Evaluations examining process
Carroll, Jonathan DWhen enrollment management divisions evaluate student satisfaction, they do not analyze the process, procedures or policies of the institution. Instead, they merely evaluate the students' "satisfaction" rather than asking them to give a critical analysis of their experience.These narrowly tailored evaluations cannot aid institutions in changing to improve service to students, Assessment tools can examine the service process and provide institutions with useful information to use as a catalyst for change in enrollment management. The author provides ways to evaluate process, methods of gathering information and an outline for developing a continuous improvement model.
Introduction
Often when schools or businesses evaluate customer satisfaction, they do not analyze the process, procedures or policies of the institution. Instead, they merely evaluate the customers' "satisfaction" rather than asking them to give a critical analysis of their experience. Thus, these broadly tailored forms of evaluations cannot effectively aid institutions in making changes to improve their service to customers. Well-designed assessment tools can examine the service process and Provide institutions with useful in. formation that can be used as catalysts for change. This paper will provide ways to evaluate process, methods of gathering information and an outline for developing a continuous improvement model. The suggestions will enable a college to develop multiple assessment tools that examine and help improve the college's enrollment management processes, procedures and services. Evaluating a customer service process
Customer satisfaction
Asking open-ended questions can address issues of customer service. Customers gain the opportunity to evaluate the overall process rather than merely marking the bubble that best explains their satisfaction level. Kridel (1999) in "Selling smarter, not harder," discusses the use of openended survey questions that directly address process:
Surveys are an effective way to determine which sales-- people and parts of the sales process need improvement. . . About one week from the sale, Cellular South sends the customer a postcard with a half dozen questions such as: How well did the salesperson explain the product? How easy was it to operate immediately after the purchase? Was the salesperson eager to assist? (pg. 3).
By giving the customer an opportunity to complain about the process, the organization is able to make institutional improvement.
For example, the U.S. Department of Education Office of Student Financial Assistance changed its organizational structure into a Performance-Based Organization (PBO) in Order to use complaints as a catalyst for change. According to "Reinventing service," by U.S. Department of Education Office of Student Financial Assistance (1999) "The best in business use complaints to good advantage. OFSA Mice of Student Financial Assistance] will copy best in business systematic methods of welcoming complaints so we can keep improving service," (pg. 12). It is crucial to understand, however, that one complaint does not merit overhauling the system. Responses to these questions help identify trends before the staff discusses change.
Roth (2001) in "Giving customers' a voice" discusses how the company PlanetFeedback.com uses the results of open-ended questions targeting process:
Our customer experience director meticulously reviews monthly feedback analysis and consumer satisfaction surveys to spot trends and opportunities. We use Express Feedback internally to route consumer insights intelligently. . . Every complaint and compliment through Express Feedback teaches us something, helps us understand our own customers, and helps us build better relationships with them (pg.3).
According to the Higher Education Funding Council for Wales (2001), a similar program helped identify a necessary change in an English college:
[Surveys] have provided useful feedback for improving services. For example, one institution revised the registration process following negative feedback from the students through a customer satisfaction survey, by selecting a new location for the main track of registration and allocating stub dents fixed times to attend, thus substantially reducing waiting times (pg. 79).
Customer needs
Another way institutions can survey to analyze process is by questioning customers about their needs. It is critical that institutions understand what their students in their own voices want in order to best serve them. Administrators no longer try to guess students' needs and if these needs are being met. Kridel (1999) discusses how other businesses have used the approach:
Learning about customer need is key to selling wireless more as a solution and less as a gee-whiz collection of technology and services. Questions that draw out the customer's needs include: How do you see yourself using our service? How often would you use it? Do you want it for security or business? Do you plan to use the service while traveling? (pg. 3).
By surveying students' needs to determine if they are being met, an institution can appropriately evaluate process, exactly what businesses with exceptional customer service do. "Reinventing Service," by U.S. Department of Education Office of Student Financial Assistance (1999), addresses how companies adjust process according to need. "Disney, FedEx, American Express, all of the companies that come to mind as among the best at serving their customers, make their living finding out exactly what their customers want and then delivering it," (pg. 12). Institutions of higher education need to follow the lead of such businesses in order to provide students with the best quality services.
Customer service encounters
Asking customers to retell critical events is another way to address process issues. A researcher asks a customer in an interview to recall a specific event. Roos (2002) in "Investigating critical incidents" outlines the technique:
He described various service encounters in detail and then showed which elements of the interaction between customer and staff members were critical to the customer relationship. The ways in which these interactions develop in various directions, depending on the response received by the customer in various interactive situations, can then be determined. These results are important to the continued development of the use of critical incidents, in that they indicate and describe trigger factors that may precede either a strong or a weakened customer relationship (pg. 195).
Once gathered, the critical event data is analyzed using SPAT [Switching Path Analysis Technique]. The SPAT method gives researchers a way to interpret the customer experience and analyze it to determine if changes in process are needed. According to Roos (2002), the SPAT method divides the experience into categories. "The customer relationship is broken down into a trigger, an initial stage, a process, and a consequence," (pg. 196) The retold critical event is analyzed through this lens to determine customer behavior, which is the key to providing better service.
When using the SPAT, the trigger stage is most critical because at that point the customer is susceptible to switching service providers. Roos (2002) defines three triggers: a situational trigger occurs outside of the customer relationship, an influential trigger occurs when the service provider to which the customer switches serves as a standard of comparison and the reactional trigger occurs when a customer switches because of a change within the company. The analysis provides organizations with valuable information about process. It enables them to discover what causes a customer to be pleased or displeased with service.
For example, if former students were asked about the enrollment management process at a particular college, the researcher would document the student accounts and determine what type of triggers caused the students to go elsewhere or stop attending. A student who stops attending because the registration process took too long illustrates a reactional trigger. A student transferring to the University of Phoenix because of more convenient on-line advising illustrates an influential trigger. The SPAT method entails dissecting the critical events and determining what caused the customer to be satisfied or unsatisfied, thus allowing institutions to better understand the behavior of their customers. An actor who is desperately trying to get into character will cry, "What's my motivation?" Critical event data as a research tool is similar. It gets to the heart of customer motivation. Roos (2002) summarizes the use of the technique:
SPAT represents a method in which the domain consists of the customer relationship or, more specifically, one customer relationship that is replaced by another. Insight into what the problem areas will be when behavioral intentions need to be predicted can be obtained by carefully mapping the advantages gained by keeping the focus on actual behavior when analyzing and representing the customer relationship (pg. 202).
Just as banks use the SPAT method to understand why customers switch banks (Roos), higher education agencies could do the same to determine why students stop attending.
Gap Analysis
Gap Analysis provides another way organizations can address process issues. Interviewers ask customers specific questions about their satisfaction level and then mark a level Of importance concerning specific parts of the process. DiDomenic (1996) in "Assessing service quality within the educational environment," addresses how colleges can use gap analysis informa. tion to improve services to students:
The gap analysis model offers a disciplined methodological approach, which scrutinizes the quality of services delivered by institutions of higher education. Students are asked several questions in a detailed process, which assigns attributes to services while evaluating the importance of each service dimension. The resulting gap provides valuable information about areas which need improvement (pg. 353).
Compiled data determines where positive and negative gaps exist. A positive gap shows the satisfaction level exceeds importance. A negative gap occurs when the level of satisfaction is lower than the importance assigned. Such information helps colleges better understand what specific areas students most value and what levels of satisfaction apply. For example, Sinclair Community College used point-of-contact surveys during the enrollment management process and learned that confusion about registration and limited hours of operation were areas the college needed to improve (Sinclair Community College, pg. 4).
Tabulated results are transferred to a matrix such as the one in Figure 1.
In "2001 National Student Satisfaction Report," Noel-Levitz discusses the importance of using a matrix to interpret the results of gap analysis:
Utilizing the matrix permits the institution to conceptualize its student satisfaction data by retention priorities and marketing opportunities. In addition, it allows the institution to pinpoint areas where resources can be redirected from areas of low importance to areas of high importance" (pg. 7).
Using the method can provide colleges with valuable information on how to improve service in specific areas.
Multiple assessment tools
When an organization wants to research its customers, a variety of methods can be used to gather information to better understand how to improve process. Smith and Mather (2000) examine one college's use of multiple assessment tools.
Using this integrated approach, the unit developed a systematic student satisfaction survey and service impact surveys. The Point of Service Survey contains eleven items that are common to all student service units and allows for the addition of items unique to a particular department. The instrument is administered every two and one-half years, and the results are used to evaluate and enhance programs and services in light of their mission. Sinclair Community College uses multiple assessment tools within this rubric, including focus groups and phone surveys. Information gathered through its assessment program has led to a variety of changes, from modifying and expanding its orientation program to establishing Sinclair Central, a onestop counseling and advising center (pg. 71).
With several tools, the organization is able to evaluate process from multiple vantagepoints. Whiteley and Hessan (1996) in "Customer centered growth" highlight tools businesses use to hear their customers:
Focus groups - Small groups (of customers or any target group) are invited to meet with a facilitator to answer open-ended questions.
Customer panels - Like a focus group, a customer panel consists of a small number of customers invited to answer open-ended questions. However, a customer panel is a group that meets on a regular basis.
Face-to-face individual interviews - Personal interviews provide nuances of different customers' thoughts.
Visit to customers and observations of their product use Thoughtful study of customers in the setting in which they actually use your product provides the greatest intimacy of any technique.
Customer tours - Invite customers to visit your facilities and discuss how you can serve them better.
Meeting customers at trade shows - Setting up a booth in a place where customers will congregate is a cost-effective as well as time-honored method of hearing the customer.
Toll-free numbers - Companies attach telephone numbers to their products.
Surveys by telephone - Surveys ask a fixed menu of questions of a large number of people; they are most useful to obtain opinions on closed-ended questions, the importance of which you've already established by asking open-ended questions in other settings.
Mystery shoppers - Professionals visit your business posing as customers and report how they were treated.
Debriefing of frontline sales and service people - Ask frontline people in a relaxed setting about their experiences to obtain insight into what the customer faces and what he or she wants. Customer contact logs - Ask customer contact people to report when customers say something interesting or significant.
Customer serviceperson's hot line - This is a phone number customer service people can call to report problems. A voicemail box may be sufficient. Competitive win/loss debriefings - These are special interviews with the customer at the time when you win or lose a piece of business (pg. 66-68).
While businesses have employed these techniques for years to understand their customers, colleges and universities are just beginning to use the same tactics. But higher education institutions have applied fewer tools than businesses. Typically, colleges and universities have used focus groups, telephone surveys, a complaint system, point-of-contact surveys and surveys using open-- ended questions in order to gauge student satisfaction. Apparently, higher education has much to learn from businesses in evaluating customer service.
A growing number of colleges and universities now use more traditional business tactics to evaluate pro. cess. The following two universities illustrate the use of traditional business tactics to better understand their students.
A case in point:The University of Phoenix implements pointof-contact surveys to improve student services
While colleges and universities have a reputation of changing at a slower rate than cemeteries, the use of point-- of-contact surveys is a growing trend in higher education. The surveys are given shortly after the service encounter and gauge a customer's satisfaction through a series of open-ended questions targeting the service process.
Beginning in 2003, the Univer, sity of Phoenix, a nationwide online higher education provider, will survey its students after each critical encounter with a Phoenix student affairs professional according to the Detroit representative for Phoenix in a May 15, 2002, interview. The service encounters include separate ap, pointments with admissions, finance and academic counselors. The university plans to both email and post mail a point-of-contact survey after each encounter. The corporate headquarters of Phoenix will analyze the results of the surveys in order to identify trends. Based on the findings, appropriate changes will be implemented to improve student services at Phoenix.
In addition to Phoenix, both the University of Michigan and the University of Minnesota use point-of-- contact surveys to evaluate their student services.
A case in point:The University of Minnesota uses student complaints to improve student services
While many colleges and universities cringe at the term "customers" when applied to students, the University of Minnesota (UM) embraces the concept. According to Craig Swan, a professor of economics who also works in the Office of the Executive Vice President and Provost, the UM atmosphere is a "service culture" (interview May 15, 2002). Every student complaint is used to improve student services at UM.
First, the commitment to student feedback exists at all levels of the organization. Even the president of UM personally responds to student complaints. No matter that UM is one of the largest universities in the United States, everyone-regardless of rank-has made a commitment to the "service culture."
Furthermore, each complaint is examined in its relationship to the overall system. The university looks for trends in student feedback to determine, if changes in policy, procedures and process in student services are necessary. Listening and responding to the needs of students creates a continuous improvement model.
In addition to being responsive to student feedback, the university actively seeks it out. A web link on the UM website solicits student feedback. The entire university system makes it a duty to collect service in. formation. UM uses multiple assessments tools including focus groups and telephone surveys to make the service culture a reality.
Process evaluation team
Forming a Process Evaluation Team is the first step in developing a multilayered assessment model for continua ous improvement. The team's objective is to develop the assessment tools, gather and analyze the data and communicate the results by completing the following objectives. See Figure 2 for a graphic representation of the necessary steps.
Determine the specific assessment tools to be used to evaluate process. The areas being examined and what techniques will provide the necessary information determine the particular tools. The choices may include traditional business assessment tools in the higher education context.
Develop the assessment tools. Evaluate customer responses. Analyze data gathered and identify trends.
The Process Evaluation Team must then transform the collected information into action plans, similar plans to those employed by many financial aid offices that gather customer data and then make changes based on the information. Transforming the information into an action plan entails the following:
Form a Customer Service Task Force comprised of individuals that represent the different needs of each group - students, staff and faculty.
Consider the distinct needs of each group. Analyze the issue from multiple vantage points. Recommend an action plan based on the findings and different needs of the stakeholders within the institution (U.S. Department of Education Office of Student Financial Assistance).
Implementation of the plan
Once developed, the action plan needs to be presented to the people within the institution that have the power to actually make the changes happen. Smith and Mather (2000) in discussing how institutional change is realized, address the importance of communicating the results. "Student affairs research is useful only if it leads to improvement in the education and lives of students. This requires the effective communication of results to the people who set policy and implement student affairs programs," (pg. 49). Gathering information and creating action plans is not enough; the findings must be effect tively communicated to and used by the appropriate stakeholders in order for change to become a reality.
After the results have been communicated, the Process Evaluation Team and the Customer Service Task Force must meet to create the continuous loop of the model. The two teams determine what information needs analysis before the next phase can begin. Instead of providing onetime results for enrollment management, the process provides continuous improvement. Further research into ways of evaluating process would include examining the guidelines of agencies including American College Personnel Association (ACPA), National Association of Student Personnel Administrators (NASPA), International Customer Service Association, American Association of Higher Education - Total Quality Management (TQM), North Central Association, The Higher Learning Commission including Academic Quality Improvement Project (AQUIP). The concepts and practices from literature review are clear: optimum customer service in any organization requires continuous improvement.
References
2001 National Student Satisfaction Survey. Retrieved May 20, 2002, from http://www.noellevitz.com
DiDomenico, E. & Bonnici, J. (1996), "Assessing Service Quality within the Educational Environment," Education, 116 (3), 353-9.
Higher Education Funding Council for Wales (2001). The management of student administration: A guide to good practice. 1-91. Retrieved April 29, 2002, from the ERIC database.
Inger, R. (2002). Methods of investigating critical incidents: A comparative review. Journal of Service Research, 193-204. Retrieved April 29, 2002, from the Lexis-Nexis database.
Jackson, K. (2001). Taking the lead in the customer satisfaction game. Call Center Magazine, 1-8. Retrieved April 29, 2002, from the ERIC database.
Kridel, T. (1999). Selling smarter, not harder. Wireless Review, 1-5. Retrieved April 29, 2002, from the Lexis-Nexis database.
Roth, A. (2001). Giving customers a voice. Web Techniques, 1-3. Retrieved April 29, 2002, from the Lexis-Nexis database.
Smith, K. & Mather, P. (2000). Best practices in student affairs research. In Pickering, J. & Hanson, G. (Eds.), Collaborations between student affairs and institutional researchers to improve institutional effectiveness (pg. 63-- 78). San Francisco: Jossey-Bass Publishers.
U.S. Department of Education Office of Student Financial Assistance (1999). A report from the customer service task force: Reinventing Service. 1-81. Retrieved April 29, 2002, from the ERIC database.
Whiteley, R. & Hessan, D. (1996). Customer centered growth. Cambridge, Massachusetts: Perseus Books.
Jonathan D. Carroll
Mr. Carroll is a doctoral student in the Community College Leadership Program at The University of Texas at Austin.
Copyright Schoolcraft College Fall 2002
Provided by ProQuest Information and Learning Company. All rights Reserved