About the Knowledge Base
Search all the Knowledge Base
Testimonial: I have found that the new HDAA Knowledge Base reduces the time it takes me to research industry stats & reliable information for the ITSM sector. It’s easy to use search functionality encompassing KCS principles, helps to filter & tailor my searches more accurately & there are numerous new services now available through the website. Every time I return to the site there is new information published. Very impressive.
Chris Powderly, Support & Services Manager, Allens
supportworld , technical support , customer experience , workforce enablement , workforce enablement , metrics and measurements , balanced scorecard , quality assurance
No Result Found
Quality programs involve assessing the service delivery and process adherence of support teams. The goal is to improve the customer experience and increase customer satisfaction. Yet, too often, quality programs stop at the scorecards. And, scorecards are just that, a numerical score that people strive to achieve. Some organizations become so numbers oriented that the support professionals become focused on “hitting the numbers” instead of restoring service and creating a great customer experience. As a result, quality actually decreases instead of improving. Additionally, you can inadvertently override all the great customer service and problem-solving skills your analysts possess—what you hired them for!
So, what does it take to truly improve the customer experience? In short, alignment, balance, and coaching and continual improvement. Goals need to be defined, and the quality program needs to drive toward achieving the goals. Additionally, you need to balance numbers and quality elements in the quality program, and scoring and coaching need to be carried out consistently. Lastly, a quality program continually evolves. Through trending, scoring, coaching, and maturing processes, you can identify areas of opportunity. As new goals are defined, the scorecard will be adjusted.
Too often, organizations build a scorecard without clearly defining (and documenting) their goals. What are the problems you are trying to solve? What are the goals you are trying to achieve? In other words, why do we need to establish a quality program? The why is what drives the scorecard elements.
Once the details of the scorecard are built, companies often become so focused on the scorecard metrics that they lose sight of the why behind the scorecard. When the why is not defined and documented, it often leads to an imbalance of metrics and scoring criteria. Often, the scorecards become more numbers-driven and less about the quality of how each customer experience is handled.
Overall, you are trying to establish an environment where all of your support center analysts understand departmental priorities and goals and have the skills needed to manage each customer experience. For example, you might want to grow individual skills and improve the overall customer experience. You want your team to focus on that customer experience with every contact and to use their skills to work within the parameters of policy and standard operating procedures. When they understand the why behind the quality standards and standard operating procedures, they will use their judgement and take the right action.
Next, gather information that supports these goals/challenges. Once the why is defined, and information is gathered, you can build the foundation for your scorecard. For example, when reviewing customer feedback, you find that often, customers are saying that analysts don’t listen and just follow a checklist. You listen to calls and review tickets and notice two things:
By shortcutting information gathering, tickets are often categorized incorrectly, sent to the wrong queue, and lack necessary details to work toward resolution. This increases the time the user is unable to work. Additionally, by failing to acknowledge and show empathy, there are unnecessary escalations to management because of bad experiences. To change behavior and improve service, you set goals for the team:
Not only have you clearly defined the why behind the goals, but you’ve identified specific actions the team needs to take to overcome the current perception, become more efficient, and improve the quality of the customer experience.
This example shows how to lay the foundation for building the scorecard. Next, the focus needs to be on balancing quality efforts with numeric targets.
To assess the quality of the customer experience, incident monitoring (ticket audits), knowledge monitoring, and call monitoring all need to take place. Based on the previous example, here are the scorecard elements and how to assess them:
It is important to recognize that the way each analyst shows empathy and acknowledges what the customer says might be different. Don’t require certain words to be said. When phrases are required, analysts sound unnatural and insincere. Also, instead of listening and using appropriate words or phrases, the analyst is focusing on making the specific statement. Common examples of required phrases include the following:
Here is an example of what might happen as a result.
Joe received a deduction on his quality scorecard because he did not literally say, “Have I resolved your issue today?” Yet, the reason that question is on the scorecard is to ensure that the analyst verifies that the resolution works and that the customer agrees.
Let’s review what Joe did say during the call:
Customer: “It’s working now! Thank you!”
Joe: “It is working? Great. Is there anything else I can do for you today?”
Customer: “No, that’s all I needed. Thank you so much!”
The analyst needs to ensure that the customer agrees that the issue has been resolved (this is the “why”). It’s not about the exact words; it’s about checking to make sure it is resolved. Joe definitely did that. He responded to the customer’s comment, and it was natural and showed that he was listening and engaged. Let’s revisit the scenario and ensure that he meets the specific criteria of the scorecard.
Joe: “It is working? Great. Have I resolved your issue today?"
This is unnatural and makes it look like Joe didn’t really listen and is scripted. This would result in lower credibility of the service desk and lower customer satisfaction overall. This could also be the reason the customer satisfaction survey feedback reported that the service desk follows a checklist. It also strayed too far from the why.
This shows that not only do you need to define the overall why for the quality program, but also define the why for each element on the scorecard. Joe met the quality criteria in his call (the first example). He checked with the customer to make sure the issue was resolved. Yet, as the quality program evolved the criteria became too restrictive and the why was forgotten.
We hire people for their ability to listen and respond appropriately in any situation. Forcing support center analysts to follow a script on responses forces them to focus on their scores, not on being engaged in the call and managing the customer experience.
Frequent feedback from analysts who have rigid scorecards say that they feel so much pressure to avoid getting “dinged” on their quality scorecard that they don’t always do what is best for the customer. This is especially true when scorecards include numbers such as these:
Analysts become focused on hitting the numbers and aren’t fully listening to their customers. Instead of listing rigid numbers to hit, keep it high level as in the example with Joe and allow the analysts to use their natural approaches and their great customer service skills.
The results will be service desk analysts who are really listening and helping their customers. They will make real connections, sound sincere, and facilitate service restoration and higher productivity—both for the customers and service desk.
A quality program needs to evolve as the service desk improves its service delivery. It is important to solicit feedback both from customers and the service desk analysts on a regular basis (quarterly or every six months). Feedback can identify what needs to be modified or added to the quality program.
Also, keep in mind that the quality program is in place to grow analysts’ skills and improve the customer experience. It should be a way of recognizing improvement, not a punitive effort. Recognition and achievement are two motivational factors that engage employees. As the analysts become more proficient in their roles, their pride and confidence grows and they deliver better service. Here are three ways to keep analysts engaged and focused on improvement:
In addition to coaching and soliciting feedback, it is important to document the responses. It makes it easier to identify what needs to be reviewed and what areas to improve. Also, if it is documented, it is easier to check against the why and avoid making changes that contradict the goals.
For example, if you get feedback that the service desk doesn’t understand business issues, it is important to identify how to put quality criteria in place that will help to assess and grow their skills through coaching.
As you recognize trends or habits that need to be changed, build it into the scorecard. As needed, communicate what trends or comments are driving the changes, then educate, assess, and coach the team members and help them to grow their skills and wow their customers. Coaching needs to remain a constant; don’t stop coaching!
It is important to remember that customer service and technical support is a profession. Not just anyone can do it. Pay attention to alignment, balance, and coaching and continual improvement to have a quality program that lets analysts’ natural skills emerge while meeting goals and identifying improvements and allows people to continually grow. You’ll have engaged employees who trust their companies. Loyal, happy support center analysts attract and keep loyal, happy customers.
Rae Ann Bruno is the president of Business Solutions Training, Inc., where she consults and trains in various areas of ITIL, KCS, communications, internal marketing, metrics, and process improvement. Rae Ann holds several ITIL certifications, is a faculty trainer for HDI, and is the author of Translating IT Metrics into Business Benefits and What Have You Done for Me Lately? Creating an Internal Marketing Culture. She is also a member of the HDI International Standards Committee.
No Result Found
- Contact Us
- IT Membership
- Support Centre Association
- Comparison Guide
- Price Guide
- Membership Conditions
Training & Workshops
- Training Courses
- Recent Workshops
- Cancellation & Transfer Policy
- ITIL Training
- ITIL Foundations
- Support Centre Consulting
- Service Desk Consulting
- Help Desk Consulting
- Media Kit
- Update your details
- New account
© Copyright HDAA. All rights reserved.
HDAA - Energising the Service & Support Profession
Help Desk Association Australasia Pty Ltd trading as HDAA
T: 1300 130 447 T: +61 (0) 2 9986 1988 F: +61 (0) 2 9986 1330
W: www.hdaa.com.au A: PO Box 303, Turramurra NSW 2074 Australia
ABN: 20 088 292 755
Our Services: ITIL | ITIL Training | ITIL Foundations | IT Membership | Service Desk Association | Support Centre Association | Support Centre Training | Service Desk Training | Help Desk Training | Support Centre Consulting | Service Desk Consulting | Help Desk Consulting
ITIL® and PRINCE2® are registered trade marks of AXELOS Limited, used under permission of AXELOS Limited. All rights reserved.
RESILIA™ is a trade mark of AXELOS Limited, used under permission of AXELOS Limited. All rights reserved.
The Swirl logo™ is a trade mark of AXELOS Limited, used under permission of AXELOS Limited. All rights reserved.
DevOps Foundation®, is a registered mark of the DevOps Institute.
HDI® is a Registered Trade Mark. HDAA is the Australasian Gold Partner of HDI®.
KCSSM is a Service Mark of the Consortium for Service Innovation™.
Apollo 13 Insignia image by 'NASA Johnson' (copyright-free) June 2017 via https://www.hq.nasa.gov/alsj/a13/images13.html
WEB DEVELOPMENT PARTNER