The Problem with Customer Service Surveys

In the world of service management, we’re not evaluating products, but services, which are even more difficult to comprehensively assess […]

Category: Featured Article

In the world of service management, we’re not evaluating products, but services, which are even more difficult to comprehensively assess via customer surveys alone. Here’s how to make sure the data you are getting is much more than noise.

As part of our “From the Vaults” series, we’re resurfacing this excellent examination of how to make sure surveys deliver true and actionable data.

Customer surveys enable us to validate our assumptions about the “voice of the customer,” and they give us feedback we can use to accelerate our design, development, or refinement of services to effectively and continually meet our customers’ needs. But collecting meaningful survey data is more difficult than ever before. Attention spans are short, social media surveys are prolific, and few people enjoy completing surveys in the first place. In addition, in the world of service management, we’re not evaluating products, but services, which are even more difficult to comprehensively assess via customer surveys alone.

Back to Basics

In order to develop an effective approach to measuring services, we must start by unpacking the definition of a service. According to ITIL, outcomes are:

The result of carrying out an activity, following a process, or delivering an IT service…The term is used to refer to intended results, as well as to actual results. Customers seek outcomes but do not wish to have accountability or ownership of all the associated costs and risks…The customer will only be exposed to the overall cost or price of a service…then judge the value of a service based on a comparison of cost or price and reliability with the desired outcome.

In short, customers don’t care about the details of the activities, the steps in a process, or the complexity of the components involved in providing a service. Customers evaluate performance on the degree to which their expectations were met and their perception of the value received, compared to overall price. “Close” seldom counts.

Services are very different from products, which is why product-based customer surveys aren’t effective tools for evaluating services. There are several key differences, as you can see in the table below.

 

Services   Products  
Services are dynamic interactions. Products are physical entities.
Services are delivered in real time. Products are created in advance.
Success can only be determined if the customer has been able to achieve a desired outcome. The output is secondary. Success is determined by the quality and delivery of the product itself.
The value of a service is only realized when it’s actually being used by a customer. It retains no value after it has been used and cannot be resold after it has been used. Value is created and realized every time a product changes hands. A product retains value over time and can be purchased and sold many times.
The value of a service is carried in the relationship between the customer and service provider. The value of the product is carried in the product itself.
Quality is defined by the level of the customer’s satisfaction based upon his or her subjective experience of the service. Quality is first based on whether the product meets certain predefined criteria, and only then on the customer’s experience of whether the product does what is claimed by the vendor.

 

In addition, customer expectations are constantly shifting, and service providers who don’t track these shifts will find themselves losing business or out of business entirely.

In my organization, a simple adjustment to the customer survey was all it took to create a more strategic and practical process that addressed the expectation/perception issue. Making this adjustment, however, required a return to the fundamentals of metrics and measurement, as well as the acknowledgment that when we surveyed customers, we were only guessing (or to be scientific about it, hypothesizing) about what they believed was important. However, we believed that, as long as we were guessing, we should at least be honest and thorough. Consequently, we added a second dimension—importance—to make sure that what we were evaluating was what actually mattered most to our customers.

Reverse Engineer the Value Proposition

Fortunately, the concepts of value, value propositions, and value realization are well defined in ITIL—as well as in the marketing, sales, and process improvement disciplines—and they provide an ideal foundation on which to build a service-oriented customer survey. Although a full discussion on value propositions is outside the scope of this article, the following extended definition of a value proposition provides sufficient background on the concept to support the design of a service-oriented survey:

The value proposition is the reason why customers turn to one company over another. It solves a customer problem or satisfies a customer need. Each value proposition consists of a selected bundle of products and/or services that caters to the requirements of a specific customer segment. In this sense, the value proposition is an aggregation, or bundle, of benefits that a company offers customers.

As a way to illustrate the linkage between value propositions and customer service surveys, consider the following example of a value proposition for a “collaborative service.”

We provide global collaboration and meeting services to our R&D office workers so that they are able to collaborate with their customers to meet or accelerate attaining their customer’s goals. We provide flexible options to connect internally or externally, delivering an easy, reliable, and repeatable process for event setup and launch. For more complex collaborative needs we provide configurable options designed in coordination with our collaboration and meeting concierge support team. Our inventory of supporting technologies include audio conferencing, email, instant messaging, video conferencing, video on demand, video streaming, web conferencing, internal team sites, and portals and file sharing. In summary, we are committed to getting and keeping you connected.

At the end of the design process, the stated value proposition was our client’s best thinking of what was important to their customers. It is very important to acknowledge, however, that this was the team’s best guess. Further refinement of the value proposition was derived from interaction with the customer segment. (When designed appropriately, customer surveys are excellent conversation starters.)

We provided our client with three survey questions for each key service attribute. Based on the example above, we identified the following service attributes for the “R&D office worker” customer segment:

  • Easy
  • Reliable
  • Repeatable
  • Flexible
  • Configurable
  • Concierge supported
  • Internal connectivity
  • External connectivity
  • Broad range of supporting technology

For each key attribute, we then asked the following questions (using the “Easy” attribute as an example):

 

How satisfied are you with the ease of using our collaborative meeting service?

Not at all
satisfied

Completely
satisfied

0

1

2

3

4

5

6

7

8

9

10

 

How important to you is the ease of using our collaborative meeting service?

Not at all
important

Extremely
important

0

1

2

3

4

5

6

7

8

9

10

 

Prioritize and Increase Customer Loyalty

For any given customer segment, all service attributes aren’t equally important. Plotting “importance” and “satisfaction” for each service attribute generates an easily understandable framework that aids in both the planning and prioritization of improvement efforts. In the “collaborative service” example, we collected the following summary attribute data:

 

Service Attribute 

Satisfaction

Importance

Easy

9

6

Reliable

9

7

Repeatable

9

7

Flexible

8

4

Configurable

4

8

Concierge supported

3

9

Internal connectivity

8

8

External connectivity

4

3

Broad range of supporting technology

8

3

Although the tabulated data provides some insight, when the information is plotted on a two-dimensional grid, a far clearer story emerges.

Fedora Customer Problem Survey

Although customer loyalty is traditionally an external market perspective, it has become increasingly relevant to internal service providers as well. IT organizations are increasingly being asked to compare their service offerings to external market alternatives. By collecting data to focus their service designs on customer-centric attributes, internal IT professionals will be well prepared to make their case for service improvement.

Data-Driven Strategic Partnerships

Customer surveys are often mistakenly positioned as tools for assessing the customer’s satisfaction with the service desk experience. However, a far more strategic use of surveys is evaluating the overall performance of the service provided. By integrating the customer survey with the service value proposition, the service provider and the service desk resources can work together to collect data that will enrich the overall service experience for customers.

Recent Posts

Quick Links

Refund Reason