Skip to: Main Content / Navigation

  • Facebook
  • Twitter
  • LinkedIn
  • Add This

By the Numbers: When satisfaction scores go flat

Article ID:
January 2014, page 24
Peter Gurney

Article Abstract

Peter Gurney offers some dos and don’ts for dealing with a stubborn trend line.

Editor's note: Peter Gurney is senior director, VOC solutions, at Seattle-based NetReflector Inc.

Once a slow and expensive process, collecting feedback from customers is now quick, simple and relatively cheap with the widespread availability of online survey tools and reporting systems. Companies can collect ratings and feedback at every point of contact, including phone calls, e-mails, Web visits and point-of-sale purchases. In addition, managers can view survey results instantly instead of waiting weeks or months to find out what their customers are saying.

This information is a valuable and necessary component of any voice of the customer (VOC) program. But if you’ve been collecting survey results for a while, you’ve probably run into a situation that many organizations face: flat trend lines. Once the easy wins are behind you, any upward movement in the overall ratings becomes increasingly difficult to achieve. This wouldn’t be a problem if you could confidently say that your organization had reached a state of customer experience perfection, but in most cases, employees and managers are painfully aware that there is still plenty of improvement to be made.

The problem with flat trend lines isn’t simply that they suggest a lack of progress. It’s also that they’re boring. It’s difficult to keep stakeholders interested and motivated when they see the same scores month after month. Many customer-experience initiatives have stalled when satisfaction ratings reach a plateau.

Flat scores are actually just a sign that the VOC program needs to evolve. There are various actions that can be taken to push the program along and different organizations approach the challenge in different ways. As a start, we offer a few dos and don’ts:

Do: Bring other metrics to the foreground. Satisfaction ratings (or NPS or however you’re keeping score) are not meant to be an end in themselves. They are intended to reflect customer attitudes and experiences as a means to achieving better business results. Eventually, satisfaction scores need to become less prominent as other success measures take the lead. Depending on what the goals of the program are, various operational and financial metrics may be brought forward, including complaint volumes, retention rates, new accounts, customer spend and average cost-to-serve. This doesn’t mean that satisfaction ratings disappear; they should continue to serve as an important indicator of the customer relationship. But as the Chinese proverb goes, “When the finger points at the moon, the fool looks at the finger.”

Don’t: Change the scale. Some organizations fall into the trap of blaming the messenger, assuming that a different scale or manner of asking about satisfaction will change the result. Here are some hard truths:

  • Bigger satisfaction scales don’t give you more precision. As a practical matter, all satisfaction analyses tend to break down into three buckets: negative, neutral or positive. Whether you’re using a five-point scale or a 100-point scale, you’ll still be looking at those three categories in the end. 
  • Using multidimensional indexes may not help, either. Combining and weighting several metrics, like overall satisfaction, willingness to recommend, likelihood to repurchase, etc., sounds scientific and gives the illusion of greater precision. Unfortunately, these formula-based indexes are seldom better predictors of business performance than simply tracking overall satisfaction.

Do: Focus more heavily on open-ended responses. Numbers are nice because they’re easy to analyze and display. Words, on the other hand, are messy and analyzing them is labor-intensive. As a result, it is common for VOC researchers to severely limit the use of open-ended questions on their surveys. It is also common to find that the research team is sitting on a pile of unanalyzed comments, hoping they will eventually have the time to make sense of them.

Although customer comments are indeed more difficult to analyze and report on than ratings, it is often in the comments that the richest and most actionable information can be found. Companies that have hit a wall with their satisfaction ratings may want to look at redesigning their surveys to better allow customers to tell their stories in their own words. This may require additional work but it will ultimately provide more powerful and actionable information.

Don’t: Shrink the scope. Satisfaction surveys can become overly focused on the needs of a specific user group, often at the expense of providing in-depth information about the customer relationship. For example, post-transaction surveys may be used primarily for coaching and rewarding call agents and other frontline service personnel and over time become shortened to exclude any questions that are not directly related to the customer’s interaction with the agent. But this narrowly-scoped data leaves out important information about the customer’s overall experience and relationship with the company. In general, voice-of-the-customer programs should include both in-depth relationship surveys and transaction-based feedback and the transaction feedback should capture information about the entire experience, not just the performance of the service agent.

Do: Segment the results. Rather than tracking an overall satisfaction score for the company, it is often more productive to break the scores out by relevant customer groups and monitor them separately. Different groups may have different satisfaction criteria, as well as different expected ranges of satisfaction. For example, business travelers typically give lower satisfaction ratings than leisure travelers, even though they may, on paper, appear to be more loyal to a specific hotel brand or airline. Understanding how different groups are best satisfied and what the relevant ranges of their satisfaction ratings are will allow you to focus your improvement efforts more effectively.

Don’t: Settle for “good enough.” If satisfaction ratings have reached a plateau, it may be tempting to rationalize by claiming that further improvement is unnecessary or unaffordable. But this is seldom true. Executives at companies with superior service levels, such as Nordstrom, are frequently heard to use phrases such as, “We’re still far from perfect,” “We have a long way to go” and “We’re always working at getting better.” If scores are flat, it’s time to work harder, not to relax.

Do: Recruit new stakeholders. As VOC programs mature, they often apply customer feedback in new ways to meet the needs of an expanding base of internal clients. While VOC may initially be used for service recovery, frontline coaching and satisfaction monitoring, over time the information can be systematically applied to support product innovation, process improvement, vendor relations, training and communications content and other important organizational needs. At the same time, the VOC team may evolve from an analytical and report-generating group to an internal consulting organization, working closely with a wide range of stakeholders to help them advance their business objectives.

Continually evolve

The main point to keep in mind is that customer satisfaction and VOC programs are not meant to be static. As the organization becomes comfortable with the process of measuring and sharing customer feedback, the program must continually evolve by incorporating new measures, serving new stakeholders, and making more effective use of the information.

Comment on this article

comments powered by Disqus

Related Glossary Terms

Search for more...

Related Events

July 13-17, 2015
ESRA will hold its 6th Conference of the European Survey Research Association on July 13-17 in Reykjavik, Iceland.
July 15-16, 2015
Alta Plana will hold an event, themed 'Sentiment Analysis Symposium,' on July 15-16 at The New York Academy of Sciences in New York.

View more Related Events...

Related Articles

There are 2297 articles in our archive related to this topic. Below are 5 selected at random and available to all users of the site.

Money isn't everything
This article reports on the second phase of an ongoing research project about respondent motivation for participating in research, especially focus group sessions. The study used two focus groups, one paid and one unpaid, to explore this issue in more depth than in Phase 1 of the study. (Phase 1 was the subject of a May 1990 Quirk's article. A follow-up was published in the June 1992 issue.)
Business-to-business customer satisfaction research comes of age
Customer satisfaction is not a new concept. This article discusses the development of customer satisfaction research, including its history and where business-to-business customer satisfaction research is today.
Bringing researchers and ad agency folks together
Market researchers find working with ad agency staff challenging. This article discusses the origins of the struggle inherent in the relationship and shows by example of Jack in the Box restaurants how the two parties can learn to work together in a more productive, enjoyable relationship.
In Case You Missed It... November 2009
News and notes on marketing and research: men's underwear and the economy; free shipping offers and holiday shopping; Tokyo marketing cafe for young, affluent women
10 things every brand should know about Asian-American youth
Conversations with Asian-American youth helped the author develop a list of dos and don’ts for marketers who are considering targeting this growing and under-served group.

See more articles on this topic

Related Suppliers: Research Companies from the SourceBook

Click on a category below to see firms that specialize in the following areas of research and/or industries


Conduct a detailed search of the entire Researcher SourceBook directory

Related Discussion Topics

TURF excel-based simulator
03/06/2015 by Nicky Turche
TURF excel-based simulator
12/16/2014 by Joseph O. Fayese
Hi Giovanni
10/17/2014 by Dohyun Kim
Referencing another survey to provide context on a question
09/12/2014 by Karina Santoro
06/06/2014 by Monika Kunkowska

View More