By Gregory Ciotti, marketing at Help Scout

There are many ways to get a quantitative look at how your customer service team is doing, but far fewer to assess the care and thought being put into your replies.

Measuring customer satisfaction thus becomes an easy metric to spin your wheels on. But you shouldn't throw the baby out with the bath water; for many teams, a few simple questions will do the trick.

While analytics help in a big way, even great data does not guarantee good decision-making; you need context. As Einstein said, "Not everything that counts can be counted, and not everything that can be counted counts." This means you need to understand the why behind what you're tracking. When you have this data, what sort of insight can be pulled? How will this help you improve?

Here are a few important questions you should be asking.

1. Are customers' expectations being met?

Satisfied

Having a good read on the overall quality of your support means you need to collect feedback in a large volume across an extended period of time. Making it lightweight is the only way to go; the easier you make it to give feedback, the more feedback you'll get.

At Help Scout, we use the Happiness Ratings built into our help desk, so that customers can rate the service they received at the bottom of every reply. That lets us collect reactions over thousands of conversations to get a bird's eye view on the delight we're delivering over the week, month, or quarter.

Calculating Happiness Ratings

We purposefully calculate our ratings like the Net Promoter Score. We take the percentage of "Great" ratings and subtract the percentage of "Not Good" ratings to get the Happiness Score.

One thing you need to be aware of: Happiness Ratings must be presented with a lot of context, because too much incentive to get better ratings won't correlate with better customer support. Dave Cole, product manager at Wistia, gave me this explanation:

The pushback I have about ranking team members on their happiness stats is that doing so could incentivize people to work towards the numbers in an unhealthy way.

If people want to make sure they only get "happy" ratings, they might only reply to soft emails they see in the inbox -- ones that they know will be super easy to handle. See a customer with a really challenging question? Or asking for a feature that we definitely won't build? Perhaps typing in all-caps and freaking out? No thanks! Onto the next one.

Many support teams will include a "happiness section" in their monthly updates that is dedicated to individual ratings. The idea is that there is sometimes--but not always--something to be learned from the best and worst ratings you get, so it's worth the time to reflect on a few of them.

2. What has the workload been like?

Workload

Knowing your workload will help you make hiring decisions, avoid burnout, and spot opportunities to reduce "bad" contacts, or those problems with the product/process that should just be fixed. Here's a few key metrics to track.

Total volume. Does the team need to discuss bringing on another person (or two)? Why was Monday so busy? Was there an outage or an issue with the product? It looks like far more customers were helped this month than last--why was that?

Types of questions. When using tags to categorize and sort conversations, you're also going to have data available on top tags for any given time period. Why was the Refund tag used 13% more often this month? Better get to the bottom of it.

Types of responses. If your team is using saved replies to answer conversations, you'll see how many instances a particular reply has been used. Say you notice that the Pricing: Subscriptions reply has been inserted quite a few times. Maybe you need to make the pricing options clearer on the website?

3. What has customer activity been like?

Activity

Highlight what times are hectic and when things quiet down, and identify trends that you've seen over the long-term. This will help the team know when they are most needed, and consistent trends will sway how you hire in the future.

At Help Scout, we ended up searching for a customer champion in Europe to better cover our bases with activity we were seeing in the evenings and early mornings (U.S. time).

4. How has my team been performing?

Performance

Team and user reports (available in most help desks) will help you dial down into how much work -- and what kind of work -- each of your team members is doing. Use them as jumping-off points to identify red flags.

Common narratives that can crop up are seeing someone moving too fast or noticing that one person is being overloaded with difficult conversations.

Discussing these numbers publicly can let the team know that adjustments are needed. Finding out their teammate needs help is all that the best people need to hear.

5. What have our response times been like?

Response times

There are only so many "Sorry for the wait!" messages you can send before customers stop waiting and start getting in line for your competitors. Results can be achieved when this information is shared throughout your team.

What steps will be taken next to improve? Is there a certain time where the team is lagging? Why is that? What goals are you setting and how are you staying accountable? Will a new escalation Workflow be used to keep older emails moving?

6. How likely are customers to recommend us?

The Net Promoter Score -- a loyalty measurement approach first put forward by Fred Reichheld of Bain & Company -- is a popular and useful way to gain a snapshot of how your company is perceived through customers' eyes.

As a quick refresher, NPS is based around one question: "Would you recommend XYZ Company to a friend?" Responses are most often collected through a survey that asks participants to rank their likelihood of recommending you on a scale of 1 to 10.

  • Those who indicate a 9 or 10 are "promoters".
  • Those who indicate a 7 or 8 are "passive".
  • The rest are "detractors".

Net Promoter Score

The model is simple: we all want more promoters than detractors. But bear in mind that this score is ephemeral; it's about understanding current sentiments from a 50,000 foot view.

If your score is negative, then this is a red flag that customers are dissatisfied. If your score is positive, great--keep up the good work. Either way, follow up on some of your ratings; if you don't understand why your customers rated you the way they did, you'll be left with no idea of how to improve. Start with asking:

  1. Why the customer gave you the rating they did.
  2. What your company could do to get to a 9 or 10. Opinionated customers will have plenty to share.

7. How much 'effort' are customers expending?

Matthew Dixon, Karen Freeman, and Nicholas Toman, all from Corporate Executive Board, shook up the customer service world a few years back in a Harvard Business Review article titled Stop Trying to Delight Your Customers.

Armed with a convincing set of data, they sought to prove that extra effort spent on delight was overrated, and that true loyalty comes from reducing customer effort.

The chart below -- adapted from data featured in The Effortless Experience, a book by the same authors -- shows the disparity between effort vs. delight:

Customer effort

The CEB now recommends evaluating your customer effort score by asking customers how easy they felt it was to get the answer they wanted. A seven-option survey is too heavy for day-to-day emails, so a lightweight approach would be to edit your signature to include a single link to "Rate My Reply."

If you want to measure effort, it's be better to ask: "How easy was it to get the help you needed today?" Very Easy, Okay, and Not Easy should work.

When a customer responds with "Not Easy," you now have an opportunity to follow up and ask why: "Because I followed your documentation step-by-step and still had to contact you! It was incredibly confusing!" That's feedback you can dig into and act on.

Your data, your decisions

The reason you are pulling and analyzing this data is so you can make better decisions. At the end of the data, it's not about improving your customer satisfaction metrics, it's about creating happier customers.

Published on: Aug 24, 2016
The opinions expressed here by Inc.com columnists are their own, not those of Inc.com.