Customer satisfaction survey — everything you need to know
The story of this article began with a question, from a member of one of the business groups that can be seen in the main graphic.
The customer must be satisfied — this mantra is repeated to each other by suppliers of products and services. This is the feedback they expect from their customers. But is it really so? Do customers “have to” be satisfied? Should we even ask about this satisfaction? And if so, then:
- What to ask?
- When to ask?
- Who to ask?
What do you want to know? Customer satisfaction survey
User satisfaction is a bit like with show business: If they don't talk about us, it means the show didn't work out. In general, everyone:
- marketing,
- sales,
- service,
- production, etc.
Everyone wants to hear that customer satisfaction.
From the UX side, I can say:
“I prefer a customer who says it's okay, but if you do something, it will be even better.”
But what do customer satisfaction surveys give?
By asking empathetic questions, drawing a kind ear to the user, we gain the trust and loyalty of the customer. We like to have an opinion, we like to be heard. And we don't get that feeling by clicking stars or emotes. Because what if it's not good...
I'm not an enemy of this method, but... — what is NPS?
I have to admit: I very often question the NPS indicator for assessing customer satisfaction. Net Promotion Score (NPS is a standard question: “Would you recommend us to your friends?”). As you can see, it does not ask about the satisfaction of contact with the brand itself, but examines the command, and in a very narrow way.
Satisfied customers or trying to please us?
Focusing on NPS narrows our business point of view and instead of examining the client's humor, it can show his attempt to “not make us feel sorry”. Besides, this question leaves little leeway to answer, and that always makes us try to aim in the middle when answering.
Remember! Users avoid choosing extreme answers!
The answer is not in the middle, or how to count
Since we already know this, it is also worth knowing that in scale-based questionnaires, extremely good and extremely bad answers occur less frequently and require extreme emotions.
Users are more than 50% more likely to give an extreme rating after a bad experience rather than a good one. We talked about this phenomenon in episode of our Design and Business podcast.
Such sample questions can sink the effectiveness of the customer satisfaction survey
With an average of 4.2, it may turn out that good grades were “spoiled” by a narrow group of extremely dissatisfied. Or we have the same number of very good and very bad grades.
CES Customer Satisfaction Survey
That's why I prefer CES, based on the interval (median) of the response. Customer Effort Score focuses on capturing the bigger picture.
He asks: how easy was it to use a service or perform an activity using your products?
The answers are scored on a scale of 1-7. The score is shown in the range 0-100, but the answers are counted in groups. For example, if 65 out of 100 customers rated the quality of service as 5, 6 or 7 on a 7-point scale, the CES score will be 65. In addition, by tracking CES over time, we can observe how changes or iterations affect customer satisfaction. What raises and what lowers the indicators.
Hint: maybe you can find out more if CES This will be your customer satisfaction survey?
How to ask about customer satisfaction at all?
A good product is simple to use. The same should be true of simple templates when it comes to customer satisfaction surveys. Questions about the quality of service and its reception should go not only to satisfied, but also to dissatisfied customers. So let's be honest with the user and with ourselves.
A few simple rules when we want to build a customer satisfaction survey
1. Don't ask questions that you can't answer yourself
It may seem strange, but there are surveys that we abandon because we do not understand what they ask us.
Here the rule reigns: the simpler, the safer.
If we want to know what the customer is happy with — let's ask directly, in his words.
Anti-examples of satisfaction questions:
Have I really “completed the online purchase verification process”? Should I just say if I felt safe paying my bill online.
Also, avoid industry jargon: “Was the design of our website intuitive to you?”
Not every customer is a UX!
Instead, ask: “Did you easily and conveniently make a purchase on our site?”.
2. Do not suggest a ready-made answer with your question
As users, we are sensitive to tricks similar to the famous PRL joke: Who is your role model and why Stalin? Suggesting an answer in a question not only makes it difficult to get an honest opinion, it also spoils research. Do not try to influence the result. Don't focus on what you would like to hear.
Anti-example of suggesting an answer:
Popular “What did you like about our website?” and a series of answers to choose from. Rather, I do not expect to find among them options: “Nothing”. And yet, we can find such an opinion.
Rather, let's ask: “What are we doing really well?”. This will give clear information about the product/or opinion about your company and its strengths. It can also test whether new features or elements do not arouse the expected enthusiasm.
3. Research briefly and on topic customer satisfaction
Questions that are too long are the bane of surveys.
Why?
- are poorly displayed on mobile devices (which are already replacing our computers)
- cause respondent fatigue effect — If I don't have the patience to read the question to the end, I won't answer it. The more long questions, the less chance that the customer will evaluate your brand at all. And he will remember it well.
4. People also don't have time for long surveys
If you promise a user: this survey will take you 3 minutes. Then keep that promise. This will inspire more confidence.
For your questionnaire I would advise Three or four questions. This gives you the best chance of collecting a set of answers (fewer people will take a break halfway), which translates into a more reliable result.
Very bad anti-example:
I must mention here the famous questionnaire of customer satisfaction in one of the games of the company Betesda. The unsubscribed player was able to explain his reasons to the company... in 48 questions. I don't know a person who gets further than 12.
If you plan longer surveys, think carefully about the purpose of the questions and how they will help assess customer satisfaction.
A good survey pattern resembles a short story that, out of curiosity, we will read to the end.
5. Give a chance for a precise answer
Questions with too few answer options will result in substandard data, that is, unreliable. Don't avoid choices like “I don't know,” “I don't have an opinion,” or the user adding their own answer.
Anti-example:
I like this one the most: “Do you like using our online store?” And the answers Yes/No. If the user is to evaluate your website or the quality of your services, then let them expand their thinking.
Apart from the fact that we use the online store more for shopping and not because we like to go there and look at ourselves...
6. Don't ask everything because you won't know anything
Focus on what's most important to your business or what you don't know. Don't treat customer satisfaction research like writing an encyclopedia.
- Not sure about the buying process? Ask about the basket.
- Have you implemented a new website design? Ask if the user finds all the information they need.
It is worth asking about those factors that we have not yet had time to know well.
7. Do not exclude users
Unless your satisfaction survey only goes to a carefully selected group of customers. In a broad survey, avoid questions like: “How do you rate your complexion after 6 months of using our product?”. What if I have been using it for a month?
When to ask a question about customer satisfaction?
When looking for knowledge about customer satisfaction and satisfaction, it is worth looking broadly and considering at what point the user had the opportunity to really rethink their experience with the product.
Any-example:
Often we are faced with surveys that are shown, for example, after purchase, and before delivery, when I do not yet know what I think about a particular store (because I do not have the goods yet). The survey on the homepage a few seconds after opening it also does not give me the opportunity to evaluate the experience.
When is the best time to investigate customer satisfaction?
For example, for e-commers it is worth asking to fill out a survey after successful purchase or confirmation of delivery of the product or soon after. When introducing new features, you can, as Miro does, ask for feedback a moment after its first use. Ask how it was and if you like the new solution.
For loyal users, the survey can be “sneaked” into the summaries, for example, as Spotify does, summarizing the music of the whole year and asking about favorite moments with the application. This allows you to reflect on the whole experience.
The Zeigranik effect, or why do we like to close cases?
In a nutshell: Tasks that we could not complete stay with us longer than those that were successful.
This phenomenon was studied as early as the beginning of the 20th century. That is why showing a progress bar or dividing processes into steps engages the user so well in the action.
How to use this effect in research?
- Ask the first question in the email or in the survey window on the website and invite the user to complete it after clicking on the link.
- Suggest that since he had time to answer this first question, it will only take him a moment to answer the next one.
Are you satisfied? Start asking!
As you can see, a customer satisfaction survey can be planned by learning good practices from other industries. You can also learn from anti-competitive examples. The most important thing, from the point of view of UX/UI professionals, is to design such a survey that keeps in mind that there is a second person on the other side.
Read about UX
See other articles that may also be of interest to you