EN-questionnaire-satisfaction-erreurs.pngEN-questionnaire-satisfaction-erreurs_social.png

We’ve already talked a lot about best practices for creating online surveys (check out our Cheat Sheet on the subject!). Now we want to talk to you about practices you should avoid – mistakes that damage the effectiveness of your surveys. To avoid adding bias to data and to maximise your conversion rate, follow our advice below!

1. Bombarding customers 

First bit of advice: don’t send surveys to your customers too often! When multiple surveys are sent in a short period of time, responses tend to be less detailed and more negative starting with the second satisfaction survey sent. The closer together the survey requests are, the more apparent this trend becomes.

So, be careful with the marketing pressure you apply! While it’s important to regularly track your customers’ satisfaction, make sure to not over-do it. Find the right balance! The guideline you can follow is that a customer should not receive more than one survey per quarter. More than that and your marketing pressure could have negative effects on the data collected, as well as on the image of your brand. Nobody likes spammers.

2. Asking identification questions at the start of a survey

Yes, we recommend that you always authenticate the opinions you collect and that you gather personal information (age, sex, location, etc.) on respondents. This allows you to use these data for targeted marketing campaigns.

However, never ask for this personal information at the start of a survey! First off, the respondent will think that the survey is designed solely to collect their personal data and that their opinion doesn’t matter much to you after all. Additionally, asking these questions first will not set the right tone for customers to interact with your brand and they will be less likely to respond. 

It’s better to start with questions that are less personal for the respondent, tackling a chosen satisfaction criterion (for example, customer service, choice, price, etc.). Identification questions, which are more personal, should be asked at the end of the survey. By that point, the respondent will be sufficiently engaged in the process and there will be less risk of them not finishing.

3. Using a complicated, technical vocabulary

To get accurate, useful data, it is essential to make your questions clear and understandable for your customers. Questions should not contain terms that are too technical or complicated grammatical structures that might lead to confusion. 

Also, consider defining any concepts or vocabulary that your respondents might not be familiar with. For example: ‘Expressimo is our new service that delivers your groceries to your door for free. Would you be interested in using this service?’ 

Respondents may misunderstand ambiguous or non-explicit questions, leading the collected responses to be biased. Consider testing your survey in-house before distributing it. A clear survey will get more responses and those responses will be more reliable!

4. Asking questions that influence the response

 You want to keep a neutral tone when you write your questions. Here’s an example: ‘80% of our customers are satisfied with Expressimo. What rating do you give this service?’ By writing the question like this, we are pushing the respondent to give a good rating.

Even while it’s nice to get a good rating (that’s another purpose behind surveys!), identifying an unsatisfied customer is a real opportunity for your business! The goal of a survey is to get accurate, useful data. So, formulate your questions in the most neutral way possible so as not to influence respondents.


DOWNLOAD THE CHECKLIST : Become an expert an creating effective online surveys


create efficient online surveys

5. Distributing a survey that’s not adapted for mobile devices

According to a study conducted by NP6, around 50% of French people read their emails on mobile devices. That’s around one out of every two customers who may be opening your satisfaction survey on their smartphone. With content not adapted to mobile devices (size of font and buttons, layout, etc.), customers will find the experience to be off-putting and will be tempted to give up easily.

With more people checking emails on mobile devices, it has become crucial to adapt your survey designs to these interfaces. Go for easy reading and a simple design with clearly marked action items. It’s 2017: ‘responsive web design’ is no longer optional. It’s a necessity.

6. Asking more than 5 questions per survey

This is definitely one of the most important pieces of advice: do not overestimate the time that your respondents are willing to give you. At Critizr, we advise asking no more than five questions. The shorter your survey is, the more responses you will get. Furthermore, a limited number of questions will help maintain respondents’ interest and limit them answering randomly or giving up.

Clearly define the objectives of your survey. Your questions must be based on your objectives. If you have a lot of questions to ask, perhaps you haven’t narrowed the theme of your survey enough. In this case, it may be a good idea to make two different surveys or to combine your questions. This will definitely make a difference in the response rate.

7. Asking for too much effort from respondents

Besides time, you have to be careful to not ask your customers to make too great of a mental effort. The questions in your survey must target a specific, concrete subject. 

Similarly, multiple-choice questions must have easy-to-understand openings. At Critizr, we recommend very visual rating systems (like stars or emoticons) that will be easily interpreted by customers.


DOWNLOAD THE CHECKLIST : Become an expert an creating effective online surveys


photo2CS.jpg

8. Asking multiple questions in one 

This seems obvious, yet it’s a common mistake in surveys. Out of concern for reducing the amount of time the survey takes, it may be tempting to evaluate multiple criteria in a single question.

However, when you ask a question like ‘Was the salesperson friendly and did they provide good advice?’, you make it very difficult to interpret the results. The customer may be satisfied with the way they were treated by the salesperson, but not the advice given, or vice versa. You will have no way of knowing! This type of question produces unusable data, so limit your question to a single criterion.

9. Always using required questions

Don’t force your customers to respond to (all of) your questions! If your respondent is taking a little of their time to answer your questions, you can let them skip certain ones. It’s up to you to decide which data are not essential for you and, in this way, to choose which questions will be optional.

Generally speaking, it’s better to not require an answer for an open-response question: customers do not necessarily have anything to put into words and it would be unfortunate for them to abandon the survey, or worse, just write whatever. Consider making certain questions optional to increase the conversion rate of your surveys and limit the risk of bias or of respondents giving up.

10. Offering non-specific responses 

When writing your multiple-choice questions, you have to offer specific answers. Let’s take, for example, the question ‘How frequently do you use our Expressimo service?’, with the following responses: very often, often, rarely, very rarely. 

Each customer will have a very subjective idea of what ‘often’ or ‘rarely’ means. As such, it will be difficult to discern anything when interpreting the results. 

To make data analysis easier, you should guide the respondent by offering specific ranges of values. For example: every day, once per week, once per month, etc. With a high enough level of precision, the responses you get will be much more useful.

So, there are the ten mistakes that we frequently find in customer satisfaction surveys. Try to break these bad habits to maximise your conversion rates and make analysis easy. As for practices recommended by Critizr, check out our Cheat Sheet called How to create an online satisfaction survey ?

Call To Action Ebook Collect Customer Feedback