Bad questions: 7 common survey design mistakes

Who knows your users best? The answer is obvious: themselves. And the good news is that your customers are willing to share their thoughts and opinions. No matter what your company does, users will always be able to provide feedback that can be turned into valuable insights.

One of the easiest and most common ways to get feedback is through surveys. They come in all shapes and sizes, but it's important to remember that they share a common foundation ― the right questions at the right time. However, this is not the only rule to take into account when creating surveys. After all, it is very easy to collect user feedback in the wrong way.

Therefore, today we will talk about the most common mistakes when creating surveys, and what they can lead to.

Using only closed-ended questions

You've probably seen surveys-statements, for example: "When watching political programs, I feel bad." And immediately – a strip with answers from "Completely disagree" to "Completely agree" (Likert scale). This is an example of a closed-ended question – when the user selects an option from the previously suggested ones.

Such questions have their absolute advantages. For example, it's easier to understand which audience mainly uses the site, how often they make purchases, etc. In other words, closed-ended multiple-choice questions will help you calculate feedback and statistics.

But often, surveys in which users can write their answers themselves are much more effective. They provide insights into the user's language that we are hunting for. Therefore, do not overdo with closed-ended questions – it is better to open the survey with them, and then ask something open-ended depending on the answer.

For example, "How likely would you recommend

our site to friends or colleagues?" (NPS) –> "How could we improve the site... and your rating?" (score 0-8) / "We are happy that you have made such an assessment.

What prompted you to do this the most?" (score 9-10).

Too many questions inside

Don't ask more than 3-5 questions in one survey. In addition, do not mislead the user – show him how much he still has left until the end. Short surveys have much better pass rates, and most importantly, they are more objective. It is very difficult to answer questions for 30 minutes honestly and be engaged in the process till the very end.

Ambiguity

It is simple to make questions ambiguous. It happens when the user may have more than one answer at once. Or when there is almost no difference between the answers.

In addition, the question should be just about one subject – it is not necessary to add a few at once. "Do you drink coffee in the morning, and what do you think about instant drinks?" – divide them into two consecutive ones. And forget about ending the question like: "... and why?". This is definitely a bad idea.

And there are also trap words. Sometimes it is impossible to understand what was meant in the question. For example, the variations of the pronoun You. "How much money do you spend on household appliances in a year?" And who are You? Me or my family? Another example of traps are words like "all," "what," "nobody," "everyone," "never".

And, of course, always check the accuracy of the wording. As, for example, in the case of: "What computer are you working on?" You can get a counter-question to it: "At home or in the office?" After all, even watching videos on Youtube is working on a computer.

Ambiguity is the hardest to deal with because we always strive to combine several concepts into one. And the deepening into details is frightening – subconsciously, we always want to get feedback of a more general nature. Even experts sometimes make such mistakes, and this is okay. Practice and develop self-focus, and then you will quickly be able to distinguish inconsistencies and ambiguities.

Questions about the distant future

"I don't even know what will happen to me tomorrow! And you're inviting me for a trip in six months," ― does this sound familiar? So users perceive questions about the distant future in almost the same way. Moreover, the answers to questions like: "Are you planning to purchase a car?" – will never be objective. Planning does not mean buying.

The same goes for questions about new features. You can't just ask the user what he wants. Everything is much more complicated, and you will have to work hard.

We noticed that if a product allows you to ask a question to a large audience and get at least 500 insights, you can always find intersections among user wishes.

If we study the sample in detail, it often turns out that 30-70% of respondents say the same thing. Perhaps in different words, with opposed solutions. But the main thing here is that their wants come from the same problems.

As a researcher or UX specialist, your task is to get to the bottom of this problem. Remember that most often, users talking about the future generate wants. You have to learn how to turn them into needs. It is the basis of the Voice of the Customer methodology. So, when asking about new features, analyze the answers deeper, and don't just run to put millions of tasks in the backlog for developers.

Biased questions or answers

Design questions and answers to get the most honest feedback, not just what you want to hear.

An example of a biased question: "Don't you think that a separate page for registration would be more convenient?" There is already a tip to a certain opinion in the question itself. Hmm, yes, perhaps it would be more convenient!

And if you consistently ask: "What do you dislike about the registration process?" And then, "Could a separate page resolve your dissatisfaction?" – you will not only receive ready-made A/B testing. But also learn, for example, which features can be transferred from the old registration page to the new one.

Incomprehensible language

Avoid slang and technical words whenever possible (well, only if you haven't explained them as clearly as possible). It will be much easier for new or inexperienced users to get on the same page with you.

Following patterns and unwillingness to experiment

All tips in this article are general recommendations that will teach you how to create high–quality surveys. But blindly following it would be wrong. Working with feedback is a constant experiment. Compare the effectiveness of questions, produce new ones, and try different formats for collecting feedback. This way, you will get even more useful insights.

In general, surveys are a living tool. There are often situations when lots of users write their reviews, and we notice multiple coincidences. In this case, it is better to suspend the collection of feedback and start a new one by adding the most common responses.