A survey is a list of questions aimed at extracting specific data from a particular group of people known as a sample as defined in wikipedia. With a representative sample, which is representative of the larger user base (or population of interest), one can seek to understand characteristics, preferences and attitudes of users.
Conversion optimization is the systematic process of getting more users to take a specific desired action. Understanding user behaviour and attitudes is at the center of any successful conversion optimization program.
Survey research is one of the most versatile tools in an optimizer’s tool box for understanding users, along with usability testing, customer interviews, A/B testing etc. A good sample selection is key as it allows one to generalize the findings from the sample to the population, which is the whole purpose of survey research.
Surveys are typically an easy tool to start with. In fact, survey research is the one research method you learn pretty extensively in school, at least all the statistical analysis. So, everyone is equipped with a certain level of survey experience even out of school!
When to use surveys
Surveys are great for generating hypotheses.
They add a qualitative layer to your quantitative data analysis i.e. the why behind the what. For example, you may see a large bounce rate on your homepage and have many theories on what could be causing it. Popping a visitor survey on that page can help identify what visitors are looking for and how you can improve the page. This can help narrow down your focus to the top 2–3 theories or hypotheses.
Surveys serve a lot of different use cases. You can survey anyone — your own customers, your prospects or website visitors, competitors’ customers and pretty much any other groups of interest. You can survey to understand your customers, know their attitudes towards your brand, stay aware of competition, evaluate new products etc. From a UX standpoint, surveys can be run to understand the end user, inform the direction of a design or assess a live website.
Surveys can be added and customized to ask questions across the funnel as well.
Surveys are typically cheap and can be done on a shoestring budget. Tools such as SurveyMonkey, Typeform and Google Surveys are either free or have really affordable pricing tiers. There is usually no additional cost if you have your own list to survey. However, if you are a start-up or are testing a new concept, tools like SurveyMonkey have a panel of users you can screen to find your target audience and survey them.
Surveys can help confirm A/B test results.
If you are A/B testing a few headlines, you can add another layer of analysis by surveying users to see which headlines resonate with them. This helps provide additional credibility to your test results. Additionally, you can collect demographics and other data that can provide a deeper understanding of the segments and the messaging that’s resonating with them.
Surveys can replace A/B testing.
Although surveys can’t really replace A/B testing, there are times when organizations are not in a position to do much else. Perhaps one of the most important reasons for optimizing website performance by running surveys is if the website just doesn’t get enough traffic. This could mean that an A/B test would have to run for months to get to the right sample size and lift. Besides sample pollution, this can greatly slow down the optimization process. Surveys can be a decent alternative in such cases to guide optimization efforts and provide faster results.
Many organizations have limited developer resources to set up A/B tests. Again, surveys are a good way to keep generating insights.
When not to survey
- Surveys are not the most effective when you have a lot of open ended questions and the research is exploratory in nature. In that case, customer interviews may be more effective since they allow you to probe the customer deeper about their responses.
- Surveys are not the most effective when observation is important. This could be when you want to understand how do they find a product on your website or where do they get stuck in the checkout or sign-up flow. In such cases, usability testing is helpful as it allows you to observe users carrying out a task.
- Surveys require a minimum sample size, depending on the size of the population. If that requirement isn’t met, survey data may not be statistically relevant. However, I do not think this is a stringent rule. Firstly, if the population itself is really small, a representative sample of very few customers may still provide helpful inputs. Secondly, users may just prefer the anonymity that surveys provide.
Tips for running successful surveys
As per Dr. Rob Balon’s Survey Design Theory course in CXL, here are some tips to get the most out of surveys:
- Avoid using jargon in the questions and focus on clarity.
- Do not ask leading questions.
- Use intuitive scales which means that the highest value should be the highest (1-least, 5-best).
- Limit to as few questions as required. Fatigue sets in in longer surveys leading to error of central tendency which means that beyond a certain question, answers start converging to the mean. This tells you a good stopping point!
- Always explain about your organization and services. Do not assume product knowledge since any confusion in the survey can lead to unusable responses.
- Be aware of the following cognitive biases that could creep into the survey design and analysis:
- Presenting the answer options in a close ended question in a fixed order could lead to respondents selecting the first option more often.
- Presenting the results based on what client/ stakeholders want to hear.
- Clients may only remember what they agree with. You have to force them to be more rigorous and understand all the important takeaways.