On-Site Surveys Biases (and what to do about them)

I'm a big fan of On-Site surveys (the anonymous surveys like Qualaroo that you find in many websites). 

Visit Intent and Task completion. The bread and butter of CRO.

Visit Intent and Task completion. The bread and butter of CRO.

They rank high among the most useful tools that Conversion Optimisers have, mainly because they are an endless source of insights that help us understand what the clickstream data context is. Very often these insights end up in an hypothesis and in a winning experiment. But... It's truly difficult to convince web owners to use them.

Why is it so hard to convince our colleagues and clients to use them?

Every time we suggest launching an "On-site Survey" we are backfired with concerns regarding accuracy, sampling, low response rates and lack of representativity. 

The answer to these concerns is simple: they are right. On-site surveys have low response rates, but that's not the worst (2% of 10,000 visits are lots of visits to analyse). The problem is that the sample used is biased due to a combination of Voluntary response Bias and the Non Response Bias, therefore, it will not be representative of our website traffic..

Don't close your Qualaroo or WebEngage accounts yet.  

Yes, this biases exist, but On-Site surveys still work very well for CRO when used properly. Knowing these biases allows us to designing the best survey to overcome them, gathering and analysing the data in a way that will not mislead us.

Let's go for it.

Non response & Voluntary Response bias 

The #1 bias in On-site Surveys (and mostly in Online Surveys) is Non response Bias. Long story short: Most of the times the users who answer our surveys are not a representative sample of our studied population (all website visitors).

Dilbert wisdom

Dilbert wisdom

This is not a problem of volume or numbers. The bias steams from the composition of these individuals: they are not a representation of all the traffic (for example: they won't include visits that bounce or leave quickly in the same proportion than those who stay in the site). 

Let's see an example. 

Have you ever asked the Visitor Intent in your website?

In my experience of dozens of surveys where we asked this question, you will have at least 15% of the responses answering that they have a "Purchase intent".

In this example from The Little Saint Store (great hand made geeky tieclips and handcuffs) it's 24%, based on a sample of 62 users out of 798 who saw the survey (notice that the visits where much more during the same period of time, but due to the triggering rules to launch the survey, not every visit saw it). 

24% of the responses claim to visit to make a purchase

24% of the responses claim to visit to make a purchase

The 8% is calculated out of the users who SAW the survey, but the population of "website visitor" was much bigger. Therefore we can't infer these results to be applied to all the website traffic.

The mistake is to report this survey as ‘24% of the visits come to Purchase, we have a great room for improvement in Conversion Rate’.

This 24% only applies to the users who answered the survey, which is not a representative sample of overall traffic.

But there's still a value on asking for the Visit Intent.

  1. The main reason to do so is to find out the Task Completion rate by Visit Intent. Even a biased sample will allow us to detect what user journeys needs our attention.
  2. Visitor Intent also gives us valuable directional data. If we had only 1% of users replying "Purchase" after 100 visits this would lead us to further investigation of a possible brand positioning issue.
  3. We will find out other Visit intents that we are not contemplating in our CRO strategy.
  4. We can compare Visitor Intent among similar segments, I.E traffic medium, landing or campaign.

Voluntary response Bias

When we are using On-site Surveys like 4Q our users "chose" whether to participate or not, therefore we are only analyzing results from those subjects who "wanted" to reply.

Normally these users are the ones who have strong opinions about us or our website, both positive and negative. The same happens with feedback gathering tools. This "auto enrollment" bias is known as Voluntary Response Bias. 

A Poll like this in a website called "STOPNRA.org" is a clear example of Voluntary Bias

A Poll like this in a website called "STOPNRA.org" is a clear example of Voluntary Bias

The secret of On-site Surveys

The secret to succeed in On-site Surveys is to use them to gather insights, not to validate hypothesis or measurements. We should not use onsite surveys to quantify one behavior. To achieve this we have much better tools based on "census" (not samples) like Google Analytics, SiteCatalyst (yeah, there's a small % of users who won't be tracked in the Web Analytics tool, but we are talking about 95%+ samples in most of cases).

On-site surveys are great for the "discovery" and research fases of the Conversion Rate Optimisation process. Tools like 4Q are excellent to gather insights from this non-representative samples. They will put some light on the big "Known Unknowns" and often to the "Unknown Unknowns".

Donald Rumsfeld famous quote

Donald Rumsfeld famous quote

Once we gather a known or unknown insight (a comment, a complain, a new visit intent, etc) that shows up frequently in our responses the next step is to validate it with our Analytics tool and form an hypothesis.

If we find data backing up this hypothesis, we can design an experiment to validate it. The experiment (as opposite to observation) allows us to infer causation.

Example:

We launch an exit survey in our ecommerce checkout page asking "Why are you leaving the checkout?".

Only 0.5% of users respond, and we know it's not a representative sample. But after 100 answers we identify several responses complaining about International Shipment not being available. Our hypothesis is that international users may be leaving for this reason, so we check our abandon rate for foreign users in Google Analytics. Looks like the abandon is higher (hold on, this is only correlation so far). An Experiment will tell us if having international shipment will decrease the abandon in this stage.

Having experiments to validate Hypothesis allows us to be able to work with non representatives samples, as we will validate our hypothesis in a later stage.

Tips to succeed using On-Site surveys

If I had asked people what they wanted, they would have said faster horses
— Henry Ford
  1. Keep them short. One or two questions máximum. This will reduce the "No response" bias (but won't solve for the voluntary response).
  2. Always add Open fields. Always! If we give a closed set opt options it's likely that we miss something, and we'd be falling on a new bias known as Design bias.
  3. Segment. This will allow us to avoid the "overall population" discussion. By asking returning visitors only we increase the chances of having a more representative sample of the population "returning visits". Another great segment to start with is "Purchasing visits" showing a one question survey at the confirmation page inquiring about how was the buying experience.
  4. Never ask about the future or "what would you do". Remember Henry Ford's quote on "Faster Horses". People don't want to lie, but if you ask me what I want... listen, I really don't know. To know what they actually DO use your analytics tool.
  5. Ask them WHY they do it, but still don't believe it. Again, we are not always conscious about what drives our decisions, as we saw in posts like Dopamine and Motivation. Ask WHY but analyse these answers and try to understand the underlying motivations.
  6. Don't worry for low response rates, but try to get 100 responses before analysing the data.
  7. Easy on the reporting of the results. Remember: These surveys are great for "Discovery" not for assessing or measuring. Never ever use a survey result % in a report, infering the results to the overall population. Never say: "24% of the users visit us to purchase" but "The survey showed a larger research intent among the respondents".
  8. Avoid questions regarding users Opinion. We want to discover insights. It's better to ask "Did you have any problem during your visit?" rather than "What do you think of our website".

Conclusions:

  • "Only few people answer these surveys and polls, somewhere around 2%" - TRUE
  • "You usually have extreme answers, super fans and super detractors " - YEP
  • "They are not a representative sample of our website users" - AGREE
  • "People lie on Internet" - Well, not really. 99.9% of the times users intend to answer truthfully, it's just that they just don't know the truth of why they behave like they do
  • "They will affect our conversion rate". Definitely, NO. And by not using them you are not going to improve them.

As we can see there are many myths an opinions on On-Site surveys. Still, the best Conversion Rate experts in the world, and the top companies, keep using them. Know why? I do. They allow us to discover new insights that other tools (like web analytics) can't.

As Data driven Marketers we need to be fully aware of the biases of using On-site Surveys and be transparent about them.

In Conversion Garden Ltd I'll help you to design surveys that allow you and your business to discover new insights to feed your testing plan. Send me an email and let's talk.

Did you like this Post? Then, please comment (or share it).

Update 3 September: Thanks David Cameron PR team for this great example of Voluntary Response bias!