Monday, December 17, 2012

How to write great user surveys - Part 6


This  is part 6 of my series on writing a good user survey. If you're just joining us, you may want to start with part 1.


Polarizing Questions


It's worth noting here that there are some situations where you do only want one bit of information from your users: is your site/product/whatever acceptable or not? I suspect that this is often the motivation behind polarizing questions like my example above. The key to asking polarizing questions safely, though, is to divide the solution space into exactly two parts. This is usually easiest by asking very general questions with general answers or very specific questions with very specific answers. Here's a general example:
How does our site make you feel?
  • :-)
  • :-(
This question is about as general as it gets, but it can accurately condense a huge variety of complex feelings into two simple options -- good enough or not.  Ikea.com uses lots of smileys (and frowny faces) in its surveys. This can be very effective if used properly -- it's just important that you're careful not to read too much detail into what your user meant by selecting :-) .

Now here's a specific example:
Are you satisfied with how Gmail handles sending and receiving email?
  • Yes
  • No
When you ask specific questions, there's still going to be a little variability in the accuracy of the answers. For this example, some users expect wild innovation and are satisfied with nothing less. Those people may have a perfectly acceptable Gmail experience but still consider themselves dissatisfied. But the more specific you get in your questions, the more you can combat that effect. Consider this even more specific example:
When using Gmail, are you usually able to send and receive emails without problems, errors, or frustration?
  • Yes
  • No
Even users with the highest expectations for innovation and UX excitement would have trouble saying no to that question. (Assuming that they aren't having problems sending and receiving email, that is). By being more specific, you've restricted their feedback to only what you want to know: is it good enough or not.

Now let's look at some examples of polarizing questions that aren't specific enough:

  • Are you satisfied with Gmail? (Gmail does lots of things -- is this question asking about all of them or just email?)
  • Does Gmail meet your needs? (Needs for what? Integrating with Google Calendar? Accessing mail on your phone? Handling multiple accounts? Almost everybody will have a complex answer for this.)
  • Do you like Gmail? (In an absolute sense? Relative to its competitors? Including all its features?)
  • Are you able to accomplish your goals with Gmail? (What goals? Email only? The rich-text editor? Attaching files easily?)

If you find yourself asking a polarizing question, your first thought should be "Am I sure that this has to be polarizing?" If 25% of your users think your product is "meh" and you want them to select "good" or "bad," don't you want to hear from those meh users? That's useful information, and you're spending time and money to get their opinions.  Why would you throw that away by trying to get a polarized answer? Furthermore, making the “meh” users choose will introduce secret inaccuracies to your results: you’ll have no way of knowing how many respondents are actually giving you their true opinion. What if it’s 50%? What if it’s 80%? You won’t know.

Having no information on your users is bad, but having false information can be much, much worse. You should think very carefully before going this route.

Tomorrow in part 7 I'll cover one last way that poor answer options can give you bad data.

--
This is part 6 of my series on writing a good user survey.  If you want to start at the beginning, head back over to part 1.

Hire me, and then I'll write your next survey! I'm available for full-time or consulting work -- email me

No comments:

Post a Comment