Monday, December 10, 2012

What every entrepreneur should know about writing user surveys - Part 1


I've been working on a series of posts about how to do a site redesign without making your users hate you. We all know that redesigns are a lightning rod for hate and abuse, but there are lots of ways to mitigate that. As I was writing it, though, I realized that I really need some data to back me up. I know what I find frustrating in a redesign, and I hear from my friends and Hacker News commenters, but we're not exactly a typical cross section of society. So I did what everybody should do when they want some quantitative user data: made a survey!

I really like taking surveys. Who doesn't love giving an opinion to people who want to hear it? Writing a good survey question is a really important aspect of being a user experience designer, and it was a really fun part of my HCI curriculum in college. So whenever a little popup window asks me to tell it about my site experience, I usually say yes. For me, half the fun is seeing how the survey was designed and thinking "Good job, UX guy!" (Sadly, it is a bit more common that I shake my head and think "You guys should have brought in a UX person to write your surveys.") But that's okay; I'm here to help!

Step 1: Figure out what you want to find out.

This part doesn't have to be formal or grammatically correct or well written. Just write down your goals using regular language. Here's what I want to find out:

  • I think lots of people are bothered by redesigns. Is that true? How many people are?
  • For people who are bothered, what things bother them about it?
  • Does an irritating redesign make them feel that the company doesn't care about its customers?
  • What do they wish the company had done differently?


Step 2: Find the squishy parts of your questions.

 What parts of these questions are squishy? Let's focus on this one:
I think lots of people are bothered by redesigns. Is that true? How many people are?
When I wrote that question, those terms (in bold) felt pretty reasonable and solid in my head. As soon as I begin making a survey out of it, though, I have to make sure that each term gets translated into a precise or quantitative phrase. Let's look at each of my squishy terms above and how they could cause problems:

  • "lots of people" - Is that a percentage of... people on the internet? Percentage of your users? Percentage of people who may someday become users? Whose opinions do you actually care about? Figuring out the precise definition of "lots of people" here will determine who actually takes your survey. This may in fact be the most important part of your survey to get right, since getting it wrong can mean that your whole survey was a waste of time. In addition to wasting your own time as the survey-maker, it also wastes your users' time and any costs the survey incurs (such as a user discount for taking the survey in the first place). Furthermore, only some of your users will want to take a survey at all, and of those, many don't want to take two in a short amount of time. So if you squander their survey goodwill, you may have to wait several weeks or months for it to recharge before they'll take another one.
  • "bothered" - This could mean practically anything -- from "it took me thirty extra seconds to send an email" to "your changes prevented me from paying my bill on time, I got a credit hit, and now I'm irate!" If you want your survey results to be useful and accurate, you need to close your open-ended terms. Though there are some exceptions to this -- if you want to hear about the entire range of experiences from "slightly inconvenienced" to "because of you my home loan was denied," then this kind of broad term can be okay. But if you plan to distinguish between those two ends of the spectrum when you're looking at the data later, it's wise to add an expository question afterwards. Such as: "On a scale of 1 to 10, how unhappy did this make you?"
  • "redesign" - Most of your users are probably not Silicon Valley entrepreneur hopefuls. Someone in Mississippi may not have the precise ideas in their head that you do when it comes to technical or business language. To someone who's in a different sphere, a "redesign" could mean a new color scheme, a new comments system, a new feature, or just a new navigation bar at the top of the screen. In the context of the particular survey that I'm creating, that sort of ambiguity might not harm the data too much, but it can be very harmful in other contexts. Terms like "redesign" need to be defined before asking the user any questions about it.
  • "how many" - Do you want to know how many people have ever experienced a frustrating redesign? That's probably a pretty high percentage if you're asking about all the redesigns they've ever encountered in their lives. That's also probably quite a different number than the percentage of people who find most redesigns frustrating. You can get hugely varying survey results from two similar-sounding questions: "Do you find site redesigns frustrating?" and "Have you ever been frustrated by a site redesign?" 

Check back tomorrow for part 2, where I'll discuss how this qualitative language can be fixed.

--
Hire me, and then I'll write your next survey! I'm available for full-time or consulting work -- email me

No comments:

Post a Comment