Thursday, December 20, 2012

How to write great user surveys - Part 9


This is part 9 of my series on writing a good user survey. If you're just joining us, you may want to start with part 1.

Step 8: Don't be too hard on yourself. This is a process.

Writing good survey questions is sometimes really hard. You're seeking the unbiased opinions of your users, but just by writing the questions you unintentionally insert your own biases and influences into your users' thoughts. And your own bias is inherently difficult to spot -- that's why there are entire sites dedicated to finding it. But fear not; knowing that you have hidden bias is half the battle. (Well... maybe only 20% of the battle. But still.)

Once you know to scrutinize your questions and define your terms and ask people for feedback, you're on the right path. If you know what traps to look for, all you've got left is an optimization problem: how much of your time is worth the benefit you receive from knowing your users' opinions? Some questions probably deserve a great deal of care if their results will affect hiring decisions or big purchases. If you're trying to decide whether a button should say "Click here" or "Submit," then not so much. ...Unless the button is submitting a user's payment. That deserves some study!

I hope this series has been helpful! If you have questions, comments, or criticism, I'd love to hear about it. Also, are there other UX or UI topics you'd like me to discuss? Leave a comment or send me an email. And thanks for reading!

--
This is part 9 of my series on writing a good user survey.  If you want to start at the beginning, head back over to part 1.

Hire me, and then I'll write your next survey! I'm available for full-time or consulting work -- email me


Wednesday, December 19, 2012

How to write great user surveys - Part 8


This  is part 8 of my series on writing a good user survey. If you're just joining us, you may want to start with part 1.

Step 6: Make a catch-all question -- and tell people about it. 


It's very common that survey-takers have an agenda. After all, they're using their time to answer a bunch of questions, and many people won't do that unless they think they'll get something out of it. Sometimes they're doing it for a coupon or free item or a chance at winning something, but other times they're doing it because they have something they want you to hear. Or both.

That's great, of course. If your customer wants to tell you something and hasn't had another opportunity, we should all be thrilled that those people are willing to take a whole survey to get their message across. Unfortunately, sometimes those users with an agenda will commandeer any text-entry box available to tell you what they're thinking -- regardless of what the text-entry box is for. So why not help them out? Not only does it make their lives easier, but it increases the likelihood that they'll actually answer the questions that are being asked. So when you say "What changes would you like to see in our app?", your user can answer honestly instead of telling you about the time the app didn't work right and he missed a flight to Cleveland.

Accomplishing this is easy, of course: just tell them up front. Usually a simple message on page 1 is pretty effective:
At the end of this survey there is an optional area for any additional feedback you'd like to give.
If you already have too much text on page 1, you run the risk of users skipping over it. So if there's more than one sentence there already, just move the message back a little -- maybe to the same page as your first text-entry box. Then, at the end of your survey, include one last question: "Is there anything else you'd like to tell us?" Just be sure that you set the users' expectations appropriately regarding feedback from that box. (Depending on your survey tool, you may want to ask users to enter an email address if they want a response.)


Step 7: Ask for feedback on the survey itself.


This question is often overlooked in surveys, but it can be invaluable. Asking for survey feedback gives you one last chance to find out about all kinds of mistakes: if you accidentally made a user give you an incorrect answer (with a false-dichotomy polarizing question), if a question had confusing language, or if part of the survey wasn't formatted correctly on the user's screen. If your survey is perfect, then you'll see a bunch of empty text fields in your results. But if it's not perfect (which is extremely likely), this question can save you from making decisions based on bad data. You might learn that some of your data is invalid and the survey was a huge waste of resources, but at least it'll prevent you from doing even more damage based on something that's not true. Here's a good sample question for this:
Do you have any questions or comments about this survey? Did you have difficulty with any of the questions?
Of course, it's worth noting that this question isn't foolproof. Many users won't bother to explain a survey problem -- they don't really care and it's not worth their time. All you need is one, though -- enough to alert you to reevaluate your results.

Come back tomorrow for the conclusion of this series -- part 9 -- where I'll talk a little more about the process behind creating a survey.

--
This is part 8 of my series on writing a good user survey.  If you want to start at the beginning, head back over to part 1.

Hire me, and then I'll write your next survey! I'm available for full-time or consulting work -- email me

Tuesday, December 18, 2012

How to write great user surveys - Part 7


This  is part 7 of my series on writing a good user survey. If you're just joining us, you may want to start with part 1.

Lazy questions


My favorite kind of poorly-written question has to do with lists of items. I see this all the time:
How did you hear about us?
  • Yelp
  • Google
  • A friend
  • TV commercial
  • radio commercial
  • that's it, there are NO MORE WAYS!
Now most of the time you see an "other" option at the bottom of lists like this, but I'm constantly amazed at how many times there isn't one. Sometimes the list has fifteen or twenty items -- the maker clearly thought about this for a while or asked a bunch of people -- but he just can't conceive of the possibility that maybe someone heard about them some other way. Or that they don't know how they heard. For me, it's often something like "I read a press release when you started the company and I've had my eye on you for years" or "You've been around for decades and you're on the news and people all over the place use your products." How did you first hear about Apple? I bet you don't know!

If you find yourself making a list of items in a question -- whether it's the example above or some other kind of listy question -- you should always include an "other" option. (Ideally, that option should let the user fill in his own value.) There is no reason not to. And there's nothing more irksome as a test taker than having to pick some wrong choice because there is no right one. Okay, there are actually many things that are more irksome than that, but you know what I mean.

Sorry today's post is a short one. It turns out that not all subjects can be conveniently divided into perfectly-sized chunks of knowledge! Check back tomorrow for part 8, where I'll cover two easy-to-write questions that should be on every survey -- and can make or save you lots of money.

--
This is part 7 of my series on writing a good user survey.  If you want to start at the beginning, head back over to part 1.

Hire me, and then I'll write your next survey! I'm available for full-time or consulting work -- email me

Monday, December 17, 2012

How to write great user surveys - Part 6


This  is part 6 of my series on writing a good user survey. If you're just joining us, you may want to start with part 1.


Polarizing Questions


It's worth noting here that there are some situations where you do only want one bit of information from your users: is your site/product/whatever acceptable or not? I suspect that this is often the motivation behind polarizing questions like my example above. The key to asking polarizing questions safely, though, is to divide the solution space into exactly two parts. This is usually easiest by asking very general questions with general answers or very specific questions with very specific answers. Here's a general example:
How does our site make you feel?
  • :-)
  • :-(
This question is about as general as it gets, but it can accurately condense a huge variety of complex feelings into two simple options -- good enough or not.  Ikea.com uses lots of smileys (and frowny faces) in its surveys. This can be very effective if used properly -- it's just important that you're careful not to read too much detail into what your user meant by selecting :-) .

Now here's a specific example:
Are you satisfied with how Gmail handles sending and receiving email?
  • Yes
  • No
When you ask specific questions, there's still going to be a little variability in the accuracy of the answers. For this example, some users expect wild innovation and are satisfied with nothing less. Those people may have a perfectly acceptable Gmail experience but still consider themselves dissatisfied. But the more specific you get in your questions, the more you can combat that effect. Consider this even more specific example:
When using Gmail, are you usually able to send and receive emails without problems, errors, or frustration?
  • Yes
  • No
Even users with the highest expectations for innovation and UX excitement would have trouble saying no to that question. (Assuming that they aren't having problems sending and receiving email, that is). By being more specific, you've restricted their feedback to only what you want to know: is it good enough or not.

Now let's look at some examples of polarizing questions that aren't specific enough:

  • Are you satisfied with Gmail? (Gmail does lots of things -- is this question asking about all of them or just email?)
  • Does Gmail meet your needs? (Needs for what? Integrating with Google Calendar? Accessing mail on your phone? Handling multiple accounts? Almost everybody will have a complex answer for this.)
  • Do you like Gmail? (In an absolute sense? Relative to its competitors? Including all its features?)
  • Are you able to accomplish your goals with Gmail? (What goals? Email only? The rich-text editor? Attaching files easily?)

If you find yourself asking a polarizing question, your first thought should be "Am I sure that this has to be polarizing?" If 25% of your users think your product is "meh" and you want them to select "good" or "bad," don't you want to hear from those meh users? That's useful information, and you're spending time and money to get their opinions.  Why would you throw that away by trying to get a polarized answer? Furthermore, making the “meh” users choose will introduce secret inaccuracies to your results: you’ll have no way of knowing how many respondents are actually giving you their true opinion. What if it’s 50%? What if it’s 80%? You won’t know.

Having no information on your users is bad, but having false information can be much, much worse. You should think very carefully before going this route.

Tomorrow in part 7 I'll cover one last way that poor answer options can give you bad data.

--
This is part 6 of my series on writing a good user survey.  If you want to start at the beginning, head back over to part 1.

Hire me, and then I'll write your next survey! I'm available for full-time or consulting work -- email me

Friday, December 14, 2012

How to write great user surveys - Part 5


This  is part 5 of my series on writing a good user survey. If you're just joining us, you may want to start with part 1.

Step 5: Allow for the entire spectrum of possible answers.

One of the most common mistakes I see in surveys is that the test-maker didn't give the user enough options to express his opinion accurately. This happens constantly. Consider the following examples:
Did your experience with the site today make you more or less likely to recommend us to a friend?
  • More
  • Less
Hypothetical user Bob: "The site is great, but I practically never recommend sites to people."
or
Hypothetical user Jay: "I don't know; the site is fine. It doesn't really affect my odds of recommending. Whatever, man."
or
Hypothetical user Randall: "I recently had crappy customer service from you and I'm taking this survey with plans to express that in one of the questions. Your site is fine, but I'm not recommending you to anyone. You guys suck."
Now think about those users. They're probably in the middle of this survey. They've already invested some time in this, and they want to finish it so you can hear their opinions. They all think "I don't know what to put here... I wish there were more options." Let's look at how each user can affect your bottom line:

Bob: "The site is great, but I practically never recommend sites to people."
Bob isn't too bothered -- this question doesn't really apply to him. If he's smart and is thinking about how you'll perceive his answers, you may still get valid data out of him. He could think "Well, I don't recommend sites very much, but this site is pretty good. I'll tell them 'more'." He's not too risky, because he probably won't give you a wrong answer.

Jay: "I don't know; the site is fine. It doesn't really affect my odds of recommending."
Jay is ambivalent but dangerous -- whichever answer he picks will be actually wrong. And if that's the case for a substantial percentage of your users, just think what an expensive error that could be! If a bunch of those ambivalent users choose "less," that could translate to thousands of hours of work by your software engineers and designers -- all to fix a site that is already fine. That's not a cheap mistake!

Randall: "I recently had crappy customer service from you and..."
Randall isn't going to do you any favors either. He's already unhappy with you, and this survey is giving him an outlet to express that. He may or may not have an opinion about the quality of your site, but without a neutral option he's far likelier to give you negative feedback because of his state of mind. Like Jay, he can contribute to a very costly snipe hunt.

(Of course, if Randall thinks that taking a survey is the easiest way to express his dissatisfaction to you, you're already doing it wrong. He should be swimming in opportunities to tell you -- in your product itself (if possible), on the main page of the site, in your promotional emails, etc. But that's a subject for another article.)

It's worth noting here that there are some situations where you do only want one bit of information from your users. Check back tomorrow for part 6 where I'll talk about that special case, what's special about it, and how to do it right.

--
This is part 5 of my series on writing a good user survey.  If you want to start at the beginning, head back over to part 1.

Hire me, and then I'll write your next survey! I'm available for full-time or consulting work -- email me

Thursday, December 13, 2012

How to write great user surveys - Part 4

This  is part 4 of my series on writing a good user survey. If you're just joining us, you may want to start with part 1.


Conflicts of interest


Imagine the following question (which I just made up):
Do you agree or disagree with the following statement? Munchery provides a good value for the money I spend.
  • Strongly agree
  • Mostly agree
  • Neither agree nor disagree
  • Mostly disagree
  • Strongly disagree
When that question is written, the writer thinks "Customers will tell me what they think of Munchery's value." When the customer reads that question, however, sometimes he thinks "Munchery's value is okay. But if I say that I disagree, maybe Munchery will lower its prices." Some customers will do this and some won't -- it depends on your customers and your product. But it will skew your results, and it might skew them in a way that costs you money you didn't need to spend.

Essentially, customers answering that question have a conflict of interest -- their incentives are not aligned with yours. But if you need that information anyway, you can usually still get it if you're sneaky -- you just have to hide your motives a little bit. In my made-up Munchery example, that question has some business mystery behind it. Maybe it's "should we lower or raise our prices?" or "do our customers consider us a luxury?" or "would we get more orders if we gave out more coupons?" There's almost certainly some business need as the motivation, and it's probably not "The CEO was just wondering how users perceive the service's value on a five-point scale."

So think about what you really want to find out. Is there a way you can get that information from the user without asking him to think about it? Like the hole-in-the-jeans example from part 3, there's probably another question you can ask which will allow you to measure results instead of asking for opinions. And the right answer might not always be a survey question. But whether it is or not, it's important to avoid these conflicts of interest if you care about getting the most accurate data.

Come back tomorrow for part 5, where I'll talk about why the answers you provide are just as important (and can be just as harmful) as the questions you ask.

--
This is part 4 of my series on writing a good user survey.  If you want to start at the beginning, head back over to part 1.

Hire me, and then I'll write your next survey! I'm available for full-time or consulting work -- email me

Wednesday, December 12, 2012

How to write great user surveys - Part 3


This  is part 3 of my series on writing a good user survey. If you're just joining us, you may want to start with part 1.

Step 4: Don't tip your hand.

When creating a survey, it's very important to remember that users don't always know what they want. Everybody has heard Henry Ford's famous quote about his customers wanting a faster horse (which it turns out he probably never actually said, but the point still holds). Ford's apocryphal quote doesn't mean that customer feedback isn't useful -- it should simply serve as a warning about asking your users the wrong questions.

Customers usually aren't good at imagining innovative and financially-viable solutions to life's problems -- that's your job as the businessman. So why would you ask customers what solutions they want? What they are good at is... complaining. Which isn't a bad thing! It just means that when you ask them questions, sometimes you need to be careful to ask only about problems -- not solutions. And that can be tricky since customers are good at reading between the lines.

Is this hole noticeable?


Never ask this. If you have a hole in your jeans and you want to know if it's noticeable, why would you point it out to someone in the question? You've just ruined their objectivity. They can't un-notice it, so whatever they tell you is untrustworthy before they even say anything. If you really want to know if the hole is noticeable, create a test and measure the results: make sure the hole is visible to your friend, but instead ask "What do you think of these jeans? Does anything about them look unusual?"

Fundamentally, the difference between these two approaches is whether you're asking for an opinion or measuring results. It's easy to imagine ten friends saying "Nah, I don't think it's very noticeable." They might be wrong or trying to make you feel good, but you just don't know. Opinions may be well-meaning, but they aren't very trustworthy.

What is trustworthy, however, is measurable and repeatable results. By keeping your friend in the dark about your motivations, you don't have to wonder how accurately he is able to judge his own observational skills. You can find the answer -- the correct answer -- just by asking your question a little differently. Instead of saying "Do you think this is noticeable?" you are secretly asking "Do people notice this?" If you can remove "do you think" from your questions, your results will be infinitely more trustworthy.

Sometimes you'll find that measuring results obviates the need for a survey question in the first place. Which is fine -- sometimes surveys aren't the best way to find out what your users want. Before you go to a user survey to plan your roadmap for the next six months, ask yourself if there's a way to solve your problems by measuring results. You can learn hugely valuable things (and have a lot more confidence in your data) from AdWords tests or A/B Testing. (Multi-armed bandit is even better if you have the setup.)

Check back tomorrow for part 4, where I'll talk about another kind of question that can mess up your survey results.

--
This is part 3 of my series on writing a good user survey.  If you want to start at the beginning, head back over to part 1.

Hire me, and then I'll write your next survey! I'm available for full-time or consulting work -- email me

Tuesday, December 11, 2012

How to write great user surveys - Part 2

This is part 2 of my series on writing a good user survey. If you just got here, you may want to start with part 1.

Step 3: Expect your questions to expand.

Let's rewrite my question from step 2 with some better language. Just as a reminder, that question was:
I think lots of people are bothered by redesigns. Is that true? How many people are?

 First I'll define all my squishy terms:
  • "lots of people" - For my post on site redesigns, I care about users who are likely to be current or future customers of Silicon-Valley-type startups.
  • "bothered" - I'm going to leave this term pretty broad, because I don't really know what kind of data to expect. However, I'm mostly interested in knowing if a redesign made people actually unhappy, so I'm going to use something slightly stronger: "frustrated." Then I'll user a 1-to-10 scale to get some more data on that afterwards.
  • "redesign" - It would take a lot of time to figure out what users think qualifies as a redesign. Since their initial understanding of the term isn't really important to the study, I'm just going to define it for the user with my own made-up definition. It doesn't have to be the most elegant or concise definition -- it just has to be clear and unambiguous.
  • "how many" - I'm mainly interested in the opinions of people who are irritated, because I expect that to be the majority. However, I could be wrong. If I am wrong, my post on redesigns will have a lot less value, and that is important information to have. (Primarily because it will save me many hours of writing several lengthy articles.) So even though I don't really care about the opinions of people who aren't usually irritated by redesigns, I need to find out how many there are. So in this case my "how many" will grow into two questions: whether people are usually frustrated by a redesign, and then a second question asking them to think of any time they have been frustrated by a redesign.


So! The results:


Before:
 I think lots of people are bothered by redesigns. Is that true? How many people are?
After:
For this survey, a "redesign" means a major change in the look, feel, location of features or actions, or other major functionality changes.
Are you usually frustrated when a website or app you use gets a redesign?
  • Yes. Usually some parts of a redesign have made me frustrated or annoyed. 
  • No. I don't usually mind when an app or site is redesigned.
If you answered Yes above, how unhappy does that frustration make you?
On the scale below, 1 is slightly unhappy and 10 is extremely unhappy.  
Of all the redesigns you encounter, how many of them make you frustrated?
  • All or almost all of them are frustrating.
  • Most of them are frustrating.
  • 50-50. Some are, some aren't.
  • Not that many. Most redesigns don't frustrate me.
  • None or almost none of them frustrate me.

Boy, that one thing I wanted to find out turned into quite a lot of questions, eh? This is actually very common. When you’re interviewing a user in person, a small question can produce paragraphs of results. For multiple-choice survey questions, however, you’re only getting one or two bits of information each time. The simple question “is my product good?” has answers in dozens of dimensions -- price, durability, value, convenience, appearance, utility, quality, enjoyability, etc. And if you want to capture all of that information, you’re going to have to ask about it all explicitly. Because each of your users has a unique collection of dimensions in his head for what constitutes product goodness, and he’ll answer your question according to his definitions.

So be prepared to ask a lot of questions, and try to keep things focused. Nobody likes twenty-minute surveys, so it's important to keep your goals narrow. You can take shortcuts if you want by getting rid of questions, but that loss will carry over into the quality of your results.

Check back tomorrow for part 3, where I'll cover some common errors that can skew your survey results.
--
This is part 2 of my series on writing a good user survey.  If you want to start at the beginning, head back over to part 1.

Hire me, and then I'll write your next survey! I'm available for full-time or consulting work -- email me

Monday, December 10, 2012

What every entrepreneur should know about writing user surveys - Part 1


I've been working on a series of posts about how to do a site redesign without making your users hate you. We all know that redesigns are a lightning rod for hate and abuse, but there are lots of ways to mitigate that. As I was writing it, though, I realized that I really need some data to back me up. I know what I find frustrating in a redesign, and I hear from my friends and Hacker News commenters, but we're not exactly a typical cross section of society. So I did what everybody should do when they want some quantitative user data: made a survey!

I really like taking surveys. Who doesn't love giving an opinion to people who want to hear it? Writing a good survey question is a really important aspect of being a user experience designer, and it was a really fun part of my HCI curriculum in college. So whenever a little popup window asks me to tell it about my site experience, I usually say yes. For me, half the fun is seeing how the survey was designed and thinking "Good job, UX guy!" (Sadly, it is a bit more common that I shake my head and think "You guys should have brought in a UX person to write your surveys.") But that's okay; I'm here to help!

Step 1: Figure out what you want to find out.

This part doesn't have to be formal or grammatically correct or well written. Just write down your goals using regular language. Here's what I want to find out:

  • I think lots of people are bothered by redesigns. Is that true? How many people are?
  • For people who are bothered, what things bother them about it?
  • Does an irritating redesign make them feel that the company doesn't care about its customers?
  • What do they wish the company had done differently?


Step 2: Find the squishy parts of your questions.

 What parts of these questions are squishy? Let's focus on this one:
I think lots of people are bothered by redesigns. Is that true? How many people are?
When I wrote that question, those terms (in bold) felt pretty reasonable and solid in my head. As soon as I begin making a survey out of it, though, I have to make sure that each term gets translated into a precise or quantitative phrase. Let's look at each of my squishy terms above and how they could cause problems:

  • "lots of people" - Is that a percentage of... people on the internet? Percentage of your users? Percentage of people who may someday become users? Whose opinions do you actually care about? Figuring out the precise definition of "lots of people" here will determine who actually takes your survey. This may in fact be the most important part of your survey to get right, since getting it wrong can mean that your whole survey was a waste of time. In addition to wasting your own time as the survey-maker, it also wastes your users' time and any costs the survey incurs (such as a user discount for taking the survey in the first place). Furthermore, only some of your users will want to take a survey at all, and of those, many don't want to take two in a short amount of time. So if you squander their survey goodwill, you may have to wait several weeks or months for it to recharge before they'll take another one.
  • "bothered" - This could mean practically anything -- from "it took me thirty extra seconds to send an email" to "your changes prevented me from paying my bill on time, I got a credit hit, and now I'm irate!" If you want your survey results to be useful and accurate, you need to close your open-ended terms. Though there are some exceptions to this -- if you want to hear about the entire range of experiences from "slightly inconvenienced" to "because of you my home loan was denied," then this kind of broad term can be okay. But if you plan to distinguish between those two ends of the spectrum when you're looking at the data later, it's wise to add an expository question afterwards. Such as: "On a scale of 1 to 10, how unhappy did this make you?"
  • "redesign" - Most of your users are probably not Silicon Valley entrepreneur hopefuls. Someone in Mississippi may not have the precise ideas in their head that you do when it comes to technical or business language. To someone who's in a different sphere, a "redesign" could mean a new color scheme, a new comments system, a new feature, or just a new navigation bar at the top of the screen. In the context of the particular survey that I'm creating, that sort of ambiguity might not harm the data too much, but it can be very harmful in other contexts. Terms like "redesign" need to be defined before asking the user any questions about it.
  • "how many" - Do you want to know how many people have ever experienced a frustrating redesign? That's probably a pretty high percentage if you're asking about all the redesigns they've ever encountered in their lives. That's also probably quite a different number than the percentage of people who find most redesigns frustrating. You can get hugely varying survey results from two similar-sounding questions: "Do you find site redesigns frustrating?" and "Have you ever been frustrated by a site redesign?" 

Check back tomorrow for part 2, where I'll discuss how this qualitative language can be fixed.

--
Hire me, and then I'll write your next survey! I'm available for full-time or consulting work -- email me