Thursday, January 10, 2013

UX in the wild: Amazon's Kindle app

I discovered a rather nifty UI mechanism a few days ago: a usable, elegant, and minimal control for locking and unlocking screen rotation. It was in Amazon's Kindle app on my Galaxy Nexus (though a quick check shows that it's in their iOS app as well).

In most apps I've seen, developers leave auto-rotation management up to the operating system. That is, there is no way to specify your auto-rotation preferences in the app itself -- it just honors the device's main settings. There is a small class of apps who manage it themselves, though -- mostly news or e-readers where users might want to read lots of text while lying on their side. For most of these, the user can lock or unlock the auto-rotation from some settings menu. This placement has some obvious UX drawbacks:

  • Unless the user explores the settings menu, he may not know that the option even exists. This could make him stop using the app. 
  • It's not easy to toggle on and off quickly -- with several taps required, it can be frustrating and possibly stop the user from using the app.
  • It's impossible to know the current state of the rotation lock without checking. This is especially frustrating on older devices which have some lag before the rotation kicks in. Once the state is learned, then the user must navigate back into a menu and change the setting if necessary.
The rotation-lock mechanism in the Kindle app, however, solves all these problems in a simple way: Whenever you rotate your device, a little lock icon appears in the corner that is either locked or unlocked to reflect the auto-rotation state. If you wish to change the state, tapping the lock causes the rotation to adjust immediately. After a few seconds, the icon disappears.


This feature is easy to discover, easy to toggle, and easy to check. While it may have some small drawbacks for certain types of users, it appears to be an across-the-board improvement on the way this option is usually accessed and presented. It's really nice to see novel UI mechanisms continue to spring up in mobile -- and especially from big companies. Great job, Amazon!

(Note: I'm not sure how long this feature has been around because I usually disable auto-rotate in the entire device. It's new to me, though, and good UI makes me happy.)


Hire me, and I'll create innovative user interfaces for your app ! I'm available for full-time or consulting work -- email me

Monday, January 7, 2013

Why you shouldn't trust your users

Your users are like your children. They have opinions and hopes and entitlements and tantrums. Sometimes their demands are reasonable ("I'm thirsty") or unreasonable ("I want brownies for breakfast every day!"). If you want to create a great experience for your user, it's your job to figure out which requests are really in their best interest. They might think they know, and sometimes they're correct -- but like parents, sometimes you need to protect them from the times that they're wrong. (And conveniently, protecting them also protects you and your business.)

The parable of the pine nuts


Last week I went to Whole Foods to get some pine nuts for a kale salad. In case you're not aware, pine nuts are not cheap -- the smallest bag they had was $8.99 for what appeared to be 1.5 cups. Of course, one salad only takes a smattering of nuts, so spending $8.99 on it seemed gratuitously spendy. (Especially since I'm the only one in my apartment who even eats salad in the first place.) But I really did want that salad -- it's the most delicious recipe -- so I thought to myself "Perhaps I can justify this pricey purchase if the nuts will keep for a long time. Then, even if don't have another salad for two months, they'll still be good." This logic seemed sound and also allowed me to get my nuts guilt-free, so I looked all over the bag for something indicating how long they'd last after opening -- to no avail.

But Whole Foods employees are almost always really knowledgeable, so I started looking around for someone to ask. I wandered around from one aisle to the next, trying in vain to find someone. Eventually I found a fellow who had no idea how long pine nuts last, but he determinedly escorted me around the store, from employee to employee, each one saying "Oh, I bet Dan would know," or "Ask Clyde in the back." Down the rabbit hole we went, and at the end of our trek there was a very nice man who told me that pine nuts can last months without any ill effects. Great!

Then as I began to walk away -- now in a small hurry after the lengthy tour of Whole Foods employees -- the man added "You know, if the bag is too big..." I was a little impatient at being called back, since I now had my answer and wanted to get on my way. But I'm a nice person, so I put on a polite smile and turned back to face him. He continued: "If the bag is too big, you could just get them from the bulk foods aisle. Then you can buy however much you need." I nodded and thanked him, thinking "Yes, I know how grocery stores work." But it was only as I was walking away again that I realized that that was exactly what I needed; in my single-minded efforts to create my own solution, I had forgotten my ultimate goal. So I bought half a cup of pine nuts at a much more reasonable price.

The critical reader might point out that Whole Foods made less money off of me than if I'd bought the $8.99 bag. Maybe so. Or maybe I'll buy a lot more kale and pine nuts from them in the future now that I know I don't have to run the risk of wasting it. (I'd be very surprised if my lifetime customer value to Whole Foods hasn't increased after this.)

So how can I use this in my business?


The most important thing to understand from this post is that you can't always trust your customers. Not because they're liars, but because they're really susceptible to getting trapped in their own context. Really, we're all susceptible to that in all aspects of life - it's just that in business we're a lot more incentivized to fix it.

On the surface, of course, the question "how long will this food last?" doesn't seem very similar to "can I buy a smaller amount?" This Whole Foods employee was able to see past the question to my underlying goal beneath it -- even though I wasn't really aware of that goal anymore. That's a very impressive UX feat, and it shows that this fellow really understands his customers. On the other hand, many people understand the needs of shoppers in a grocery store -- it's not exactly a new business model. The tricky part is learning how to gain that understanding when your users' goals aren't as obvious.

So what can you do? There are lots of great ways to conduct user research that can mitigate this effect, such as writing surveys carefully and well. (See my guide on writing effective user surveys.) But every business is different, and the best way to get user feedback will vary from case to case. (If there were a one-size-fits-all solution to this, it wouldn't be a problem anymore.) Hiring a user experience expert is a great first step, though. Preferably one with a solid understanding of user research. As a matter of fact, I'm looking for work myself! Send me an email -- I'd love to talk about your company's needs.


Thursday, December 20, 2012

How to write great user surveys - Part 9


This is part 9 of my series on writing a good user survey. If you're just joining us, you may want to start with part 1.

Step 8: Don't be too hard on yourself. This is a process.

Writing good survey questions is sometimes really hard. You're seeking the unbiased opinions of your users, but just by writing the questions you unintentionally insert your own biases and influences into your users' thoughts. And your own bias is inherently difficult to spot -- that's why there are entire sites dedicated to finding it. But fear not; knowing that you have hidden bias is half the battle. (Well... maybe only 20% of the battle. But still.)

Once you know to scrutinize your questions and define your terms and ask people for feedback, you're on the right path. If you know what traps to look for, all you've got left is an optimization problem: how much of your time is worth the benefit you receive from knowing your users' opinions? Some questions probably deserve a great deal of care if their results will affect hiring decisions or big purchases. If you're trying to decide whether a button should say "Click here" or "Submit," then not so much. ...Unless the button is submitting a user's payment. That deserves some study!

I hope this series has been helpful! If you have questions, comments, or criticism, I'd love to hear about it. Also, are there other UX or UI topics you'd like me to discuss? Leave a comment or send me an email. And thanks for reading!

--
This is part 9 of my series on writing a good user survey.  If you want to start at the beginning, head back over to part 1.

Hire me, and then I'll write your next survey! I'm available for full-time or consulting work -- email me


Wednesday, December 19, 2012

How to write great user surveys - Part 8


This  is part 8 of my series on writing a good user survey. If you're just joining us, you may want to start with part 1.

Step 6: Make a catch-all question -- and tell people about it. 


It's very common that survey-takers have an agenda. After all, they're using their time to answer a bunch of questions, and many people won't do that unless they think they'll get something out of it. Sometimes they're doing it for a coupon or free item or a chance at winning something, but other times they're doing it because they have something they want you to hear. Or both.

That's great, of course. If your customer wants to tell you something and hasn't had another opportunity, we should all be thrilled that those people are willing to take a whole survey to get their message across. Unfortunately, sometimes those users with an agenda will commandeer any text-entry box available to tell you what they're thinking -- regardless of what the text-entry box is for. So why not help them out? Not only does it make their lives easier, but it increases the likelihood that they'll actually answer the questions that are being asked. So when you say "What changes would you like to see in our app?", your user can answer honestly instead of telling you about the time the app didn't work right and he missed a flight to Cleveland.

Accomplishing this is easy, of course: just tell them up front. Usually a simple message on page 1 is pretty effective:
At the end of this survey there is an optional area for any additional feedback you'd like to give.
If you already have too much text on page 1, you run the risk of users skipping over it. So if there's more than one sentence there already, just move the message back a little -- maybe to the same page as your first text-entry box. Then, at the end of your survey, include one last question: "Is there anything else you'd like to tell us?" Just be sure that you set the users' expectations appropriately regarding feedback from that box. (Depending on your survey tool, you may want to ask users to enter an email address if they want a response.)


Step 7: Ask for feedback on the survey itself.


This question is often overlooked in surveys, but it can be invaluable. Asking for survey feedback gives you one last chance to find out about all kinds of mistakes: if you accidentally made a user give you an incorrect answer (with a false-dichotomy polarizing question), if a question had confusing language, or if part of the survey wasn't formatted correctly on the user's screen. If your survey is perfect, then you'll see a bunch of empty text fields in your results. But if it's not perfect (which is extremely likely), this question can save you from making decisions based on bad data. You might learn that some of your data is invalid and the survey was a huge waste of resources, but at least it'll prevent you from doing even more damage based on something that's not true. Here's a good sample question for this:
Do you have any questions or comments about this survey? Did you have difficulty with any of the questions?
Of course, it's worth noting that this question isn't foolproof. Many users won't bother to explain a survey problem -- they don't really care and it's not worth their time. All you need is one, though -- enough to alert you to reevaluate your results.

Come back tomorrow for the conclusion of this series -- part 9 -- where I'll talk a little more about the process behind creating a survey.

--
This is part 8 of my series on writing a good user survey.  If you want to start at the beginning, head back over to part 1.

Hire me, and then I'll write your next survey! I'm available for full-time or consulting work -- email me

Tuesday, December 18, 2012

How to write great user surveys - Part 7


This  is part 7 of my series on writing a good user survey. If you're just joining us, you may want to start with part 1.

Lazy questions


My favorite kind of poorly-written question has to do with lists of items. I see this all the time:
How did you hear about us?
  • Yelp
  • Google
  • A friend
  • TV commercial
  • radio commercial
  • that's it, there are NO MORE WAYS!
Now most of the time you see an "other" option at the bottom of lists like this, but I'm constantly amazed at how many times there isn't one. Sometimes the list has fifteen or twenty items -- the maker clearly thought about this for a while or asked a bunch of people -- but he just can't conceive of the possibility that maybe someone heard about them some other way. Or that they don't know how they heard. For me, it's often something like "I read a press release when you started the company and I've had my eye on you for years" or "You've been around for decades and you're on the news and people all over the place use your products." How did you first hear about Apple? I bet you don't know!

If you find yourself making a list of items in a question -- whether it's the example above or some other kind of listy question -- you should always include an "other" option. (Ideally, that option should let the user fill in his own value.) There is no reason not to. And there's nothing more irksome as a test taker than having to pick some wrong choice because there is no right one. Okay, there are actually many things that are more irksome than that, but you know what I mean.

Sorry today's post is a short one. It turns out that not all subjects can be conveniently divided into perfectly-sized chunks of knowledge! Check back tomorrow for part 8, where I'll cover two easy-to-write questions that should be on every survey -- and can make or save you lots of money.

--
This is part 7 of my series on writing a good user survey.  If you want to start at the beginning, head back over to part 1.

Hire me, and then I'll write your next survey! I'm available for full-time or consulting work -- email me

Monday, December 17, 2012

How to write great user surveys - Part 6


This  is part 6 of my series on writing a good user survey. If you're just joining us, you may want to start with part 1.


Polarizing Questions


It's worth noting here that there are some situations where you do only want one bit of information from your users: is your site/product/whatever acceptable or not? I suspect that this is often the motivation behind polarizing questions like my example above. The key to asking polarizing questions safely, though, is to divide the solution space into exactly two parts. This is usually easiest by asking very general questions with general answers or very specific questions with very specific answers. Here's a general example:
How does our site make you feel?
  • :-)
  • :-(
This question is about as general as it gets, but it can accurately condense a huge variety of complex feelings into two simple options -- good enough or not.  Ikea.com uses lots of smileys (and frowny faces) in its surveys. This can be very effective if used properly -- it's just important that you're careful not to read too much detail into what your user meant by selecting :-) .

Now here's a specific example:
Are you satisfied with how Gmail handles sending and receiving email?
  • Yes
  • No
When you ask specific questions, there's still going to be a little variability in the accuracy of the answers. For this example, some users expect wild innovation and are satisfied with nothing less. Those people may have a perfectly acceptable Gmail experience but still consider themselves dissatisfied. But the more specific you get in your questions, the more you can combat that effect. Consider this even more specific example:
When using Gmail, are you usually able to send and receive emails without problems, errors, or frustration?
  • Yes
  • No
Even users with the highest expectations for innovation and UX excitement would have trouble saying no to that question. (Assuming that they aren't having problems sending and receiving email, that is). By being more specific, you've restricted their feedback to only what you want to know: is it good enough or not.

Now let's look at some examples of polarizing questions that aren't specific enough:

  • Are you satisfied with Gmail? (Gmail does lots of things -- is this question asking about all of them or just email?)
  • Does Gmail meet your needs? (Needs for what? Integrating with Google Calendar? Accessing mail on your phone? Handling multiple accounts? Almost everybody will have a complex answer for this.)
  • Do you like Gmail? (In an absolute sense? Relative to its competitors? Including all its features?)
  • Are you able to accomplish your goals with Gmail? (What goals? Email only? The rich-text editor? Attaching files easily?)

If you find yourself asking a polarizing question, your first thought should be "Am I sure that this has to be polarizing?" If 25% of your users think your product is "meh" and you want them to select "good" or "bad," don't you want to hear from those meh users? That's useful information, and you're spending time and money to get their opinions.  Why would you throw that away by trying to get a polarized answer? Furthermore, making the “meh” users choose will introduce secret inaccuracies to your results: you’ll have no way of knowing how many respondents are actually giving you their true opinion. What if it’s 50%? What if it’s 80%? You won’t know.

Having no information on your users is bad, but having false information can be much, much worse. You should think very carefully before going this route.

Tomorrow in part 7 I'll cover one last way that poor answer options can give you bad data.

--
This is part 6 of my series on writing a good user survey.  If you want to start at the beginning, head back over to part 1.

Hire me, and then I'll write your next survey! I'm available for full-time or consulting work -- email me

Friday, December 14, 2012

How to write great user surveys - Part 5


This  is part 5 of my series on writing a good user survey. If you're just joining us, you may want to start with part 1.

Step 5: Allow for the entire spectrum of possible answers.

One of the most common mistakes I see in surveys is that the test-maker didn't give the user enough options to express his opinion accurately. This happens constantly. Consider the following examples:
Did your experience with the site today make you more or less likely to recommend us to a friend?
  • More
  • Less
Hypothetical user Bob: "The site is great, but I practically never recommend sites to people."
or
Hypothetical user Jay: "I don't know; the site is fine. It doesn't really affect my odds of recommending. Whatever, man."
or
Hypothetical user Randall: "I recently had crappy customer service from you and I'm taking this survey with plans to express that in one of the questions. Your site is fine, but I'm not recommending you to anyone. You guys suck."
Now think about those users. They're probably in the middle of this survey. They've already invested some time in this, and they want to finish it so you can hear their opinions. They all think "I don't know what to put here... I wish there were more options." Let's look at how each user can affect your bottom line:

Bob: "The site is great, but I practically never recommend sites to people."
Bob isn't too bothered -- this question doesn't really apply to him. If he's smart and is thinking about how you'll perceive his answers, you may still get valid data out of him. He could think "Well, I don't recommend sites very much, but this site is pretty good. I'll tell them 'more'." He's not too risky, because he probably won't give you a wrong answer.

Jay: "I don't know; the site is fine. It doesn't really affect my odds of recommending."
Jay is ambivalent but dangerous -- whichever answer he picks will be actually wrong. And if that's the case for a substantial percentage of your users, just think what an expensive error that could be! If a bunch of those ambivalent users choose "less," that could translate to thousands of hours of work by your software engineers and designers -- all to fix a site that is already fine. That's not a cheap mistake!

Randall: "I recently had crappy customer service from you and..."
Randall isn't going to do you any favors either. He's already unhappy with you, and this survey is giving him an outlet to express that. He may or may not have an opinion about the quality of your site, but without a neutral option he's far likelier to give you negative feedback because of his state of mind. Like Jay, he can contribute to a very costly snipe hunt.

(Of course, if Randall thinks that taking a survey is the easiest way to express his dissatisfaction to you, you're already doing it wrong. He should be swimming in opportunities to tell you -- in your product itself (if possible), on the main page of the site, in your promotional emails, etc. But that's a subject for another article.)

It's worth noting here that there are some situations where you do only want one bit of information from your users. Check back tomorrow for part 6 where I'll talk about that special case, what's special about it, and how to do it right.

--
This is part 5 of my series on writing a good user survey.  If you want to start at the beginning, head back over to part 1.

Hire me, and then I'll write your next survey! I'm available for full-time or consulting work -- email me