So You Want to Conduct a Usability Test?

2.2.2016 »
Expertise
Stay up to Date

Leading questions, asking too much of the user in too short of a time frame, not asking enough, guiding instead of observing. Avoiding these mistakes, and knowing the reason behind doing so, is what separates the men and women from the boys and girls in the user experience field. I’ve been conducting user tests for years, and it’s been an incredible skill to adapt as user experience—and the proper way to run usability tests—only increase in importance.

Illustration of laboratory-type testing tools

This post is the third in a series on user experience. We'll be covering each facet of user testing, with each discipline giving their take:


Setting Up the Tasks
When conducting a usability test, the first step is to decide on a series of tasks. The tasks should be common actions a typical user would do on your website, such as buy an item, request more information, or find your phone number.

The key is to identify 5-10 typical use cases for your website and then design the tasks in such a way that you don't give away the answer.

When prepping for these tests, you must first test your script. Avoid leading questions—they could distort your entire testing. Say you’re a college and you’re working on the redesign of your recruiting website. You wouldn't want to say "Go to the footer, click on the Contact Us link and then complete the form to send us an email."

The wording of the task is important. A better way to say it would be "You've already applied for admission to the college, but you want to ask about the status of your application. Where would you send in a question?"

Recruiting Users
If you’ve done your recruiting correctly, you’ll have a good sample of typical site users. When it comes to tech knowledge, you want to avoid both ends of the spectrum, from computer nerds to absolute noobs. Computer nerds should be avoided, unless you’re designing a site for them, since they aren't your typical user. In general, you want to identify people in the middle of the spectrum, your average web user. A typical user at least knows how to use a web browser and click a link.

Illustration of user demographic graph

 And tech savvy doesn’t directly correlate with age. I once tested a 62 year-old woman who didn’t know how to use the mouse. In the same study I also tested an 82 year-old man who read the news daily online.

Depending on your demographics, you only need about 5 users for a usability test. You don't need to conduct a test with dozens of users. With about 4 or 5 users, you’ll start to see some common patterns in your results.

Starting a Test
At the beginning of the test, you want to introduce yourself and give a little background about what you’re trying to accomplish. It's important to make your testers feel comfortable. People tend to focus on the "test" part and feel like they want to avoid "failing." They need to know you’re testing the site, not them.

If they fail to achieve one of our tasks, that means we haven't yet found the best design solution. And that's what we want to find out.

Illustration of user testing across multiple devices

Conducting a Test
Encourage your users to think aloud as they work through the tasks. Don't interfere or give suggestions. Actually, avoid helping them in any way. When a user is struggling, it can be very awkward, sometimes painful to watch—but it speaks volumes.

If you see they are struggling for a long time, ask them to think out loud.

  • "In your own words, what you are you looking for?"
  • "What have you tried already?"

Your job is one of an observer. Don't interfere, instead try to capture as much information about the process that the user is going through. Write fast. Capture mistakes—but more importantly—their thought process.

Some quantitative metrics you might want to capture are:

  • How long did the task take?
  • What was the first thing they clicked on?
  • How many clicks did it take to find the correct solution?
  • Did they end up on the wrong page? What page was it?

Once they have completed the task, let them know. If they struggled with the task, ask them "How could we make it easier for you?"

Reviewing Results
If you’ve tested a wide variety of subjects, you should have seen many different types of users—and even more different user interactions.

  • What were the common mistakes that people made?
  • Which tasks did people struggle with?
  • Was there any terminology that caused confusion?
  • What was their navigation style?
  • Did they use the navigation system as expected?
  • Did they do something unusual?

In one of our recent tests of a mobile website, I had a user who completely ignored the mobile navigation, one that slides out from the side—the most intuitive way to navigate a mobile site. Instead he would scroll down and navigate through the entire homepage. Fortunately, the homepage was designed so a user could easily get to most of the important pages on the site. Believe it or not, he was able to complete nearly all of the tasks, all without using the menu navigation.

People navigate your site in different ways. Have in-line links in the copy of the site. Some people think that way.

In the end, we want to determine what changes can be made to the site to make improvements. Small changes can have a big impact on user experience. At Sanger & Eby, we take these things into account with every web project. While there are industry standards and best practices, each website and target audience is nuanced. Part of our expertise is knowing we don’t know everything. That’s why usability testing—and knowing how to test—is so powerful. Contact us if you’d like to learn more about our process.

Keep reading.
Load All
View More