Last week at Newfangled we ordered a few pizzas and gathered ’round our conference table for our monthly lunch and learn session. For my presentation, I wanted to review the simple usability testing process, show the team some examples of usability tests performed on our clients’ sites, and share some lessons I’ve learned so far from my experience.
Usability Testing Basics
In case you missed Chris’s recent newsletter on simple usability testing and my recent blog post sharing tips for the testing process, here’s a brief review:
Making a Test Plan
A good place to start is with a homepage orientation in which you give the volunteer one to two minutes to observe and interact with the homepage (but not actually clicking any links) and then ask them to explain what they understand the site to be about. The remainder of the test should be 3-4 tasks that correspond to the main goals of the site—subscribing to content, completing a purchase, registering for an event, requesting more information, and so on.
Finding the Right Volunteers
Since the goal of these simply usability tests is to uncover problems that almost any visitor to the site would encounter, it’s not necessary to recruit volunteers that represent the target audience of the site (although you may find it helpful in some cases…more on this later). The ideal volunteer will be someone unfamiliar with the site you’re testing but comfortable using the web in general. I’ve had success recruiting family and friends, and Chris talks about how we enlisted the help of our upstairs neighbors from BlogAds in his case study of the usability tests we performed on our own site. In general, I’ve found that people with more talkative personalities tend to make better volunteers (not to discriminate against the shy types, as I’m cut from that cloth myself).
Preparing the Test Area
It can be tempting to go for convenience when you’re deciding where to perform the tests, but a quiet place free of distractions is a must. I made the mistake of choosing a friend’s apartment as my first testing site and had to throw out most of the footage (he has a dog). It’s a good idea to clear the cache and cookies on the test machine and to turn off or disable any features or programs that might disrupt the test – this includes instant messaging, Twitter alerts, calendar invitations, and scheduled virus scans to name a few. We’ve found that the best setup is a laptop set on a stand with an external mouse and keyboard.
Four Lessons Learned (and counting…)
Don’t be afraid to ask follow up questions.
Sometimes it’s difficult to know when to ask questions and when to stay quiet. Ideally, you want the volunteer to complete each task as if they were sitting at their computer at home without you (the facilitator) there to answer any questions. However, don’t let that make you feel like you’ve taken a vow of silence. Narrating your every thought and action as you navigate through a website isn’t intuitive or natural for most people, so they may need some coercing as they get comfortable with the process. If a volunteer is being particularly quiet, don’t be afraid to ask questions like “What are you looking at/reading now?” or “What are you thinking?” If somone has a reaction of surprise but doesn’t say why, asking “Is that what you expected to happen?” will probably give you extremely helpful information that would have otherwise been lost. The key is to remain neutral in any questions that you ask.
Don’t be afraid to move on if the volunteer is stuck on a task and doesn’t know how to proceed.
If a volunteer clearly understands what you’re asking them to do but is having trouble completing a task, feel free to move on to the next task. When you feel like you’re not likely to learn more by continuing, it’s probably a good idea to move on for time’s and sanity’s sake. Once you hit a point where it’s clear that something’s not working and the volunteer is able to articulate the problems, you have what you need. Don’t feel like you have to let them struggle through to the bitter end.
Volunteers probably won’t do what you expect them to do.
There were several times when I thought the volunteers would uncover something I saw as a problem, but they completely bypassed it or didn’t comment on it at all. The simplest explanation could be that perhaps what you think are problems really aren’t problems. However, taking some time to review and reflect on the test could shed some light on some alternate tester-induced causes.
Recently, I had a friend complete a usability test on a client’s site who specializes in family friendly and expertly planned western driving adventures—complete with RV rental and personalized guidebook. Naturally, this client ultimately wants users to contact them to learn more about a specific trip, so my test included the following task:
You’re interested in taking a summer trip with your family of four. You would like the trip to be one week long and would like to see the Grand Canyon as part of your adventure. Find a trip that meets your criteria and find out what’s included in the trip, how much it costs, and contact them to learn more about the trip.
My friend proceeded to quickly find the first trip that mentioned the Grand Canyon, scan the trip detail page a bit, and submit a form after checking off a random assortment of that particular trips optional activities (without actually reading about them of course). In hindsight, I realized that my 27 year old single male friend would probably have a hard time getting in the mindset of a person planning a summer vacation for his wife and kids. I redid the same test with a friend who actually has a family of four and the results were much more indicative of the average user coming to that site. Which brings me to my last lesson…
Always be ready to do another test
I’ve written about this point previously, but I didn’t realize how true it was until I was preparing for my presentation. I started with the goal of doing three simple tests to show as examples and ended up doing a total of seven. You may encounter a problem that warrants further investigation, realize that tweaking the wording of a task and retesting could provide much more valuable insight or data, or just feel that a test is “off” in an unexplainable way. Since these tests should ideally take only 5-10 minutes of your time, don’t hesitate to do another test, or even another round.