February 12, 2014
Usability Testing: Getting the Most out of Your Test Participants
One of my first assignments was to conduct training sessions for content managers to use our then content management system, Vignette. I assumed that content managers would have a certain level of fluency with using software or web tools. I was proved wrong in my first hour of my first class. I saw my class struggling with some of the basic functions and features.
That training class quickly turned into a user testing session and while the content managers learned the system just fine, I walked out with a list of findings that could improve Vignette. Something I had not planned. It was an unplanned usability test. I had the opportunity to conduct several usability tests later for websites, web applications and would like to share some of my lessons learned.
For those of you who attended GOVTalks: Usability, or GOVtalks: Understanding Our Audience, you heard us say how important usability testing is and why you should be doing it often. Peter Lee covered some of the important tools to analyze web usage and Pooja Berrong covered the process of doing a heuristic evaluation, a very effective exercise to find the low hanging fruits. While the mentioned heuristics cover certain aspects of your site, testing with real users give you a lot of specific actionable information.
The most common test where you, the moderator gets to interface with the user or test participant is the interview and observation test. A test where you observe your test participants (anywhere from 5 to 10) use your website or web application and then discuss the experience. Here are a few pointers to help you with a smoother user test:
1. Know what you are looking for.
Without knowing what you need to find out from usability tests, you are as much in the dark as not conducting a test. You need to find out from the stakeholders, your program director or your communication director about the purpose of your website or application. What is expected from the users? What are some of the key information points they are looking for from the test. The more you know this information, the better prepared you will be to design your questions.
2. Let users know you are testing your site and not them.
When people sign up as test participants, they are either doing it to help you, your agency or to be part of a new experience. State agencies rarely can afford to pay test participants what private companies compensate. For whatever reason your participant showed up for the test, they are certainly not there to show you their browsing skills. On the contrary, they immediately get uncomfortable when they know they will be watched. We need to be in their element so without putting a spotlight on their natural behavior, it is your job to remind them that you are only testing the system and not them. Not being able find something is not a user issue but a system or tool issue. They might need some initial help, so feel free to answer any questions as long as the question isn't “Now where do I click?”.
3. Let go and breathe!
Not every user is going to be your rock star and will read your mind and click on the exact pixel. Actually, I still have to meet mine. There is a good chance that users are not going to click where you need them to and will choose a different click path. Its okay for them to do this. Unless they quit your site and start browsing YouTube videos, don’t keep reeling them back to the exact page they missed the correct click. The journey is important and document where and how they landed. You never know what they will uncover for you in their detour.
4. Read their mind; well try to.
Sometimes it is quite natural for you as a moderator to get uneasy when a test participants takes a detour. “Maybe they did not understand the task or maybe they are lost?” you keep telling yourself. One easy way to find out is to ask them. I don't mean interrupt them after every click but before you explain their task, ask your test participant if they feel comfortable thinking out loud. This does not come natural to all but it is not that difficult. If needed, demonstrate to them what you mean by thinking out loud. Once they start verbalizing their thought process, there are various gems of findings you can capture. Not only from the core task they are working on but even about some peripheral elements they interact with.
5. Don’t do this alone.
Ideally you should not be conducting user tests by yourself. Have some help with capturing the findings. Keep your time to analyze and answer any questions and more importantly to keep the test participant at ease. There is nothing more important for a moderator than to make sure the participant is in the exact mental state as they would be using your website or application unmonitored. Sometimes you will need to modify the tasks based on the participant. Spending some time ahead of the session in conducting a light conversation with your participant will help you if you need to steer away from the specific task.
6. It's not me, it's you.
Yes remind them the website is for the user not for the agency, board members or even the commissioner. Do your best to convince them to be honest. If they associate you with the website or the agency, they might not give you constructive criticism. There are various ways to address this. Some experts suggest testing the users on competitor or similar websites and eventually landing them on the real website you need to test. By the time they land on your site, they should have shrugged off their initial inhibitions and give you an honest first impression.
7. What’s wrong with me?
Last but not the least, do not ask the test participant to solve any detected usability flaws. Don’t ask questions like “what would you do if you were to design this page?”. Your job is to get the participant to uncover the gaps, not fix them or redesign the system. Sometimes test participants feel obligated to provide solutions to discovered problems. If they do so, just make a note and thank them. Do not encourage them to take the design role. This keeps your test on target and the participant will move to uncover more gaps rather than spend time coming up with design strategies (experience speaking here).
There is no secret recipe for successful usability testing. The more testing you do, you will find out what works for you. Test a lot, improvise a lot, and don’t be rigid to a script. Remember you are not just seeking answers to the questions you have drafted, you are looking for problems to solve.