Planning usability testing – Writing test plans

Planning usability testing – Writing test plans

Friday 28th of July 2017
Written by: David Humphreys

“In preparing for battle I have always found that plans are useless, but planning is indispensable.” – Dwight D. Eisenhower

This article is part 1 of a 3-part series on usability testing. Planning is a critical part of the usability testing process. Without planning, your testing activity runs the risk of disorganisation at best and misleading or useless results at worst. Spending time planning will reduce the risks of the testing going off the rails and enhances the potential for great insights. This article explores the reasons for planning your usability testing process and just what you should think about. Hint: we mention checklists quite a bit.

Why plan?

The Eisenhower quote above suggests that it is the act of planning that is important, not the plan itself. By carefully considering the needs of the situation, in this case usability testing, and allowing for as many variables as you can (there are always more) is more critical than the actual product. Planning for planning’s sake in “useless” according to Eisenhower, but the act of sitting down and thinking about what you need to achieve and how you are going to do it is essential to success. You will never have infinite resources to run any testing project, and even if you are blessed with vast resources, using them wisely will allow you to save your excess for future activities.

Writing your test plan

There are four critical things you need to consider when running usability testing activities of any size or scale. These considerations are:

  • What are your research goals?
  • What tools and methods will you use to achieve them?
  • Who are you going to test with?
  • What resources do you need to get those goals done?

The very first thing to consider is what is the goal of your research? Understanding what you want to achieve will determine how you will achieve it. Think about:

  • What are your research goals? Are you trying to increase conversion?  Are there suspected problem areas with your site or system?  Do you have disagreement in your team about how a particular design element? Are you trying to determine why users area abandoning at a particular point in a process?  Decide what outcomes are important. Hint: check the project and business goals too.
  • What test artefacts or deliverables will you produce after testing? Are you going to do a test report, issues register in Excel, a PowerPoint presentation with video clips?
  • What is your methodology? What tools and methods are you going to use to achieve your goals? Examine your research and your test assumptions and design an approach to meet them. We often use a range of tools and methods beyond simple scenario task based testing, to do things like explore user page impressions and visual hierarchypre vs post use impressions, brand alignment semantic differential scales, or even mixing up IA testing using tools like Treejack with usability testing of a site or prototype.
  • Determine who you will test with and how many?  Who are your users and who do you want to test with? How many participants should you test with? The general rule is 5-10 participants from each unique user group you are testing with.  Avoid friends, family and co-workers as they are often not reflective of your actual target audience and may have knowledge that your users typically do not have. We generally recommend using a professional market recruitment agency to find participants as they are worth their weight in gold.
  • Test measures. How are you measuring success?  We often measure task efficiency using a scale: easy/medium/hard/fail.  You may look at error rates if testing a system. Task completion time might be important (although there is a trade-off here as users can’t think out aloud when being timed). Measuring subjective user satisfaction is also usually important.
  • Write test scenarios. This gets back to your testing objectives. What is the purpose of the testing? The tasks you are asking your participants to complete should reflect those goals. Be sure to write tasks in context i.e. a sentence of context before the actual task.
  • How much time do you plan to spend on each task? It’s important to prioritise your tasks. You won’t have time for everything (we rarely go over an hour), so make sure you’re targeting your high priority goals first based on what user’s most common goals. Be sure to run a pilot test before the day of testing to confirm your timing too – you’ll often have to adjust to fit things in.
  • Lastly, are there any constraints to the testing or limits to scope? Are you dependent on a critical system or person? Are results due at a certain time or other projects dependent on getting testing results by a particular time or manner? These constraints will influence your approach and deliverables. A project designed to test a prototype is more than likely going to be used to quickly iterate the design reporting is likely to be simpler than someone trying to prove a business case.
  • Location & equipment. Where are you testing? A lab? In the field? Are you testing onsite? What equipment do you need? You don’t need a lab for testing but it can help streamline your analysis, communicate findings and allow your stakeholders to unobtrusively observe sessions.  We use Morae to record test sessions but if you have a Mac you might try Silverback.
  • Testing team. Who will moderate test sessions?  Often it is good to have two moderators to ensure you stay fresh and provide different perspectives. Will you have observers logging data?  Who is writing up the results?
  • Paperwork & materials, What paperwork do you need? There are often a bunch of forms you need and there are many templates available. If recording you will need an informed consent form. A moderator guide is a useful form for each test session to record results. Task cards are also useful for participants to refer back to during testing.
  • Create a checklist for all the items above. Have you made that checklist yet?  Consider using Steve Krug’s testing checklist. Getting the basics right can help you concentrate on the important things, the testing itself and making the participants comfortable on the day. The less you are worried about the forms and the right money in envelopes or the technology crapping out on you, the better your test session will be.

Run a pilot test

There are a couple of key things you can do to ensure a smooth test day. The first is to run a pilot session. Get a participant (they don’t need to meet the brief exactly but it is best if they are as close as possible to it as you can find without paying for recruitment) and run them through the test plan exactly as if you were running a normal session.

This will do a few things for you:

  • It allows you to test timings and prioritise or shorten if too long.
  • It lets you ensure the scenarios are understood by participants.
  • It confirms the test methods you are using are going to give you the results you planned.
  • It enables you to identify technology issues with your testing environment or the system you are testing e.g. bugs you need to be aware of in testing if not live yet.

You should run a pilot at the very least, the day before testing to give you enough time to make any tweaks or adjustments you need to, and allow you to produce and print the final versions of tools like a moderator’s guide.

It is best to have everything prepared the day before testing so you aren’t worrying about it the next morning. Having a checklist with everything that needs to be considered is a great idea as it takes a lot of that ‘need to remember lots of details’ away. Just follow the checklist. Did we mention a checklist?

Oh, and the last thing on our internal checklist is often overlooked but very important.

Make sure you get a good night’s sleep!

Resources

Usability.gov usability test plan template

Usability test plan dashboard (Userfocus)

Usability testing checklist (Steve Krug)

Our newsletter

Sign up for our online newsletter with usability tips and the latest in user-centred design.

Connect with us

Contact Info

Brisbane Technology Park
3 Clunies Ross Ct,
Eight Mile Plains QLD 4113
Australia

Phone: 1800 732 593
Email us

© 2024 PeakXD ABN 33018701610   All Rights Reserved