Remote Usability Testing - A Christmas ghost story

Remote Usability Testing - A Christmas ghost story

Tuesday 1st of December 2015
Written by: David Humphreys

Ho! Ho! Ho! Merry Christmas! Uncle Dave here with a Christmas story about UX lessons relearned, the ghost of an old client to bring you a bit of UX joy.

The ghost of client's past

We were recently contacted by a client we had not heard from for years. They had been building their in-house UX capability over the years. Then one bright sunny summer's morning in mid-December we received a call that went something like this:

Client: Hi, we have a couple of different mobile designs. We are divided. We need an expert review to tell us which to use. We need an answer by COB tomorrow ...and don't want to spend a lot of money.

Peak: Sure. We can do that ...we won't pick one design -but will catalog issues and help you make your mind up.

To us, it seemed straightforward, look at the two designs, apply our experience and knowledge and identify the issues. Until, a voice was raised in dissent, perhaps a Christmas angel (but more likely Tania) who said, "We need to test this. What if we are wrong?"

One of our core philosophies is 'Some testing is always better than none' so we told the client we would test their designs. But how could we do it, within time-frames and budget (i.e. in a 24 hour window)?

A saviour is (re)introduced

There are a few tools that we regularly use (Optimal Workshop's excellent suite for example) that do offer inbuilt online recruitment of participants but their lead times all seemed too slow for our needs.

However, we had noticed that fivesecondtest.com had broadened into a suite of products as part of the UsabilityHub and we knew that fivesecondtest.com had always been very good in generating responses. The suite now included a click testing tool and we hoped, with the rapid responses from its global user base that we would get a quick enough set of responses to meet our clients tight time frame.

We anticipated that most of the respondents would come from the US overnight - while we slept. Not being terribly concerned about any potential cultural differences, we set the test with 10 tasks per design, launched the test at 6pm and went home.

Sort of...

Because, before we finished for the day we already had half of our planned 20 or so responses for each task.. Just six hours behind Australia is India and results were flooding in from the Subcontinent. Again, there was not significant concern as English is a second language for many Indians, but it was an unexpected outcome nonetheless.

Expectations overturned

So what happened? If not a Christmas miracle, then certainly a Christmas reminder. Our first impressions had favoured one design over another. As is often the case, one design was more visually appealing and 'modern looking' but it didn't test as well when users attempted common tasks. The other design was also marginally more efficient when we looked at time on task. Our first impressions and assumptions about the most efficient and usable design were disproved. The testing highlighted a few usability issues with the "pretty" design and reminded us of the value of testing with users, even when you have extremely tight timeframes.

Lessons learned and relearned

What did we learn? I think we reschooled ourselves in some fundamentals as well as learnt some practical lessons on rapid usability testing.

  1. Some testing is ALWAYS better than none - Ultimately there were strong elements in both designs that should be incorporated into a hybrid design. But without the testing we would have overlooked some of the gems. Our initial reaction was that there was one that was better than the other, but the results from the testing told a different story.
  2. Efficiency is a better measure than preference - Don't ask what users think, observe what they do. We initially considered using Usability Hub's Preference Test which asks users which design they prefer. However, if users haven't tried to complete tasks on the site, they tend to base their choice purely on visual aesthetics. By using the click test we were able to measure efficiency (task completion, rate and average time) so were able to give more informed recommendations to our client.
  3. Plan your launch times - If it is important that you target particular users, from a geographic location, language group or region, then you need to plan your launch times. We don't think the 44% of responses from India influenced the results significantly but it may in certain circumstances.

Want to learn more about UX?

Sign up for our online newsletter with UX tips and the latest in user-centred design.

Find out more about our UX course and certification training courses.

Our newsletter

Sign up for our online newsletter with usability tips and the latest in user-centred design.

Connect with us

Contact Info

Brisbane Technology Park
3 Clunies Ross Ct,
Eight Mile Plains QLD 4113
Australia

Phone: 1800 732 593
Email us

© 2024 PeakXD ABN 33018701610   All Rights Reserved