Expediting the Usability Testing Process


Robert Bailey and Kent Bailey

One of the aspects of usability testing, in whatever form it manifests itself, is the fact there there is a lot of data to go through. The tool and process that Bob and Kent were demo’ing is something they have been building to help expedite the process of testing.

The tool itself is in early stages of development. They have been using it in some government Web sites recently, but they also spoke often (usually in answering “will it do x” questions) about features they are going to be adding.

The tool is spilt into two interfaces; one for test management and one for test facilitation. The idea is that this tool can be used remotely or in a lab setting. You can choose to track many different aspects of interaction, set starting and stopping points of a test (via entered URL), set how many steps the user can take before you (the tool) set them back on track. It tracks mouse clicks, whether they used the mouse wheel, or dragged or clicked the scroll bar.

Everything is of course time stamped and sequenced in order of action. The interface for the tool could use some work. It’s still very GUI with a gray background and everything, but as I said before they are still building the tool. It is not Web-based which could be a detractor. You have to download it, but they have set it up so the tool deletes itself after sending the data back to the specified address.

The tool they showed helps standardize the features of your test, so you set up all your failure and success criteria before the test actually happens (isn’t that a good idea).

Someone in the audience asked how this tool compared to Vividence or similar online tools. Bob mentioned that they share aspects, but the idea behind their tool is to make tests more effective and efficient for the tester, which he said was a different impetus from the other products mentioned.

Tool also has satisfaction questionnaire built in.

Requirements used in building the tool (overview list for the purposes of the presentation):

  • Present test subjects with one or more tasks to perform using the Internet or intranet
  • Transparently watch what they do & record all interactions
  • Time them
  • Intervene when they go too far astray
  • Know when they are finished
  • Elicit comments and explanations from them
  • Compile and analyze results from multiple subjects
  • Automatically generate reports in a standardized format
  • Be easily configured by non-technical personnel

Things the tool reports on:

  • Automatically produces a report
  • Offers standardized reporting
  • Automatically conducts statistical analyses
    • ¤Statistical significance (t—tests, F—tests)
    • ¤Correlations (e.g., age and task success)
    • ¤Detection of outliers (> 4 standard deviations)
  • Automatically compares with
    • ¤Usability objectives
    • ¤Previous usability test results
  • Can produce a test report the same day the data are collected

This would be a good tool for my work at my company because often when I am doing informal (not in lab) testing I am on my own, so it is difficult to be facilitator, note taker, observer, and tech support (if the prototype baulks during testing:). Especially if I am visiting multiple participants in one day. I am so tired, often, that it is hard to sit down at the end of the day and compile notes. If I had something like this tool I could primarily focus on observation and facilitation.

And if you are wondering why I keep saying “tool,” it’s because there isn’t a name for it yet. :)

[Edit: Obviously I misunderstood the answer, so I edited the following paragraph based on information from the comment below.]

If you want to find out about using the tool, contact Bob or Kent. For a while (based on my understanding of an audience question) the tool is in the public domain because it is being developed for the US government (the National Cancer Institute to be precise), but as with most things like this it will eventually be offered out of the private sector.

While some phases of development were under contract with some Federal agencies to use in testing of agency websites, Bob and Kent retain the copyright on the software. They expect to have a commercial version of the software available by the end of this year.


  1. Just a few comments.

    First, the automated usability testing tool we demonstrated at UPA is not in the public domain. Though some phases of it were developed under contract with some Federal agencies to use in testing of agency websites, we retained the copyright on the software. We expect to have a commercial version available by the end of this year.

    Second, I’d appreciate it if you’d change the wording in the final paragraph “If you want to find out about suing the tool…” from suing to using ;). We’d really like to get our relationships started on better footing than that. :)

    Third, you can add my email address as contact info for the usability testing tool.

    Kent Bailey, President
    Mind Design Systems, Inc.

  2. Thanks for the feedback Kent. I fixed the post as requested.

    My apologies for misunderstanding the answer. Next time I will sit more toward the front so I can hear better. :)

Comments are closed.