They say write what you know, so as it was for my latest Usability Geek piece, this week I am writing about … writing. However, while last week’s article focused on UX writing, this one concerns itself with a different kind of scribing: a usability test script.
The foundation of any sound UAT exercise, an articulate usability test script means more in-depth data and more representative results. In contrast, a poor one can skew your findings, making them unable to inform the rest of your design process and rendering the entire exercise useless.
I polled a few of the members of Codal‘s user experience design team to learn how they build a usability test script from scratch.
1. Determine Scope And Subjects
Before you pick up a pen or open a new Google Doc, you need to firmly establish the scope of your test, as well as recruit the subjects you will be conducting it on. Determining scope and enlisting interviewees deserves an article in and of itself, so for now just know that these will dictate how your script is written. To use screenwriting terms: the scope, or what tasks you will be asking the user to complete, comprise most of the script’s narrative, while the subject is its protagonist.
2. Ask For Consent To Record
All usability tests should be recorded for later analysis, and whether that is via webcam, digital recorder, or screen capture program, you need to inform your test subject you are planning on recording them, and obtain their explicit consent to do so. It not only makes the subject more comfortable (especially if you are pointing a camera in their face) but failing to get consent compromises the integrity of the test.
3. Begin With Preliminary Information
Before you dive into the bulk of the test, be sure to document the subject’s basic personal information. That means recording the subject’s name, age, occupation, or any other data relevant to your test. You can do this in the form of a question (e.g. ‘what is your name?’, ‘how old are you?’, ‘have you used rideshare apps before?’) or simply requesting their confirmation (‘please confirm your name is X, your age is Y,’ etc.).
Most designers prefer to use the former – it better mimics a natural conversation and generally makes the subject feel more at ease. As always, be sure to record the information in multiple places. Files can get corrupted, so it is always practical to have a backup, just to be safe.
4. Reassure The Subject They Are Not The Ones Being Tested
It is natural for test subjects to feel as if they are the ones under examination, rather than the platform or prototype you are testing. Be sure to assuage those fears by reinforcing the fact that you and the subject are on the same team; that you are both working together to help make a better product. Be honest and open with your subject, and you will cultivate an environment more conducive to collecting representative, empirical data.
5. Encourage Them To Voice Their Thought Process
So much of UX design is attempting to capture split-second decisions, to influence impressions that users form almost instantaneously. It is difficult to determine what is happening inside a user’s mind while they are interfacing with your prototype. The user may not even recognise their own subconscious reactions to it.
That is why it is so crucial to encourage your test subject to talk aloud and give voice to their thoughts, actions, and opinions as they are using the platform. Narrating your thought process can be awkward (especially in front of strangers), and that is why so much of your usability script should be dedicated to making the subject feel at ease.
6. Give Them An Opportunity To Ask Questions Before Beginning
It is a single line in your script, but an important one: make sure you and the subject are on the same page before assigning tasks. If applicable, it can also be helpful to inform the subject that they can ask questions any time during the test.
7. Remove Bias From Your Statements
Most user acceptance tests are task-based, where a proctor assigns the subject an action or series of actions to complete. These vary depending on the platforms’ use cases, but common ones include locating a specific product, making a return, or creating a profile.
Make no mistake: the language and verbiage you use when assigning these tasks will affect the outcome. Consider the difference between “now, see if you can try to find the men’s Oxford dress shirts” and “now, find the men’s Oxford dress shirts”. It does not seem like much, but the wording of the former has a slight connotation that the task will be difficult, or that the platform is supposed to have a poor user experience.
Anytime the test subject thinks they are supposed to answer a certain way; it taints their response. It is a well-known phenomenon that plagues all qualitative research studies, and while it is difficult to remove completely, your script should do its best to avoid it.
8. Keep Questions Open-Ended
While the user completes the task, the proctor will sometimes ask questions about different parts of their experience. If you plan on including interstitial questions like these in your test, you should ensure they are concise, but also leave room for the user’s thoughts, opinions, and feelings. Avoid yes or no questions like “did you enjoy the experience?” and instead opt for the much broader “how did you feel about the experience?”.
If you are still struggling with your script, or maybe just need some inspiration, there are plenty of templates out there to help you get started. You will find they follow most or all of the tips I have listed here. Here is one I am partial to, written by web design guru Steve Krug.
If you want to know other best practices for user acceptance testing, feel free to check out Usability Geek’s comprehensive vertical. Other than the tips I have listed above, the best general writing advice I can give is to get started.
(Lead image: Depositphotos)