How I teach usability
I prefer a hands-on practical approach to teaching usability testing.
Aside from my consulting practice, I also teach a few university courses in technical writing. One course is usability and human factors, which I've taught for several years now, including an undergraduate and graduate version. I've experimented with different ways to teach usability testing to students, but I've always come back to a hands-on approach where students get to learn about usability testing and practice usability testing.
I am currently teaching a course on usability testing for technical communicators. As an online-asynchronous course, we do not meet at a specific time or day; instead, we work together in an online mode and use an online learning system to discuss topics. Weeks begin on Sundays, when I share a pair of videos: one that looks back to review our work in the previous week, and another to look forward to the coming week. Every week, students read articles, book chapters, and other resources. We have two assignments every week, typically a synthesis due Wednesday and a practice due Friday.
The course is structured into three main units: learning about usability testing, practicing usability testing, and a client usability test.
Unit 1. Learning about usability testing
To provide a solid foundation for the course, we spend the first few weeks learning about "what is usability" and "different ways to do usability testing" for any kind of information product. An information product might be documentation, instructions, websites, or software interfaces.
We also discuss ethics in usability testing, such as why it's important to respect the tester - including not asking the tester to submit their personal data into a web form just for the sake of a usability test. For example, if a test requires that the tester enter some information, we must provide the tester with fake credentials such as a user ID like test01
, dummy email address, or generated username.
At the end of this unit, students perform a design analysis of an everyday thing. This kind of review is also called a heuristic review because it involves an expert providing their opinion of the usability of an item, usually according to certain usability factors.
Unit 2. Practicing usability testing
With this solid grounding of "what is usability," we then turn to a more typical moderated usability test. Every week in this unit, the students learn about how to design a usability test, starting with the audience analysis, personas, use scenarios, scenario tasks, and how to perform a remote usability test.
The target for this usability test is our department's MS and Graduate Certificate website, and I serve as the "client" representative. Because I am both the client and instructor, this is a low-stakes low-pressure usability test. If students "fall down" during this usability test, it's okay; I am the instructor, and I know this is meant to be an educational exercise.
Since most of the students are MS and Certificate students in the technical communication program, they can start from a place of familiarity in writing their personas and use scenarios; they only need to cast their minds back by a year or two, to when they were considering applying to the program. I encourage the students to craft personas and use scenarios based on their own experiences; write about who you are and why you applied to the graduate program.
The students write five personas and use scenarios. A persona is a fictional but realistic representation of the actual users of the system. If the persona is the "Who," then a use scenario answers the "What," "Where," "When," "Why," and "How" that persona accesses the system. As the "client" for this usability test, I select the best personas and use scenarios from their submissions.
The following week, the students use those personas to craft five scenario tasks. A scenario task is the backbone of a usability test, and sets a brief context then asks the tester to do something specific. Again, I select the best scenario tasks for our usability test.
This completes the design of a simple usability test. After that, the students must perform a usability test with one tester using the same scenario tasks. With twenty students in the class, this is effectively the same as running a single test with twenty testers. I coach the students on how to moderate their usability test and collect data. Afterwards, I collate their data into a "heat grid map" so we can talk about any "hot rows" or "hot columns" in the data.
The last discussion asks how many testers did we really need in the usability test. We discuss Nielsen's "5" number and the assumptions underlying that assertion, including iterative testing and getting "good enough" data to make informed changes to the design.
Unit 3. Client usability test
With that practice, students are more confident going into the second half of the semester, where they work in a small team to perform a usability test for an outside client. In previous iterations of this course, we have worked with a university, an online "magazine" website, an overseas nonprofit, and local government. To start this unit, I record an interview with the client and share it with my students; the students can watch the video to understand the client, their website, the audience, and the client's goals and success criteria for the usability test.
The students use that information to craft personas and use scenarios, which I filter and pass the most applicable to the client. In their review, I ask the client to sort the personas and use scenarios into three "piles" such as "Yes," "No," and "Maybe." When reading a persona, if the client thinks "we definitely have users like this," then that persona goes into a "Yes" pile. If the client responds with "I don't recognize this person," then sort that persona into a "No" pile. If the client thinks "it's a little off-target, it needs a few changes to hit the mark," then I recommend a "Maybe" pile. The goal of this quick review is to identify the personas that accurately describe the users: the "Yes" pile.
Based on that feedback, the students craft scenario tasks. And again, I review them and send the best scenario tasks to our client for their review. As the instructor, I assign the approved scenario tasks to the student teams. This semester, I have five student groups; teams 1, 2, and 3 are using the same set of six scenario tasks, and groups 4 and 5 are using a different yet similar set of six scenario tasks. This "divide and conquer" approach allows us to provide different perspectives to our clients.
Over the next few weeks, the groups work independently to design, plan, execute, analyze, and report on their usability test. At the end of the semester, students submit a written test report with recommendations, and record a brief video to summarize the test and results for the client.
Because the client test is a group project, I also have the students do an individual "final exam," which is basically a reflection about what they learned about usability testing.
Course outline
It may be easier to understand the course outline by representing it visually. This table shows an overview of the semester, with the weekly topics and assignments. To keep this brief, I did not list the assigned articles and chapters for each week.
Week | Topic | due Wednesday | due Friday |
---|---|---|---|
1 | Welcome and introduction | Introduce yourself online | What is usability? |
2 | Understanding usability | Usability test methods | Human factors and user-centered design |
3 | Usability research | Design analysis of an everyday thing | Usability research ethics |
4 | Understanding the audience | What are personas and use scenarios? | Write 5 personas and use scenarios (1-person test) |
5 | Designing the usability test | What are scenario tasks? | Write 5 scenario tasks (1-person test) |
6 | Our first usability test | How to moderate remote usability tests | Usability test results (1-person test) |
7 | Understanding the results | Heat map grid report | How many testers do you really need? |
8 | Spring Break week (no classes) | ||
9 | Starting the client usability test | Client interview impressions | Write 5 personas and use scenarios (client test) |
10 | Designing the client usability test | Article review assignment (for graduate students) | Write 5 scenario tasks (client test) |
11 | Working as a team | Usability testing roles | Research questions and objectives |
12 | Usability test planning | Background questionnaires and debriefing questions | Report your test plan readiness |
13 | Usability testing | - | Test results first impressions (client test) |
14 | Preparing for the client report | How to make a great online presentation | Usability test recommendations |
15 | Usability test results | Usability test results presentation (client test) | Usability test results report (client test) |
16 | Individual reflection | - | Final “exam” (reflection) |
Note that most of the Wednesday and Friday assignments are structured as discussions, which require students to post their synthesis or practice, then reply to at least two other students.