HCD Qualitative User Testing Project - (40 points.)
Remember, while working on this project, take notes!!
Write down the names of people who help you develop tasks / scenarios.
Later, write down the names of the students who take your tests. (Not the same people!)
Web sites were randomly assigned to students by the professor.
Here is the list of web sites and students. (If you don't like your assigned web site, blame the alphabet.)
Why: We want to study first-time user learn-ability of web sites, creating a benchmark for future designs. (Future web sites should be as usable, or more usable. Not less!)
Purpose: We are looking for an easy to use interface where errors are unlikely and common tasks are easier and faster than rare tasks.
You will be looking for design improvements for the next web site, so pick some nice hard tasks, tasks that will strain the usability of your web site.
Where: Via video-conferencing. You will use 'share screen' tools, in combination with a webcam view of test subjects. That way, the test giver can see the facial expressions and the web browser interactions of the test taker.
What to test: you pick the tasks. Pick 5 to 6 or so tasks, applying insights from the textbook. Pick:
If possible, try to get your test takers all on the same web browser and, hopefully, the same version of the web site. (Many web sites have a desktop version, and a smart phone version, and automatically display the version depending on a visitor's hardware.) If it is not possible to keep the web browser and web site versions consistent, include in your notes information about your test takers' web browser software and the web site versions they visited.
Ask in class if you want advice picking tasks.
Create one or more scenarios (for two or more tasks each). (More scenarios is better. In the reflections of students who've done usability testing in previous semesters, the most common improvement students have mentioned is "I should have used more scenarios." Some students have said scenarios are more fun and some have said more scenarios would have improved the quality of their testing.)
Required pre-test questions for your test participants are:
Create one or more additional pre-test questions to ask each test participant. (Note that "what is your age?" is an inappropriately personal question, and also provides no useful insights for web site usability analysis.)
Create 2-4 de-briefing questions, to ask each test participant after testing is over. At least one de-briefing questions should, in some way, try to gather "impressions and opinions" from your test takers. Avoid yes/no de-briefing questions. Seek after questions that provoke reflection and will provide you insight into the user's experiences and impressions.
Write down a plan for actions you need to perform in between test participants, to clean up after the last test participant and prepare for a new one. For example, if you have your test participants go through most of the process of purchasing something from the web site, write down that you will need to empty your "cart" on the web site and delete any other records of the purchase process each and every time you prepare for a new test participant.
Keep track of errors - their frequency, type, and severity. We're interested in learn-ability, severity and frequency, errors of omission or commission.
Who: Tester: you, for your web site.
Test subjects: Two or more fellow students. The test results will be best if test subjects have never used your web site before.
(If you are not able to attend class on the testing dates, contact the professor promptly. To protect the welfare of test participants, usability tests performed outside of CS 3500 will not count for your project, unless you get the professor's permission ahead of time.)
We will use "think out loud" observation. "I would like you to think out loud and describe why you do things and how you feel about the way they are done." (RI, Ch 8)
Getting stuck is discouraging for subjects. Try to give test subjects an early success experience. Start with a task you think will be relatively easy. Remind the test subjects that they can quit at any time for any reason
"If subjects get stuck: While the experimenter should not help the subject with the task, there are a few exceptions to this rule.
If you have to give your user hints, discuss why in your report -- you may have found a usability problem for new users of your web site.
Remember to de-brief your test participants using your pre-planned debriefing questions. Answer any questions they have.
When you are a test participant: Go slowly, think out loud, and go slowly. The person giving you the test needs time to make observations and take notes.
For more information:
"Write Better Qualitative Usability Tasks: Top 10 Mistakes to Avoid" by Amy Schade at Nielsen Norman Group, on April 9, 2017
The qualitative analysis report will be a MS Word or PDF document (.doc, .docx or .pdf name ending).
The qualitative analysis report will contain:
Remember, we are examining the usability of your web site for new users of the web site only. What a new user has trouble with might be very different from what an experienced user (like you) has trouble with.
It may help to remember what we learned from Don Norman. Would your suggested improvement help the user develop a more accurate conceptual model? Improve feedback? Improve mapping? Improve visibility? Reduce the likelihood of some kind of error, or make error recovery easier?
Spelling, grammar and quality of writing always count.
The analysis report, as an MS Word or PDF document (.doc, .docx or .pdf), is to be submitted to the Canvas Assignment.
Total Points (40)
|Completeness of Report||Missing||Incomplete portions||Satisfactory|
|Web site Identified||0||0||0|
|Names of other students||0||0||0|
|Pre-test and De-briefing Questions||0||0||0|
|Conclusions / Future testing improvements||0||0||0|
|Literary Quality||Poor||Ok, but proof read||Good|
|Spelling / Typos / Grammar||0||0||0|
|Formatting particulars, email subject||0||0||0|
|Clarity of writing||0||0||0|
|Quality of writing||0||0||0|
|Technical Quality of Testing||Poor||Ok||Great|
|Test observations||0 (hard to understand)||0 (sketchy, incomplete)||0|
|Interpretations||0 (little insight added)||0||0|
|Improvements||0 (poor/minor)||0||0 (excellent/low-cost)|
|Conclusions / Future testing improvements||0 (no conclusion)||0||0|
Last Updated on 6/22/2020
By Megan Thomas
With thanks to Saul Greenberg for some ideas.