HCD Qualitative User Testing Project - (40 points.)

Remember, while working in the classroom, take notes!!
Write down the names of people who help you develop tasks / scenarios.
Later, write down the names of your testers. (Not the same people!)

Web sites will be randomly selected in class.

Why: We want to study first-time user learn-ability of web sites, as a benchmark for future designs. (Future web sites should be as usable, or more usable. Not less!)

Purpose: We are looking for an easy to use interface where errors are unlikely and common tasks are easier and faster than rare tasks.

You will be looking for design improvements for the next web site, so pick some nice hard tasks, tasks that will strain the usability of your web site.

Where: Here in our classroom. "In the field" testing.

What to test: you pick the tasks. Pick 7 to 9 or so tasks, applying insights from the textbook. Pick:

Make certain to select which version of the web site you will test (the laptop / desktop version, or the smart phone version) and test only that one version of the web site for all your tasks.

Ask in class if you want advice picking tasks. You will work with a group, so you will have help. (Include the names of your group members in your final report.)

Create two or more scenarios (for two or more tasks each). (More scenarios is better. In the reflections of students who've done usability testing in previous semesters, the most common improvement students have mentioned is "I should have used more scenarios." Some students have said scenarios are more fun and some have said more scenarios would have improved the quality of their testing.)

Create 2 or more pre-test questions to ask each test participant. ("What is your name?" is a good choice for one of your pre-test questions. "What is your age?" is an inappropriately personal question, and also provides no useful insights for this analysis.)

Create 2-4 de-briefing questions, to ask each test participant after testing is over. At least one de-briefing questions should, in some way, try to gather "impressions and opinions" from your test takers. Avoid yes/no de-briefing questions. Seek after questions that provoke reflection and will provide you insight into the user's experiences and impressions.

Write down a plan for actions you need to perform in between test participants, to clean up after the last test participant and prepare for a new one. For example, if you have your test participants go through most of the process of purchasing something from the web site, write down that you will need to empty your "cart" on the web site and delete any other records of the purchase process each and every time you prepare for a new test participant.

Keep track of errors - their frequency, type, and severity. We're interested in learn-ability, severity and frequency, errors of omission or commission.

Who: Tester: you, for your web site.

Test subjects: Two or more fellow students, best if test subjects have never used your web site before.

(If you are not able to attend class on the testing dates, contact the professor promptly. To protect the welfare of test participants, usability tests performed outside of CS 3500 will not count for your project, unless you get the professor's permission ahead of time.

We will use "think out loud" observation. "I would like you to think out loud and describe why you do things and how you feel about the way they are done." (RI, Ch 8)

Getting stuck is discouraging for subjects. Try to give test subjects an early success experience. Start with a task you think will be relatively easy. Remind the test subjects that they can quit at any time for any reason

"If subjects get stuck: While the experimenter should not help the subject with the task, there are a few exceptions to this rule.

If you have to give your user hints, discuss why in your report -- you have found a usability issue for new users of your web site.

Remember to de-brief your test participants using your pre-planned debriefing questions. Answer any questions they have.

When you are a test participant: Go slowly, think out loud, and go slowly. The person giving you the test needs time to make observations and take notes.

The qualitative analysis report will be a MS Word or PDF document (.doc, .docx or .pdf name ending).

The qualitative analysis report will contain:

Spelling, grammar and quality of writing always count.

The analysis report, as an MS Word or PDF document (.doc, .docx or .pdf), is to be submitted as an attachment to email by the due date.
Email, as an attachment, the analysis report to the instructor (mthomas@cs.csustan.edu). CC yourself. Email "Subject" must be "CS3500: User Test" with no quotes (capitalized and spaced exactly as shown).

Grading Sheet:

Total Points (40)

Completeness of Report Missing Incomplete portions Satisfactory
Web site Identified 0 0 0
Task list 0 0 0
Scenarios 0 0 0
Test plan 0 0 0
Pre-test and De-briefing Questions 0 0 0
Names of other students 0 0 0
Test observations 0 0 0
Interpretations 0 0 0
Improvements 0 0 0
Conclusions / Future testing improvements 0 0 0
Literary Quality Poor Ok, but proof read Good
Spelling / Typos / Grammar 0 0 0
Formatting particulars, email subject 0 0 0
Clarity of writing 0 0 0
Quality of writing 0 0 0
Technical Quality of Testing Poor Ok Great
Task list 0 0 0
Scenarios 0 0 0
Test plan 0 0 0
De-briefing Questions 0 0 0
Test observations 0 (hard to understand) 0 (sketchy, incomplete) 0
Interpretations 0 (little insight added) 0 0
Improvements 0 (poor/minor) 0 0 (excellent/low-cost)
Conclusions / Future testing improvements 0 (no conclusion) 0 0

Last Updated on 10/26/2016
By Megan Thomas
With thanks to Saul Greenberg for some ideas.