Heuristic Evaluation


The goal of this assignment is to get experience performing a heuristic evaluation of a real user interface, and to provide useful feedback to your classmates on their interface designs. Note that there is an in-class discussion of recommendations that you write up and submit.

Whose project am I evaluating?

Feedback sessions are scheduled explicitly during class time. You may also schedule more than the 40 minute allotments outside of class if needed. The 40 minute in-class eval slots can't be adjusted trivially because they would generally cause someone else to miss something else.

see the Design Refinement Phase Checklist for your pairing.

The Task

In this assignment, you will act as a consultant to another group in the class. Assume that they would like some outside assistance in finding problems with their prototype interface.

This assignment will proceed in three steps.

  1. Each student will perform a heuristic evaluation on your own and write up a report in the format described below.
  2. During the time when you and a partner are presenting your errors and recommendations to the creators of the interface you were evaluating, you will write up take-aways that you notice between the feedback you are giving and what your evaluator/partner is providing. Are your problems the same? Do the severity ratings match? Do you agree or differ on the recommendations? You will turn this in as a part of the assignment too. Discuss aspects of the interface that you found particularly compelling as well as summarize the major issues you found.

Individual Evaluation:

Visit the online prototype provided by the team whose work you are evaluating (available to you by the date specified in the Design Refinement Checklist). Read their instructions and explore their prototype. Apply Nielsen's heuristics (or a different set of heuristics if you think a different set is more appropriate and your evaluation group has agreed) to the interface. You should do this in two passes. In the first pass, read the group's notes/instructions and then try out their prototype (i.e., run it); walk through their task scenarios. In the second pass, go through the interface (using the scenarios again if appropriate) and look for and note HE violations.

You should concentrate on the interface the group has designed, not only on what has been implemented. Do not focus on features that are missing, but would clearly be added (e.g., "there should be help on this screen... and this screen..." -- if it is a globally missing feature you can report it once). We want to focus on evaluating what they have designed so far. You should also bear in mind whatever the team may tell you (or whatever their site may suggest) about their target audience.

Use this severity rating scheme:

0. Don't think this is a usability problem. 
1. Cosmetic problem 
2. Minor usability problem 
3. Major usability problem; important to fix 
4. Usability catastrophe; imperative to fix

The individual report should list each of the problems found in the following format:

problem # [heuristic # and name] (severity rating) 
description of problem and reasoning why you think this violates the heuristic

For example:

5. [H4 Consistency & Standards] (Severity 2) 
The interface used the string "Save" on the first screen for saving the user's file, but used the string "Write file" on the second screen. Users may be confused by this different terminology for the same function. 
6. [H3 User Control & Freedom] (Severity 3) 
The interface brings the user into a set of preference screens when they select "New User", but doesn't allow the user out of the dialog until they fill out all 4 screens. There is no way to cancel from any of the screens if a user came into the first screen by accident. 

You may also want to include some notes about things that you found that you thought were good, but there is not a formal format for these. The individual evaluation writeup should take no more than two printed pages. You shouldn't spend more than two hours on this part of the assignment.

here is an example from a previous team that can guide what your final heuristic evaluation summary can look like: http://hfid.olin.edu/sa2012/s_engr3220-eris/designrefinement/hfid_eris_heuristic_evaluation.pdf

Turning it in:

Please submit your individual and group reports to the folder of the team you are evaluating in this shared drive >> LINK <<