HCI Exercise 2

Human-Computer Interaction SS 2025

Exercise 2: Heuristic Evaluation (HE Report)

For this exercise, your group will play the role of user interface consultants who have been contracted to evaluate a web site. The idea is to identify as many potential usability problems as possible and feed this input into the next design phase.

Your tutor will play the role of your client (the manager of the web site you are evaluating). You will present the results of your heuristic evaluation to your client at Meeting M2 at M2 Slots.

Use the report template and materials provided in the Ex2 Materials.

1 Plan the Heuristic Evaluation

First of all, read the sections in the lecture notes on heuristic evaluation.

  1. The four (or three) members of your group will be the evaluators for the heuristic evaluation.

  2. The web site which has been assigned to your group is visible in TeachCenter in the Practical Exercises section on the page entitled Web Site to Evaluate.

  3. Unless otherwise notified, assume that you will be evaluating the entire web site.

    Generally speaking, do not follow links to external web sites and evaluate them too. However, it often does make sense to follow links to subsites and consider them too (say, tickets.site.com). Check with your tutor if you are unsure.

  4. Consider the target user population(s) for your assigned web site. Are there distinct user groups with distinct needs and goals?

  5. If the web site has several distinct user groups, you should consider all of the potential user groups in the heuristic evaluation.

  6. Consider the goals the users from these groups might have and typical tasks they might want to perform on the web site.

  7. Plan for all of your evaluators to work at around the same time, to minimise the risk that the web site changes in between evaluations.

  8. Coordinate between yourselves, so that the evaluators use a mix of devices and browsers to inspect the web site:

    • Two of the evaluators should evaluate on a mobile device (= smartphone or tablet, running Android or iOS), in portrait orientation, each using a different approved mobile web browser. On mobile, the approved browsers are: Chrome, Firefox, Safari, and Samsung Internet Browser.

    • The other evaluator(s) should evaluate on a PC (= desktop or laptop, running Windows, MacOS, or Unix), in landscape orientation, each using a different approved PC web browser. On PC, the approved browsers are: Chrome, Firefox, and Safari.

    It is OK for one evaluator to use (say) Chrome on PC, and another evaluator to use Chrome on mobile, since they are different devices.

    You must use real devices and not an emulator or simulator.

    If you are using a mobile browser and your web site suggests you should go to the mobile version of the site, then do so. However, do not install a native app, even if your web site suggests you should do so.

  9. In an attempt to reflect real usage patterns, one of the evaluators on mobile and one of the evaluators on PC should plan to use an ad blocker.

    If you find a problem which may be due to blocked content, unblock ads temporarily to see whether the problem persists. If the problem then goes away, document it as a problem apparently caused by ad blocking.

  10. Decide how you will make screen video recordings on the various platforms you have chosen to evaluate. See my Guide To Session Capture on various platforms. You should plan to record your voice (with a live commentary while you work) as well as what happens on screen. For this heuristic evaluation, do not record the face of the evaluator in any video recording, only the screen and audio.

    Do not use recording software which leaves behind a watermark. On mouse devices, turn on recording of the mouse pointer (even make it slightly larger), but turn off any mouse trails. On touch devices, see if your recording software can turn on display of touch events, but turn off touch trails.

    The screen recording should be at most at FullHD resolution (1920×1080 pixels in landscape orientation or 1080×1920 pixels in portrait orientation). If your device has higher resolution than FullHD, please plan to adjust the settings or make the browser window smaller. If your device has lower resolution than FullHD, record at its maximum resolution. The standard browser window and GUI should be included in the recording. Other things outside the browser window should not be included in the recording. Do not leave unnecessary margins around the video.

    If the native resolution of your device is too high in one or both directions, and you cannot adjust the settings or make the browser window smaller, you will have to record at higher pixel resolution and later transcode down uniformly to at most FullHD. Do not distort the video and do not create black or empty strips to either side (or top and bottom). See my Guide to Video Transcoding. Unfortunately, transcoding always results in a loss of quality.

    Evaluate and record in light mode (dark text on a light background).

    Use a good microphone, and make a test recording to make sure everything is working and the audio can be heard.

    Our preferred video format for screen recordings is MP4 with H.264 video and AAC audio. If your device/software can record in that format natively, that is perfect.

    Otherwise, you will have to convert/transcode your video clips to MP4 (with H.264 video and AAC audio) format later on. Unfortunately, transcoding always results in a loss of quality.

    Plan to use the following recommended video settings:

    Container:MP4
    Output Video Resolution:1920×1080 (FullHD) [or 1080×1920] or lower
    Frame Rate:20
    Codec:H.264
    Rate Control:VBR (Variable Bit Rate)
    Bit Rate:5000 Kbps (= 5 Mbps)

    Plan to use the following recommended audio settings:

    Channels:Stereo
    Codec:AAC (= mp4a)
    Sample Rate:44100 Hz
    Bit Rate:160 Kbps
    Bits per Sample:32
  11. Decide how you will, if necessary, extract video clips from the screen capture videos or trim video clips. See my Guide to Video Editing.

    Every finding must be illustrated with a video clip.

  12. Adapt the HE materials as necessary for your evaluation:

    • Assign each evaluator a two-letter shorthand code based on their initials. For example, Keith Andrews would be “ka” in lower case and “KA” in upper case.

      If two evaluators in your team both have the same initials, then assign one of them a variant. For example, Ken Anderson might be assigned "kn" in lower case and “KN” in upper case.

    • Copy the skeleton template log file log-ee.txt from the materials to create a plain text log file for each evaluator:

        log-ee.txt
      

      where ee is replaced by the lower case initials of the evaluator. Do not use any upper case letters, special characters, umlauts, or spaces in file names.

      For example, if your name is Keith Andrews, the log files should be named log-ka.txt.

    • Each evaluator should fill in the metadata at the start of their log file to match their (planned) evaluation environments. Enter the name of the corresponding evaluator. Take care to preserve the character encoding of the log files as UTF-8.

    • Each negative finding (problem) will be assigned an ID of the form EE-Neg01, EE-Neg02, etc. Each positive finding will be assigned an ID of the form EE-Pos01, EE-Pos02, etc. In each log file, replace the initials EE in the example problem and positive finding IDs with the upper case initials of the corresponding evaluator.

2 Individual Evaluations

Each evaluator must inspect the interface alone.

2a) Prepare for Individual Evaluations

Make the following preparations:

  1. Set up screen and audio capture on your allocated device, as described above. Make a test recording to make sure everything is working and the audio can be heard.

  2. Print out an A4 copy of the Andrews General Usability Heuristics provided in the materials.

  3. Fill in the metadata at the top of the log file.

  4. Reset the browser to a fresh state before you start evaluating:

    • Make sure all add-ons and extensions are disabled (except for an ad blocker, if you are using one via an extension).
    • Delete all cookies.
    • Delete all temporary files (clear the cache).

2b) Conduct the Individual Evaluations

Conduct the individual evaluation using the allocated device and browser (and ad blocker if allocated):

  1. Most of the chosen web sites are available in German. Some of the chosen web sites are available in English and German. Sometimes, some of the chosen web sites are available in English only.

    If the web site has both English and German versions, each evaluator should pick one language (English or German) for the bulk of their evaluation, but then also take a look with the other language too.

  2. Speak English for the audio commentary while evaluating, regardless of the language of the web site.

  3. Keep in mind the target user population(s) and their typical tasks.

  4. Start the screen video (+ audio) capture, making sure to hide any UI controls of the recording software.

    To be clear, the evaluator should record their entire inspection session, so that it is available to cut out video clips from, in case some findings cannot later be reconstructed.

  5. Try out your assigned web site first with all cookies enabled and then with most cookies disabled (only necessary cookies enabled) to see if there is any difference.

    If you find a problem which appears to be due to disabling cookies, document it as such.

  6. Inspect the interface, talking out loud and also noting problems and positive impressions in chronological order of discovery in the plain text log file you adapted from the materials.

    The log file should be called log-ee.txt, where ee are your initials in lower case.

  7. Each negative finding (problem) is assigned an ID in chronological order of discovery of the form EE-Neg01, EE-Neg02, etc., where EE are the evaluator's upper case initials, for example KA-Neg01.

  8. Proceed analogously for positive findings: EE-Pos01, EE-Pos02, etc.

  9. For each finding, enter the following information into the log file:

    • ID: The finding ID, e.g. KA-Neg01.
    • Title: A short heading concisely summarising the finding.
    • Description: Two or more sentences describing the problem.
    • Video Clip: The name of the corresponding video clip, e.g. ka-neg01-keywords.mp4.
    • How Reproducible?: Instructions as to how to reproduce the finding, either in sentences or as a path through the interface, e.g. Options → That → This.
  10. In addition, for negative findings (problems), also make a note of:

    • Heuristic: The corresponding heuristic (if any), which problem falls under, e.g. A04 Consistency.

      Most problems will fall under one of the heuristics, but it is OK if some of the problems you report are not covered by a heuristic (leave the heuristic field in the log file blank). We will not assign heuristics to positive findings in this evaluation, only to negative findings.

    • Only When: If the problem occurs only in a specific situation (say with cookies disabled, or in dark mode), make a note of this here.
  11. Once you have finished evaluating, save your screen video (+ audio) recording and your log file.

  12. Make sure your log file is a plain text file encoded in UTF-8, of at most 100 kb.

    Make sure that when editing the log file, that when you save it, it is still in UTF-8 format.

  13. If there are many examples of the same general problem (for example, typos across several pages), count this as one problem and give two or three examples in the description.

  14. In total, we would generally expect each individual evaluator to find between 10 and 20 problems and at least 3 positives on each device, depending on the quality of the web site you are evaluating.

2c) Create or Extract Video Clips

  1. Re-create a short video clip to illustrate each finding from each device.

    Every finding must be accompanied by at least one video clip (sometimes two, say with and without cookies, if that is what best illustrates the problem). The other evaluators will need the video clip to understand the finding, so they can properly assign a severity rating.

  2. Each video clip must contain an audio track in English, concisely describing the finding.

    While the audio quality of an original voice will be better, if you are concerned about privacy (the potential of voice cloning, etc.), it is OK to modify your voice with a voice modifier or to use synthetic speech (text-to-speech) for the audio commentaries. A voice modifier might be preferable to synthetic speech, since that can sound quite robotic, often with misplaced pauses and shaky pronunciation. In any case, the audio commentary must remain clear and understandable.

  3. Where a finding cannot be recreated, extract a video clip from your evaluation recording.

    If the existing audio commentary does not describe the finding adequately, replace the audio commentary with a new one.

  4. Make sure the audio levels are high enough, so that your audio commentary can be clearly heard.

    If necessary, you might want to normalise the audio level and/or remove any noise.

  5. Each video clip should be long enough to illustrate the finding, but no longer. Typically, video clips are 10 to 15 seconds. At a maximum, each video clip should be no longer than 20 seconds duration.

    In some rare cases, say to illustrate extremely slow loading time, it might be tempting to include a video clip longer than 20 seconds. In such a case (for the purposes of this course), edit the video clip to show say an initial 8 seconds, then a transition frame something like “28 seconds later” for 2 seconds, then the final 8 seconds (18 seconds duration in total).

  6. Each video clip should be no larger than FullHD resolution (1920×1080 pixels in landscape orientation or 1080×1920 pixels in portrait orientation).

    Otherwise, you will have to transcode down uniformly to at most FullHD. Do not distort the video and do not create black or empty strips to either side (or top and bottom). See my Guide to Video Transcoding. Unfortunately, transcoding always results in a loss of quality.

  7. Each video clip should be no larger than 10 MB in size!

    Otherwise, make it shorter in duration if possible, or else transcode to a lower video bitrate (with the resulting loss of quality).

  8. Name your video clips according to the following naming scheme:

    ee-negxx-keywords.mp4
    ee-posxx-keywords.mp4

    where ee is replaced by the lower case initials of the evaluator (e.g. ka for Keith Andrews), neg indicates a negative finding (problem) and pos indicates a positive finding, xx is the two-digit number in chronological order of discovery, and keywords comprises one to four words describing the finding separated by hyphens.

    For example:

    ka-neg01-links-yellow.mp4
    ka-pos01-breadcrumbs-work-well.mp4
    ...

    For file and folder names, use only lower case letters, digits, and hyphens, from the 7-bit ASCII character set. Do not use any upper case letters, spaces, underscores, or special characters in either the name or extension. The name ka_neg01_links_yellow.MP4 does not conform to the naming scheme.

    Under windows, turn on the display of file extensions, so that you see them!

  9. Enter the names of your video clips into your log file.

3 Prepare the Findings

As a group, assemble the individual log files and associated video clips from each evaluator in your group.

3a) Aggregate the Findings

Form a combined (aggregated) list of problems (and a combined list of positives). The spreadsheet helist.xlsx is provided for you to use (but it is not required that you hand it in).

Proceed as follows:

  1. Choose the evaluator with the longest list of problems and enter the problems into the spreadsheet.

  2. Merge the problems from the other evaluators into this list.

  3. If the same (or very similar) findings are found by two or more evaluators, combine them into one.

    You then have multiple video clips to illustrate the finding.

  4. If the same (or very similar) findings are found on two or more devices, combine them into one.

    You then have multiple video clips to illustrate the finding.

  5. List many small related issues (such as 15 individual typos) as one problem with many instances (rather than 15 problems).

    The same for many examples of German content not being available in English (or vice versa).

  6. Indicate which problems were found by which evaluator(s).

  7. For each problem, determine whether the problem is general to all browsers and platforms or specific to a particular browser or platform. Or, if it only occurs when cookies are disabled, or when an ad blocker is used. In such cases, indicate this in the “Only When...” column. If a problem is general, leave the corresponding cell empty.

Proceed analogously for the positive findings.

3b) Assign Ratings

  1. Individually (not working together), assign severity ratings to each problem. Use the 0 to 4 integer severity scale below:

    SeverityMeaning
    4Catastrophic Problem
    3Serious Problem
    2Minor Problem
    1Cosmetic Problem
    0Not a Problem

    Individual ratings must be integers, fractional ratings are not allowed.

  2. Individually (not working together), assign positivity ratings to each positive finding. Use the 0--4 integer positivity scale below:

    PositivityMeaning
    4Extremely Positive
    3Major Positive
    2Minor Positive
    1Cosmetic Positive
    0Not a Positive

    Individual ratings must be integers, fractional ratings are not allowed.

  3. Calculate the mean severity rating for each problem to 2 decimal places. Calculate the mean positivity rating for each positive finding to 2 decimal places.

  4. Sort the problems into descending order of average severity (most severe first). Sort the positives into descending order of average positivity (most positive first).

  5. Renumber the problems and positives, so that the finding at the top of each list is number 1. We will call these the ranked finding numbers. The top-ranked positive finding will henceforth be named P01, the second-ranked positive finding P02, etc. The top-ranked negative finding will be named N01, the second-ranked negative finding N02, and so forth.

3c) Select Video Clips

  1. For each finding, select the best (most representative) available video clip(s) to illustrate the finding. It often makes sense to include multiple video clips to illustrate one finding, for example, if they show different aspects of the same issue (say, one on mobile and the other on PC).

    However, do not always select every available video clip from every evaluator for every finding. Typically, you might select one or two, possibly three, video clips per finding. Only these are to be included in the report and handed in.

    If you are using the provided spreadsheet, you can list all of the available video clips for a finding in the column “All Available Video Clips”, then enter the ones you want to use in the column “Selected Video Clip(s)”.

  2. Copy and rename the selected video clips for each finding, so that the ranked finding number is prepended to the file name, according to the following scheme:

    nnn-ee-negxx-keywords.mp4
    pnn-ee-posxx-keywords.mp4

    where n indicates a negative finding (problem) and p indicates a positive finding, nn is the ranked finding number, ee are the lower case initials of the evaluator, and posxx or negxx is the original numbering in discovery order. For example:

    n01-ka-neg01-links-yellow.mp4
    p01-ka-pos01-breadcrumbs-work-well.mp4

    where n01-ka-neg01-links-yellow.mp4 is one of the selected video clips illustrating the highest ranked (most severe) problem N01 and was found by evaluator KA.

    If you are using the provided spreadsheet, you can rename the selected video clips in the column “Selected Video Clip(s)”, leaving the original names in the column “All Available Video Clips”.

3d) Create the Tables as HTML

Transfer the sorted lists of problems and of positives into simple XHTML5 table entries.

Do not just save the spreadsheet from Excel as HTML, since this creates bloated HTML code, which will not be valid XHTML5.

  1. For example, save the spreadsheet as a CSV (comma seperated value) file and then use a text editor to manipulate it:

    • Replace the commas with HTML cell boundaries (</td><td>).
    • Prepend an initial <tr><td> to the first cell and append a final </td></tr> to the final cell on each row.

    If you use the provided spreadsheet, the column “All Available Video Clips” should not be included in the HTML5 table in the final report. You could, say, make a temporary copy of the spreadsheet, delete the column, and save the CSV from there. The column “Selected Video Clip(s)” is simply called “Video Clip(s)” in the HTML5 table in the final report.

  2. If you know Perl, you might also use a Perl script to generate simple XHTML5 table entries from a CSV file. I have an example Perl script (rename it to csv2html.pl).

4 Write the Heuristic Evaluation Report

  1. Write a report of the heuristic evaluation using the skeleton report provided.

  2. Your report should be in English.

  3. Include some title information indicating the group number, the title/topic of the evaluation, and the name of each group member.

  4. Include a summary of the evaluation and the most important findings written for your client's management. The description of the evaluation scope and procedure should take up no more than 25% of the executive summary. Most of the executive summary should summarise the main findings.

    In real life, a manager would typically not have time to read the whole report, but would read only the executive summary to find out how the evaluation was done and what the main findings were.

  5. Include a description of the methodology behind a Heuristic Evaluation, in your own words.

  6. Adapt the citations and references as instructed in the comments.

    See my Guide to Citations and References.

  7. Describe the assumptions made about user characteristics and groupings and the extent of the evaluation.

  8. Decsribe the evaluation environments (hardware, browser version, OS, date of evaluation, etc.) used by each evaluator.

    Explain how each evaluator extracted or created video clips on each particular evaluation device or platform, and how they were edited and transcoded if necessary.

  9. Discuss of three most positive findings, each illustrated with a video clip.

  10. Include an aggregate table of positive findings in descending order of mean positivity.

  11. Analyse and discuss the five most severe problems (the top five from your sorted list), each illustrated with a video clip.

  12. Include an aggregate table of problems in descending order of mean severity.

For the HE Report:

5 Create a Presentation

  1. Create a slide deck for a 15-minute group presentation of your HE Report to your client (tutor).

  2. Use slides (screenful by screenful) with bulleted lists (not full sentences).

    Make a separate presentation. Do not simply open your report in a web browser and project that.

  3. Your slides should be in English and you should present in English.

  4. Use PowerPoint. If you do not have PowerPoint, try LibreOffice Impress as an alternative. Save the file in PowerPoint .pptx format. Do not use Google Slides!

    Students at Graz University of Technology have free access to Microsoft Office 365 by going to portal.office.com and logging in with their university account.

    In Google Slides, it is not possible to embed video within the slide deck file, only to link it, which breaks if a video file moves or is deleted, or if you are offline.

  5. You may have to install a video codec pack (such as the K-Lite Codec Pack for Windows or the GStreamer plugins for Linux), to play back video clips from within PowerPoint or Impress.

  6. For the slide deck:

    • Use an aspect ratio of 16:9.
    • Use light mode (dark text on a light background).
    • Use a large enough font, so people at the back of the room can still see.
    • Embed images and videos into the slide deck rather than linking them, so they are included within the .pptx file and the presentation is completely self-contained.
  7. Create a title slide with the following information:

    • Heuristic Evaluation of <Web Site URL>
    • Names of Group Members
    • Group Number (e.g. G1-01)
    • HCI SS 2025
    • Date of Presentation
    • And, I ask (but cannot require) that you place your slide deck under a CC BY 4.0 licence. To do so, include the following statement in a smaller font at the bottom of the title slide:

      This work is placed under a Creative Commons Attribution 4.0 International (CC BY 4.0) licence.
  8. Address (at least) the following talking points within the slide deck:

    • Evaluation methodology.
    • Evaluation environments.
    • Top two (or more) positive findings.
    • Top four (or more) negative findings.
  9. Name the presentation gT-GG-he-slides.pptx, where T-GG is the number of your tutor and group.

    For example, g1-01-he-slides.pptx for Group G1-01.

6 Prepare the HE Report Directory

  1. Make a directory called gT-GG-he for your heuristic evaluation report, where T-GG is the number of your tutor and group. For example, g1-01-he for Group 1-01.

  2. Include your main file he.html.

  3. Copy over the files heuristics.pdf and report.css unchanged.

  4. Create a subdirectory logs for your log files. Include the individual evaluation log files from each evaluator. Do not include the original template log file (log-ee.txt).

  5. Create a subdirectory presentation for your presentation slides.

  6. Place the video clips (and any corresponding poster images) selected for your report into a subdirectory called videos.

    All video clips referenced in your report must be handed in as local copies, so that the report is self-contained. Do not include video clips which are not referenced in the report.

  7. When naming your files and directories, use only lower case letters, digits, and hyphens, from the 7-bit ASCII character set.

  8. Your directory structure should look something like this:

    g1-01-he/
      he.html
      heuristics.pdf
      report.css
      logs/
        log-hr.txt
        log-ct.txt
        ...
      presentation/
        g1-01-he-slides.pptx
      videos/
        p01-ka-pos01-breadcrumbs-work-well.mp4
        ...
        n01-ka-neg01-links-yellow.mp4
        ...
    
  9. Tidy up your directory. Do not leave junk files, backup files, etc. lying around.

  10. Make a zip file of your hand-in directory, including the directory itself (not just the files inside).

    Name the zip file gT-GG-he.zip, where T-GG is the number of your tutor and group. For example, g1-01-he.zip for Group 1-01.

  11. The maximum size of your zip file is 500 MB (500,000,000 bytes). Check the size before uploading.

    If your files are too big, you may have to delete something or reduce the size of something. Contact your tutor if you are unsure what to remove or make smaller.

7 Submit Your Work

  1. Upload your zip file to TeachCenter before the submission deadline.

  2. This exercise is a group exercise. The group makes a single submission (one zip file) as a group.

  3. If you make any changes to your submission after the deadline, your submission will be flagged with the new timestamp and will be considered to be a late submission with the corresponding points deduction.

  4. Submissions will cease to be accepted 48 hours after the deadline.

8 Present Your Work

  1. You will present your work at Meeting M2 at M2 Slots.

  2. At the meeting, you must present the same version of your work which was handed in (uploaded to TeachCenter).