Heuristic Evaluation Report
Human-Computer Interaction SS 2024
Group GT-XX
Harald Roth
Christian Traum
Thomas Gelb
Sabine Schwarz
Heuristic Evaluation of the Web Site
example.com
Report of XXth Apr 2024
{My instructions and comments are contained inside curly brackets. Remove them before you hand in your work!}
{You must use your own words. Do not copy material from the web, from colleagues from previous years, or from anywhere else. Do not generate text using AI-based tools. You are allowed to copy your own words from your own HE Plan.}
1 Executive Summary
{Executive summary of the main results from the heuristic evaluation aimed at higher management (between 300 and 500 words). In paragraphs, no subsections, no bulleted lists. Do not list the ten heuristics.}
{Your client's manager will not read the whole report, but only the executive summary, and wants to know how the evaluation was done and what the main findings were.}
{The description of procedure, methodology, and setup should take up no more than 25% of the executive summary. Most of the executive summary should summarise the main findings.}
2 Introduction
{Short description of the web site to be evaluated and the motivation behind the evaluation.}
3 Evaluation Procedure
This section describes the procedure used in the heuristic evaluation.
3.1 Evaluation Methodology
{Describe what a HE is and how it was done. Write between 300 to 500 of your own words. Explain the method, do not just list the heuristics. Replace my sample text below with your own.}
Heuristic evaluation is ... ... by Jakob Nielsen and Rolf Molich in 1990 [Nie1990]. ... as can be seen in Moran and Gordon's guide [Mor2023].
For this evaluation, the "Andrews General Usability Heuristics", shown in Appendix A.1, were used. These are based on and slightly adapted from Nielsen's revised set of ten usability heuristics [Nie1994].
{Cite at least two more references of your own in this section and include them in the References section below. Use only indirect quotations (paraphrasing), no direct quotations. You may remove or retain my sample citations and references as you wish.}
3.2 User Profiles
{Describe the kinds of user the site is trying to attract.}
{Group these users into categories according to their characteristics.}
{Describe the goals and typical tasks for each of these user groups.}
{Do not create personas for these user groups.}
3.3 Extent of the Evaluation
{Describe which parts of the interface were evaluated and which parts were not.}
3.4 Evaluators and Evaluation Environments
{Tabular overview of the different evaluators and the evaluation environments they used: i.e. the demographic data, hardware, browser, type and speed of internet connection, etc.}
The evaluation environments used by each evaluator are shown in Table 1. For this evaluation, mobile devices were operated in portrait mode.
{Note: the recording resolution is the resolution of the actual recorded video of your browser window.}
{Enter the name, version, and platform of the video editing software and any transcoding software you used.}
Evaluator | Harald Roth (HR) | Christian Traum (CT) | Thomas Gelb (TG) | Sabine Schwarz (SS) |
---|---|---|---|---|
Age | 24 | 26 | 31 | 25 |
Gender | male | male | male | female |
Device | Sony Vaio VGN-Z51XG Laptop | Dell Precision 5510 Laptop | Samsung Galaxy S3 | iPad Pro 10.5" (2017) |
OS and Version | Windows 7 Pro DE SP1 | Windows 10 Pro EN v 2004 | Android 4.1.2 | iOS 13.5.1 |
Screen Size | 19″ | 15.6″ | 4.8″ | 10.5″ |
Screen Resolution | 1280×1024 | 3840×2160 | 720×1280 | 1668×2224 |
Web Browser | Chrome 85.0.4183.83 (64-bit) | Firefox 80.0 EN | Firefox 68.0 | Safari 605.1.15 |
Ad Blocker | Privacy Badger 2020.7.21 | none | none | AdGuard 1.6 |
Internet Connection | Magenta Take-IT, xdsl | A1, dsl | HoT, LTE (hotspot) | WiFi |
Download Speed | 15 mbps | 18 mbps | 20 mbps | 30 mbps |
Screen Recording Software | OBS Studio 25.0.8 | Camtasia 2022.4.1 | AZ Screen Recorder | iOS Screen Recording |
Recording Resolution | 1280×1024 | 1920×1080 | 720×1280 | 1668×2224 |
Video Editing Software | Lossless Cut 3.59.1 Win | Lossless Cut 3.59.1 Win | Adobe Premiere Pro 24.2.1 Win | DaVinci Resolve 18 Mac |
Video Transcoding Software | not required | not required | not required | Handbrake 1.7.3 Mac |
Date of Evaluation | 2023-04-16 | 2023-04-14 | 2023-04-15 | 2023-04-15 |
Time of Evaluation | 10:00-11:30 | 14:00-15:00 | 18:00-19:20 | 09:30-10:30 |
4 Results of the Evaluation
This section describes the results of the heuristic evaluation.
4.1 Top Three Positive Findings
The top three positive findings according to their average (mean) positivity ratings are described in more detail below. The positivity rating scheme used to rank the positive findings is shown in Table 2.
Positivity | Meaning |
---|---|
4 | Extremely Positive |
3 | Major Positive |
2 | Minor Positive |
1 | Cosmetic Positive |
0 | Not a Positive |
{Describe the top three positive findings which emerged during the evaluation, according to the mean positivity ratings.}
{For each finding, include at least one paragraph of text and select the best (most representative) available video clip(s) to illustrate the finding.}
P01. Cookie Preferences are Clear
Title: | Cookie Preferences are Clear |
---|---|
Description: | The cookie preferences are clear and are pre-configured to turn off all non-essential cookies. |
Video Clip(s): | p01-ct-pos02-cookie-prefs.mp4, p01-tg-pos03-cookies.mp4 |
How Reproducible?: | Home → bottom of screen |
Mean Positivity: | 4.00 |
Two evaluators noted that the cookies dialogue is clear and not meant to trick the user into accepting unwanted cookies. Both choices have equal visual weight, and the cookie selection is pre-configured to include only essential cookies, as shown in Figure 1.
P02. etc.
4.2 List of All Positive Findings
{Aggregated list of all positive findings, in descending order of mean positivity.}
{For each positive finding, include a link to the best (most representative) available video clip(s) to illustrate the finding. The name of each video clip should be linked to the corresponding video clip.}
Table 3 shows a list of all the positive findings which emerged from the evaluation, sorted in descreasing order of average (mean) positivity, i.e. the most positive are at the top of the table. The name codes assigned to each evaluator are shown in Table 4.
{Individual positivity ratings are integer values between 0 and 4 (no fractions allowed). The mean positivity is a fractional value given to 2 decimal places.}
No. | Title | Description | Video Clip(s) | How Reproducible? | Found By | Positivity | |||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
HR | CT | TG | SS | HR | CT | TG | SS | Mean | |||||
1 | Cookie Preferences are Clear |
The cookie preferences are clear and are pre-configured to turn off all non-essential cookies. |
p01-ct-pos02-cookie-prefs.mp4, p01-tg-pos03-cookies.mp4 | tugraz.at/en/home | y | y | 4 | 4 | 4 | 4 | 4.00 | ||
... | Chat → MyChat → Options | ||||||||||||
... | {in descending order of mean positivity until...} | ... | |||||||||||
... | |||||||||||||
12 | Exchange Rate Calculator |
A link is available to an exchange rate calculator. |
p12-tg-pos02-exchange-rate.mp4 | Every page | y | 0 | 0 | 1 | 0 | 0.25 |
Code | Evaluator |
---|---|
HR | Harald Roth |
CT | Christian Traum |
TG | Thomas Gelb |
SS | Sabine Schwarz |
y | Found by this evaluator |
4.3 Top Five Problems
The top five problems according to their average (mean) severity ratings are described in more detail below. Problem number 1 is the problem (negative finding) with the highest mean severity. The severity rating scheme used to rank the problems is shown in Table 5.
Severity | Meaning |
---|---|
4 | Catastophic problem |
3 | Serious problem |
2 | Minor problem |
1 | Cosmetic problem |
0 | Not a problem |
{Describe the top five most severe problems (negative findings) which emerged during the evaluation, according to the mean severity ratings.}
{For each problem, include at least one paragraph of text describing the problem and select the best (most representative) available video clip(s) to illustrate the problem. After the problem description, make a recommendation for how the problem might be addressed.}
N01. Gaudy Animated Image
Title: | Gaudy Animated Image |
---|---|
Description: | The home page contains a gaudy, animated image, which is extremely distracting and irritating. |
Video Clip(s): | n01-ct-neg01-gaudy-animated-image.mp4 |
How Reproducible?: | /en/home |
Heuristic: | A08 Aesthetic and Minimalist Design |
Only When: | |
Mean Severity: | 4.00 |
When the user arrives at the home page, they are greeted with a gaudy animted image containing some seemingly random content, as shown in Figure X. It is extremely distracting and irritating.
The screen real estate in the middle of the home page is the most valuable on a web site. It could be used in a much more productive way, say to highlight some particularly important news or success stories.
N02. Content Cut-Off on Narrow Screens
Title: | Content Cut-Off on Narrow Screens |
---|---|
Description: | On narrow screens, the content is initially cut off, ... |
Video Clip(s): | n02-tg-neg05-content-cut-off.mp4 |
How Reproducible?: | ... |
Heuristic: | ... |
Only When: | |
Mean Severity: | ... |
On narrow screens, the content is cut off on the right side of the page, as shown in Figure X+1. The user has to scroll right.
This issue could potentially be resolved by re-implementing the HTML table element to be more responsive using CSS Grid ...
N03. etc.
{Here follow the next three problems in the order they appear in the mean severity rankings...}
4.4 List of All Problems Found
{Aggregated list of all problems found (negative findings), in descending order of mean severity.}
{For each problem, include a link to the best (most representative) available video clip(s) to illustrate the finding. The name of each video clip should be linked to the corresponding video clip.}
Table 6 shows a list of all the problems found (negative findings) in the evaluation, sorted in descreasing order of average (mean) severity. The name codes assigned to each evaluator are shown in Table 7.
{Individual severity ratings are integer values between 0 and 4 (no fractions allowed). The mean severity is a fractional value given to 2 decimal places.}
No. | Title | Description | Video Clip(s) | How Reproducible? | Heuristic | Only When | Found By | Severity | |||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
HR | CT | TG | SS | HR | CT | TG | SS | Mean | |||||||
1 | Gaudy Animated Image |
The home page contains a gaudy, animated image, which is extremely distracting and irritating. |
n01-ct-neg01-gaudy-animated-image.mp4 | /en/home | A08 Aesthetic and Minimalist Design | y | y | 4 | 4 | 4 | 4 | 4.00 | |||
... | Chat → MyChat → Options | Only on Android | |||||||||||||
... | {in descending order of mean severity until...} | ... | |||||||||||||
... | |||||||||||||||
53 | Green Links |
Visited links are displayed in green, which conflicts somewhat with the light grey background. |
n53-tg-neg13-green-links.mp4, n53-tg-neg13-green-links.mp4 | Every page | A08 Aesthetic and Minimalist Design | y | 0 | 0 | 1 | 0 | 0.25 |
Code | Evaluator |
---|---|
HR | Harald Roth |
CT | Christian Traum |
TG | Thomas Gelb |
SS | Sabine Schwarz |
y | Found by this evaluator |
References
{References to related work and related studies. Include at least two more references of your own. Do not include references to Wikipedia (or copies of Wikipedia). You may remove or retain my sample citations and references as you wish. All references you list in this section must be cited somewhere in the document.}
- [Mor2023]
- Kate Moran and Kelley Gordon;
How to Conduct a Heuristic Evaluation;
Nielsen Norman Group, 25 Jun 2023.
https://nngroup.com/articles/how-to-conduct-a-heuristic-evaluation/
- [Nie1990]
- Jakob Nielsen and Rolf Molich;
Heuristic Evaluation of User Interfaces;
Proc. Conference on Human Factors in Computing Systems (CHI’90).
ACM. Seattle, Washington, USA, Apr 1990, pages 249–256.
doi:10.1145/97243.97281
- [Nie1994]
- Jakob Nielsen;
Enhancing the Exploratory Power of Usability Heuristics;
Proc. Conference on Human Factors in Computing Systems (CHI’94).
ACM. Boston, Massachusetts, USA, Apr 1994, pages 152–158.
doi:10.1145/191666.191729
A Materials
The following materials were used or created by the evaluation team.
A.1 Heuristics
The evaluators used the Andrews General Usability Heuristics 2013
found in file:
heuristics.pdf
.
A.2 Individual Evaluation Logs
The individual log files for each evaluator are provided below:
- Harald Roth:
log-hr.txt
. - Christian Traum:
log-ct.txt
. - Thomas Gelb:
log-tg.txt
. - Sabine Schwarz:
log-ss.txt
.