Heuristic Evaluation Plan
Human-Computer Interaction SS 2024
Group GT-XX
Harald Roth
Christian Traum
Thomas Gelb
Sabine Schwarz
Evaluation of the Web Site
example.com
HE Plan of XXth March 2024
{My instructions and comments are contained inside curly brackets. Remove them before you hand in your work!}
{You must use your own words. Do not copy material from the web, from colleagues from previous years, or from anywhere else. Do not generate text using AI-based tools.}
1 Introduction
{Short description of the web site to be evaluated and what should come out of the evaluation.}
2 Evaluation Methodology
{Describe what a HE is and how it is done. Write between 300 and 500 of your own words. Explain the method, do not just list the heuristics. Replace my sample text below with your own.}
Heuristic evaluation is ... ... by Jakob Nielsen and Rolf Molich in 1990 [Nie1990]. ... as can be seen in Moran and Gordon's guide [Mor2023].
For this evaluation, the "Andrews General Usability Heuristics", shown in Appendix A.1, will be used. These are based on and slightly adapted from Nielsen's revised set of ten usability heuristics [Nie1994].
{Cite at least two more references of your own in this section and include them in the References section below. Use only indirect quotations (paraphrasing), no direct quotations. You may remove or retain my sample citations and references as you wish.}
3 User Profiles
{Describe the kinds of user the site is trying to attract.}
{Group these users into categories according to their characteristics.}
{Describe the goals and typical tasks for each of these user groups.}
{Do not create personas for these user groups.}
4 Extent of the Evaluation
{Describe which parts of the web site will be evaluated and which parts will not. Unless agreed otherwise with your tutor, assume that you will evaluate the entire web site.}
5 Evaluators and Evaluation Environments
{Tabular overview of the different evaluators and their planned evaluation environments: i.e. the demographic data, hardware, browser, type and speed of internet connection, etc., which each evaluator will be using.}
The evaluators will use the evaluation environments shown in Table 1. For this evaluation, mobile devices will be operated in portrait mode.
{Fill out the tables as best you can at the moment of writing the HE
Plan. Measure the download speed of your internet connection (in
Austria, use
netztest.at
).}
{Note: the recording resolution is the resolution of the actual recorded video of your browser window. Make a test recording to find out. For example, be aware that Windows Display Scaling can change the recording resolution, OBS Studio can scale the recording, the browser window may not be maximised to full screen, etc.}
{Enter the name, version, and platform of the video editing software and any transcoding software you plan to use.}
Evaluator | Harald Roth (HR) | Christian Traum (CT) | Thomas Gelb (TG) | Sabine Schwarz (SS) |
---|---|---|---|---|
Age | 24 | 26 | 31 | 25 |
Gender | male | male | male | female |
Device | Sony Vaio VGN-Z51XG Laptop | Dell Precision 5510 Laptop | Samsung Galaxy S3 | iPad Pro 10.5" (2017) |
OS and Version | Windows 7 Pro DE SP1 | Windows 10 Pro EN v 2004 | Android 4.1.2 | iOS 13.5.1 |
Screen Size | 19″ | 15.6″ | 4.8″ | 10.5″ |
Screen Resolution | 1280×1024 | 3840×2160 | 720×1280 | 1668×2224 |
Web Browser | Chrome 85.0.4183.83 (64-bit) | Firefox 80.0 EN | Firefox 68.0 | Safari 605.1.15 |
Ad Blocker | Privacy Badger 2020.7.21 | none | none | AdGuard 1.6 |
Internet Connection | Magenta Take-IT, xdsl | A1, dsl | HoT, LTE (hotspot) | WiFi |
Download Speed | 15 mbps | 18 mbps | 20 mbps | 30 mbps |
Screen Recording Software | OBS Studio 25.0.8 | Camtasia 2022.4.1 | AZ Screen Recorder 5.9.2 | iOS Screen Recording |
Recording Resolution | 1280×1024 | 1920×1080 | 720×1280 | 1668×2224 |
Video Editing Software | Lossless Cut 3.59.1 Win | Lossless Cut 3.59.1 Win | Adobe Premiere Pro 24.2.1 Win | DaVinci Resolve 18 Mac |
Video Transcoding Software | not required | not required | not required | Handbrake 1.7.3 Mac |
Planned Date of Evaluation | 2023-04-14 | 2023-04-14 | 2023-04-15 | 2023-04-15 |
Planned Time of Evaluation | 09:00-10:30 | 14:00-15:00 | 21:00-23:00 | 09:30-10:30 |
References
{References to related work and related studies. Include at least two more references of your own. Do not include references to Wikipedia (or clones of Wikipedia). You may remove or retain my sample citations and references as you wish. All references you list in this section must be cited somewhere in the document.}
- [Mor2023]
- Kate Moran and Kelley Gordon;
How to Conduct a Heuristic Evaluation;
Nielsen Norman Group, 25 Jun 2023.
https://nngroup.com/articles/how-to-conduct-a-heuristic-evaluation/
- [Nie1990]
- Jakob Nielsen and Rolf Molich;
Heuristic Evaluation of User Interfaces;
Proc. Conference on Human Factors in Computing Systems (CHI’90).
ACM. Seattle, Washington, USA, Apr 1990, pages 249–256.
doi:10.1145/97243.97281
- [Nie1994]
- Jakob Nielsen;
Enhancing the Exploratory Power of Usability Heuristics;
Proc. Conference on Human Factors in Computing Systems (CHI’94).
ACM. Boston, Massachusetts, USA, Apr 1994, pages 152–158.
doi:10.1145/191666.191729
A Materials
The following materials will be used by the evaluation team.
A.1 Heuristics
The evaluators will use the Andrews General Usability Heuristics 2013
found in file:
heuristics.pdf
.
A.2 Skeleton Log Files
The evaluators will use the following (plain text) log files to collect notes during their individual evaluations:
- Harald Roth:
log-hr.txt
. - Christian Traum:
log-ct.txt
. - Thomas Gelb:
log-tg.txt
. - Sabine Schwarz:
log-ss.txt
.
{Copy and adapt the skeleton log
file log-ee.txt
for each evaluator. Replace ee
with the initials
of each evaluator. Fill in the metadata at the start of the log file
for each evaluator with their (planned) evaluation
environments. Replace the initials EE
in the
example problem and positive finding IDs with the initials of the
corresponding evaluator (e.g HR-Neg01, HR-Pos01, etc.).}