For this exercise, your group will play the role of user interface consultants who have been contracted to evaluate a web site. The idea is to identify as many potential usability problems as possible and feed this input into the next design phase.
Your tutor will play the role of your client (the manager of the web site you are evaluating). You will present the results of your heuristic evaluation to your client at Meeting M2 at M2 Slots.
Use the report template and materials provided in the Ex2 Materials.
First of all, read the sections in the lecture notes on heuristic evaluation.
The four (or three) members of your group will be the evaluators for the heuristic evaluation.
The web site which has been assigned to your group is visible in TeachCenter in the Practical Exercises section on the page entitled Web Site to Evaluate.
Unless otherwise notified, assume that you will be evaluating the entire web site.
Generally speaking, do not follow links to external web sites and
evaluate them too. However, it often does make sense to follow links
to subsites and consider them too
(say, tickets.site.com
). Check with your tutor if you
are unsure.
Consider the target user population(s) for your assigned web site. Are there distinct user groups with distinct needs and goals?
If the web site has several distinct user groups, you should consider all of the potential user groups in the heuristic evaluation.
Consider the goals the users from these groups might have and typical tasks they might want to perform on the web site.
Plan for all of your evaluators to work at around the same time, to minimise the risk that the web site changes in between evaluations.
Coordinate between yourselves, so that the evaluators use a mix of devices and browsers to inspect the web site:
Two of the evaluators should evaluate on a mobile device (= smartphone or tablet, running Android or iOS), in portrait orientation, each using a different approved mobile web browser. On mobile, the approved browsers are: Chrome, Firefox, Safari, and Samsung Internet Browser.
The other evaluator(s) should evaluate on a PC (= desktop or laptop, running Windows, MacOS, or Unix), in landscape orientation, each using a different approved PC web browser. On PC, the approved browsers are: Chrome, Firefox, and Safari.
It is OK for one evaluator to use (say) Chrome on PC, and another evaluator to use Chrome on mobile, since they are different devices.
You must use real devices and not an emulator or simulator.
If you are using a mobile browser and your web site suggests you should go to the mobile version of the site, then do so. However, do not install a native app, even if your web site suggests you should do so.
In an attempt to reflect real usage patterns, one of the evaluators on mobile and one of the evaluators on PC should plan to use an ad blocker.
If you find a problem which may be due to blocked content, unblock ads temporarily to see whether the problem persists. If the problem then goes away, document it as a problem apparently caused by ad blocking.
Decide how you will make screen video recordings on the various platforms you have chosen to evaluate. See my Guide To Session Capture on various platforms. You should plan to record your voice (with a live commentary while you work) as well as what happens on screen. For this heuristic evaluation, do not record the face of the evaluator in any video recording, only the screen and audio.
Do not use recording software which leaves behind a watermark. On mouse devices, turn on recording of the mouse pointer (even make it slightly larger), but turn off any mouse trails. On touch devices, see if your recording software can turn on display of touch events, but turn off touch trails.
The screen recording should be at most at FullHD resolution (1920×1080 pixels in landscape orientation or 1080×1920 pixels in portrait orientation). If your device has higher resolution than FullHD, please plan to adjust the settings or make the browser window smaller. If your device has lower resolution than FullHD, record at its maximum resolution. The standard browser window and GUI should be included in the recording. Other things outside the browser window should not be included in the recording. Do not leave unnecessary margins around the video.
If the native resolution of your device is too high in one or both directions, and you cannot adjust the settings or make the browser window smaller, you will have to record at higher pixel resolution and later transcode down uniformly to at most FullHD. Do not distort the video and do not create black or empty strips to either side (or top and bottom). See my Guide to Video Transcoding. Unfortunately, transcoding always results in a loss of quality.
Evaluate and record in light mode (dark text on a light background).
Use a good microphone, and make a test recording to make sure everything is working and the audio can be heard.
Our preferred video format for screen recordings is MP4 with H.264 video and AAC audio. If your device/software can record in that format natively, that is perfect.
Otherwise, you will have to convert/transcode your video clips to MP4 (with H.264 video and AAC audio) format later on. Unfortunately, transcoding always results in a loss of quality.
Plan to use the following recommended video settings:
Container: | MP4 |
Output Video Resolution: | 1920×1080 (FullHD) [or 1080×1920] or lower |
Frame Rate: | 20 |
Codec: | H.264 |
Rate Control: | VBR (Variable Bit Rate) |
Bit Rate: | 5000 Kbps (= 5 Mbps) |
Plan to use the following recommended audio settings:
Channels: | Stereo |
Codec: | AAC (= mp4a) |
Sample Rate: | 44100 Hz |
Bit Rate: | 160 Kbps |
Bits per Sample: | 32 |
Decide how you will, if necessary, extract video clips from the screen capture videos or trim video clips. See my Guide to Video Editing.
Every finding must be illustrated with a video clip.
Adapt the HE materials as necessary for your evaluation:
Assign each evaluator a two-letter shorthand code based on their initials. For example, Keith Andrews would be “ka” in lower case and “KA” in upper case.
If two evaluators in your team both have the same initials, then assign one of them a variant. For example, Ken Anderson might be assigned "kn" in lower case and “KN” in upper case.
Copy the skeleton template log file
log-ee.txt
from the materials to create a plain text log file
for each evaluator:
log-ee.txt
where ee
is replaced by the lower case initials of the
evaluator. Do not use any upper case letters, special characters,
umlauts, or spaces in file names.
For example, if your name is Keith Andrews, the log files should be
named log-ka.txt
.
Each evaluator should fill in the metadata at the start of their log file to match their (planned) evaluation environments. Enter the name of the corresponding evaluator. Take care to preserve the character encoding of the log files as UTF-8.
Each negative finding (problem) will be assigned an ID of the
form EE-Neg01
, EE-Neg02
,
etc. Each positive finding will be assigned an ID of the
form EE-Pos01
, EE-Pos02
,
etc. In each log file, replace the initials EE in the example
problem and positive finding IDs with the upper case initials of the
corresponding evaluator.
Each evaluator must inspect the interface alone.
Make the following preparations:
Set up screen and audio capture on your allocated device, as described above. Make a test recording to make sure everything is working and the audio can be heard.
Print out an A4 copy of the Andrews General Usability Heuristics provided in the materials.
Fill in the metadata at the top of the log file.
Reset the browser to a fresh state before you start evaluating:
Conduct the individual evaluation using the allocated device and browser (and ad blocker if allocated):
Most of the chosen web sites are available in German. Some of the chosen web sites are available in English and German. Sometimes, some of the chosen web sites are available in English only.
If the web site has both English and German versions, each evaluator should pick one language (English or German) for the bulk of their evaluation, but then also take a look with the other language too.
Speak English for the audio commentary while evaluating, regardless of the language of the web site.
Keep in mind the target user population(s) and their typical tasks.
Start the screen video (+ audio) capture, making sure to hide any UI controls of the recording software.
To be clear, the evaluator should record their entire inspection session, so that it is available to cut out video clips from, in case some findings cannot later be reconstructed.
Try out your assigned web site first with all cookies enabled and then with most cookies disabled (only necessary cookies enabled) to see if there is any difference.
If you find a problem which appears to be due to disabling cookies, document it as such.
Inspect the interface, talking out loud and also noting problems and positive impressions in chronological order of discovery in the plain text log file you adapted from the materials.
The log file should be called log-ee.txt
,
where ee
are your initials in lower case.
Each negative finding (problem) is assigned an ID in
chronological order of discovery of the form
EE-Neg01
, EE-Neg02
, etc.,
where EE
are the evaluator's upper case
initials, for example KA-Neg01
.
Proceed analogously for positive findings: EE-Pos01
,
EE-Pos02
, etc.
For each finding, enter the following information into the log file:
KA-Neg01
.
ka-neg01-keywords.mp4
.In addition, for negative findings (problems), also make a note of:
Heuristic: The corresponding heuristic (if any), which problem falls under, e.g. A04 Consistency.
Most problems will fall under one of the heuristics, but it is OK if some of the problems you report are not covered by a heuristic (leave the heuristic field in the log file blank). We will not assign heuristics to positive findings in this evaluation, only to negative findings.
Once you have finished evaluating, save your screen video (+ audio) recording and your log file.
Make sure your log file is a plain text file encoded in UTF-8, of at most 100 kb.
Make sure that when editing the log file, that when you save it, it is still in UTF-8 format.
If there are many examples of the same general problem (for example, typos across several pages), count this as one problem and give two or three examples in the description.
In total, we would generally expect each individual evaluator to find between 10 and 20 problems and at least 3 positives on each device, depending on the quality of the web site you are evaluating.
Re-create a short video clip to illustrate each finding from each device.
Every finding must be accompanied by at least one video clip (sometimes two, say with and without cookies, if that is what best illustrates the problem). The other evaluators will need the video clip to understand the finding, so they can properly assign a severity rating.
Each video clip must contain an audio track in English, concisely describing the finding.
While the audio quality of an original voice will be better, if you are concerned about privacy (the potential of voice cloning, etc.), it is OK to modify your voice with a voice modifier or to use synthetic speech (text-to-speech) for the audio commentaries. A voice modifier might be preferable to synthetic speech, since that can sound quite robotic, often with misplaced pauses and shaky pronunciation. In any case, the audio commentary must remain clear and understandable.
Where a finding cannot be recreated, extract a video clip from your evaluation recording.
If the existing audio commentary does not describe the finding adequately, replace the audio commentary with a new one.
Make sure the audio levels are high enough, so that your audio commentary can be clearly heard.
If necessary, you might want to normalise the audio level and/or remove any noise.
Each video clip should be long enough to illustrate the finding, but no longer. Typically, video clips are 10 to 15 seconds. At a maximum, each video clip should be no longer than 20 seconds duration.
In some rare cases, say to illustrate extremely slow loading time, it might be tempting to include a video clip longer than 20 seconds. In such a case (for the purposes of this course), edit the video clip to show say an initial 8 seconds, then a transition frame something like “28 seconds later” for 2 seconds, then the final 8 seconds (18 seconds duration in total).
Each video clip should be no larger than FullHD resolution (1920×1080 pixels in landscape orientation or 1080×1920 pixels in portrait orientation).
Otherwise, you will have to transcode down uniformly to at most FullHD. Do not distort the video and do not create black or empty strips to either side (or top and bottom). See my Guide to Video Transcoding. Unfortunately, transcoding always results in a loss of quality.
Each video clip should be no larger than 10 MB in size!
Otherwise, make it shorter in duration if possible, or else transcode to a lower video bitrate (with the resulting loss of quality).
Name your video clips according to the following naming scheme:
ee-negxx-keywords.mp4
ee-posxx-keywords.mp4
where ee
is replaced by the lower case initials
of the evaluator (e.g. ka
for Keith
Andrews), neg
indicates a negative finding (problem)
and pos
indicates a positive
finding, xx
is the two-digit number in
chronological order of discovery, and keywords
comprises one to four words describing the finding separated by
hyphens.
For example:
ka-neg01-links-yellow.mp4
ka-pos01-breadcrumbs-work-well.mp4
...
For file and folder names, use only lower case letters, digits, and
hyphens, from the 7-bit ASCII character set. Do not use any upper case
letters, spaces, underscores, or special characters in either the name
or extension. The name ka_neg01_links_yellow.MP4
does not conform to the naming scheme.
Under windows, turn on the display of file extensions, so that you see them!
Enter the names of your video clips into your log file.
As a group, assemble the individual log files and associated video clips from each evaluator in your group.
Form a combined (aggregated) list of problems (and a combined list of
positives). The spreadsheet
helist.xlsx
is provided for you to use (but it is not required that you hand
it in).
Proceed as follows:
Choose the evaluator with the longest list of problems and enter the problems into the spreadsheet.
Merge the problems from the other evaluators into this list.
If the same (or very similar) findings are found by two or more evaluators, combine them into one.
You then have multiple video clips to illustrate the finding.
If the same (or very similar) findings are found on two or more devices, combine them into one.
You then have multiple video clips to illustrate the finding.
List many small related issues (such as 15 individual typos) as one problem with many instances (rather than 15 problems).
The same for many examples of German content not being available in English (or vice versa).
Indicate which problems were found by which evaluator(s).
For each problem, determine whether the problem is general to all browsers and platforms or specific to a particular browser or platform. Or, if it only occurs when cookies are disabled, or when an ad blocker is used. In such cases, indicate this in the “Only When...” column. If a problem is general, leave the corresponding cell empty.
Proceed analogously for the positive findings.
Individually (not working together), assign severity ratings to each problem. Use the 0 to 4 integer severity scale below:
Severity | Meaning |
---|---|
4 | Catastrophic Problem |
3 | Serious Problem |
2 | Minor Problem |
1 | Cosmetic Problem |
0 | Not a Problem |
Individual ratings must be integers, fractional ratings are not allowed.
Individually (not working together), assign positivity ratings to each positive finding. Use the 0--4 integer positivity scale below:
Positivity | Meaning |
---|---|
4 | Extremely Positive |
3 | Major Positive |
2 | Minor Positive |
1 | Cosmetic Positive |
0 | Not a Positive |
Individual ratings must be integers, fractional ratings are not allowed.
Calculate the mean severity rating for each problem to 2 decimal places. Calculate the mean positivity rating for each positive finding to 2 decimal places.
Sort the problems into descending order of average severity (most severe first). Sort the positives into descending order of average positivity (most positive first).
Renumber the problems and positives, so that the finding at the top of each list is number 1. We will call these the ranked finding numbers. The top-ranked positive finding will henceforth be named P01, the second-ranked positive finding P02, etc. The top-ranked negative finding will be named N01, the second-ranked negative finding N02, and so forth.
For each finding, select the best (most representative) available video clip(s) to illustrate the finding. It often makes sense to include multiple video clips to illustrate one finding, for example, if they show different aspects of the same issue (say, one on mobile and the other on PC).
However, do not always select every available video clip from every evaluator for every finding. Typically, you might select one or two, possibly three, video clips per finding. Only these are to be included in the report and handed in.
If you are using the provided spreadsheet, you can list all of the available video clips for a finding in the column “All Available Video Clips”, then enter the ones you want to use in the column “Selected Video Clip(s)”.
Copy and rename the selected video clips for each finding, so that the ranked finding number is prepended to the file name, according to the following scheme:
nnn-ee-negxx-keywords.mp4
pnn-ee-posxx-keywords.mp4
where n
indicates a negative finding (problem)
and p
indicates a positive finding,
nn
is the ranked finding number,
ee
are the lower case initials of the evaluator,
and posxx
or negxx
is the original
numbering in discovery order. For example:
n01-ka-neg01-links-yellow.mp4
p01-ka-pos01-breadcrumbs-work-well.mp4
where n01-ka-neg01-links-yellow.mp4
is one of the
selected video clips illustrating the highest ranked (most severe)
problem N01 and was found by evaluator KA.
If you are using the provided spreadsheet, you can rename the selected video clips in the column “Selected Video Clip(s)”, leaving the original names in the column “All Available Video Clips”.
Transfer the sorted lists of problems and of positives into simple XHTML5 table entries.
Do not just save the spreadsheet from Excel as HTML, since this creates bloated HTML code, which will not be valid XHTML5.
For example, save the spreadsheet as a CSV (comma seperated value) file and then use a text editor to manipulate it:
</td><td>
).
<tr><td>
to the first cell
and append a final </td></tr>
to the final
cell on each row.
If you use the provided spreadsheet, the column “All Available Video Clips” should not be included in the HTML5 table in the final report. You could, say, make a temporary copy of the spreadsheet, delete the column, and save the CSV from there. The column “Selected Video Clip(s)” is simply called “Video Clip(s)” in the HTML5 table in the final report.
If you know Perl, you might also use a Perl script to generate simple
XHTML5 table entries from a CSV file. I have an example
Perl script (rename it
to csv2html.pl
).
Write a report of the heuristic evaluation using the skeleton report provided.
Your report should be in English.
Include some title information indicating the group number, the title/topic of the evaluation, and the name of each group member.
Include a summary of the evaluation and the most important findings written for your client's management. The description of the evaluation scope and procedure should take up no more than 25% of the executive summary. Most of the executive summary should summarise the main findings.
In real life, a manager would typically not have time to read the whole report, but would read only the executive summary to find out how the evaluation was done and what the main findings were.
Include a description of the methodology behind a Heuristic Evaluation, in your own words.
Adapt the citations and references as instructed in the comments.
Describe the assumptions made about user characteristics and groupings and the extent of the evaluation.
Decsribe the evaluation environments (hardware, browser version, OS, date of evaluation, etc.) used by each evaluator.
Explain how each evaluator extracted or created video clips on each particular evaluation device or platform, and how they were edited and transcoded if necessary.
Discuss of three most positive findings, each illustrated with a video clip.
Include an aggregate table of positive findings in descending order of mean positivity.
Analyse and discuss the five most severe problems (the top five from your sorted list), each illustrated with a video clip.
Include an aggregate table of problems in descending order of mean severity.
For the HE Report:
Use the skeleton report provided. Insert your own content where indicated.
Follow any instructions and guidance embedded in the skeleton {the text in curly brackets}.
Remove any instructions and guidance {the text in curly brackets} from the file before you hand it in.
All text files (HTML, CSS, log files, etc.) must be encoded in UTF-8 format.
Make sure that when you edit such a file, when you save it, it is still in UTF-8 format.
Write straightforward, simple, valid XHTML5 using a plain text editor or an IDE like Visual Studio Code (single file, no frames, no Javascript, no export from Word, no conversion from LaTeX).
Do not prettify or change the structure of the HTML code. Use two
spaces for indentation. Do not use any Tab characters. Leave
the <section>
structure
and <section>
ids intact. Do not make any changes
to the CSS file or add any <style>
elements to the
HTML.
To ensure that your HTML is well-formed, make sure that it is valid XHTML5. See my Guide to Validating XHTML5.
If you are not familiar with XHTML5, consult Chapters 10 and 11 of my INM 2014 lecture notes.
All video clips and images referenced in the report must be handed in as local copies (referenced by relative links), so that the report is self-contained. Do not link to such assets remotely.
Video clips for the top three positives and top five problems should
be given space in the report using the HTML5 video
element, as shown in the skeleton template.
By default, the HTML5 video
element initially displays
the first frame of video. For our video clips, it is often more
appropriate to display a frame from somewhere within the video clip as
the initial image. This can be achieved by extracting a still frame
(in JPEG format) which better illustrates the finding, for
example n01-ka-links.jpg
for a video clip
named n01-ka-links.mp4
, and specifying it in
the poster
attribute of the video
element.
Remember, all the video clips which are included in the HE Report must be no longer than 20 seconds duration, no larger than 10 MB, and no larger than FullHD resolution (1920×1080 pixels in landscape orientation or 1080×1920 pixels in portrait orientation).
Shorten or transcode any of the videos to be handed in with the report which exceed these limitations.
And, I ask (but cannot require) that you place your HE Report under a CC BY 4.0 licence. To do so, leave the following statement in place at the bottom of the report:
This work is placed under a Creative Commons Attribution 4.0 International (CC BY 4.0) licence.
If you do not wish to do so, then remove the statement.
Create a slide deck for a 15-minute group presentation of your HE Report to your client (tutor).
Use slides (screenful by screenful) with bulleted lists (not full sentences).
Make a separate presentation. Do not simply open your report in a web browser and project that.
Your slides should be in English and you should present in English.
Use PowerPoint. If you do not have PowerPoint, try LibreOffice Impress
as an alternative. Save the file in PowerPoint .pptx
format. Do not use Google Slides!
Students at Graz University of Technology have free access to
Microsoft Office 365 by going
to portal.office.com
and logging in with their university account.
In Google Slides, it is not possible to embed video within the slide deck file, only to link it, which breaks if a video file moves or is deleted, or if you are offline.
You may have to install a video codec pack (such as the K-Lite Codec Pack for Windows or the GStreamer plugins for Linux), to play back video clips from within PowerPoint or Impress.
For the slide deck:
.pptx
file
and the presentation is completely self-contained.
Create a title slide with the following information:
And, I ask (but cannot require) that you place your slide deck under a CC BY 4.0 licence. To do so, include the following statement in a smaller font at the bottom of the title slide:
This work is placed under a Creative Commons Attribution 4.0 International (CC BY 4.0) licence.
Address (at least) the following talking points within the slide deck:
Name the
presentation gT-GG-he-slides.pptx
,
where T-GG
is the number
of your tutor and group.
For
example, g1-01-he-slides.pptx
for Group G1-01.
Make a directory called gT-GG-he
for your
heuristic evaluation report, where T-GG
is the
number of your tutor and group. For
example, g1-01-he
for Group 1-01.
Include your main file he.html
.
Copy over the files heuristics.pdf
and
report.css
unchanged.
Create a subdirectory logs
for your log files. Include
the individual evaluation log files from each evaluator.
Do not include the original template log file
(log-ee.txt
).
Create a subdirectory presentation
for your presentation
slides.
Place the video clips (and any corresponding poster images) selected
for your report into a subdirectory called videos
.
All video clips referenced in your report must be handed in as local copies, so that the report is self-contained. Do not include video clips which are not referenced in the report.
When naming your files and directories, use only lower case letters, digits, and hyphens, from the 7-bit ASCII character set.
Your directory structure should look something like this:
g1-01-he/ he.html heuristics.pdf report.css logs/ log-hr.txt log-ct.txt ... presentation/ g1-01-he-slides.pptx videos/ p01-ka-pos01-breadcrumbs-work-well.mp4 ... n01-ka-neg01-links-yellow.mp4 ...
Tidy up your directory. Do not leave junk files, backup files, etc. lying around.
Make a zip file of your hand-in directory, including the directory itself (not just the files inside).
Name the zip file gT-GG-he.zip
,
where T-GG
is the number of your tutor
and group. For example, g1-01-he.zip
for Group 1-01.
The maximum size of your zip file is 500 MB (500,000,000 bytes). Check the size before uploading.
If your files are too big, you may have to delete something or reduce the size of something. Contact your tutor if you are unsure what to remove or make smaller.
Upload your zip file to TeachCenter before the submission deadline.
This exercise is a group exercise. The group makes a single submission (one zip file) as a group.
If you make any changes to your submission after the deadline, your submission will be flagged with the new timestamp and will be considered to be a late submission with the corresponding points deduction.
Submissions will cease to be accepted 48 hours after the deadline.