skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Evaluation of Visualization Heuristics

Conference ·

Heuristics have been used since the early 1990's in the Human Computer Interaction (HCI) domain, for evaluating user interfaces. Most notably, heuristics developed by Nielsen and Molich (1990) have been used by the HCI community. Forsell and Johansson (2010) used the same methodology as Nielsen and Molich and developed a set of 10 heuristics for information visualization. Currently, neither the information visualization or the visual analytics community has adopted these heuristics. However, several studies have been conducted using heuristics. Hearst et al. (2015) found that using the heuristics for visualizations along with asking questions about the data resulted in complementary results as the questions about the data could compensate for heuristics that were difficult to understand and apply. Heli Väätäjä et al. (2016) found that heuristics related to interaction, veracity, and aesthetics needed to be added to the Forsell and Johansson (2010) set. They also noted that a lack of domain knowledge caused the evaluators issues in conducting a heuristic evaluation. We conducted a controlled experiment to determine what factors might impact the expert evaluators' capability to conduct a heuristic evaluation of a static information visualization. In particular, we wanted to understand how their familiarity with the visualizations and their experience in conducting heuristic evaluations impacted their evaluations of visualizations. We recruited 10 participants who were domain experts in the visualization field each having experience with one of more of the following: user experience design or research, user interface development, and human computer interaction research. We used a heuristic set proposed by Forsell and Johansson. Participants were also asked to complete a demographic survey in which they rated their experience with each visualization type generally, their experience conducting heuristic evaluations, their experience using each visualization type to extract information, and their experience developing and creating each visualization type. The participants were shown a series of 5 common visualizations (Scatterplots, Sunbursts, Tree Maps, Parallel Sets, and Area Graphs) all displaying the VAST 2008 Mini Challenge Two- Migrant Boats data . Participants were given a suggested analytical question related to each visualization to keep in mind as they applied the 10 heuristics to each visualization. They were asked if they could find any usability problems in the visualizations and which heuristic(s) helped them to identify these problems. The participants were asked to rate how relevant each heuristic was for evaluating each visualization along with rating how confident they felt in their ability to apply the heuristic to evaluate each visualization. At the conclusion of the study, participants provided qualitative feedback on the usability issues of each visualization. Additionally, we collected eye tracking data on 6 of the 10 participants. We will be using this data to determine how long participants looked at the various visualizations and what parts of the visualizations they focused on. We had two hypotheses for this empirical evaluation: 1.The Forsell and Johansson heuristics of Orientation and help, Data set reduction, and Prompting are less useful for static visualizations because they require some interaction in order to test completely. 2.Participants that do not have experience with heuristic evaluation will provide lower evaluation scores for the heuristics compared to those that have experience. Related to hypothesis 1, as our study used only static visualizations, interaction-related usability challenges should be harder to identify. This should result in lower usefulness ratings for those heuristics. Preliminary results indicate that participants found it more difficult to apply heuristics to unfamiliar visualizations.

Research Organization:
Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
Sponsoring Organization:
USDOE
DOE Contract Number:
AC05-76RL01830
OSTI ID:
1599075
Report Number(s):
PNNL-SA-132234
Resource Relation:
Journal Volume: 10901; Conference: International Conference on Human-Computer Interaction (HCI 2018): Human-Computer Interaction. Theories, Methods, and Human Issues, July 15-20, 2018, Las Vegas, NV. Lecture Notes in Computer Science
Country of Publication:
United States
Language:
English

Similar Records

User-Centered Evaluation of Visual Analytics
Book · Sun Oct 01 00:00:00 EDT 2017 · OSTI ID:1599075

Automated Cache Performance Analysis And Optimization
Technical Report · Mon Dec 23 00:00:00 EST 2013 · OSTI ID:1599075

A Heuristic Approach to Value-Driven Evaluation of Visualizations
Journal Article · Mon Sep 03 00:00:00 EDT 2018 · IEEE Transactions on Visualization and Computer Graphics · OSTI ID:1599075

Related Subjects