skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Visual Analytics Science and Technology

Abstract

It is an honor to welcome you to the first theme issue of information visualization (IVS) dedicated entirely to the study of visual analytics. It all started from the establishment of the U.S. Department of Homeland Security (DHS) sponsored National Visualization and Analytics Center™ (NVAC™) at the Pacific Northwest National Laboratory (PNNL) in 2004. In 2005, under the leadership of NVAC, a team of the world’s best and brightest multidisciplinary scholars coauthored its first research and development (R&D) agenda Illuminating the Path, which defines the study as “the science of analytical reasoning facilitated by interactive visual interfaces.” Among the most exciting, challenging, and educational events developed since then was the first IEEE Symposium on Visual Analytics Science and Technology (VAST) held in Baltimore, Maryland in October 2006. This theme issue features seven outstanding articles selected from the IEEE VAST proceedings and a commentary article contributed by Jim Thomas, the director of NVAC, on the status and progress of the center.

Authors:
Publication Date:
Research Org.:
Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
Sponsoring Org.:
USDOE
OSTI Identifier:
902682
Report Number(s):
PNNL-SA-53463
400904120; TRN: US200718%%56
DOE Contract Number:
AC05-76RL01830
Resource Type:
Journal Article
Resource Relation:
Journal Name: Information Visualization, 2007(6):1-2; Journal Volume: 6
Country of Publication:
United States
Language:
English
Subject:
45 MILITARY TECHNOLOGY, WEAPONRY, AND NATIONAL DEFENSE; 99 GENERAL AND MISCELLANEOUS//MATHEMATICS, COMPUTING, AND INFORMATION SCIENCE; INFORMATION; COMPUTER GRAPHICS; NATIONAL SECURITY; PATTERN RECOGNITION; visual analytics

Citation Formats

Wong, Pak C. Visual Analytics Science and Technology. United States: N. p., 2007. Web. doi:10.1057/palgrave.ivs.9500149.
Wong, Pak C. Visual Analytics Science and Technology. United States. doi:10.1057/palgrave.ivs.9500149.
Wong, Pak C. Thu . "Visual Analytics Science and Technology". United States. doi:10.1057/palgrave.ivs.9500149.
@article{osti_902682,
title = {Visual Analytics Science and Technology},
author = {Wong, Pak C.},
abstractNote = {It is an honor to welcome you to the first theme issue of information visualization (IVS) dedicated entirely to the study of visual analytics. It all started from the establishment of the U.S. Department of Homeland Security (DHS) sponsored National Visualization and Analytics Center™ (NVAC™) at the Pacific Northwest National Laboratory (PNNL) in 2004. In 2005, under the leadership of NVAC, a team of the world’s best and brightest multidisciplinary scholars coauthored its first research and development (R&D) agenda Illuminating the Path, which defines the study as “the science of analytical reasoning facilitated by interactive visual interfaces.” Among the most exciting, challenging, and educational events developed since then was the first IEEE Symposium on Visual Analytics Science and Technology (VAST) held in Baltimore, Maryland in October 2006. This theme issue features seven outstanding articles selected from the IEEE VAST proceedings and a commentary article contributed by Jim Thomas, the director of NVAC, on the status and progress of the center.},
doi = {10.1057/palgrave.ivs.9500149},
journal = {Information Visualization, 2007(6):1-2},
number = ,
volume = 6,
place = {United States},
year = {Thu Mar 01 00:00:00 EST 2007},
month = {Thu Mar 01 00:00:00 EST 2007}
}
  • The evaluation of visual analytics environments was a topic in Illuminating the Path [Thomas 2005] as a critical aspect of moving research into practice. For a thorough understanding of the utility of the systems available, evaluation not only involves assessing the visualizations, interactions or data processing algorithms themselves, but also the complex processes that a tool is meant to support (such as exploratory data analysis and reasoning, communication through visualization, or collaborative data analysis [Lam 2012; Carpendale 2007]). Researchers and practitioners in the field have long identified many of the challenges faced when planning, conducting, and executing an evaluation ofmore » a visualization tool or system [Plaisant 2004]. Evaluation is needed to verify that algorithms and software systems work correctly and that they represent improvements over the current infrastructure. Additionally to effectively transfer new software into a working environment, it is necessary to ensure that the software has utility for the end-users and that the software can be incorporated into the end-user’s infrastructure and work practices. Evaluation test beds require datasets, tasks, metrics and evaluation methodologies. As noted in [Thomas 2005] it is difficult and expensive for any one researcher to setup an evaluation test bed so in many cases evaluation is setup for communities of researchers or for various research projects or programs. Examples of successful community evaluations can be found [Chinchor 1993; Voorhees 2007; FRGC 2012]. As visual analytics environments are intended to facilitate the work of human analysts, one aspect of evaluation needs to focus on the utility of the software to the end-user. This requires representative users, representative tasks, and metrics that measure the utility to the end-user. This is even more difficult as now one aspect of the test methodology is access to representative end-users to participate in the evaluation. In many cases the sensitive nature of data and tasks and difficult access to busy analysts puts even more of a burden on researchers to complete this type of evaluation. User-centered design goes beyond evaluation and starts with the user [Beyer 1997, Shneiderman 2009]. Having some knowledge of the type of data, tasks, and work practices helps researchers and developers know the correct paths to pursue in their work. When access to the end-users is problematic at best and impossible at worst, user-centered design becomes difficult. Researchers are unlikely to go to work on the type of problems faced by inaccessible users. Commercial vendors have difficulties evaluating and improving their products when they cannot observe real users working with their products. In well-established fields such as web site design or office software design, user-interface guidelines have been developed based on the results of empirical studies or the experience of experts. Guidelines can speed up the design process and replace some of the need for observation of actual users [heuristics review references]. In 2006 when the visual analytics community was initially getting organized, no such guidelines existed. Therefore, we were faced with the problem of developing an evaluation framework for the field of visual analytics that would provide representative situations and datasets, representative tasks and utility metrics, and finally a test methodology which would include a surrogate for representative users, increase interest in conducting research in the field, and provide sufficient feedback to the researchers so that they could improve their systems.« less
  • Five years after the science of visual analytics was formally established, we attempt to use two different studies to assess the current state of the community and evaluate the progress the community has made in the past few years. The first study involves a comparison analysis of intellectual and scholastic accomplishments recently made by the visual analytics community. The second one aims to measure the degree of community reach and internet penetration of visual-analytics-related resources. This paper describes our efforts to harvest the study data, conduct analysis, and make interpretations based on parallel comparisons with five other established computer sciencemore » areas.« less
  • The authors provide a description of the transition process for visual analytic tools and contrast this with the transition process for more traditional software tools. This paper takes this into account and describes a user-oriented approach to technology transition including a discussion of key factors that should be considered and adapted to each situation. The progress made in transitioning visual analytic tools in the past five years is described and the challenges that remain are enumerated.
  • Multimedia analysis has focused on images, video, and to some extent audio and has made progress in single channels excluding text. Visual analytics has focused on the user interaction with data during the analytic process plus the fundamental mathematics and has continued to treat text as did its precursor, information visualization. The general problem we address in this tutorial is the combining of multimedia analysis and visual analytics to deal with multimedia information gathered from different sources, with different goals or objectives, and containing all media types and combinations in common usage.
  • The term Visual Analytics has been around for almost five years by now, but still there are on-going discussions about what it actually is and in particular what is new about it. The core of our view on Visual Analytics is the new enabling and accessible analytic reasoning interactions supported by the combination of automated and visual analytics. In this paper, we outline the scope of Visual Analytics using two problem and three methodological classes in order to work out the need for and purpose of Visual Analytics. Thereby, the respective methods are explained plus examples of analytic reasoning interactionmore » leading to a glimpse into the future of how Visual Analytics methods will enable us to go beyond what is possible when separately using the two methods.« less