skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: RadViz Deluxe: An Attribute-Aware Display for Multivariate Data

; ;
Publication Date:
Sponsoring Org.:
OSTI Identifier:
Grant/Contract Number:
Resource Type:
Journal Article: Published Article
Journal Name:
Additional Journal Information:
Journal Volume: 5; Journal Issue: 4; Related Information: CHORUS Timestamp: 2018-02-22 06:03:09; Journal ID: ISSN 2227-9717
Country of Publication:

Citation Formats

Cheng, Shenghui, Xu, Wei, and Mueller, Klaus. RadViz Deluxe: An Attribute-Aware Display for Multivariate Data. Switzerland: N. p., 2017. Web. doi:10.3390/pr5040075.
Cheng, Shenghui, Xu, Wei, & Mueller, Klaus. RadViz Deluxe: An Attribute-Aware Display for Multivariate Data. Switzerland. doi:10.3390/pr5040075.
Cheng, Shenghui, Xu, Wei, and Mueller, Klaus. 2017. "RadViz Deluxe: An Attribute-Aware Display for Multivariate Data". Switzerland. doi:10.3390/pr5040075.
title = {RadViz Deluxe: An Attribute-Aware Display for Multivariate Data},
author = {Cheng, Shenghui and Xu, Wei and Mueller, Klaus},
abstractNote = {},
doi = {10.3390/pr5040075},
journal = {Processes},
number = 4,
volume = 5,
place = {Switzerland},
year = 2017,
month =

Journal Article:
Free Publicly Available Full Text
Publisher's Version of Record at 10.3390/pr5040075

Save / Share:
  • Geospatial technologies and digital data have developed and disseminated rapidly in conjunction with increasing computing performance and internet availability. The ability to store and transmit large datasets has encouraged the development of national datasets in geospatial format. National datasets are used by numerous agencies for analysis and modeling purposes because these datasets are standardized, and are considered to be of acceptable accuracy. At Oak Ridge National Laboratory, a national population model incorporating multiple ancillary variables was developed and one of the inputs required is a school database. This paper examines inaccuracies present within two national school datasets, TeleAtlas North Americamore » (TANA) and National Center of Education Statistics (NCES). Schools are an important component of the population model, because they serve as locations containing dense clusters of vulnerable populations. It is therefore essential to validate the quality of the school input data, which was made possible by increasing national coverage of high resolution imagery. Schools were also chosen since a 'real-world' representation of K-12 schools for the Philadelphia School District was produced; thereby enabling 'ground-truthing' of the national datasets. Analyses found the national datasets not standardized and incomplete, containing 76 to 90% of existing schools. The temporal accuracy of enrollment values of updating national datasets resulted in 89% inaccuracy to match 2003 data. Spatial rectification was required for 87% of the NCES points, of which 58% of the errors were attributed to the geocoding process. Lastly, it was found that by combining the two national datasets together, the resultant dataset provided a more useful and accurate solution. Acknowledgment Prepared by Oak Ridge National Laboratory, P.O. Box 2008, Oak Ridge, Tennessee 37831-6285, managed by UT-Battelle, LLC for the U. S. Department of Energy undercontract no. DEAC05-00OR22725. Copyright This manuscript has been authored by employees of UT-Battelle, LLC, under contract DE-AC05-00OR22725 with the U.S. Department of Energy. Accordingly, the United States Government retains and the publisher, by accepting the article for publication, acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes.« less
  • In our paper, we present and analyze a BDDC algorithm for a class of elliptic problems in the three-dimensional H(curl) space. Compared with existing results, our condition number estimate requires fewer assumptions and also involves two fewer powers of log(H/h), making it consistent with optimal estimates for other elliptic problems. Here, H/his the maximum of Hi/hi over all subdomains, where Hi and hi are the diameter and the smallest element diameter for the subdomain Ωi. The analysis makes use of two recent developments. The first is our new approach to averaging across the subdomain interfaces, while the second is amore » new technical tool that allows arguments involving trace classes to be avoided. Furthermore, numerical examples are presented to confirm the theory and demonstrate the importance of the new averaging approach in certain cases.« less
  • Here, a BDDC domain decomposition preconditioner is defined by a coarse component, expressed in terms of primal constraints, a weighted average across the interface between the subdomains, and local components given in terms of solvers of local subdomain problems. BDDC methods for vector field problems discretized with Raviart-Thomas finite elements are introduced. The methods are based on a new type of weighted average an adaptive selection of primal constraints developed to deal with coefficients with high contrast even inside individual subdomains. For problems with very many subdomains, a third level of the preconditioner is introduced. Assuming that the subdomains aremore » all built from elements of a coarse triangulation of the given domain, and that in each subdomain the material parameters are consistent, one obtains a bound for the preconditioned linear system's condition number which is independent of the values and jumps of these parameters across the subdomains' interface. Numerical experiments, using the PETSc library, are also presented which support the theory and show the algorithms' effectiveness even for problems not covered by the theory. Also included are experiments with Brezzi-Douglas-Marini finite-element approximations.« less
  • Bulk data transfer is facing significant challenges in the coming era of big data. There are multiple performance bottlenecks along the end-to-end path from the source to destination storage system. The limitations of current generation data transfer tools themselves can have a significant impact on end-to-end data transfer rates. In this paper, we identify the issues that lead to underperformance of these tools, and present a new data transfer tool with an innovative I/O scheduler called MDTM. The MDTM scheduler exploits underlying multicore layouts to optimize throughput by reducing delay and contention for I/O reading and writing operations. With ourmore » evaluations, we show how MDTM successfully avoids NUMA-based congestion and significantly improves end-to-end data transfer rates across high-speed wide area networks.« less