DOE PAGES title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Z-checker: A framework for assessing lossy compression of scientific data

Journal Article · · International Journal of High Performance Computing Applications
 [1];  [2];  [2];  [3];  [4]
  1. Univ. of California, Riverside, CA (United States)
  2. Argonne National Lab. (ANL), Lemont, IL (United States)
  3. Univ. of California, Riverside, CA (United States); Beijing Univ. of Technology, Beijing (China)
  4. Argonne National Lab. (ANL), Lemont, IL (United States); Univ. of Illinois Urbana-Champaign, Champaign, IL (United States)

Because of the vast volume of data being produced by today's scientific simulations and experiments, lossy data compressor allowing user-controlled loss of accuracy during the compression is a relevant solution for significantly reducing the data size. However, lossy compressor developers and users are missing a tool to explore the features of scientific data sets and understand the data alteration after compression in a systematic and reliable way. To address this gap, we have designed and implemented a generic framework called Z-checker. On the one hand, Z-checker combines a battery of data analysis components for data compression. On the other hand, Z-checker is implemented as an open-source community tool to which users and developers can contribute and add new analysis components based on their additional analysis demands. In this study, we present a survey of existing lossy compressors. Then, we describe the design framework of Z-checker, in which we integrated evaluation metrics proposed in prior work as well as other analysis tools. Specifically, for lossy compressor developers, Z-checker can be used to characterize critical properties (such as entropy, distribution, power spectrum, principal component analysis, and autocorrelation) of any data set to improve compression strategies. For lossy compression users, Z-checker can detect the compression quality (compression ratio and bit rate) and provide various global distortion analysis comparing the original data with the decompressed data (peak signal-to-noise ratio, normalized mean squared error, rate-distortion, rate-compression error, spectral, distribution, and derivatives) and statistical analysis of the compression error (maximum, minimum, and average error; autocorrelation; and distribution of errors). Z-checker can perform the analysis with either coarse granularity (throughout the whole data set) or fine granularity (by user-defined blocks), such that the users and developers can select the best fit, adaptive compressors for different parts of the data set. Z-checker features a visualization interface displaying all analysis results in addition to some basic views of the data sets such as time series. To the best of our knowledge, Z-checker is the first tool designed to assess lossy compression comprehensively for scientific data sets.

Research Organization:
Argonne National Laboratory (ANL), Argonne, IL (United States)
Sponsoring Organization:
National Science Foundation (NSF); USDOE
Grant/Contract Number:
AC02-06CH11357
OSTI ID:
1437773
Alternate ID(s):
OSTI ID: 1510019
Journal Information:
International Journal of High Performance Computing Applications, Vol. 33, Issue 2; ISSN 1094-3420
Publisher:
SAGECopyright Statement
Country of Publication:
United States
Language:
English

References (19)

Exploration of Lossy Compression for Application-Level Checkpoint/Restart conference May 2015
Fast Error-Bounded Lossy HPC Data Compression with SZ conference May 2016
Fast Lossless Compression of Scientific Floating-Point Data conference January 2006
Fixed-Rate Compressed Floating-Point Arrays journal December 2014
HACC: extreme scaling and performance across diverse architectures journal December 2016
A study of the characteristics of white noise using the empirical mode decomposition method
  • Wu, Zhaohua; Huang, Norden E.
  • Proceedings of the Royal Society of London. Series A: Mathematical, Physical and Engineering Sciences, Vol. 460, Issue 2046 https://doi.org/10.1098/rspa.2003.1221
journal June 2004
ISABELA for effective in situ compression of scientific data: ISABELA FOR EFFECTIVE
  • Lakshminarasimhan, Sriram; Shah, Neil; Ethier, Stephane
  • Concurrency and Computation: Practice and Experience, Vol. 25, Issue 4 https://doi.org/10.1002/cpe.2887
journal July 2012
Significantly Improving Lossy Compression for Scientific Data Sets Based on Multidimensional Prediction and Error-Controlled Quantization conference May 2017
The JPEG still picture compression standard journal January 1992
Assessing the Effects of Data Compression in Simulations Using Physically Motivated Metrics journal January 2014
A methodology for evaluating the impact of data compression on climate simulation data
  • Baker, Allison H.; Xu, Haiying; Dennis, John M.
  • Proceedings of the 23rd international symposium on High-performance parallel and distributed computing - HPDC '14 https://doi.org/10.1145/2600212.2600217
conference January 2014
The Earth System Grid: Supporting the Next Generation of Climate Modeling Research journal March 2005
A universal algorithm for sequential data compression journal May 1977
NUMARCK: Machine Learning Algorithm for Resiliency and Checkpointing
  • Chen, Zhengzhang; Son, Seung Woo; Hendrix, William
  • SC14: International Conference for High Performance Computing, Networking, Storage and Analysis https://doi.org/10.1109/SC.2014.65
conference November 2014
Advanced Photon Source journal March 2016
Industrial-era global ocean heat uptake doubles in recent decades journal January 2016
A Method for the Construction of Minimum-Redundancy Codes journal September 1952
Industrial-era global ocean heat uptake doubles in recent decades journal January 2016
Evaluating lossy data compression on climate simulation data within a large ensemble journal January 2016

Figures / Tables (18)