Z-checker: A framework for assessing lossy compression of scientific data
- Department of Computer Science and Engineering, University of California, Riverside, CA, USA
- Division of Computer Science and Mathematics, Argonne National Laboratory, Lemont, IL, USA
- Department of Computer Science and Engineering, University of California, Riverside, CA, USA, Beijing University of Technology, Beijing, China
- Division of Computer Science and Mathematics, Argonne National Laboratory, Lemont, IL, USA, Parallel Computing Institute, University of Illinois Urbana–Champaign, Champaign, IL, USA
Because of the vast volume of data being produced by today’s scientific simulations and experiments, lossy data compressor allowing user-controlled loss of accuracy during the compression is a relevant solution for significantly reducing the data size. However, lossy compressor developers and users are missing a tool to explore the features of scientific data sets and understand the data alteration after compression in a systematic and reliable way. To address this gap, we have designed and implemented a generic framework called Z-checker. On the one hand, Z-checker combines a battery of data analysis components for data compression. On the other hand, Z-checker is implemented as an open-source community tool to which users and developers can contribute and add new analysis components based on their additional analysis demands. In this article, we present a survey of existing lossy compressors. Then, we describe the design framework of Z-checker, in which we integrated evaluation metrics proposed in prior work as well as other analysis tools. Specifically, for lossy compressor developers, Z-checker can be used to characterize critical properties (such as entropy, distribution, power spectrum, principal component analysis, and autocorrelation) of any data set to improve compression strategies. For lossy compression users, Z-checker can detect the compression quality (compression ratio and bit rate) and provide various global distortion analysis comparing the original data with the decompressed data (peak signal-to-noise ratio, normalized mean squared error, rate–distortion, rate-compression error, spectral, distribution, and derivatives) and statistical analysis of the compression error (maximum, minimum, and average error; autocorrelation; and distribution of errors). Z-checker can perform the analysis with either coarse granularity (throughout the whole data set) or fine granularity (by user-defined blocks), such that the users and developers can select the best fit, adaptive compressors for different parts of the data set. Z-checker features a visualization interface displaying all analysis results in addition to some basic views of the data sets such as time series. To the best of our knowledge, Z-checker is the first tool designed to assess lossy compression comprehensively for scientific data sets.
- Research Organization:
- Argonne National Laboratory (ANL), Argonne, IL (United States)
- Sponsoring Organization:
- USDOE; National Science Foundation (NSF)
- Grant/Contract Number:
- AC02-06CH11357
- OSTI ID:
- 1437773
- Alternate ID(s):
- OSTI ID: 1510019
- Journal Information:
- International Journal of High Performance Computing Applications, Journal Name: International Journal of High Performance Computing Applications Vol. 33 Journal Issue: 2; ISSN 1094-3420
- Publisher:
- SAGE PublicationsCopyright Statement
- Country of Publication:
- United States
- Language:
- English
Web of Science
Similar Records
cuZ-Checker: A GPU-Based Ultra-Fast Assessment System for Lossy Compressions
Optimization of Error-Bounded Lossy Compression for Hard-to-Compress HPC Data