DOE Patents title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Compression embedding

Abstract

A method of embedding auxiliary information into the digital representation of host data created by a lossy compression technique. The method applies to data compressed with lossy algorithms based on series expansion, quantization to a finite number of symbols, and entropy coding. Lossy compression methods represent the original data as integer indices having redundancy and uncertainty in value by one unit. Indices which are adjacent in value are manipulated to encode auxiliary data. By a substantially reverse process, the embedded auxiliary data can be retrieved easily by an authorized user. Lossy compression methods use loss-less compressions known also as entropy coding, to reduce to the final size the intermediate representation as indices. The efficiency of the compression entropy coding, known also as entropy coding is increased by manipulating the indices at the intermediate stage in the manner taught by the method.

Inventors:
 [1];  [2];  [2]
  1. (Los Alamos, NM)
  2. Los Alamos, NM
Issue Date:
Research Org.:
Los Alamos National Laboratory (LANL), Los Alamos, NM (United States)
OSTI Identifier:
871418
Patent Number(s):
5727092
Assignee:
Regents of University of California (Oakland, CA)
Patent Classifications (CPCs):
H - ELECTRICITY H04 - ELECTRIC COMMUNICATION TECHNIQUE H04N - PICTORIAL COMMUNICATION, e.g. TELEVISION
DOE Contract Number:  
W-7405-ENG-36
Resource Type:
Patent
Country of Publication:
United States
Language:
English
Subject:
compression; embedding; method; auxiliary; information; digital; representation; host; data; created; lossy; technique; applies; compressed; algorithms; based; series; expansion; quantization; finite; symbols; entropy; coding; methods; represent; original; integer; indices; redundancy; uncertainty; value; unit; adjacent; manipulated; encode; substantially; reverse; process; embedded; retrieved; easily; authorized; user; loss-less; compressions; reduce; final; size; intermediate; efficiency; increased; manipulating; stage; manner; taught; compression methods; auxiliary information; compression technique; authorized user; host data; auxiliary data; lossy compression; substantially reverse; original data; retrieved easily; reverse process; embedding auxiliary; digital representation; embedded auxiliary; series expansion; compression embedding; method applies; final size; data created; intermediate stage; algorithms based; data compressed; /382/

Citation Formats

Sandford, II, Maxwell T., Handel, Theodore G, and Bradley, Jonathan N. Compression embedding. United States: N. p., 1998. Web.
Sandford, II, Maxwell T., Handel, Theodore G, & Bradley, Jonathan N. Compression embedding. United States.
Sandford, II, Maxwell T., Handel, Theodore G, and Bradley, Jonathan N. Thu . "Compression embedding". United States. https://www.osti.gov/servlets/purl/871418.
@article{osti_871418,
title = {Compression embedding},
author = {Sandford, II, Maxwell T. and Handel, Theodore G and Bradley, Jonathan N},
abstractNote = {A method of embedding auxiliary information into the digital representation of host data created by a lossy compression technique. The method applies to data compressed with lossy algorithms based on series expansion, quantization to a finite number of symbols, and entropy coding. Lossy compression methods represent the original data as integer indices having redundancy and uncertainty in value by one unit. Indices which are adjacent in value are manipulated to encode auxiliary data. By a substantially reverse process, the embedded auxiliary data can be retrieved easily by an authorized user. Lossy compression methods use loss-less compressions known also as entropy coding, to reduce to the final size the intermediate representation as indices. The efficiency of the compression entropy coding, known also as entropy coding is increased by manipulating the indices at the intermediate stage in the manner taught by the method.},
doi = {},
journal = {},
number = ,
volume = ,
place = {United States},
year = {Thu Jan 01 00:00:00 EST 1998},
month = {Thu Jan 01 00:00:00 EST 1998}
}