Skip to main content
U.S. Department of Energy
Office of Scientific and Technical Information

Faster Johnson–Lindenstrauss transforms via Kronecker products

Journal Article · · Information and Inference (Online)
 [1];  [2];  [1]
  1. Univ. of Texas, Austin, TX (United States)
  2. Sandia National Lab. (SNL-CA), Livermore, CA (United States)
The Kronecker product is an important matrix operation with a wide range of applications in signal processing, graph theory, quantum computing and deep learning. In this work, we introduce a generalization of the fast Johnson–Lindenstrauss projection for embedding vectors with Kronecker product structure, the Kronecker fast Johnson–Lindenstrauss transform (KFJLT). The KFJLT reduces the embedding cost by an exponential factor of the standard fast Johnson–Lindenstrauss transform’s cost when applied to vectors with Kronecker structure, by avoiding explicitly forming the full Kronecker products. Here, we prove that this computational gain comes with only a small price in embedding power: consider a finite set of $$p$$ points in a tensor product of $$d$$ constituent Euclidean spaces $$\bigotimes _{k=d}^{1}{\mathbb{R}}^{n_k}$$, and let $$N = \prod _{k=1}^{d}n_k$$. With high probability, a random KFJLT matrix of dimension $$m \times N$$ embeds the set of points up to multiplicative distortion $$(1\pm \varepsilon )$$ provided $$m \gtrsim \varepsilon ^{-2} \, \log ^{2d - 1} (p) \, \log N$$. We conclude by describing a direct application of the KFJLT to the efficient solution of large-scale Kronecker-structured least squares problems for fitting the CP tensor decomposition.
Research Organization:
Sandia National Laboratories (SNL-CA), Livermore, CA (United States)
Sponsoring Organization:
National Science Foundation (NSF); US Air Force Office of Scientific Research (AFOSR); USDOE National Nuclear Security Administration (NNSA); USDOE Office of Science (SC), Advanced Scientific Computing Research (ASCR) (SC-21)
Grant/Contract Number:
AC04-94AL85000; NA0003525
OSTI ID:
1738933
Report Number(s):
SAND--2020-13781J; 692879
Journal Information:
Information and Inference (Online), Journal Name: Information and Inference (Online) Journal Issue: 4 Vol. 10; ISSN 2049-8772
Publisher:
Oxford University PressCopyright Statement
Country of Publication:
United States
Language:
English

References (24)

Restricted Isometry of Fourier Matrices and List Decodability of Random Linear Codes journal January 2013
Quantum Wavelet Transforms: Fast Algorithms and Complete Circuits preprint January 1998
On sparse reconstruction from Fourier and Gaussian measurements journal January 2008
Suprema of Chaos Processes and the Restricted Isometry Property journal January 2014
An elementary proof of a theorem of Johnson and Lindenstrauss journal November 2002
A Simple Proof of the Restricted Isometry Property for Random Matrices journal January 2008
Finding frequent items in data streams journal January 2004
Randomized interpolative decomposition of separated representations journal January 2015
Guarantees for the Kronecker fast Johnson–Lindenstrauss transform using a coherence and sampling argument journal October 2020
Probability Inequalities for Sums of Bounded Random Variables journal March 1963
Optimality of the Johnson-Lindenstrauss Lemma conference October 2017
Kronecker Compressive Sensing journal February 2012
Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information journal February 2006
Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies? journal January 2006
Sparse Sampling for Inverse Problems With Tensors journal June 2019
The Fast Johnson–Lindenstrauss Transform and Approximate Nearest Neighbors journal January 2009
Fast Fourier Transforms for Nonequispaced Data journal November 1993
The Restricted Isometry Property of Subsampled Fourier Matrices conference January 2016
New and Improved Johnson–Lindenstrauss Embeddings via the Restricted Isometry Property journal January 2011
A Practical Randomized CP Tensor Decomposition journal January 2018
An Almost Optimal Unrestricted Fast Johnson-Lindenstrauss Transform journal June 2013
Fast and scalable polynomial kernels via explicit feature maps
  • Pham, Ninh; Pagh, Rasmus
  • KDD' 13: The 19th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining https://doi.org/10.1145/2487575.2487591
conference August 2013
A Bound on Tail Probabilities for Quadratic Forms in Independent Random Variables journal June 1971
Concentration inequalities using the entropy method journal July 2003

Cited By (2)

Structured Random Sketching for PDE Inverse Problems journal January 2020
Practical Leverage-Based Sampling for Low-Rank Tensor Decomposition preprint January 2020