Stochastic Gradients for Large-Scale Tensor Decomposition
- Sandia National Lab. (SNL-CA), Livermore, CA (United States)
- Univ. of Michigan, Ann Arbor, MI (United States)
Tensor decomposition is a well-known tool for multiway data analysis. This work proposes using stochastic gradients for efficient generalized canonical polyadic (GCP) tensor decomposition of large-scale tensors. GCP tensor decomposition is a recently proposed version of tensor decomposition that allows for a variety of loss functions such as Bernoulli loss for binary data or Huber loss for robust estimation. Here, the stochastic gradient is formed from randomly sampled elements of the tensor and is efficient because it can be computed using the sparse matricized-tensor times Khatri--Rao product tensor kernel. For dense tensors, we simply use uniform sampling. For sparse tensors, we propose two types of stratified sampling that give precedence to sampling nonzeros. Numerical results demonstrate the advantages of the proposed approach and its scalability to large-scale problems.
- Research Organization:
- Sandia National Laboratories (SNL-CA), Livermore, CA (United States)
- Sponsoring Organization:
- USDOE Office of Science (SC), Advanced Scientific Computing Research (ASCR) (SC-21)
- Grant/Contract Number:
- AC04-94AL85000
- OSTI ID:
- 1738932
- Alternate ID(s):
- OSTI ID: 1767900
- Report Number(s):
- SAND--2020-13780J; 692878
- Journal Information:
- SIAM Journal on Mathematics of Data Science, Journal Name: SIAM Journal on Mathematics of Data Science Journal Issue: 4 Vol. 2; ISSN 2577-0187
- Publisher:
- Society for Industrial and Applied Mathematics (SIAM)Copyright Statement
- Country of Publication:
- United States
- Language:
- English
| Practical Leverage-Based Sampling for Low-Rank Tensor Decomposition | preprint | January 2020 |
Similar Records
Streaming Generalized Canonical Polyadic Tensor Decompositions
Software for Sparse Tensor Decomposition on Emerging Computing Architectures