DOE PAGES title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: On the Solution of ℓ0-Constrained Sparse Inverse Covariance Estimation Problems

Abstract

The sparse inverse covariance matrix is used to model conditional dependencies between variables in a graphical model to fit a multivariate Gaussian distribution. Estimating the matrix from data are well known to be computationally expensive for large-scale problems. Sparsity is employed to handle noise in the data and to promote interpretability of a learning model. Although the use of a convex ℓ1 regularizer to encourage sparsity is common practice, the combinatorial ℓ0 penalty often has more favorable statistical properties. In this paper, we directly constrain sparsity by specifying a maximally allowable number of nonzeros, in other words, by imposing an ℓ0 constraint. Here, we introduce an efficient approximate Newton algorithm using warm starts for solving the nonconvex ℓ0-constrained inverse covariance learning problem. Numerical experiments on standard data sets show that the performance of the proposed algorithm is competitive with state-of-the-art methods.

Authors:
ORCiD logo [1];  [2]
  1. IBM Thomas J. Watson Research Center, Yorktown Heights, NY (United States)
  2. Argonne National Lab. (ANL), Lemont, IL (United States)
Publication Date:
Research Org.:
Argonne National Lab. (ANL), Argonne, IL (United States)
Sponsoring Org.:
IBM Thomas J. Watson Research Center, Yorktown Heights, NY (United States)
OSTI Identifier:
1839067
Grant/Contract Number:  
AC02-06CH11357
Resource Type:
Accepted Manuscript
Journal Name:
INFORMS Journal on Computing
Additional Journal Information:
Journal Volume: 33; Journal Issue: 2; Journal ID: ISSN 1091-9856
Publisher:
INFORMS
Country of Publication:
United States
Language:
English
Subject:
97 MATHEMATICS AND COMPUTING; $\ell_0$-Constrained; approximate Newton; gradient projection; inverse covariance; machine learning; optimization; sparsity

Citation Formats

Phan, Dzung T., and Menickelly, Matt. On the Solution of ℓ0-Constrained Sparse Inverse Covariance Estimation Problems. United States: N. p., 2020. Web. doi:10.1287/ijoc.2020.0991.
Phan, Dzung T., & Menickelly, Matt. On the Solution of ℓ0-Constrained Sparse Inverse Covariance Estimation Problems. United States. https://doi.org/10.1287/ijoc.2020.0991
Phan, Dzung T., and Menickelly, Matt. Thu . "On the Solution of ℓ0-Constrained Sparse Inverse Covariance Estimation Problems". United States. https://doi.org/10.1287/ijoc.2020.0991. https://www.osti.gov/servlets/purl/1839067.
@article{osti_1839067,
title = {On the Solution of ℓ0-Constrained Sparse Inverse Covariance Estimation Problems},
author = {Phan, Dzung T. and Menickelly, Matt},
abstractNote = {The sparse inverse covariance matrix is used to model conditional dependencies between variables in a graphical model to fit a multivariate Gaussian distribution. Estimating the matrix from data are well known to be computationally expensive for large-scale problems. Sparsity is employed to handle noise in the data and to promote interpretability of a learning model. Although the use of a convex ℓ1 regularizer to encourage sparsity is common practice, the combinatorial ℓ0 penalty often has more favorable statistical properties. In this paper, we directly constrain sparsity by specifying a maximally allowable number of nonzeros, in other words, by imposing an ℓ0 constraint. Here, we introduce an efficient approximate Newton algorithm using warm starts for solving the nonconvex ℓ0-constrained inverse covariance learning problem. Numerical experiments on standard data sets show that the performance of the proposed algorithm is competitive with state-of-the-art methods.},
doi = {10.1287/ijoc.2020.0991},
journal = {INFORMS Journal on Computing},
number = 2,
volume = 33,
place = {United States},
year = {Thu Oct 08 00:00:00 EDT 2020},
month = {Thu Oct 08 00:00:00 EDT 2020}
}

Works referenced in this record:

The Adaptive Lasso and Its Oracle Properties
journal, December 2006


Sparse Reconstruction by Separable Approximation
journal, July 2009

  • Wright, S. J.; Nowak, R. D.; Figueiredo, M. A. T.
  • IEEE Transactions on Signal Processing, Vol. 57, Issue 7
  • DOI: 10.1109/TSP.2009.2016892

An efficient optimization approach for a cardinality-constrained index tracking problem
journal, July 2015


Best subset selection via a modern optimization lens
journal, April 2016

  • Bertsimas, Dimitris; King, Angela; Mazumder, Rahul
  • The Annals of Statistics, Vol. 44, Issue 2
  • DOI: 10.1214/15-AOS1388

Solving Log-Determinant Optimization Problems by a Newton-CG Primal Proximal Point Algorithm
journal, January 2010

  • Wang, Chengjing; Sun, Defeng; Toh, Kim-Chuan
  • SIAM Journal on Optimization, Vol. 20, Issue 6
  • DOI: 10.1137/090772514

Sparse Inverse Covariance Estimation with L 0 Penalty for Network Construction with Omics Data
journal, March 2016

  • Liu, Zhenqiu; Lin, Shili; Deng, Nan
  • Journal of Computational Biology, Vol. 23, Issue 3
  • DOI: 10.1089/cmb.2015.0102

An inexact interior point method for L 1-regularized sparse covariance selection
journal, November 2010


Sparse inverse covariance estimation with the graphical lasso
journal, December 2007


A Nonmonotone Line Search Technique for Newton’s Method
journal, August 1986

  • Grippo, L.; Lampariello, F.; Lucidi, S.
  • SIAM Journal on Numerical Analysis, Vol. 23, Issue 4
  • DOI: 10.1137/0723046

Sparse Approximation via Penalty Decomposition Methods
journal, January 2013

  • Lu, Zhaosong; Zhang, Yong
  • SIAM Journal on Optimization, Vol. 23, Issue 4
  • DOI: 10.1137/100808071

Nonmonotone Spectral Projected Gradient Methods on Convex Sets
journal, January 2000

  • Birgin, Ernesto G.; Martínez, José Mario; Raydan, Marcos
  • SIAM Journal on Optimization, Vol. 10, Issue 4
  • DOI: 10.1137/S1052623497330963

Projection algorithms for nonconvex minimization with application to sparse principal component analysis
journal, February 2016

  • Hager, William W.; Phan, Dzung T.; Zhu, Jiajie
  • Journal of Global Optimization, Vol. 65, Issue 4
  • DOI: 10.1007/s10898-016-0402-z

A Multilevel Framework for Sparse Optimization with Application to Inverse Covariance Estimation and Logistic Regression
journal, January 2016

  • Treister, Eran; Turek, Javier S.; Yavneh, Irad
  • SIAM Journal on Scientific Computing, Vol. 38, Issue 5
  • DOI: 10.1137/15M102469X

Two-Point Step Size Gradient Methods
journal, January 1988

  • Barzilai, Jonathan; Borwein, Jonathan M.
  • IMA Journal of Numerical Analysis, Vol. 8, Issue 1
  • DOI: 10.1093/imanum/8.1.141

Sparse permutation invariant covariance estimation
journal, January 2008

  • Rothman, Adam J.; Bickel, Peter J.; Levina, Elizaveta
  • Electronic Journal of Statistics, Vol. 2, Issue none
  • DOI: 10.1214/08-EJS176

$l_{0}$ Sparse Inverse Covariance Estimation
journal, June 2015

  • Marjanovic, Goran; Hero, Alfred O.
  • IEEE Transactions on Signal Processing, Vol. 63, Issue 12
  • DOI: 10.1109/TSP.2015.2416680

First-Order Methods for Sparse Covariance Selection
journal, January 2008

  • d'Aspremont, Alexandre; Banerjee, Onureena; El Ghaoui, Laurent
  • SIAM Journal on Matrix Analysis and Applications, Vol. 30, Issue 1
  • DOI: 10.1137/060670985