Home

About

Advanced Search

Browse by Discipline

Scientific Societies

E-print Alerts

Add E-prints

E-print Network
FAQHELPSITE MAPCONTACT US


  Advanced Search  

 
MODEL SELECTION IN GAUSSIAN REGRESSION FOR HIGH-DIMENSIONAL
 

Summary: MODEL SELECTION IN GAUSSIAN
REGRESSION FOR HIGH-DIMENSIONAL
DATA
Felix Abramovich and Vadim Grinshtein
Abstract We consider model selection in Gaussian regression, where the number
of predictors might be even larger than the number of observations. The proposed
procedure is based on penalized least square criteria with a complexity penalty on a
model size. We discuss asymptotic properties of the resulting estimators correspond-
ing to linear and so-called 2kln(p/k)-type nonlinear penalties for nearly-orthogonal
and multicollinear designs. We show that any linear penalty cannot be simulta-
neously adapted to both sparse and dense setups, while 2kln(p/k)-type penalties
achieve the wide adaptivity range. We also present Bayesian perspective on the pro-
cedure that provides an additional insight and can be used as a tool for obtaining a
wide class of penalized estimators associated with various complexity penalties.
1 Introduction
Modern statistics encounters new challenges, where the problems have exploded
both in size and complexity. Analysis of complex high-dimensional data sets of
very large sizes requires a new look on traditional statistical methods.
Consider the standard Gaussian linear regression setup
y = X +, (1)

  

Source: Abramovich, Felix - School of Mathematical Sciences, Tel Aviv University

 

Collections: Mathematics