Home

About

Advanced Search

Browse by Discipline

Scientific Societies

E-print Alerts

Add E-prints

E-print Network
FAQHELPSITE MAPCONTACT US


  Advanced Search  

 
Computational Statistics & Data Analysis 50 (2006) 267284 www.elsevier.com/locate/csda
 

Summary: Computational Statistics & Data Analysis 50 (2006) 267284
www.elsevier.com/locate/csda
Improved predictions penalizing both slope and
curvature in additive models
Magne Aldrin
Norwegian Computing Center, SAMBA P.O. Box 114, Blindern, N-0314 Oslo, Norway
Received 1 August 2004; accepted 2 August 2004
Available online 28 August 2004
Abstract
A new method is proposed to estimate the nonlinear functions in an additive regression model.
Usually, these functions are estimated by penalized least squares, penalizing the curvatures of the
functions. The new method penalizes the slopes as well, which is the type of penalization used in
ridge regression for linear models. Tuning (or smoothing) parameters are estimated by permuted leave-
k-out cross-validation. The prediction performance of various methods is compared by a simulation
experiment: penalizing both slope and curvature is either better than or as good as penalizing curvature
only.
2004 Elsevier B.V. All rights reserved.
Keywords: Penalized B-splines; Penalized least squares; Penalized likelihood; Ridge regression; Generalized
additive models; Cross-validation
1. Introduction

  

Source: Aldrin, Magne - Norsk Regnesentral

 

Collections: Biology and Medicine; Mathematics