skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Training neural networks using sequential extended Kalman filtering

Conference ·
OSTI ID:46580

Recent work has demonstrated the use of the extended Kalman filter (EKF) as an alternative to gradient-descent backpropagation when training multi-layer perceptrons. The EKF approach significantly improves convergence properties but at the cost of greater storage and computational complexity. Feldkamp et al. have described a decoupled version of the EKF which preserves the training advantages of the general EKF but which reduces the storage and computational requirements. This paper reviews the general and decoupled EKF approaches and presents sequentialized versions which provide further computational savings over the batch forms. The usefulness of the sequentialized EKF algorithms is demonstrated on a pattern classification problem.

Research Organization:
Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
Sponsoring Organization:
USDOE, Washington, DC (United States)
DOE Contract Number:
W-7405-ENG-36
OSTI ID:
46580
Report Number(s):
LA-UR-95-423; CONF-950789-1; ON: DE95007874
Resource Relation:
Conference: 1995 world congress on neural networks, Washington, DC (United States), Jul 1995; Other Information: PBD: [1995]
Country of Publication:
United States
Language:
English