You need JavaScript to view this

Learning methods for ill-posed problems. Applications to {gamma}-spectrometry; Methodes d`apprentissage statistiques et problemes inverses. Applications a la spectrographie

Abstract

Up to recent days, feature extraction bas been mostly considered a supervised process of (linear filters) mapping the original measurements into more effective features so as to minimize a criterion, assuming that the variables are already selected and given. Furthermore, data are rare and/or expensive, even sometimes not representative of the exact distribution. From an experimental device, the physicist gets some measurements, spoiled by noise and some determinist distortions. The `problem` is then to seek `good` values of a `number` of `interesting` parameters. But, neither `good`, nor the `number`, nor `interesting` are clearly defined notions. Frequently, the physicist is unable to write the mathematical equations of the observed phenomenon. He hopes that usual recipes called Fourier transform, deconvolution, least squares... Will produce shining revelations. Of course, these recipes are well-known and their honorability well established, sometimes with a name of a mathematician as a quality-label. In Pattern Recognition the input items have to be identified under various transformations of their representations. Contemporary neural-network research concentrates mostly on decision making systems, whereas the fundamental functions associated with the preprocessing of observations have often been ignored. This paper is a step toward theories that are expected to help the emergence of invariant-features.  More>>
Authors:
Vigneron, V [1] 
  1. CEA Saclay, 91 - Gif-sur-Yvette (France). Dept. de Mecanique et de Technologie
Publication Date:
May 01, 1997
Product Type:
Thesis/Dissertation
Report Number:
CEA-R-5778
Reference Number:
SCA: 440103; PA: AIX-30:003763; EDB-99:026017; SN: 99002059692
Resource Relation:
Other Information: DN: 303 refs.; TH: These (D. es Sc.); PBD: May 1997
Subject:
44 INSTRUMENTATION, INCLUDING NUCLEAR AND PARTICLE DETECTORS; ARSENIC; ARTIFICIAL INTELLIGENCE; COMPUTER ARCHITECTURE; COMPUTERIZED SIMULATION; GAMMA SPECTROSCOPY; ISOTOPE SEPARATION; MONTE CARLO METHOD; NEURAL NETWORKS; PLUTONIUM; STATISTICAL MODELS; THORIUM; URANIUM; X-RAY FLUORESCENCE ANALYSIS
OSTI ID:
308941
Research Organizations:
CEA Centre d`Etudes de Saclay, 91 - Gif-sur-Yvette (France). Dept. de Mecanique et de Technologie
Country of Origin:
France
Language:
French
Other Identifying Numbers:
Other: ON: DE99612405; TRN: FR9803950003763
Availability:
INIS; OSTI as DE99612405
Submitting Site:
FRN
Size:
263 p.
Announcement Date:
Mar 03, 1999

Citation Formats

Vigneron, V. Learning methods for ill-posed problems. Applications to {gamma}-spectrometry; Methodes d`apprentissage statistiques et problemes inverses. Applications a la spectrographie. France: N. p., 1997. Web.
Vigneron, V. Learning methods for ill-posed problems. Applications to {gamma}-spectrometry; Methodes d`apprentissage statistiques et problemes inverses. Applications a la spectrographie. France.
Vigneron, V. 1997. "Learning methods for ill-posed problems. Applications to {gamma}-spectrometry; Methodes d`apprentissage statistiques et problemes inverses. Applications a la spectrographie." France.
@misc{etde_308941,
title = {Learning methods for ill-posed problems. Applications to {gamma}-spectrometry; Methodes d`apprentissage statistiques et problemes inverses. Applications a la spectrographie}
author = {Vigneron, V}
abstractNote = {Up to recent days, feature extraction bas been mostly considered a supervised process of (linear filters) mapping the original measurements into more effective features so as to minimize a criterion, assuming that the variables are already selected and given. Furthermore, data are rare and/or expensive, even sometimes not representative of the exact distribution. From an experimental device, the physicist gets some measurements, spoiled by noise and some determinist distortions. The `problem` is then to seek `good` values of a `number` of `interesting` parameters. But, neither `good`, nor the `number`, nor `interesting` are clearly defined notions. Frequently, the physicist is unable to write the mathematical equations of the observed phenomenon. He hopes that usual recipes called Fourier transform, deconvolution, least squares... Will produce shining revelations. Of course, these recipes are well-known and their honorability well established, sometimes with a name of a mathematician as a quality-label. In Pattern Recognition the input items have to be identified under various transformations of their representations. Contemporary neural-network research concentrates mostly on decision making systems, whereas the fundamental functions associated with the preprocessing of observations have often been ignored. This paper is a step toward theories that are expected to help the emergence of invariant-features. In this context, the Learning Theory approach (through advances tools like ACP, CCA or factorial cumulants) offers a great potential for archiving optimal solutions of complex real world problems, because it deals with undefined knowledge which is in mind of the physicist before he carries out the experiment: non-linear correlations, hidden dependencies... These questions are complex and very problem-dependant, but we focus on a specific one: ill-conditioned problems, i.e. when the physicist has not a sufficient amount of experimental data. In order to illustrate our approach, we propose a wide range of examples, real to artificial, in the fields concerning non-destructive analysis, from X-ray fluorescence to neutronography. (author) 303 refs.}
place = {France}
year = {1997}
month = {May}
}