A Bayesian approach to extracting meaning from system behavior
The modeling relation and its reformulation to include the semiotic hierarchy is essential for the understanding, control, and successful re-creation of natural systems. This presentation will argue for a careful application of Rosen`s modeling relationship to the problems of intelligence and autonomy in natural and artificial systems. To this end, the authors discuss the essential need for a correct theory of induction, learning, and probability; and suggest that modern Bayesian probability theory, developed by Cox, Jaynes, and others, can adequately meet such demands, especially on the operational level of extracting meaning from observations. The methods of Bayesian and maximum Entropy parameter estimation have been applied to measurements of system observables to directly infer the underlying differential equations generating system behavior. This approach by-passes the usual method of parameter estimation based on assuming a functional form for the observable and then estimating the parameters that would lead to the particular observed behavior. The computational savings is great since only location parameters enter into the maximum-entropy calculations; this innovation finesses the need for nonlinear parameters altogether. Such an approach more directly extracts the semantics inherent in a given system by going to the root of system meaning as expressed by abstract form or shape, rather than in syntactic particulars, such as signal amplitude and phase. Examples will be shown how the form of a system can be followed while ignoring unnecessary details. In this sense, the authors are observing the meaning of the words rather than being concerned with their particular expression or language. For the present discussion, empirical models are embodied by the differential equations underlying, producing, or describing the behavior of a process as measured or tracked by a particular variable set--the observables. The a priori models are probability structures that capture syntactical relationships within the formal system that mirrors the natural system under observation. Inductive learning is then a prescription for incorporating the current, and possibly changing, empirical model into an iterative syntactical relationship in the form of a statement about parameters, producing an updated a priori model subject to future modification in the light of changing parameter sets.
- Research Organization:
- Oak Ridge National Lab., Instrumentation and Controls Div., TN (United States)
- Sponsoring Organization:
- USDOE Office of Energy Research, Washington, DC (United States)
- DOE Contract Number:
- AC05-96OR22464
- OSTI ID:
- 666161
- Report Number(s):
- ORNL/CP-98644; CONF-981019-; ON: DE98007185; BR: 260900000; TRN: AHC29819%%317
- Resource Relation:
- Conference: 1998 IEEE international conference on systems, man and cybernetics, La Jolla, CA (United States), 14 Oct 1998; Other Information: PBD: [1998]
- Country of Publication:
- United States
- Language:
- English
Similar Records
Inductive class representation and its central role in pattern recognition
Unsupervised Group Discovery and LInk Prediction in Relational Datasets: a nonparametric Bayesian approach