Skip to main content
U.S. Department of Energy
Office of Scientific and Technical Information

Rethinking the learning of belief network probabilities

Conference ·
OSTI ID:421266
 [1]
  1. Lawrence Livermore National Lab., CA (United States)

Belief networks are a powerful tool for knowledge discovery that provide concise, understandable probabilistic models of data. There are methods grounded in probability theory to incrementally update the relationships described by the belief network when new information is seen, to perform complex inferences over any set of variables in the data, to incorporate domain expertise and prior knowledge into the model, and to automatically learn the model from data. This paper concentrates on part of the belief network induction problem, that of learning the quantitative structure (the conditional probabilities), given the qualitative structure. In particular, the current practice of rote learning the probabilities in belief networks can be significantly improved upon. We advance the idea of applying any learning algorithm to the task of conditional probability learning in belief networks, discuss potential benefits, and show results of applying neutral networks and other algorithms to a medium sized car insurance belief network. The results demonstrate from 10 to 100% improvements in model error rates over the current approaches.

OSTI ID:
421266
Report Number(s):
CONF-960830--; CNN: Grant CDA-8722788; Grant IRI-9058427
Country of Publication:
United States
Language:
English

Similar Records

Rethinking the learning of belief network probabilities
Conference · Thu Feb 29 23:00:00 EST 1996 · OSTI ID:251271

Quantum Graphical Models and Belief Propagation
Journal Article · Fri Aug 15 00:00:00 EDT 2008 · Annals of Physics (New York) · OSTI ID:21163722

Order priors for Bayesian network discovery with an application to malware phylogeny
Journal Article · Fri Sep 15 00:00:00 EDT 2017 · Statistical Analysis and Data Mining · OSTI ID:1398911