Home

About

Advanced Search

Browse by Discipline

Scientific Societies

E-print Alerts

Add E-prints

E-print Network
FAQHELPSITE MAPCONTACT US


  Advanced Search  

 
0.1 Likelihood Ratios, Decision Rules, and Anti-Neurons (Notes for CNS 102, prepared by Jonathan Harel, Feb. 18. 2009)
 

Summary: 0.1 Likelihood Ratios, Decision Rules, and Anti-Neurons
(Notes for CNS 102, prepared by Jonathan Harel, Feb. 18. 2009)
Suppose we have two hypotheses h1 and h2 and only one observation of one
event e1. The best decision strategy would be to pick the hypothesis which is
more probable given the observation. We will express this in formal symbols
below.
De...ne:
p(hi is true given observation e) , p(hi is true je) = p(hije)
p(observing e given that hypothesis hi is true) , p(ejhi is true) = p(ejhi)
Then, as we know from the Bayesian rules of probability:
p(h1je) =
p(ejh1)p(h1)
p(e)
p(h2je) =
p(ejh2)p(h2)
p(e)
where p(hi) is the prior (before evidence) probability of hi being true, and p(e)
is the probability with which you would observe e under either hypothesis. You
can derive this arithmetic, namely p(a; b) = p(ajb)p(b) = p(bja)p(a), from a
frequentist interpretation, in which what we are describing using these p( )s are

  

Source: Adolphs, Ralph - Psychology and Neuroscience, California Institute of Technology

 

Collections: Biology and Medicine