 
Summary: Efficient Convergence Implies Ockham's Razor
Kevin T. Kelly
Department of Philosophy
Carnegie Mellon University
kk3n@andrew.cmu.edu
March 6, 2005
Abstract
A finite data set is consistent with infinitely many alternative theories. Scientific realists
recommend that we prefer the simplest one. Antirealists ask how a fixed simplicity bias
could track the truth when the truth might be complex. It is no solution to impose a prior
probability distribution biased toward simplicity, for such a distribution merely embodies
the bias at issue without explaining its efficacy. In this note, I argue, on the basis of
computational learning theory, that a fixed simplicity bias is necessary if inquiry is to
converge to the right answer efficiently, whatever the right answer might be. Efficiency is
understood in the sense of minimizing the least fixed bound on retractions or errors prior
to convergence.
Keywords: learning, induction, simplicity, Ockham's razor, realism, skepticism
1 Introduction
There are infinitely many alternative theories compatible with any finite amount of
experience, so choosing the right one on the basis of the evidence alone seems hopeless.
