Bayes rules all: On the equivalence of various forms of learning in a probabilistic setting

Abstract

Jeffrey conditioning is said to provide a more general method of assimilating uncertain evidence than Bayesian conditioning. We show that Jeffrey learning is merely a particular type of Bayesian learning if we accept either of the following two observations: – Learning comprises both probability kinematics and proposition kinematics. – What can be updated is not the same as what can do the updating; the set of the latter is richer than the set of the former. We address the problem of commutativity and isolate commutativity from invariance upon conditioning on conjunctions. We also present a disjunctive model of Bayesian learning which suggests that Jeffrey conditioning is better understood as providing a method for incorporating unspecified but certain evidence rather than providing a method for incorporating specific but uncertain evidence. The results also generalize over many other subjective probability update rules, such as those proposed by Field and Gallow.

Other Versions

No versions found

Links

PhilArchive

    This entry is not archived by us. If you are the author and have permission from the publisher, we recommend that you archive it. Many publishers automatically grant permission to authors to archive pre-prints. By uploading a copy of your work, you will enable us to better index it, making it easier to find.

    Upload a copy of this work     Papers currently archived: 102,785

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

  • Only published works are available at libraries.

Analytics

Added to PP
2015-09-07

Downloads
27 (#851,257)

6 months
2 (#1,507,063)

Historical graph of downloads
How can I increase my downloads?

Author's Profile

Citations of this work

No citations found.

Add more citations

References found in this work

No references found.

Add more references