F-Information, a Unitless Variant of Fisher Information

Foundations of Physics 29 (10):1521-1541 (1999)
  Copy   BIBTEX

Abstract

A new information matrix [F] with elements F mn = 〈 (y m - a m )(y n - a n) (∂ ln p(y | a)/∂a m ) (∂ ln p(y | a)/∂a n ) 〉 is analyzed. The PDF p(y | a) is the usual likelihood law. [F] differs from the Fisher information matrix by the presence of the first two factors in the given expectation. These factors make F mn unitless, in contrast with the Fisher information. This lack of units allows F mn values from entirely different phenomena to be compared as, for example, Shannon information values can be compared. Each element F mn defines an error inequality analogous to the Cramer-Rao inequality. In the scalar case F mn ≡ F, for a normal p(y|a) law F = 3, while for an exponential law F = 9. A variational principle F = min (called FMIN) allows an unknown PDF p(x) to be estimated in the presence of weak information. Under certain conditions F obeys a “Boltzmann F-theorem” ∂F/∂t ⩽ 0, indicating that F is a physical entropy. Finally, the trace ℱ of [F] may be used as the scalar information quantity in an information-based principle for deriving distribution laws p of physics

Other Versions

No versions found

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 101,130

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

Analytics

Added to PP
2013-11-22

Downloads
84 (#248,657)

6 months
7 (#698,214)

Historical graph of downloads
How can I increase my downloads?

Citations of this work

No citations found.

Add more citations

References found in this work

No references found.

Add more references