You might look at the Fisher information number. This plays a role in the
Cramer-Rao lower bound, which says that the variance of an unbiased
estimator is bounded below by the reciprocal of the Fisher information
number. You can find these topics discussed in many mathematical
statistics texts, for example, Bickel and Doksum (Holden-Day 1977).
The other obvious answer is entropy, and its relative, the cross-entropy
between two random variables. The introductory probability text by Ross
(Prentice-Hall 1998) has a nice discussion of entropy and Shannon
information. My favorite discussion of entropy and cross-entropy is in
Bernardo and Smith, *Bayesian Theory* (Wiley 1994). However, it turns out
these are value-related. As Bernardo and Smith point out, if the decision
problem is to guess the distribution P(x) of a subsequently observed X, and
if your guess P* for P is valued according to the utility function U(x) =
ln P*(x), then the information value of an uncertain quantity Y is equal to
the cross-entropy between Y and X.
Gordon Hazen
Department of Industrial Engineering and Management Sciences
McCormick School of Engineering and Applied Science
Northwestern University
Evanston IL 60208-3119
Fax 847-491-8005
Phone 847-491-5673
www.iems.nwu.edu/~hazen/