Re: ?Value of information without utilities?

Gordon Hazen (hazen@iems.nwu.edu)
Thu, 20 May 1999 11:45:48 -0500

At 12:39 AM 5/20/99 -0800, you wrote:
>Colleagues,
>
>I'm looking for pointers to papers that deal with the issue of value of
>information that is based purely on probabilities (rather than the
>decision-theoretic notion of the difference in expected utilities). I will
>be glad to summarize the responses if there is sufficient interest.
>Greetings,
>
>Marek
>--------------------------------------------------------------------------
>Marek J. Druzdzel http://www.pitt.edu/~druzdzel
>

You might look at the Fisher information number. This plays a role in the
Cramer-Rao lower bound, which says that the variance of an unbiased
estimator is bounded below by the reciprocal of the Fisher information
number. You can find these topics discussed in many mathematical
statistics texts, for example, Bickel and Doksum (Holden-Day 1977).

The other obvious answer is entropy, and its relative, the cross-entropy
between two random variables. The introductory probability text by Ross
(Prentice-Hall 1998) has a nice discussion of entropy and Shannon
information. My favorite discussion of entropy and cross-entropy is in
Bernardo and Smith, *Bayesian Theory* (Wiley 1994). However, it turns out
these are value-related. As Bernardo and Smith point out, if the decision
problem is to guess the distribution P(x) of a subsequently observed X, and
if your guess P* for P is valued according to the utility function U(x) =
ln P*(x), then the information value of an uncertain quantity Y is equal to
the cross-entropy between Y and X.

Gordon Hazen
Department of Industrial Engineering and Management Sciences
McCormick School of Engineering and Applied Science
Northwestern University
Evanston IL 60208-3119

Fax 847-491-8005
Phone 847-491-5673
www.iems.nwu.edu/~hazen/