Re: "information gained" sometimes != "entropy reduction" ??

Peter Szolovits (psz@mit.edu)
Wed, 12 Aug 1998 10:01:51 -0400

My take on your question is that there is an expected information gain
before you introduce the new evidence, which is relevant to deciding
whether to ask some question or do some test. If you contemplate doing a
test, its expected information gain is the difference between the entropy
of some distribution of interest before testing and its expectation after
testing. That latter consists of the entropies resulting from getting each
possible test result, weighted by the likelihood of that result. Tony
Gorry did this in his MIT PhD in 1967, with an idiot Bayes model. For a
complex network, it's not quite clear (to me) what distribution(s) you
should be computing entropy over. In any case, though your expectation
from adding new evidence is a decrease in entropy, it's not guaranteed to
turn out that way; for example, you may get "contradictory" evidence.

At 05:51 PM 8/11/98 -0600, Robert Dodier wrote:
>Esteemed colleagues,
>
>I have a brief question concerning terminology, this time
>about "information."