Re: "information gained" sometimes != "entropy reduction" ??

Bruce D'Ambrosio (dambrosi@CS.ORST.EDU)
Mon, 24 Aug 1998 20:39:50 -0700

> I thought this was true also but consider the following simple
> counterexample:
>
> p(y=0,x=0)=0 p(y=1,x=0)=.98
> p(y=0,x=1)=.01 p(y=1,x=1)=.01
>
> Entropy = -.98 * lg .98 - .02 * lg .01 = .16
>
> When we learn that x=1 this becomes:
>
> p(y=0,x=0)=0 p(y=1,x=0)=0
> p(y=0,x=1)=.5 p(y=1,x=1)=.5
>
> with entropy = -lg .5 = 1
>
> So the entropy goes up after we gain information about the state! Most
> disturbing!
>
> Stephen Omohundro

Very interesting.

Perhaps it is normalization that is misleading us here? the entropy of
the unnormalized distribution does decrease (unless all values in the
joint set to zero by the observation were already zero, in which case
we learned nothing):

0 0
.01 .01

- .02*lg.01 = .1329

I suspect this is a fairly simple question for someone familiar with
information theory - anyone?

tnx - Bruce
dambrosi@cs.orst.edu