Re: "information gained" sometimes != "entropy reduction" ??

Stephen M. Omohundro (om3@worldnet.att.net)
Tue, 25 Aug 1998 01:20:20 +0000

> X-Authentication-Warning: ghost.CS.ORST.EDU: majordom set sender to owner-uai@maillist.cs.orst.edu using -f
> cc: "'Robert Dodier'" <dodier@bechtel.Colorado.EDU>,
> "'uai@CS.ORST.EDU'" <uai@CS.ORST.EDU>
> Reply-To: dambrosi@CS.ORST.EDU
> Date: Tue, 11 Aug 1998 18:34:31 -0700
> From: "Bruce D'Ambrosio" <dambrosi@CS.ORST.EDU>
> Sender: owner-uai@CS.ORST.EDU
> Precedence: bulk
>
> Similarly, isn't it true that the entropy of the joint over all
> variables in the model has decreased?

I thought this was true also but consider the following simple
counterexample:

p(y=0,x=0)=0 p(y=1,x=0)=.98
p(y=0,x=1)=.01 p(y=1,x=1)=.01

Entropy = -.98 * lg .98 - .02 * lg .01 = .16

When we learn that x=1 this becomes:

p(y=0,x=0)=0 p(y=1,x=0)=0
p(y=0,x=1)=.5 p(y=1,x=1)=.5

with entropy = -lg .5 = 1

So the entropy goes up after we gain information about the state! Most
disturbing!

Stephen Omohundro