First, thanks to everyone who contributed to the thread on the
topic of the relation between information gain and entropy change.
I have now a question that is a little more specialized. I am
working with conditional distributions which are mixtures of
conditional Gaussians. I am assuming that the mixing coefficients
can depend on the parent variables. That is, the mixture density is
p(x|u) = \sum_j p(x|j,u) p(j|u)
where u is the parent, x is the child, and j is the mixture index.
This allows the mixture components to vary depending on where you
are in u's range. In belief network algorithms for mixtures of
conditional Gaussians that I know of (Hugin, Driver and Morrell),
mixing coefficients are not allowed to depend on the parent u.
Such a mixture arises naturally as the conditional density of a
joint (x,u) mixture of Gaussians. The mixing coefficient p(j|u)
is the ``responsibility'' of bump j of the marginal of u (itself
a mixture) for the given value of u; p(j|u) has the form
p(j|u) = p(j) p(u|j) / \sum_i p(i) p(u|i)
with p(i) the mixing coefficients of the joint mixture and p(u|i)
the components of the marginal of u.
I am now trying to compute inferences with such a mixture, such as
p(x|e+) = \int p(x|u) p(u|e+) du
with p(u|e+) a Gaussian or Gaussian mixture. However, I can't see
how to carry out the integration -- the normalizing terms in the
denominator of p(j|u) make life difficult. It's not even clear to
me that the exact result would again be a mixture of Gaussians, as
one would certainly hope.
Although the functions involved are all well-behaved, I would still
rather not carry out the integration numerically. Can someone point
me to papers or books that treat variable mixing coefficients?
I hope I am overlooking something simple. :)
Regards,
Robert Dodier
PS. Even Mathematica (via www.integrals.com) cannot integrate such a
simple example as Integral[Exp[-u^2]/(Exp[-u^2]+Exp[-(u-3)^2])
Exp[-(x-u)^2] Exp[-(u-5)^2]] wrt u, so I don't feel so bad. :)