Dear UAIers:
On Thu, 5 Apr 2001, Fabio Gagliardi Cozman wrote:
> Dear UAIers,
>
> I must thank Russell Almond for adding to my list of
> references; I also must note that the SPI algorithm
> by Li and D'Ambrosio offered an early insight into
> the use of serial elimination (this is referenced in
> the paper I mentioned, together with Shenoy and Shafer's
> scheme):
>
> Z. Li and B. D'Ambrosio, Efficient inference in Bayes networks
> as a combinatorial optimization problem, Int. J. of Approximate
> Reasoning, 11, 1994.
>
> In SPI, like in standard variable elimination, the focus
> is on inference for a single query. In fact, variable
> elimination can be faster for single queries than
> the junction tree approach. And Russell is correct that
> doing elimination is equivalent to building a triangulated
> graph (it is a result in graph theory that an elimination
> ordering produces a triangulated graph; I think this is
> used in Jensen's 1996 book).
> Coupled with the possibility of doing a second pass and
> so getting all marginal probabilities, variable elimination
> becomes a quite powerful scheme.
>
> Best,
>
> Fabio
>
> On Wed, 4 Apr 2001, Fabio Gagliardi Cozman wrote:
> > But it is actually possible to derive a more
> > general form of variable elimination, in which a second pass
> > generates *all* marginal probabilities. The complexity is
> > similar to usual junction tree algorithms, and the result
> > is essentially the same. The bonus is that operations are much
> > easier to understand, without any mention to triangulation
> > and other graph-theoretic techniques. The result is similar
> > to a tree propagation algorithm, but in a tree of clusters.
> >
> > The more general variable elimination is presented at
> >
> > Fabio G. Cozman, Generalizing Variable Elimination in
> > Bayesian Networks, Workshop on Probabilistic Reasoning in
> > Artificial Intelligence, pp. 27-32, Atibaia, Brazil, 2000.
A similar (as far as I can tell from the brief description above)
extension of the variable elimination (or bucket elimination or SPI)
method that computes all marginals was first presented in UAI-98:
Mark Bloemeke and Marco Valtorta "A Hybrid Algorithm to Compute Marginal
and Joint Beliefs in Bayesian Networks and Its Complexity." UAI-98, pp.
16-23.
You may get this paper from http://www.cse.sc.edu/~mgv/papers/uai98.ps
There is a brief description of the algorithm in Bruce D'Ambrosio's
tutorial on inference in Bayesian networks (_AI Magazine, Summer 1999,
pp.21-36).
I second Russel Almond's recommendation that Umberto Bertele' and
Francesco Brioschi's book be added to the list of key references. (One
may want to add Donald Rose's work on symbolic Gaussian elimination, some
of which is referenced in that book.) The importance of this reference is
also that is shows clearly that bucket elimination (or variable
elimination or SPI) *is* a graphical technique. A triangulation is built
implicitly by operating of what Bertele' and Brioschi called the
interaction graph. The advantage of the Bloemeke-Valtorta scheme over the
junction tree algorithm is that there is no need to precompile the
interaction graph into a junction tree, and one can take full advantage of
the query. Much of this advantage is also achieved, I believe, by using
Jensen and Madsen's lazy propagation scheme, although the precise tradeoff
between the two approaches has not been studied, to the best of my
knowledge.
Cheers,
Marco
Marco Valtorta Associate Professor
Dept. of Computer Science and Engineering mgv@cse.sc.edu
University of South Carolina tel: (1)(803)777-4641 fax: -3767
Columbia, SC 29208, U.S.A. http://www.cse.sc.edu/~mgv/ tlx: 805038 USC
---------------------------------------------------------------------------
"Probability is not about numbers. It is about the structure of reasoning."
--Glenn Shafer
---------------------------------------------------------------------------
This archive was generated by hypermail 2b29 : Thu Apr 05 2001 - 15:45:30 PDT