Re: [UAI] A general-purpose algorithm for belief networks?

From: Fabio Gagliardi Cozman (fgcozman@usp.br)
Date: Wed Apr 04 2001 - 12:34:31 PDT

  • Next message: Sam Maes: "Re: [UAI] BN Software API that create join tree"

    Dear colleagues,

        The following answer to one of Will Briggs's questions may be
        of interest to others (question: where to find a general, easy
        algorithm for inference in general Bayesian networks).
        
        The easiest inference algorithm to understand/teach is (in my
        opinion, anyway) the variable elimination algorithm, also called
        bucket elimination --- the problem with variable elimination is
        that it does not generate marginal probabilities for all variables
        in a network, just for a query variable.

        But it is actually possible to derive a more
        general form of variable elimination, in which a second pass
        generates *all* marginal probabilities. The complexity is
        similar to usual junction tree algorithms, and the result
        is essentially the same. The bonus is that operations are much
        easier to understand, without any mention to triangulation
        and other graph-theoretic techniques. The result is similar
        to a tree propagation algorithm, but in a tree of clusters.
        
        The more general variable elimination is presented at

    Fabio G. Cozman, Generalizing Variable Elimination in
    Bayesian Networks, Workshop on Probabilistic Reasoning in
    Artificial Intelligence, pp. 27-32, Atibaia, Brazil, 2000.

        (get this paper from
          http://www.cs.cmu.edu/~javabayes/Download/jb-heading.ps.gz)
        
        This paper presents standard variable elimination and then
        adds the second pass that generates all marginal probabilities.
        I tried to make the presentation compact and understandable,
        without mathematical complications (no theorems, just
        explanations). I have used the paper as teaching material
        and I think it works well. I would of course be happy to
        receive comments.
        
        This general variable elimination scheme is implemented
        in the JavaBayes system (a system for manipulation of
        Bayesian networks available at http://www.cs.cmu.edu/~javabayes).
        Because JavaBayes is distributed under the GNU license,
        the source code is available and you can read the
        implementation of the general variable elimination
        scheme.
        
        I hope this is useful; as I said, I would appreciate
        comments on the algorithm/paper/JavaBayes.
        
          Best,
          
                Fabio Cozman
                
                
    PS: Variable elimination uses a simple idea, already mentioned
    in the peeling algorithms by Cannings, Thompson and Skolnick;
    the idea is also mentioned in the 1988
    paper by Lauritzen and Spiegelhalter. The idea resurfaced in 1996
    in two papers, one by Zhang and Poole (used "variable elimination")
    and another by Decther (used "bucket elimination"). There have
    been efforts to derive inference algorithms that use junction trees
    without the triangulation step (I know of results obtained by
    Draper and by Darwiche), but the general variable elimination
    scheme seems simpler and more direct. The papers
    containing these results are:

    C. Cannings and E. A. Thompson and M. H. Skolnick, Probability
    Functions in Complex Pedigrees, Advances in Applied Probability,
    10, pp. 26-61, 1978.

    S. L. Lauritzen and D. J. Spiegelhalter, Local Computations with
    Probabilities on Graphical Structures and their Application to
    Expert Systems, Journal Royal Statistical Society B, 50(2):157-224,
    1988.

    N. L. Zhang and D. Poole, Exploiting Causal Independence in Bayesian
    Network Inference, Journal of Artificial Intelligence Research,
    pp. 301-328, 1996.

    Bucket Elimination: A unifying framework for
    probabilistic inference, XII Conference on Uncertainty in Artificial
    Intelligence, E. Horvitz and F. Jensen (eds.), pp. 211-219, Morgan
    Kaufmann, San Francisco, 1996. (There is also an extended version
    in the book Learning in Graphical Models, MIT Press, 1999.)

    D. Draper, Clustering without (thinking about) triangulation,
    XI Conference on Uncertainty in Artificial Intelligence,
    P. Besnard and S. Hanks (eds.), Morgan Kaufmann, San Francisco,
    1995.

    Adnan Darwiche, Dynamic Jointrees, Proceedings of the Fourteenth
    Conference on Uncertainty in Artificial Intelligence, G. F. Cooper
    and S. Moral (eds.), Morgan Kaufmann, San Francisco, 1998.

    On Tue, 3 Apr 2001, Briggs, Will wrote:
    > Is there an algorithm that would handle any belief net?
    > Has anyone written it down or implemented it?
    > I had thought of using the one Russell & Norvig presents for polytrees, on
    > nets that _aren't_ polytrees. (I do realize this will give the wrong
    > answer, but hope we can tweak it to fix the problem.)



    This archive was generated by hypermail 2b29 : Wed Apr 04 2001 - 12:42:41 PDT