Two new papers in the Journal of Machine Learning Research

From: David 'Pablo' Cohn (David.Cohn@acm.org)
Date: Tue Oct 31 2000 - 09:55:53 PST

  • Next message: Judea Pearl: "[UAI] errata list for Causality"

    [Apologies for the broad distribution. Future announcements of this type
    will only be posted to jmlr-announce@ai.mit.edu. See the end of this
    message for information on subscribing.]

    The Journal of Machine Learning is pleased to announce the availability of
    two papers in electronic form.

    - ----------------------------------------
    Learning with Mixtures of Trees
    Marina Meila and Michael I. Jordan.
    Journal of Machine Learning Research 1 (October 2000) pp. 1-48.

    Abstract
    This paper describes the mixtures-of-trees model, a probabilistic model for
    discrete multidimensional domains. Mixtures-of-trees generalize the
    probabilistic trees of Chow and Liu (1968) in a different and complementary
    direction to that of Bayesian networks. We present efficient algorithms for
    learning mixtures-of-trees models in maximum likelihood and Bayesian
    frameworks. We also discuss additional efficiencies that can be obtained
    when data are "sparse," and we present data structures and algorithms that
    exploit such sparseness. Experimental results demonstrate the performance
    of the model for both density estimation and classification. We also
    discuss the sense in which tree-based classifiers perform an implicit form
    of feature selection, and demonstrate a resulting insensitivity to
    irrelevant attributes.

    - ----------------------------------------
    Dependency Networks for Inference, Collaborative Filtering, and Data
    Visualization
    David Heckerman, David Maxwell Chickering, Christopher Meek, Robert
    Rounthwaite, and Carl Kadie.
    Journal of Machine Learning Research 1 (October 2000), pp. 49-75.

    Abstract
    We describe a graphical model for probabilistic relationships--an
    alternative to the Bayesian network--called a dependency network. The graph
    of a dependency network, unlike a Bayesian network, is potentially cyclic.
    The probability component of a dependency network, like a Bayesian network,
    is a set of conditional distributions, one for each node given its parents.
    We identify several basic properties of this representation and describe a
    computationally efficient procedure for learning the graph and probability
    components from data. We describe the application of this representation to
    probabilistic inference, collaborative filtering (the task of predicting
    preferences), and the visualization of acausal predictive relationships.

    These first two papers of Volume 1 are available at http://www.jmlr.org in
    PostScript, PDF and HTML formats; a bound, hardcopy edition of Volume 1
    will be available in the next year.

    - -David Cohn, <david.cohn@acm.org>
      Managing Editor, Journal of Machine Learning Research

    - -------
    This message has been sent to the mailing list "jmlr-announce@ai.mit.edu",
    which is maintained automatically by majordomo. To subscribe to the list,
    send mail to listserv@ai.mit.edu with the line "subscribe jmlr-announce" in
    the body; to unsubscribe send email to listserv@ai.mit.edu with the line
    "unsubscribe jmlr-announce" in the body.



    This archive was generated by hypermail 2b29 : Tue Oct 31 2000 - 10:03:02 PST