Re: [UAI] learning HMM structure

From: Kevin Murphy (murphyk@cs.berkeley.edu)
Date: Wed Jun 13 2001 - 13:51:01 PDT

  • Next message: Richard Zemel: "[UAI] Reminder: Upcoming deadlines for NIPS*2001"

    MAP estimation of HMMs using a prior that encourages 0s is an effective
    way of learning structure (i.e., sparse transition matrices). See

    @article{Brand99,
      author = "M. Brand",
      title = "Structure learning in conditional probability models via an
    entropic prior and parameter extinction",
      journal = "Neural Computation",
      year = 1999,
      volume = 11,
      pages = "1155--1182"
    }

    Another classic approach is to build a tree that models the data
    perfectly, and then merge states to simplify the model. See

    @inproceedings{Stolcke92,
      author = "A. Stolcke and S. M. Omohundro",
      title = "Hidden Markov Model Induction by Bayesian Model Merging",
      year = 1992,
      booktitle = "NIPS-5"
    }

    HTH,
    Kevin



    This archive was generated by hypermail 2b29 : Wed Jun 13 2001 - 13:52:03 PDT