Learning the structure of HMMs can be handled using the framework of Dynamic
Bayesian networks (DBNs) . Smyth97 is a good reference to see that an HMM is
a specific example in this framework. Friedman98a and the references within
discuss the general learning algorithms and scoring metrics for DBNs. In our
recent work we are using the DBN setting to learn the structure of HMMs for
speech recognition tasks. Initial results will be published in Eurospeech
2001.
@Article{Smyth97,
author = {Padraic Smyth and David Heckerman and Michael I. Jordan},
title = {Probabilistic Independence Networks for Hidden Markov Probability
Models},
journal = {Neural Computation},
year = {1997},
volume = {9},
number = {2},
pages = {227--269},
}
@InProceedings{Friedman98a,
author = {Nir Friedman and Kevin Murphy and Stuart Russell},
title = {Learning the Structure of Dynamic Probabilistic Networks},
booktitle = {UAI'98},
year = {1998},
address = {Madison, Wisconsin},
}
Best Regards,
Murat Deviren
INRIA-LORIA
15, rue du Jardin Botanique - B.P. 101
54602 Villers-les-Nancy FRANCE
http://www.loria.fr/equipes/parole
Denver Dash wrote:
> Hi,
>
> Can anybody point me to some references for learning the structure of =
> Markov chains and Hidden Markov Models from data?
>
> Also of interest would be some standard references discussing metrics or =
> scoring functions for HMMs.
>
> Thanks,
> Denver.
> --
> http://www.sis.pitt.edu/~ddash
This archive was generated by hypermail 2b29 : Thu Jun 14 2001 - 10:14:26 PDT