Training Second-Order Hidden Markov Models with Multiple Observation Sequences
Second-order hidden Markov models (HMM2) have been widely used in pattern recognition, especially in speech recognition. Their main advantages are their capabilities to model noisy temporal signals of variable length. In this article, we introduce a new HMM2 with multiple observable sequences, assuming that all the observable sequences are statistically correlated. In this treatment, the multiple observation probability is expressed as a combination of individual observation probabilities without losing generality.This combinatorial method gives one more freedom in making different dependenceindependence assumptions.By generalizing Baums auxiliary function into this framework and building up an associated objective function using Lagrange multiplier method, several new formulae solving model training problem are theoretically derived, we show that the model training equations can be easily derived with an independence assumption.
Second-order hidden Markov models forward-backward procedure Baum-Welch algorithm multiple observable sequences
Du Shiping Chen Tao Zeng Xianyin Wang Jian Wei Yuming
Collge of Biology and Science, Sichuan Agricultural University,Yaan 625014, Sichuan,China Triticeae Research Institute, Sichuan Agricultural University,Yaan 625014, Sichuan, China
国际会议
重庆
英文
25-29
2009-12-25(万方平台首次上网日期,不代表论文的发表时间)