In this seminar I will talk about the work of Hsu and Kakade (2009) on spectral methods for leaning a Hidden Markov Model (HMM). I will introduce and exemplify this method and describe how we can estimate the best set of state transitions or how we can predict a sequence of future observations. Hidden Markov Models are one of the most fundamental tools for modeling a discrete time series, and in general learning from an HMM is computationally hard, the classical approach to this problem resorts to search heuristics, like expectation maximization that are prone to local optima. I will talk about a new approach that learns an HMM based on the spectral decomposition of its parameters.This method depends implicitly on the number of distinct observations, making the algorithm particularly applicable to settings with a large number of observations, such as those in natural language processing.
Menu
Spectral Algorithms for learning Hidden Markov Models
March 26, 2013
1:00 pm