Search code examples
pythonhidden-markov-modelshmmlearn

hmmlearn: how to get the prediction for the hidden state probability at time T+1, given a full observation sequence 1:T


I'm using hmmlearn's GaussianHMM to train a Hidden Markov Model with Gaussian observations. Each hidden state k has its corresponding Gaussian parameters: mu_k, Sigma_k.

After training the model, I would like to calculate the following quantity:

P(z_{T+1} = j | x_{1:T}),

where j = 1, 2, ... K, K is the number of hidden states.

The above probability is basically the one-step-ahead hidden state probabilities, given a full sequence of observations: x_1, x_2, ..., x_T, where x_i, i=1,...,T are used to train the HMM model.

I read the documentation, but couldn't find a function to calculate this probability. Is there any workaround?


Solution

  • The probability you are looking for is simply one row of the transition matrix. The n-th row of the transition matrix gives the probability of transitioning to each state at time t+1 knowing the state the system is at time t.

    In order to know in which state the system is at time t given a sequence of observations x_1,...,x_t one can use the Viterbi algorithm which is the default setting of the method predict in hmmlearn.

    model = hmm.GaussianHMM(n_components=3, covariance_type="full", n_iter=100)  # Viterbi is set by default as the 'algorithm' optional parameter.
    model.fit(data)
    state_sequence = model.predict(data)
    prob_next_step = model.transmat_[state_sequence[-1], :]
    

    I suggest you give a closer look at this documentation that shows concrete use examples.