i'm going to rewrite a MATLAB script that use the Kevin Murphy's toolbok in Python.
I know that there are some HMM algos implementation in python (Viterbi, Baum Welch, Backword Forward) so i think that i have everything i need to do the porting matlab-->python.
My MATLAB script uses the procedure written in learn_dhmm.m:
function [LL, prior, transmat, obsmat, gamma] = learn_dhmm(data, prior, transmat, obsmat, max_iter, thresh, verbose, act, adj_prior, adj_trans, adj_obs, dirichlet)
% LEARN_HMM Find the ML parameters of an HMM with discrete outputs using EM.
%
% [LL, PRIOR, TRANSMAT, OBSMAT] = LEARN_HMM(DATA, PRIOR0, TRANSMAT0, OBSMAT0)
% computes maximum likelihood estimates of the following parameters,
% where, for each time t, Q(t) is the hidden state, and
% Y(t) is the observation
% prior(i) = Pr(Q(1) = i)
% transmat(i,j) = Pr(Q(t+1)=j | Q(t)=i)
% obsmat(i,o) = Pr(Y(t)=o | Q(t)=i)
% It uses PRIOR0 as the initial estimate of PRIOR, etc.
i don't understand what this procedure actually do.
sorry i'm just approaching maching learning
I think that the comment is explanatory: Find the ML parameters of an HMM with discrete outputs using EM.
You can read this classic paper to understand HMMs: A tutorial on Hidden Markov Models and selected applications in speech recognition, L. Rabiner, 1989, Proc. IEEE 77(2):257--286.
The above function solves problem 3 (page 264) in the paper.