Let’s practice this equation with our dishonest casino example. Suppose our initial
probabilities are a0Fair = ½ and aoLoaded = ½, and we have a sequence of rolls: x = 1, 2, 1, 5,
6, 2, 1, 5, 2, 4. Then the likelihood of x = Fair, Fair, Fair, Fair, Fair, Fair, Fair, Fair, Fair,
Fair is P(start at Fair) P(1 | Fair) P(Fair | Fair) P(2 | Fair) P(Fair | Fair) … P(4 | Fair)
= ½ (1/6)10 (0.95)9 = .00000000521158647211 ~= 0.5 10-9
where ½ is the start probability, 1/6 is the emission probability from the fair die, and 0.95
is the transition probability from Fair state to Fair state.
With the same sequences of rolls, the likelihood of x = Loaded, Loaded, Loaded, Loaded,
Loaded, Loaded, Loaded, Loaded, Loaded, Loaded is
P(start at Loaded) P(1 | Loaded) P(Loaded, Loaded) … P(4 | Loaded)
= ½ (1/10)9 (1/2)1 (0.95)9 = .00000000015756235243 ~= 0.16 10-9
where the first ½ is the start probability, 1/10 is the emission probability of NOT rolling
six from the loaded die, the second ½ is the emission probability of rolling six from the
loaded die, and 0.95 is the transition probability from Loaded state to Loaded state.
Therefore, it somewhat more likely that all the rolls are done with the fair die than that
they are all done with the loaded die
Now, changing the sequence of rolls to x = 1, 6, 6, 5, 6, 2, 6, 6, 3, 6. The likelihood = F,
F, …, F equals to ½ (1/6)10 (0.95)9 ~= 0.5 10-9, which is the same as before. But
the likelihood =L, L, …, L equals to ½ (1/10)4 (1/2)6 (0.95)9 ~= 0.5 10-7 . This
time, it is 100 times more likely the die is loaded.
7. Overview of the Three Main Questions on HMMs
We can define the three main questions on HMMs as follows:
Evaluation: What is the probability of the observed sequence?
o Given: a HMM M, and a sequence x,
o Find : Prob[ x | M ]
o Algorithm: Forward
o Example of dishonest casino: How likely is this sequence, given our
model of how the casino works?