In this video, I'll talk about ergodic Markov chains.
First, I'll introduce some new concepts about Markov chains.
In a Markov chain, a path from state i to state j is a sequence of transitions that
begins in i and ends in j, such that each transition in the sequence has a positive
probability.
A state j is reachable from state i if there is a path leading from i to j.
Two states i and j can communicate if j is reachable from i, and i is reachable from
j.
A set of states S in a Markov chain is a closed set if no state outside of S is reachable
from any state in S. Let's see an example.
This Markov chain has 5 states.
This is the state transition matrix.
In this example, state 3 is reachable from state 1 via the path 1–2–3.
But state 5 is not reachable from state 1.
There is no path from 1 to 5.
States 1 and 2 communicate.
We can go from 1 to 2 and from 2 to 1.
Set S1, which contains 3 states 1, 2, and 3, is a closed set.
Set S2, which contains 2 states, 4 and 5, is also a closed set.
A state i is an absorbing state if pii=1.
That means, the probability of the transition from i to itself is 100%.
If we enter an absorbing state, we will never leave the state.
An absorbing state is also a closed set containing only one state.
Here is an example.
In this Markov chain, both the first state and the last state are absorbing states.
Other states are not absorbing states.
A state i is a transient state if there exists a state j that is reachable from i, but state
i is not reachable from state j.
It means, once you leave a transient state, it is possible you will never return to this
state.
If a state is not transient, then it is called a recurrent state.
In this Markov chain, there are 3 transient states, 2, 3, and 4.
For example, we can go from 3 to 2, and then 2 to 1.
Then we get trapped in state 1 and will never come back to state 3 again.
The other two states, 1 and 5 are not transient, so they are recurrent states.
In this Markov chain, the three states on the left are all transient states.
If we go from one of these three states to one of the states on the right, we will never
come back to the left.
The three states on the right are all recurrent states.
A state i is periodic with period k, where k > 1, if k is the smallest number such that
all paths leading from state i back to state i have a length that is a multiple of k.
Absorbing states are aperiodic.
If we can return to a recurrent state at irregular times, it is aperiodic.
In this example, we have three states.
They are all recurrent states.
There is no absorbing state.
All states are periodic with a period of 3.
For example, starting from state 1, we need 3, 6, or 9 steps to come back to state 1.
The multiple is 3.
Now, let's take a look at this example.
We know that the absorbing states 1 and 5 are not periodic.
The transient states 2, 3, and 4 are not periodic because we may not come back to these states
again.
Similarly, in this example, the transient states 1, 2, and 3 are not periodic.
But the recurrent states 4, 5, and 6 are periodic and the period is 2.
If we start from any of these states, we can return to the state at time 2, 4, 6, and so
on.
If all states in a Markov chain are recurrent, aperiodic, and communicate with each other,
then this Markov chain is ergodic.
Ergodic Markov chains have some special properties, which will be introduced in another video.
Let's see some examples.
Is this Markov chain ergodic?
No, because it has some transient states, 2, 3, and 4.
And state 1 cannot communicate with state 5.
Is this Markov chain ergodic?
No, because it also has some transient states, 1, 2, and 3.
And some states cannot communicate with each other.
For example, we can go from 1 to 2, but not from 2 to 1.
Is this Markov chain represented by states 1, 2, and 3 ergodic?
Yes.
None of the states is transient.
There is no period.
And every state can communicate with other states.
What about the Markov chain represented by states 4 and 5?
Yes, it is ergodic for the same reasons.
What about the Markov chain represented by states 1 through 5?
No, it is not ergodic, because states 1, 2, and 3 cannot communicate with states 4 and
5.
Okay, that is about ergodic Markov chains.
Thanks for watching.
Không có nhận xét nào:
Đăng nhận xét