🔗

Properties of Markov Chains

Jul 22, 2024

Properties of Markov Chains

Introduction

  • Welcome and invitation to subscribe
  • Follow-up to a previous video on fundamentals of Markov Chains
  • Links provided for new viewers to check the prior video

Simple Markov Chain Example

  • Example without transition probabilities labeled
  • Arrow indicates non-zero transition probability
  • Sum of outgoing probabilities from any state equals 1

Properties of States

Transient State

  • Example with state 0
  • Random walk starting at state 0
  • Probability of revisiting state 0 from itself is <1
  • Called a transient state

Recurrent State

  • Example with state 1
  • Random walk starting at state 1 shows revisiting probability = 1
  • Recurrent state example also applicable to state 2

Reducible and Irreducible Chains

Reducibility

  • Cannot revisit state 0 from states 1 or 2
  • Markov chain is reducible
  • Addition of a single edge between state 2 and state 0
  • All states become reachable from each other (irreducible chain)

Irreducibility

  • Explained with modified example (edge between state 2 and state 0)
  • Every state can be reached from any other state
  • Original reducible chain broken into irreducible smaller chains

Communicating Classes

Example: Gambler’s Ruin

  • States 0 and 3 are self-contained
  • States 1 and 2 can communicate with each other but not with 0 or 3
  • Communication defines classes: 3 in this example
  • Classes are known as communicating classes

Conclusion

  • Invitation to comment for more videos
  • Reminder to subscribe
  • Thanks to viewers