📈

Understanding Discrete Time Markov Chains

Mar 7, 2025

Lecture Notes: Stochastic Processes - Discrete Time Markov Chains

Introduction

  • Presenter: Dylan Wong, University of Nevada Reno, senior math major.
  • Purpose: To share complex math concepts with limited resources online.
  • Focus: Stochastic processes.

Stochastic Processes

  • First topic: Discrete Time Markov Chains.

What is a Markov Chain?

  • Definition: A system with several states (nodes) and probabilities of transitioning between them.
  • Example: Restaurant choices with states A, B, C, D with various transition probabilities.

Questions and Probabilities in a Markov Chain

  • Define x_t = state at time t.
  • Key questions:
    • Probability of x_t = some value given initial state A.
    • As t approaches infinity, probability of being at a particular restaurant.

Transition Matrix

  • Purpose: Represents probabilities of moving from one state to another.
  • Structure: Rows represent starting states; columns represent ending states.
  • Example Matrix:
    • From A: 0.2 to A, 0.8 to B, 0 to C/D.
    • From B: 0.5 to A, 0.5 to D, 0 to B/C.
    • From C: Always stays at C.
    • From D: Always goes to A.

Types of States in Markov Chains

  • Transient States: Visited a finite number of times.
  • Recurrent States: Visited an infinite number of times.
  • Irreducible: Every state is connected.
  • Reducible: Some states are isolated or separated.

Periodicity

  • Describes the frequency of returning to a state.
  • Example: State C has a period of 1 (always returns to itself).

Solving Probability Distributions

  • Example: Probability of x_1 = A given x_0 = A is 0.2.
  • Distribution: Use the transition matrix to find probabilities after t steps.
  • Matrix Power: P(x_t | x_0 = A) can be found by T^t (row n).

Stationary Distribution

  • Definition: The distribution as t approaches infinity.
  • Finding: Solve for eigenvectors and eigenvalues.
  • Normalization: Make sure the eigenvector sums to 1.
  • Example: New transition matrix leads to a stationary distribution of 1/3 for all states.

Discrete vs Continuous

  • Discrete: Steps are countable changes in state.
  • Continuous: In real-life systems, decisions vary over time (to be covered next).

Wrap Up

  • Next Steps: Explore continuous time Markov processes.
  • Conclusion: Markov chains provide a mathematical framework for understanding state transitions over time.
  • Engagement: Encourage feedback and ideas for future topics.