Introduction to Markov Chains

A Markov Chain is a sequence of random values whose probabilities at the next states depend only on the state at the time, and no prior history.

Shown below are two ways to represent a Markov chain: a transition matrix and its corresponding state-transition diagram.

I will also explain and prove a theorem about the powers of a transition matrix.