Markov Chain
- Sequence of random variables such as only depends on
- Discrete time
- Stochastic process without memory
- Finite interval : [0, 1, … n]
- Right infinite : n = 1, 2, 3, …
- Left right infinite : Integers
- No start
- Markov Initial Distribution
- Markov Transition Kernel
- Markov for Continuous Distributions