Markov Chains# WIP Reference# https://en.wikipedia.org/wiki/Markov_chain state space : set of all configuration of a system. countable state space continuous or general state space discrete time discrete time markov chain on a countable/finite state space. continuous time [ ]: