A "memoryless" discrete stochastic process.
A Markov Chain is a discrete stochastic process for which the conditional probability of proceeding from state $x_t$ to state $x_{t+1}$ does not depend on the sequence $\ldots, x_0,x_1,\ldots,x_{t-1}$ of states preceding $x_t$. It is characterized by a transition matrix: conventionally each row, corresponding to a possible state, contains the conditional probabilities of transitions to each other state (including itself). Properties of the Markov Chain can thereby be computed from the transition matrix.