markoff chain

/M AA1 R K AO0 F CH EY0 N/
noun
  1. 1

    a markov process for which the parameter is discrete time values