markov chain mixing time

/M AA1 R K AO0 F CH EY0 N M IH0 K S IH0 NG T AY0 M/
N
  1. 1

    In probability theory, the mixing time of a Markov chain is the time until the Markov chain is "close" to its steady state distribution.