markoff chain

Noun
  1. A Markov process for which the parameter is discrete time values (synset 113532571)

Other Searches