markov process

Noun
  1. A simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state (synset 113532710)

Found on Word Lists

Other Searches