markoff chain
Noun
-
A Markov process for which the parameter is discrete time values (synset 113532571)
is a type of: markoff process, markov process - a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present statesame as: markov chain
Found on Word Lists
Other Searches
- Rhyme: Dillfrog, RhymeZone
- Definition: Wiktionary, Dictionary.com, Wikipedia, Merriam-Webster, WordNet, Power Thesaurus
- Imagery: Google, Flickr, Bing