markov process
Noun
-
A simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state (synset 113532710)
is a type of: stochastic process - a statistical process involving a number of random variables depending on a variable parameter (which is usually time)subtypes: markoff chain, markov chain - a Markov process for which the parameter is discrete time valuessame as: markoff process
Found on Word Lists
Other Searches
- Rhyme: Dillfrog, RhymeZone
- Definition: Wiktionary, Dictionary.com, Wikipedia, Merriam-Webster, WordNet, Power Thesaurus
- Imagery: Google, Flickr, Bing