markoff process
Noun
-
is a type of: stochastic process - a statistical process involving a number of random variables depending on a variable parameter (which is usually time)subtypes: markoff chain, markov chain - a Markov process for which the parameter is discrete time valuessame as: markov process
Found on Word Lists
Other Searches
- Rhyme: Dillfrog, RhymeZone
- Definition: Wiktionary, Dictionary.com, Wikipedia, Merriam-Webster, WordNet, Power Thesaurus
- Imagery: Google, Flickr, Bing