Free Translator Free Translator
Translators Dictionaries Courses Other
Home
English Dictionary      examples: 'day', 'get rid of', 'New York Bay'




Markov chain   Listen
Markov chain

noun
1.
A Markov process for which the parameter is discrete time values.  Synonym: Markoff chain.






WordNet 3.0 © 2010 Princeton University








Advanced search
     Find words:
Starting with
Ending with
Containing
Matching a pattern  

Synonyms
Antonyms
Quotes
Words linked to  

only single words



Share |





Words linked to "Markov chain" :   Markoff process, Markov process



Copyright © 2024 Free-Translator.com