A Markov process for which the parameter is discrete time values. Synonym: Markov chain.
![]() |
![]() ![]() |
|
![]() | ![]() | ![]() | |||
|
|
|
![]() |
Words linked to "Markoff chain" : Markov chain, Markoff process, Markov process |
Copyright © 2025 Free-Translator.com |