Markov process — Mark ov pro cess, n. [after A. A. Markov, Russian mathematician, b. 1856, d. 1922.] (Statistics) a random process in which the probabilities of states in a series depend only on the properties of the immediately preceding state or the next… … The Collaborative International Dictionary of English
Markov process — [mär′kôf] n. a chain of random events in which only the present state influences the next future state, as in a genetic code: also Markoff process … English World dictionary
Markov process — In probability theory and statistics, a Markov process, named after the Russian mathematician Andrey Markov, is a time varying random phenomenon for which a specific property (the Markov property) holds. In a common description, a stochastic… … Wikipedia
Markov process — noun a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state • Syn: ↑Markoff process • Hypernyms: ↑stochastic process • Hyponyms: ↑Markov chain,… … Useful english dictionary
Markov process — Markovo vyksmas statusas T sritis fizika atitikmenys: angl. Markov process; Markovian process vok. Markow Prozeß, m; Markowscher Prozeß, m rus. марковский процесс, m; процесс Маркова, m pranc. processus de Markoff, m; processus marcovien, m;… … Fizikos terminų žodynas
Markov process — noun Date: 1938 a stochastic process (as Brownian motion) that resembles a Markov chain except that the states are continuous; also Markov chain called also Markoff process … New Collegiate Dictionary
Markov process — noun A stochastic process in which the probability distribution of the current state is conditionally independent of the path of past states. See Also: Markov property, Markov chain … Wiktionary
Markov process — A stochastic process in which the probability of an event in the future is not affected by the past history of events … Dictionary of molecular biology
Continuous-time Markov process — In probability theory, a continuous time Markov process is a stochastic process { X(t) : t ≥ 0 } that satisfies the Markov property and takes values from a set called the state space; it is the continuous time version of a Markov chain. The… … Wikipedia
Semi-Markov process — A continuous time stochastic process is called a semi Markov process or Markov renewal process if the embedded jump chain (the discrete process registering what values the process takes) is a Markov chain, and where the holding times (time… … Wikipedia