 stochastic process

In probability theory, a family of random variables indexed to some other set and having the property that for each finite subset of the index set, the collection of random variables indexed to it has a joint probability distribution.It is one of the most widely studied subjects in probability. Examples include Markov processes (in which the present value of the variable depends only upon the immediate past and not upon the whole sequence of past events), such as stockmarket fluctuations, and time series (in which temperature or rainfall measurements, for example, are taken at the same time each day over several days).
* * *
in probability theory, a process involving the operation of chance. For example, in radioactive decay (radioactive series) every atom is subject to a fixed probability of breaking down in any given time interval. More generally, a stochastic process refers to a family of random variables indexed against some other variable or set of variables. It is one of the most general objects of study in probability. Some basic types of stochastic processes include Markov processes (Markov process), Poisson processes (such as radioactive decay), and time series, with the index variable referring to time. This indexing can be either discrete or continuous, the interest being in the nature of changes of the variables with respect to time.* * *
Universalium. 2010.