Stationary random process. Stationary process

Stationary random process

an important special class of random processes (See Random process), often found in applications of probability theory to various branches of natural science and technology. Random process X(t) is called stationary if all of it probabilistic characteristics do not change over time t(so, for example, the probability distribution of the quantity X(t) in front of everyone t is one and the same, but joint distribution probabilities of quantities X(t 1) And X(t 2) depends only on the duration of the period of time t 2 -t 1, i.e., distributions of pairs of quantities (X(t 1), X(t 2)} And ( X(t 1 + s), X(t 2 + s)) are the same for any t 1, t 2 And s And T. d.).

Scheme of S. s. p. describes many with a good approximation real phenomena, accompanied by disordered fluctuations. For example, ripple current or voltage in electrical circuit(electrical “noise”) can be considered as S. s. etc., if this circuit is in a stationary mode, i.e. if all its macroscopic characteristics and all the conditions causing current to flow through it do not change in time; velocity pulsations at a point in a turbulent flow represent s.s. etc., if they do not change general conditions, generating the flow under consideration (i.e. the flow is steady), etc. These and other examples of S. s. items found in physics (in particular, geo- and astrophysics), mechanics, and technology, stimulated the development of research in the field of solar systems. p.; At the same time, some generalizations of the concept of social systems also turned out to be significant. (for example, the concept of a random process with stationary increments given order, generalized S. s. and a homogeneous random field).

IN mathematical theory S. s. The main role is played by the moments of the probability distribution of process values X(t), being the simplest numerical characteristics these distributions. The moments of the first two orders are especially important: the average value of S. s. item E X(t)= m -mathematical expectation random variable X(t) And correlation function S. s. item E X(t 1) X(t 2)= B(t 2 -t 1) - mathematical expectation of the product X(t 1)X(t 2) (simply expressed in terms of the variance of the quantities X(t) and the correlation coefficient between X(t 1) And X(t 2); cm. Correlation). In many mathematical research, dedicated to S. s. etc., in general, only those properties of them are studied that are completely determined by the characteristics alone m and B (τ) (so-called correlation theory S. s. p.). In this regard, random processes X(t), having a constant average value E X(t)= m and correlation function B ( t 2 , t 1) = E X(t 1) X(t 2), depending only on t 2 - t 1, often called S. s. p.c. in a broad sense(and more particular random processes, all of whose characteristics do not change over time, in this case are called random processes in the narrow sense).

An important place in the mathematical theory of social science. are occupied by studies based on the expansion of a random process X(t) and its correlation function B ( t 2 -t 1) = In (τ) in the Fourier integral, or Fourier - Stieltjes (see. Fourier integral). The main role here is played by Khinchin’s theorem, according to which the correlation function of the system. p. X(t) can always be represented in the form

Where F(λ) - a monotonically non-decreasing function λ (and the integral on the right is the Stieltjes integral); if B (τ) decreases quickly enough as |τ|→∞ (as is most often the case in applications, provided that under X(t) is actually understood to be the difference X(t) - m), then the integral on the right side of (1) turns into the usual Fourier integral:

Where f(λ) = F'(λ) - non-negative function. Function F(λ) called the spectral function of the S. s. p. X(t), and the function F(λ) [in cases where equality (2) holds] - its spectral density. It also follows from Khinchin’s theorem that the process itself X(t) admits Spectral decomposition kind

Where Z(λ) - is a random function with uncorrelated increments, and the integral on the right is understood as the mean square limit of the corresponding sequence of integral sums. Decomposition (3) gives grounds to consider any system of systems. p. X(t) as a superposition of uncorrelated with each other harmonic vibrations different frequencies with random amplitudes and phases; at the same time spectral function F(λ) and spectral density f(λ) determine the distribution average energy included in X(t) harmonic oscillations along the frequency spectrum λ (and therefore, in applied research function f(λ) is often also called the energy spectrum or power spectrum of the solar system. p. X(t)).

Identification of the concept of S. s. item and obtaining the first mathematical results related to it are the merit of E. E. Slutsky (See. Slutsky) and date back to the late 20s and early 30s. 20th century In the future important work according to the theory of S. s. items were performed by A. Ya. Khinchin th , A.N. Kolmogorov th , G. Kramer ohm , N. Wiener om, etc.

Lit.: Slutsky E. E., Izbr. tr., M., 1960; Khinchin A. Ya., Theory of correlation of stationary stochastic processes, "Successes mathematical sciences", 1938, c. 5, pp. 42-51; Rozanov Yu. A., Stationary random processes, M., 1963; Prokhorov Yu. V., Rozanov Yu. A., Theory of Probability. (Basic concepts. Limit theorems. Random processes), 2nd ed., M., 1973; Gikhman I. I., Skorokhod A. V., Theory of random processes, vol. 1, M., 1971; Hennan E., Multivariate Time Series, trans. from English, M., 1974.

A. M. Yaglom.


Big Soviet encyclopedia. - M.: Soviet Encyclopedia. 1969-1978 .

See what a “Stationary random process” is in other dictionaries:

    A random process defined for all moments in time, stochastic. the characteristics of the product do not depend on the choice of starting point. reference moment (i.e. they do not change when replaced. More precisely, this means that for any set of times t1,...,tn... ... Physical encyclopedia

Definition [ | ]

X t (⋅) : Ω → R , t ∈ T (\displaystyle X_(t)(\cdot)\colon \Omega \to \mathbb (R) ,\quad t\in T),

Where T (\displaystyle T) an arbitrary set is called random function .

Terminology [ | ]

This classification is not strict. In particular, the term “random process” is often used as an absolute synonym for the term “random function”.

Classification [ | ]

  • Random process X (t) (\displaystyle X(t)) called a process discrete in time, if the system in which it occurs changes its states only at moments of time t 1 , t 2 , … (\displaystyle \;t_(1),t_(2),\ldots ), the number of which is finite or countable. The random process is called continuous time process, if the transition from state to state can occur at any time.
  • The random process is called process with continuous states , if the value of the random process is continuous random variable. The random process is called random process with discrete states, if the value of the random process is a discrete random variable:
  • The random process is called stationary, if all multidimensional distribution laws depend only on relative position moments in time t 1 , t 2 , … , t n (\displaystyle \;t_(1),t_(2),\ldots ,t_(n)), but not on the values ​​of these quantities themselves. In other words, a random process is called stationary if its probabilistic patterns are constant over time. Otherwise, it is called non-stationary.
  • The random function is called stationary in a broad sense, if its mathematical expectation and variance are constant, and the ACF depends only on the difference between the moments of time for which the ordinates are taken random function. The concept was introduced by A. Ya. Khinchin.
  • A random process is called a process with stationary increments a certain order, if the probabilistic patterns of such an increment are constant over time. Such processes were considered by Yaglom.
  • If the ordinates of a random function obey the normal distribution law, then the function itself is called normal.
  • Random functions, the law of distribution of ordinates of which at a future time is completely determined by the value of the ordinate of the process in present moment time and does not depend on the ordinate values ​​of the process at previous times are called Markovian.
  • The random process is called process with independent increments, if for any set t 1 , t 2 , … , t n (\displaystyle t_(1),t_(2),\ldots ,t_(n)), Where n > 2 (\displaystyle n>2), A t 1< t 2 < … < t n {\displaystyle t_{1} , random variables (X t 2 − X t 1) (\displaystyle (X_(t_(2))-X_(t_(1)))), (X t 3 − X t 2) (\displaystyle (X_(t_(3))-X_(t_(2)))), … (\displaystyle \ldots ), (X t n − X t n − 1) (\displaystyle (X_(t_(n))-X_(t_(n-1)))) collectively independent.
  • If, when determining the moment functions of a stationary random process, the operation of averaging over a statistical ensemble can be replaced by averaging over time, then such a stationary random process is called ergodic .
  • Among random processes, impulsive random processes are distinguished.

Trajectory of a random process[ | ]

Let a random process be given ( X t ) t ∈ T (\displaystyle \(X_(t)\)_(t\in T)). Then for each fixed t ∈ T (\displaystyle t\in T) X t (\displaystyle X_(t))- a random variable called cross section. If the elementary outcome is fixed ω ∈ Ω (\displaystyle \omega \in \Omega ), That X t: T → R (\displaystyle X_(t)\colon T\to \mathbb (R) )- deterministic parameter function t (\displaystyle t). This function is called trajectory or implementation random function ( X t ) (\displaystyle \(X_(t)\)).

Stationary random process

an important special class of random processes, often found in applications of probability theory to various branches of natural science and technology. A random process X (t) is called stationary if all its probabilistic characteristics do not change over time t (so, for example, the probability distribution of the value X (t) for all t is the same, and the joint probability distribution of the values ​​X (t

    depends only on the duration of the time interval t2≈t1, i.e. the distributions of pairs of quantities (X (t1), X (t2)) and (X (t1 + s), X (t2 + s)) are identical for any t1, t2 and s, etc.).

    Scheme of S. s. It describes with a good approximation many real phenomena accompanied by disordered fluctuations. So, for example, pulsations of current or voltage in an electrical circuit (electrical “noise”) can be considered as S. s. etc., if this circuit is in a stationary mode, i.e. if all its macroscopic characteristics and all the conditions causing current to flow through it do not change in time; velocity pulsations at a point in a turbulent flow represent s.s. etc., if the general conditions generating the flow under consideration do not change (i.e., the flow is steady), etc. These and other examples of S. s. items found in physics (in particular, geo- and astrophysics), mechanics, and technology, stimulated the development of research in the field of solar systems. p.; At the same time, some generalizations of the concept of social systems also turned out to be significant. (for example, the concepts of a random process with stationary increments of a given order, a generalized random process, and a homogeneous random field).

    In the mathematical theory of S. s. The main role is played by the moments of the probability distribution of the values ​​of the process X (t), which are the simplest numerical characteristics of these distributions. The moments of the first two orders are especially important: the average value of S. s. item EX (t) = m ≈ mathematical expectation of the random variable X (t) and correlation function S. s. p. EX (t1) X (t2)= B (t2≈t1) ≈ mathematical expectation of the product X (t1) X (t2) (simply expressed in terms of the variance of the values ​​X (t) and the correlation coefficient between X (t1) and X ( t2); see Correlation). In many mathematical studies devoted to social systems. etc., in general, only those properties of them are studied that are completely determined by the characteristics m and B(t) alone (the so-called correlation theory of social networks). In this regard, random processes X (t), having a constant average value EX (t) = m and a correlation function B (t2, t1) = EX (t1) X (t2), depending only on t2 ≈ t1, are often called C. With. p. in the broad sense (and more particular random processes, all of whose characteristics do not change over time, in this case are called random processes in the narrow sense).

    An important place in the mathematical theory of social science. areas are occupied by studies based on the expansion of the random process X (t) and its correlation function B (t2 ≈t1) = B (t) into the Fourier integral, or Fourier ≈ Stieltjes (see Fourier integral). The main role here is played by Khinchin’s theorem, according to which the correlation function of the system. item X (t) can always be represented in the form

    where F (l) ≈ a monotonically non-decreasing function l (and the integral on the right ≈ is the Stieltjes integral); if B (t) decreases quickly enough as |t|╝¥ (as this most often happens in applications, provided that by X (t) we actually mean the difference X (t) ≈ m), then the integral on the right part (1) turns into the usual Fourier integral:

    where f (l) = F▓(l) ≈ non-negative function. The function F(l) is called the spectral function of the s.s. item X (t), and the function F (l) [in cases where equality (2) holds] ≈ its spectral density. It also follows from Khinchin’s theorem that the process X (t) itself admits a spectral decomposition of the form

    where Z (l) ≈ a random function with uncorrelated increments, and the integral on the right is understood as the mean square limit of the corresponding sequence of integral sums. Decomposition (3) gives grounds to consider any system of systems. item X (t) as a superposition of harmonic oscillations of different frequencies uncorrelated with each other with random amplitudes and phases; in this case, the spectral function F (l) and spectral density f (l) determine the distribution of the average energy of the harmonic oscillations included in X (t) over the frequency spectrum l (therefore, in applied research, the function f (l) is often also called the energy spectrum or power spectrum S. s. p. X (t)).

    Identification of the concept of S. s. p. and the receipt of the first mathematical results related to it are the merit of E. E. Slutsky and date back to the late 20s and early 30s. 20th century Subsequently, important works on the theory of social systems. items were carried out by A. Ya. Khinchin, A. N. Kolmogorov, G. Kramer, N. Wiener and others.

    Lit.: Slutsky E. E., Izbr. tr., M., 1960; Khinchin A. Ya., Theory of correlation of stationary stochastic processes, “Advances in Mathematical Sciences”, 1938, century. 5, p. 42≈51; Rozanov Yu. A., Stationary random processes, M., 1963; Prokhorov Yu. V., Rozanov Yu. A., Theory of Probability. (Basic concepts. Limit theorems. Random processes), 2nd ed., M., 1973; Gikhman I. I., Skorokhod A. V., Theory of random processes, vol. 1, M., 1971; Hennan E., Multivariate Time Series, trans. from English, M., 1974.



Did you like the article? Share with your friends!