Misplaced Pages

Stationary process

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
(Redirected from Stationary stochastic process) Type of stochastic process

In mathematics and statistics, a stationary process (or a strict/strictly stationary process or strong/strongly stationary process) is a stochastic process whose unconditional joint probability distribution does not change when shifted in time. Consequently, parameters such as mean and variance also do not change over time.

Since stationarity is an assumption underlying many statistical procedures used in time series analysis, non-stationary data are often transformed to become stationary. The most common cause of violation of stationarity is a trend in the mean, which can be due either to the presence of a unit root or of a deterministic trend. In the former case of a unit root, stochastic shocks have permanent effects, and the process is not mean-reverting. In the latter case of a deterministic trend, the process is called a trend-stationary process, and stochastic shocks have only transitory effects after which the variable tends toward a deterministically evolving (non-constant) mean.

A trend stationary process is not strictly stationary, but can easily be transformed into a stationary process by removing the underlying trend, which is solely a function of time. Similarly, processes with one or more unit roots can be made stationary through differencing. An important type of non-stationary process that does not include a trend-like behavior is a cyclostationary process, which is a stochastic process that varies cyclically with time.

For many applications strict-sense stationarity is too restrictive. Other forms of stationarity such as wide-sense stationarity or N-th-order stationarity are then employed. The definitions for different kinds of stationarity are not consistent among different authors (see Other terminology).

Strict-sense stationarity

Definition

Formally, let { X t } {\displaystyle \left\{X_{t}\right\}} be a stochastic process and let F X ( x t 1 + τ , , x t n + τ ) {\displaystyle F_{X}(x_{t_{1}+\tau },\ldots ,x_{t_{n}+\tau })} represent the cumulative distribution function of the unconditional (i.e., with no reference to any particular starting value) joint distribution of { X t } {\displaystyle \left\{X_{t}\right\}} at times t 1 + τ , , t n + τ {\displaystyle t_{1}+\tau ,\ldots ,t_{n}+\tau } . Then, { X t } {\displaystyle \left\{X_{t}\right\}} is said to be strictly stationary, strongly stationary or strict-sense stationary if

F X ( x t 1 + τ , , x t n + τ ) = F X ( x t 1 , , x t n ) for all  τ , t 1 , , t n R  and for all  n N > 0 {\displaystyle F_{X}(x_{t_{1}+\tau },\ldots ,x_{t_{n}+\tau })=F_{X}(x_{t_{1}},\ldots ,x_{t_{n}})\quad {\text{for all }}\tau ,t_{1},\ldots ,t_{n}\in \mathbb {R} {\text{ and for all }}n\in \mathbb {N} _{>0}} (Eq.1)

Since τ {\displaystyle \tau } does not affect F X ( ) {\displaystyle F_{X}(\cdot )} , F X {\displaystyle F_{X}} is independent of time.

Examples

Two simulated time series processes, one stationary and the other non-stationary, are shown above. The augmented Dickey–Fuller (ADF) test statistic is reported for each process; non-stationarity cannot be rejected for the second process at a 5% significance level.

White noise is the simplest example of a stationary process.

An example of a discrete-time stationary process where the sample space is also discrete (so that the random variable may take one of N possible values) is a Bernoulli scheme. Other examples of a discrete-time stationary process with continuous sample space include some autoregressive and moving average processes which are both subsets of the autoregressive moving average model. Models with a non-trivial autoregressive component may be either stationary or non-stationary, depending on the parameter values, and important non-stationary special cases are where unit roots exist in the model.

Example 1

Let Y {\displaystyle Y} be any scalar random variable, and define a time-series { X t } {\displaystyle \left\{X_{t}\right\}} , by

X t = Y  for all  t . {\displaystyle X_{t}=Y\qquad {\text{ for all }}t.}

Then { X t } {\displaystyle \left\{X_{t}\right\}} is a stationary time series, for which realisations consist of a series of constant values, with a different constant value for each realisation. A law of large numbers does not apply on this case, as the limiting value of an average from a single realisation takes the random value determined by Y {\displaystyle Y} , rather than taking the expected value of Y {\displaystyle Y} .

The time average of X t {\displaystyle X_{t}} does not converge since the process is not ergodic.

Example 2

As a further example of a stationary process for which any single realisation has an apparently noise-free structure, let Y {\displaystyle Y} have a uniform distribution on [ 0 , 2 π ] {\displaystyle } and define the time series { X t } {\displaystyle \left\{X_{t}\right\}} by

X t = cos ( t + Y )  for  t R . {\displaystyle X_{t}=\cos(t+Y)\quad {\text{ for }}t\in \mathbb {R} .}

Then { X t } {\displaystyle \left\{X_{t}\right\}} is strictly stationary since ( ( t + Y ) {\displaystyle (t+Y)} modulo 2 π {\displaystyle 2\pi } ) follows the same uniform distribution as Y {\displaystyle Y} for any t {\displaystyle t} .

Example 3

Keep in mind that a weakly white noise is not necessarily strictly stationary. Let ω {\displaystyle \omega } be a random variable uniformly distributed in the interval ( 0 , 2 π ) {\displaystyle (0,2\pi )} and define the time series { z t } {\displaystyle \left\{z_{t}\right\}}

z t = cos ( t ω ) ( t = 1 , 2 , . . . ) {\displaystyle z_{t}=\cos(t\omega )\quad (t=1,2,...)}

Then

E ( z t ) = 1 2 π 0 2 π cos ( t ω ) d ω = 0 , Var ( z t ) = 1 2 π 0 2 π cos 2 ( t ω ) d ω = 1 / 2 , Cov ( z t , z j ) = 1 2 π 0 2 π cos ( t ω ) cos ( j ω ) d ω = 0 t j . {\displaystyle {\begin{aligned}\mathbb {E} (z_{t})&={\frac {1}{2\pi }}\int _{0}^{2\pi }\cos(t\omega )\,d\omega =0,\\\operatorname {Var} (z_{t})&={\frac {1}{2\pi }}\int _{0}^{2\pi }\cos ^{2}(t\omega )\,d\omega =1/2,\\\operatorname {Cov} (z_{t},z_{j})&={\frac {1}{2\pi }}\int _{0}^{2\pi }\cos(t\omega )\cos(j\omega )\,d\omega =0\quad \forall t\neq j.\end{aligned}}}

So { z t } {\displaystyle \{z_{t}\}} is a white noise in the weak sense (the mean and cross-covariances are zero, and the variances are all the same), however it is not strictly stationary.

Nth-order stationarity

In Eq.1, the distribution of n {\displaystyle n} samples of the stochastic process must be equal to the distribution of the samples shifted in time for all n {\displaystyle n} . N-th-order stationarity is a weaker form of stationarity where this is only requested for all n {\displaystyle n} up to a certain order N {\displaystyle N} . A random process { X t } {\displaystyle \left\{X_{t}\right\}} is said to be N-th-order stationary if:

F X ( x t 1 + τ , , x t n + τ ) = F X ( x t 1 , , x t n ) for all  τ , t 1 , , t n R  and for all  n { 1 , , N } {\displaystyle F_{X}(x_{t_{1}+\tau },\ldots ,x_{t_{n}+\tau })=F_{X}(x_{t_{1}},\ldots ,x_{t_{n}})\quad {\text{for all }}\tau ,t_{1},\ldots ,t_{n}\in \mathbb {R} {\text{ and for all }}n\in \{1,\ldots ,N\}} (Eq.2)

Weak or wide-sense stationarity

Definition

A weaker form of stationarity commonly employed in signal processing is known as weak-sense stationarity, wide-sense stationarity (WSS), or covariance stationarity. WSS random processes only require that 1st moment (i.e. the mean) and autocovariance do not vary with respect to time and that the 2nd moment is finite for all times. Any strictly stationary process which has a finite mean and covariance is also WSS.

So, a continuous time random process { X t } {\displaystyle \left\{X_{t}\right\}} which is WSS has the following restrictions on its mean function m X ( t ) E [ X t ] {\displaystyle m_{X}(t)\triangleq \operatorname {E} } and autocovariance function K X X ( t 1 , t 2 ) E [ ( X t 1 m X ( t 1 ) ) ( X t 2 m X ( t 2 ) ) ] {\displaystyle K_{XX}(t_{1},t_{2})\triangleq \operatorname {E} } :

m X ( t ) = m X ( t + τ ) for all  τ , t R K X X ( t 1 , t 2 ) = K X X ( t 1 t 2 , 0 ) for all  t 1 , t 2 R E [ | X t | 2 ] < for all  t R {\displaystyle {\begin{aligned}&m_{X}(t)=m_{X}(t+\tau )&&{\text{for all }}\tau ,t\in \mathbb {R} \\&K_{XX}(t_{1},t_{2})=K_{XX}(t_{1}-t_{2},0)&&{\text{for all }}t_{1},t_{2}\in \mathbb {R} \\&\operatorname {E} <\infty &&{\text{for all }}t\in \mathbb {R} \end{aligned}}} (Eq.3)

The first property implies that the mean function m X ( t ) {\displaystyle m_{X}(t)} must be constant. The second property implies that the autocovariance function depends only on the difference between t 1 {\displaystyle t_{1}} and t 2 {\displaystyle t_{2}} and only needs to be indexed by one variable rather than two variables. Thus, instead of writing,

K X X ( t 1 t 2 , 0 ) {\displaystyle \,\!K_{XX}(t_{1}-t_{2},0)\,}

the notation is often abbreviated by the substitution τ = t 1 t 2 {\displaystyle \tau =t_{1}-t_{2}} :

K X X ( τ ) K X X ( t 1 t 2 , 0 ) {\displaystyle K_{XX}(\tau )\triangleq K_{XX}(t_{1}-t_{2},0)}

This also implies that the autocorrelation depends only on τ = t 1 t 2 {\displaystyle \tau =t_{1}-t_{2}} , that is

R X ( t 1 , t 2 ) = R X ( t 1 t 2 , 0 ) R X ( τ ) . {\displaystyle \,\!R_{X}(t_{1},t_{2})=R_{X}(t_{1}-t_{2},0)\triangleq R_{X}(\tau ).}

The third property says that the second moments must be finite for any time t {\displaystyle t} .

Motivation

The main advantage of wide-sense stationarity is that it places the time-series in the context of Hilbert spaces. Let H be the Hilbert space generated by {x(t)} (that is, the closure of the set of all linear combinations of these random variables in the Hilbert space of all square-integrable random variables on the given probability space). By the positive definiteness of the autocovariance function, it follows from Bochner's theorem that there exists a positive measure μ {\displaystyle \mu } on the real line such that H is isomorphic to the Hilbert subspace of L(μ) generated by {e}. This then gives the following Fourier-type decomposition for a continuous time stationary stochastic process: there exists a stochastic process ω ξ {\displaystyle \omega _{\xi }} with orthogonal increments such that, for all t {\displaystyle t}

X t = e 2 π i λ t d ω λ , {\displaystyle X_{t}=\int e^{-2\pi i\lambda \cdot t}\,d\omega _{\lambda },}

where the integral on the right-hand side is interpreted in a suitable (Riemann) sense. The same result holds for a discrete-time stationary process, with the spectral measure now defined on the unit circle.

When processing WSS random signals with linear, time-invariant (LTI) filters, it is helpful to think of the correlation function as a linear operator. Since it is a circulant operator (depends only on the difference between the two arguments), its eigenfunctions are the Fourier complex exponentials. Additionally, since the eigenfunctions of LTI operators are also complex exponentials, LTI processing of WSS random signals is highly tractable—all computations can be performed in the frequency domain. Thus, the WSS assumption is widely employed in signal processing algorithms.

Definition for complex stochastic process

In the case where { X t } {\displaystyle \left\{X_{t}\right\}} is a complex stochastic process the autocovariance function is defined as K X X ( t 1 , t 2 ) = E [ ( X t 1 m X ( t 1 ) ) ( X t 2 m X ( t 2 ) ) ¯ ] {\displaystyle K_{XX}(t_{1},t_{2})=\operatorname {E} } and, in addition to the requirements in Eq.3, it is required that the pseudo-autocovariance function J X X ( t 1 , t 2 ) = E [ ( X t 1 m X ( t 1 ) ) ( X t 2 m X ( t 2 ) ) ] {\displaystyle J_{XX}(t_{1},t_{2})=\operatorname {E} } depends only on the time lag. In formulas, { X t } {\displaystyle \left\{X_{t}\right\}} is WSS, if

m X ( t ) = m X ( t + τ ) for all  τ , t R K X X ( t 1 , t 2 ) = K X X ( t 1 t 2 , 0 ) for all  t 1 , t 2 R J X X ( t 1 , t 2 ) = J X X ( t 1 t 2 , 0 ) for all  t 1 , t 2 R E [ | X ( t ) | 2 ] < for all  t R {\displaystyle {\begin{aligned}&m_{X}(t)=m_{X}(t+\tau )&&{\text{for all }}\tau ,t\in \mathbb {R} \\&K_{XX}(t_{1},t_{2})=K_{XX}(t_{1}-t_{2},0)&&{\text{for all }}t_{1},t_{2}\in \mathbb {R} \\&J_{XX}(t_{1},t_{2})=J_{XX}(t_{1}-t_{2},0)&&{\text{for all }}t_{1},t_{2}\in \mathbb {R} \\&\operatorname {E} <\infty &&{\text{for all }}t\in \mathbb {R} \end{aligned}}} (Eq.4)

Joint stationarity

The concept of stationarity may be extended to two stochastic processes.

Joint strict-sense stationarity

Two stochastic processes { X t } {\displaystyle \left\{X_{t}\right\}} and { Y t } {\displaystyle \left\{Y_{t}\right\}} are called jointly strict-sense stationary if their joint cumulative distribution F X Y ( x t 1 , , x t m , y t 1 , , y t n ) {\displaystyle F_{XY}(x_{t_{1}},\ldots ,x_{t_{m}},y_{t_{1}^{'}},\ldots ,y_{t_{n}^{'}})} remains unchanged under time shifts, i.e. if

F X Y ( x t 1 , , x t m , y t 1 , , y t n ) = F X Y ( x t 1 + τ , , x t m + τ , y t 1 + τ , , y t n + τ ) for all  τ , t 1 , , t m , t 1 , , t n R  and for all  m , n N {\displaystyle F_{XY}(x_{t_{1}},\ldots ,x_{t_{m}},y_{t_{1}^{'}},\ldots ,y_{t_{n}^{'}})=F_{XY}(x_{t_{1}+\tau },\ldots ,x_{t_{m}+\tau },y_{t_{1}^{'}+\tau },\ldots ,y_{t_{n}^{'}+\tau })\quad {\text{for all }}\tau ,t_{1},\ldots ,t_{m},t_{1}^{'},\ldots ,t_{n}^{'}\in \mathbb {R} {\text{ and for all }}m,n\in \mathbb {N} } (Eq.5)

Joint (M + N)th-order stationarity

Two random processes { X t } {\displaystyle \left\{X_{t}\right\}} and { Y t } {\displaystyle \left\{Y_{t}\right\}} is said to be jointly (M + N)-th-order stationary if:

F X Y ( x t 1 , , x t m , y t 1 , , y t n ) = F X Y ( x t 1 + τ , , x t m + τ , y t 1 + τ , , y t n + τ ) for all  τ , t 1 , , t m , t 1 , , t n R  and for all  m { 1 , , M } , n { 1 , , N } {\displaystyle F_{XY}(x_{t_{1}},\ldots ,x_{t_{m}},y_{t_{1}^{'}},\ldots ,y_{t_{n}^{'}})=F_{XY}(x_{t_{1}+\tau },\ldots ,x_{t_{m}+\tau },y_{t_{1}^{'}+\tau },\ldots ,y_{t_{n}^{'}+\tau })\quad {\text{for all }}\tau ,t_{1},\ldots ,t_{m},t_{1}^{'},\ldots ,t_{n}^{'}\in \mathbb {R} {\text{ and for all }}m\in \{1,\ldots ,M\},n\in \{1,\ldots ,N\}} (Eq.6)

Joint weak or wide-sense stationarity

Two stochastic processes { X t } {\displaystyle \left\{X_{t}\right\}} and { Y t } {\displaystyle \left\{Y_{t}\right\}} are called jointly wide-sense stationary if they are both wide-sense stationary and their cross-covariance function K X Y ( t 1 , t 2 ) = E [ ( X t 1 m X ( t 1 ) ) ( Y t 2 m Y ( t 2 ) ) ] {\displaystyle K_{XY}(t_{1},t_{2})=\operatorname {E} } depends only on the time difference τ = t 1 t 2 {\displaystyle \tau =t_{1}-t_{2}} . This may be summarized as follows:

m X ( t ) = m X ( t + τ ) for all  τ , t R m Y ( t ) = m Y ( t + τ ) for all  τ , t R K X X ( t 1 , t 2 ) = K X X ( t 1 t 2 , 0 ) for all  t 1 , t 2 R K Y Y ( t 1 , t 2 ) = K Y Y ( t 1 t 2 , 0 ) for all  t 1 , t 2 R K X Y ( t 1 , t 2 ) = K X Y ( t 1 t 2 , 0 ) for all  t 1 , t 2 R {\displaystyle {\begin{aligned}&m_{X}(t)=m_{X}(t+\tau )&&{\text{for all }}\tau ,t\in \mathbb {R} \\&m_{Y}(t)=m_{Y}(t+\tau )&&{\text{for all }}\tau ,t\in \mathbb {R} \\&K_{XX}(t_{1},t_{2})=K_{XX}(t_{1}-t_{2},0)&&{\text{for all }}t_{1},t_{2}\in \mathbb {R} \\&K_{YY}(t_{1},t_{2})=K_{YY}(t_{1}-t_{2},0)&&{\text{for all }}t_{1},t_{2}\in \mathbb {R} \\&K_{XY}(t_{1},t_{2})=K_{XY}(t_{1}-t_{2},0)&&{\text{for all }}t_{1},t_{2}\in \mathbb {R} \end{aligned}}} (Eq.7)

Relation between types of stationarity

  • If a stochastic process is N-th-order stationary, then it is also M-th-order stationary for all ⁠ M N {\displaystyle M\leq N} ⁠.
  • If a stochastic process is second order stationary ( N = 2 {\displaystyle N=2} ) and has finite second moments, then it is also wide-sense stationary.
  • If a stochastic process is wide-sense stationary, it is not necessarily second-order stationary.
  • If a stochastic process is strict-sense stationary and has finite second moments, it is wide-sense stationary.
  • If two stochastic processes are jointly (M + N)-th-order stationary, this does not guarantee that the individual processes are M-th- respectively N-th-order stationary.

Other terminology

The terminology used for types of stationarity other than strict stationarity can be rather mixed. Some examples follow.

  • Priestley uses stationary up to order m if conditions similar to those given here for wide sense stationarity apply relating to moments up to order m. Thus wide sense stationarity would be equivalent to "stationary to order 2", which is different from the definition of second-order stationarity given here.
  • Honarkhah and Caers also use the assumption of stationarity in the context of multiple-point geostatistics, where higher n-point statistics are assumed to be stationary in the spatial domain.

Differencing

One way to make some time series stationary is to compute the differences between consecutive observations. This is known as differencing. Differencing can help stabilize the mean of a time series by removing changes in the level of a time series, and so eliminating trends. This can also remove seasonality, if differences are taken appropriately (e.g. differencing observations 1 year apart to remove a yearly trend).

Transformations such as logarithms can help to stabilize the variance of a time series.

One of the ways for identifying non-stationary times series is the ACF plot. Sometimes, patterns will be more visible in the ACF plot than in the original time series; however, this is not always the case.

Another approach to identifying non-stationarity is to look at the Laplace transform of a series, which will identify both exponential trends and sinusoidal seasonality (complex exponential trends). Related techniques from signal analysis such as the wavelet transform and Fourier transform may also be helpful.

See also

References

  1. ^ Park,Kun Il (2018). Fundamentals of Probability and Stochastic Processes with Applications to Communications. Springer. ISBN 978-3-319-68074-3.
  2. ^ Ionut Florescu (7 November 2014). Probability and Stochastic Processes. John Wiley & Sons. ISBN 978-1-118-59320-2.
  3. Priestley, M. B. (1981). Spectral Analysis and Time Series. Academic Press. ISBN 0-12-564922-3.
  4. Priestley, M. B. (1988). Non-linear and Non-stationary Time Series Analysis. Academic Press. ISBN 0-12-564911-8.
  5. Honarkhah, M.; Caers, J. (2010). "Stochastic Simulation of Patterns Using Distance-Based Pattern Modeling". Mathematical Geosciences. 42 (5): 487–517. Bibcode:2010MatGe..42..487H. doi:10.1007/s11004-010-9276-7.
  6. 8.1 Stationarity and differencing | OTexts. Retrieved 2016-05-18. {{cite book}}: |website= ignored (help)

Further reading

External links

Stochastic processes
Discrete time
Continuous time
Both
Fields and other
Time series models
Financial models
Actuarial models
Queueing models
Properties
Limit theorems
Inequalities
Tools
Disciplines
Statistics
Descriptive statistics
Continuous data
Center
Dispersion
Shape
Count data
Summary tables
Dependence
Graphics
Data collection
Study design
Survey methodology
Controlled experiments
Adaptive designs
Observational studies
Statistical inference
Statistical theory
Frequentist inference
Point estimation
Interval estimation
Testing hypotheses
Parametric tests
Specific tests
Goodness of fit
Rank statistics
Bayesian inference
Correlation
Regression analysis
Linear regression
Non-standard predictors
Generalized linear model
Partition of variance
Categorical / Multivariate / Time-series / Survival analysis
Categorical
Multivariate
Time-series
General
Specific tests
Time domain
Frequency domain
Survival
Survival function
Hazard function
Test
Applications
Biostatistics
Engineering statistics
Social statistics
Spatial statistics
Categories: