Misplaced Pages

Wold's theorem

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
Theorem of stationary processes This article is about the theorem as used in time series analysis. For an abstract mathematical statement, see Wold decomposition.

In statistics, Wold's decomposition or the Wold representation theorem (not to be confused with the Wold theorem that is the discrete-time analog of the Wiener–Khinchin theorem), named after Herman Wold, says that every covariance-stationary time series Y t {\displaystyle Y_{t}} can be written as the sum of two time series, one deterministic and one stochastic.

Formally

Y t = j = 0 b j ε t j + η t , {\displaystyle Y_{t}=\sum _{j=0}^{\infty }b_{j}\varepsilon _{t-j}+\eta _{t},}

where:

  • Y t {\displaystyle Y_{t}} is the time series being considered,
  • ε t {\displaystyle \varepsilon _{t}} is an uncorrelated sequence which is the innovation process to the process Y t {\displaystyle Y_{t}} – that is, a white noise process that is input to the linear filter { b j } {\displaystyle \{b_{j}\}} .
  • b {\displaystyle b} is the possibly infinite vector of moving average weights (coefficients or parameters)
  • η t {\displaystyle \eta _{t}} is a "deterministic" time series, in the sense that it is completely determined as a linear combination of its past values (see e.g. Anderson (1971) Ch. 7, Section 7.6.3. pp. 420-421). It may include "deterministic terms" like sine/cosine waves of t {\displaystyle t} , but it is a stochastic process and it is also covariance-stationary, it cannot be an arbitrary deterministic process that violates stationarity.

The moving average coefficients have these properties:

  1. Stable, that is, square summable j = 1 | b j | 2 {\displaystyle \sum _{j=1}^{\infty }|b_{j}|^{2}} < {\displaystyle \infty }
  2. Causal (i.e. there are no terms with j < 0)
  3. Minimum delay
  4. Constant ( b j {\displaystyle b_{j}} independent of t)
  5. It is conventional to define b 0 = 1 {\displaystyle b_{0}=1}

This theorem can be considered as an existence theorem: any stationary process has this seemingly special representation. Not only is the existence of such a simple linear and exact representation remarkable, but even more so is the special nature of the moving average model. Imagine creating a process that is a moving average but not satisfying these properties 1–4. For example, the coefficients b j {\displaystyle b_{j}} could define an acausal and non-minimum delay model. Nevertheless the theorem assures the existence of a causal minimum delay moving average that exactly represents this process. How this all works for the case of causality and the minimum delay property is discussed in Scargle (1981), where an extension of the Wold decomposition is discussed.

The usefulness of the Wold Theorem is that it allows the dynamic evolution of a variable Y t {\displaystyle Y_{t}} to be approximated by a linear model. If the innovations ε t {\displaystyle \varepsilon _{t}} are independent, then the linear model is the only possible representation relating the observed value of Y t {\displaystyle Y_{t}} to its past evolution. However, when ε t {\displaystyle \varepsilon _{t}} is merely an uncorrelated but not independent sequence, then the linear model exists but it is not the only representation of the dynamic dependence of the series. In this latter case, it is possible that the linear model may not be very useful, and there would be a nonlinear model relating the observed value of Y t {\displaystyle Y_{t}} to its past evolution. However, in practical time series analysis, it is often the case that only linear predictors are considered, partly on the grounds of simplicity, in which case the Wold decomposition is directly relevant.

The Wold representation depends on an infinite number of parameters, although in practice they usually decay rapidly. The autoregressive model is an alternative that may have only a few coefficients if the corresponding moving average has many. These two models can be combined into an autoregressive-moving average (ARMA) model, or an autoregressive-integrated-moving average (ARIMA) model if non-stationarity is involved. See Scargle (1981) and references there; in addition this paper gives an extension of the Wold Theorem that allows more generality for the moving average (not necessarily stable, causal, or minimum delay) accompanied by a sharper characterization of the innovation (identically and independently distributed, not just uncorrelated). This extension allows the possibility of models that are more faithful to physical or astrophysical processes, and in particular can sense ″the arrow of time.″

References

  • Anderson, T. W. (1971). The Statistical Analysis of Time Series. Wiley.
  • Nerlove, M.; Grether, David M.; Carvalho, José L. (1995). Analysis of Economic Time Series (Revised ed.). San Diego: Academic Press. pp. 30–36. ISBN 0-12-515751-7.
  • Scargle, J. D. (1981). Studies in astronomical time series analysis. I – Modeling random processes in the time domain. Astrophysical Journal Supplement Series. Vol. 45. pp. 1–71.
  • Wold, H. (1954) A Study in the Analysis of Stationary Time Series, Second revised edition, with an Appendix on "Recent Developments in Time Series Analysis" by Peter Whittle. Almqvist and Wiksell Book Co., Uppsala.
Statistics
Descriptive statistics
Continuous data
Center
Dispersion
Shape
Count data
Summary tables
Dependence
Graphics
Data collection
Study design
Survey methodology
Controlled experiments
Adaptive designs
Observational studies
Statistical inference
Statistical theory
Frequentist inference
Point estimation
Interval estimation
Testing hypotheses
Parametric tests
Specific tests
Goodness of fit
Rank statistics
Bayesian inference
Correlation
Regression analysis
Linear regression
Non-standard predictors
Generalized linear model
Partition of variance
Categorical / Multivariate / Time-series / Survival analysis
Categorical
Multivariate
Time-series
General
Specific tests
Time domain
Frequency domain
Survival
Survival function
Hazard function
Test
Applications
Biostatistics
Engineering statistics
Social statistics
Spatial statistics
Categories: