This article may be too technical for most readers to understand. Please help improve it to make it understandable to non-experts, without removing the technical details. (November 2017) (Learn how and when to remove this message) |
The smoothing problem (not to be confused with smoothing in statistics, image processing and other contexts) is the problem of estimating an unknown probability density function recursively over time using incremental incoming measurements. It is one of the main problems defined by Norbert Wiener. A smoother is an algorithm that implements a solution to this problem, typically based on recursive Bayesian estimation. The smoothing problem is closely related to the filtering problem, both of which are studied in Bayesian smoothing theory.
A smoother is often a two-pass process, composed of forward and backward passes. Consider doing estimation (prediction/retrodiction) about an ongoing process (e.g. tracking a missile) based on incoming observations. When new observations arrive, estimations about past needs to be updated to have a smoother (more accurate) estimation of the whole estimated path until now (taking into account the newer observations). Without a backward pass (for retrodiction), the sequence of predictions in an online filtering algorithm does not look smooth. In other words, retrospectively, it is as if we are using future observations for improving estimation of a point in past, when those observations about future points become available. Note that time of estimation (which determines which observations are available) can be different to the time of the point that the prediction is about (that is subject to prediction/retrodiction). The observations about later times can be used to update and improved the estimations about earlier times. Doing so leads to smoother-looking estimations (retrodiction) about the whole path.
Examples of smoothers
Some variants include:
- Rauch–Tung–Striebel (RTS) smoother
- Gaussian smoothers (e.g., extended Kalman smoother or sigma-point smoothers) for non-linear state-space models.
- Particle smoothers
The confusion in terms and the relation between Filtering and Smoothing problems
This section may require cleanup to meet Misplaced Pages's quality standards. The specific problem is: this section needs reorganization and also needs additional citations. Please help improve this section if you can. (December 2021) (Learn how and when to remove this message) |
The terms Smoothing and Filtering are used for four concepts that may initially be confusing: Smoothing (in two senses: estimation and convolution), and Filtering (again in two senses: estimation and convolution).
Smoothing (estimation) and smoothing (convolution) despite being labelled with the same name in English language, can mean totally different mathematical procedures. The requirements of problems they solve are different. These concepts are distinguished by the context (signal processing versus estimation of stochastic processes).
The historical reason for this confusion is that initially, the Wiener's suggested a "smoothing" filter that was just a convolution. Later on his proposed solutions for obtaining a smoother estimation separate developments as two distinct concepts. One was about attaining a smoother estimation by taking into account past observations, and the other one was smoothing using filter design (design of a convolution filter).
Both the smoothing problem (in sense of estimation) and the filtering problem (in sense of estimation) are often confused with smoothing and filtering in other contexts (especially non-stochastic signal processing, often a name of various types of convolution). These names are used in the context of World War 2 with problems framed by people like Norbert Wiener. One source of confusion is the Wiener Filter is in form of a simple convolution. However, in Wiener's filter, two time-series are given. When the filter is defined, a straightforward convolution is the answer. However, in later developments such as Kalman filtering, the nature of filtering is different to convolution and it deserves a different name.
The distinction is described in the following two senses:
1. Convolution: The smoothing in the sense of convolution is simpler. For example, moving average, low-pass filtering, convolution with a kernel, or blurring using Laplace filters in image processing. It is often a filter design problem. Especially non-stochastic and non-Bayesian signal processing, without any hidden variables.
2. Estimation: The smoothing problem (or Smoothing in the sense of estimation) uses Bayesian and state-space models to estimate the hidden state variables. This is used in the context of World War 2 defined by people like Norbert Wiener, in (stochastic) control theory, radar, signal detection, tracking, etc. The most common use is the Kalman Smoother used with Kalman Filter, which is actually developed by Rauch. The procedure is called Kalman-Rauch recursion. It is one of the main problems solved by Norbert Wiener. Most importantly, in the Filtering problem (sense 2) the information from observation up to the time of the current sample is used. In smoothing (also sense 2) all observation samples (from future) are used. Filtering is causal but smoothing is batch processing of the same problem, namely, estimation of a time-series process based on serial incremental observations.
But the usual and more common smoothing and filtering (in the sense of 1.) do not have such distinction because there is no distinction between hidden and observable.
The distinction between Smoothing (estimation) and Filtering (estimation): In smoothing all observation samples are used (from future). Filtering is causal, whereas smoothing is batch processing of the given data. Filtering is the estimation of a (hidden) time-series process based on serial incremental observations.
See also
- Filtering problem
- Filter (signal processing)
- Kalman filter, a well-known filtering algorithm related both to the filtering problem and the smoothing problem
- Generalized filtering
- Smoothing
References
- ^ 1942, Extrapolation, Interpolation and Smoothing of Stationary Time Series. A war-time classified report nicknamed "the yellow peril" because of the color of the cover and the difficulty of the subject. Published postwar 1949 MIT Press. http://www.isss.org/lumwiener.htm Archived 2015-08-16 at the Wayback Machine
- ^ Wiener, Norbert (1949). Extrapolation, Interpolation, and Smoothing of Stationary Time Series. New York: Wiley. ISBN 0-262-73005-7.
- Simo Särkkä. Bayesian Filtering and Smoothing. Publisher: Cambridge University Press (5 Sept. 2013) Language: English ISBN 1107619289 ISBN 978-1107619289