Misplaced Pages

Hausdorff moment problem

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.

In mathematics, the Hausdorff moment problem, named after Felix Hausdorff, asks for necessary and sufficient conditions that a given sequence (m0, m1, m2, ...) be the sequence of moments

m n = 0 1 x n d μ ( x ) {\displaystyle m_{n}=\int _{0}^{1}x^{n}\,d\mu (x)}

of some Borel measure μ supported on the closed unit interval . In the case m0 = 1, this is equivalent to the existence of a random variable X supported on , such that E = mn.

The essential difference between this and other well-known moment problems is that this is on a bounded interval, whereas in the Stieltjes moment problem one considers a half-line [0, ∞), and in the Hamburger moment problem one considers the whole line (−∞, ∞). The Stieltjes moment problems and the Hamburger moment problems, if they are solvable, may have infinitely many solutions (indeterminate moment problem) whereas a Hausdorff moment problem always has a unique solution if it is solvable (determinate moment problem). In the indeterminate moment problem case, there are infinite measures corresponding to the same prescribed moments and they consist of a convex set. The set of polynomials may or may not be dense in the associated Hilbert spaces if the moment problem is indeterminate, and it depends on whether measure is extremal or not. But in the determinate moment problem case, the set of polynomials is dense in the associated Hilbert space.

Completely monotonic sequences

In 1921, Hausdorff showed that (m0, m1, m2, ...) is such a moment sequence if and only if the sequence is completely monotonic, that is, its difference sequences satisfy the equation

( 1 ) k ( Δ k m ) n 0 {\displaystyle (-1)^{k}(\Delta ^{k}m)_{n}\geq 0}

for all n, k ≥ 0. Here, Δ is the difference operator given by

( Δ m ) n = m n + 1 m n . {\displaystyle (\Delta m)_{n}=m_{n+1}-m_{n}.}

The necessity of this condition is easily seen by the identity

( 1 ) k ( Δ k m ) n = 0 1 x n ( 1 x ) k d μ ( x ) , {\displaystyle (-1)^{k}(\Delta ^{k}m)_{n}=\int _{0}^{1}x^{n}(1-x)^{k}d\mu (x),}

which is non-negative since it is the integral of a non-negative function. For example, it is necessary to have

( Δ 4 m ) 6 = m 6 4 m 7 + 6 m 8 4 m 9 + m 10 = x 6 ( 1 x ) 4 d μ ( x ) 0. {\displaystyle (\Delta ^{4}m)_{6}=m_{6}-4m_{7}+6m_{8}-4m_{9}+m_{10}=\int x^{6}(1-x)^{4}d\mu (x)\geq 0.}

See also

References

  • Hausdorff, F. "Summationsmethoden und Momentfolgen. I." Mathematische Zeitschrift 9, 74–109, 1921.
  • Hausdorff, F. "Summationsmethoden und Momentfolgen. II." Mathematische Zeitschrift 9, 280–299, 1921.
  • Feller, W. "An Introduction to Probability Theory and Its Applications", volume II, John Wiley & Sons, 1971.
  • Shohat, J.A.; Tamarkin, J. D. The Problem of Moments, American mathematical society, New York, 1943.
Categories: