Trying to map moments to a measure that generates them
Example: Given the mean and variance (as well as all further cumulants equal 0) the normal distribution is the distribution solving the moment problem.
Inmathematics, a moment problem arises as the result of trying to invert the mapping that takes a measure to the sequence of moments
In the classical setting, is a measure on the real line, and is the sequence . In this form the question appears in probability theory, asking whether there is a probability measure having specified mean, variance and so on, and whether it is unique.
A sequence of numbers is the sequence of moments of a measure if and only if a certain positivity condition is fulfilled; namely, the Hankel matrices,
should be positive semi-definite. This is because a positive-semidefinite Hankel matrix corresponds to a linear functional such that and (non-negative for sum of squares of polynomials). Assume can be extended to . In the univariate case, a non-negative polynomial can always be written as a sum of squares. So the linear functional is positive for all the non-negative polynomials in the univariate case. By Haviland's theorem, the linear functional has a measure form, that is . A condition of similar form is necessary and sufficient for the existence of a measure supported on a given interval .
One way to prove these results is to consider the linear functional that sends a polynomial
to
If are the moments of some measure supported on , then evidently
Thus the existence of the measure is equivalent to (1). Using a representation theorem for positive polynomials on , one can reformulate (1) as a condition on Hankel matrices.[2][3]
The uniqueness of in the Hausdorff moment problem follows from the Weierstrass approximation theorem, which states that polynomials are dense under the uniform norm in the space of continuous functionson. For the problem on an infinite interval, uniqueness is a more delicate question.[4] There are distributions, such as log-normal distributions, which have finite moments for all the positive integers but where other distributions have the same moments.
An important variation is the truncated moment problem, which studies the properties of measures with fixed first k moments (for a finite k). Results on the truncated moment problem have numerous applications to extremal problems, optimisation and limit theorems in probability theory.[3]
The moment problem has applications to probability theory. The following is commonly used:[5]
Theorem (Fréchet-Shohat) — If is a determinate measure (i.e. its moments determine it uniquely), and the measures are such that then in distribution.
By checking Carleman's condition, we know that the standard normal distribution is a determinate measure, thus we have the following form of the central limit theorem:
Corollary — If a sequence of probability distributions satisfy then converges to in distribution.
Kreĭn, M. G.; Nudel′man, A. A. (1977). The Markov Moment Problem and Extremal Problems. Translations of Mathematical Monographs. Providence, Rhode Island: American Mathematical Society. doi:10.1090/mmono/050. ISBN978-0-8218-4500-4. ISSN0065-9282.