This is the current revision of this page, as edited by Volunteer Marek (talk | contribs) at 19:09, 23 November 2024. The present address (URL) is a permanent link to this version.
Revision as of 19:09, 23 November 2024 by Volunteer Marek (talk | contribs)(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)
In mathematics, particularly in linear algebra, the Schur product theorem states that the Hadamard product of two positive definite matrices is also a positive definite matrix.
The result is named after Issai Schur (Schur 1911, p. 14, Theorem VII) (note that Schur signed as J. Schur in Journal für die reine und angewandte Mathematik.)
The converse of the theorem holds in the following sense: if is a symmetric matrix and the Hadamard product is positive definite for all positive definite matrices , then itself is positive definite.
Proof
Proof using the trace formula
For any matrices and , the Hadamard product considered as a bilinear form acts on vectors as
where is the matrix trace and is the diagonal matrix having as diagonal entries the elements of .
Suppose and are positive definite, and so Hermitian. We can consider their square-roots and , which are also Hermitian, and write
Then, for , this is written as for and thus is strictly positive for , which occurs if and only if . This shows that is a positive definite matrix.
Proof using Gaussian integration
Case of M = N
Let be an -dimensional centered Gaussian random variable with covariance . Then the covariance matrix of and is
Using Wick's theorem to develop we have
Since a covariance matrix is positive definite, this proves that the matrix with elements is a positive definite matrix.
General case
Let and be -dimensional centered Gaussian random variables with covariances , and independent from each other so that we have
- for any
Then the covariance matrix of and is
Using Wick's theorem to develop
and also using the independence of and , we have
Since a covariance matrix is positive definite, this proves that the matrix with elements is a positive definite matrix.
Proof using eigendecomposition
Proof of positive semidefiniteness
Let and . Then
Each is positive semidefinite (but, except in the 1-dimensional case, not positive definite, since they are rank 1 matrices). Also, thus the sum is also positive semidefinite.
Proof of definiteness
To show that the result is positive definite requires even further proof. We shall show that for any vector , we have . Continuing as above, each , so it remains to show that there exist and for which corresponding term above is nonzero. For this we observe that
Since is positive definite, there is a for which (since otherwise for all ), and likewise since is positive definite there exists an for which However, this last sum is just . Thus its square is positive. This completes the proof.
References
- Schur, J. (1911). "Bemerkungen zur Theorie der beschränkten Bilinearformen mit unendlich vielen Veränderlichen". Journal für die reine und angewandte Mathematik. 1911 (140): 1–28. doi:10.1515/crll.1911.140.1. S2CID 120411177.
- Zhang, Fuzhen, ed. (2005). The Schur Complement and Its Applications. Numerical Methods and Algorithms. Vol. 4. doi:10.1007/b105056. ISBN 0-387-24271-6., page 9, Ch. 0.6 Publication under J. Schur
- Ledermann, W. (1983). "Issai Schur and His School in Berlin". Bulletin of the London Mathematical Society. 15 (2): 97–106. doi:10.1112/blms/15.2.97.
External links
Categories: