Revision as of 11:58, 23 November 2013 editLilily (talk | contribs)260 edits create page with references and proofs | Revision as of 11:36, 24 November 2013 edit undoLilily (talk | contribs)260 edits →Proof using Gaussian integration: corrected and generalised proofNext edit → | ||
Line 16: | Line 16: | ||
=== Proof using Gaussian integration === | === Proof using Gaussian integration === | ||
==== Case of M=N ==== | |||
⚫ | Let <math>X</math> be an <math>n</math> dimensional centered ] with ] <math>\langle X_i X_j \rangle = |
||
⚫ | Let <math>X</math> be an <math>n</math> dimensional centered ] with ] <math>\langle X_i X_j \rangle = M_{ij}</math>. | ||
Then the covariance matrix of <math>X_i^2</math> and <math>X_j^2</math> is | Then the covariance matrix of <math>X_i^2</math> and <math>X_j^2</math> is | ||
: <math>\operatorname{Cov}(X_i^2, X_j^2) = \langle X_i^2 X_j^2 \rangle - \langle X_i^2 \rangle \langle X_j^2 \rangle</math> | : <math>\operatorname{Cov}(X_i^2, X_j^2) = \langle X_i^2 X_j^2 \rangle - \langle X_i^2 \rangle \langle X_j^2 \rangle</math> | ||
Using ] to develop <math>\langle X_i^2 X_j^2 \rangle = 2 \langle X_i X_j \rangle^2 + \langle X_i^2 \rangle \langle X_j^2 \rangle</math> we have | Using ] to develop <math>\langle X_i^2 X_j^2 \rangle = 2 \langle X_i X_j \rangle^2 + \langle X_i^2 \rangle \langle X_j^2 \rangle</math> we have | ||
: <math>\operatorname{Cov}(X_i^2, X_j^2) = 2 \langle X_i X_j \rangle^2 = 2 |
: <math>\operatorname{Cov}(X_i^2, X_j^2) = 2 \langle X_i X_j \rangle^2 = 2 M_{ij}^2</math> | ||
Since a covariance matrix is positive definite, this proves that the matrix with elements <math> |
Since a covariance matrix is positive definite, this proves that the matrix with elements <math>M_{ij}^2</math> is a positive definite matrix. | ||
==== General case ==== | |||
Let <math>X</math> and <math>Y</math> be <math>n</math> dimensional centered ]s with ]s <math>\langle X_i X_j \rangle = M_{ij}</math>, <math>\langle Y_i Y_j \rangle = N_{ij}</math> and independt from each other so that we have | |||
: <math>\langle X_i Y_j \rangle = 0</math> for any <math>i, j</math> | |||
Then the covariance matrix of <math>X_i Y_i</math> and <math>X_j Y_j</math> is | |||
: <math>\operatorname{Cov}(X_i Y_i, X_j Y_j) = \langle X_i Y_i X_j Y_j \rangle - \langle X_i Y_i \rangle \langle X_j Y_j \rangle</math> | |||
Using ] to develop | |||
: <math>\langle X_i Y_i X_j Y_j \rangle = \langle X_i X_j \rangle \langle Y_i Y_j \rangle + \langle X_i Y_i \rangle \langle X_i Y_j \rangle + \langle X_i Y_j \rangle \langle X_j Y_i \rangle</math> | |||
and also using the independence of <math>X</math> and <math>Y</math>, we have | |||
: <math>\operatorname{Cov}(X_i Y_i, X_j Y_j) = \langle X_i X_j \rangle \langle Y_i Y_j \rangle = M_{ij} N_{ij}</math> | |||
Since a covariance matrix is positive definite, this proves that the matrix with elements <math>M_{ij} N_{ij}</math> is a positive definite matrix. | |||
== Refercenes == | == Refercenes == |
Revision as of 11:36, 24 November 2013
In mathematics, particularly linear algebra, the Schur product theorem, named after Issai Schur (Schur 1911, p. 14, Theorem VII) (note that Schur signed as J. Schur in Journal für die reine und angewandte Mathematik) states that the Hadamard product of two positive definite matrices is also a positive definite matrix.
Proof
Proof using the trace formula
It is easy to show that for matrices and , the Hadamard product considered as a bilinear form acts on vectors as
where is the matrix trace and is the diagonal matrix having as diagonal entries the elements of .
Since and are positive definite, we can consider their square-roots and and write
Then, for , this is written as for and thus is positive. This shows that is a positive definite matrix.
Proof using Gaussian integration
Case of M=N
Let be an dimensional centered Gaussian random variable with covariance . Then the covariance matrix of and is
Using Wick's theorem to develop we have
Since a covariance matrix is positive definite, this proves that the matrix with elements is a positive definite matrix.
General case
Let and be dimensional centered Gaussian random variables with covariances , and independt from each other so that we have
- for any
Then the covariance matrix of and is
Using Wick's theorem to develop
and also using the independence of and , we have
Since a covariance matrix is positive definite, this proves that the matrix with elements is a positive definite matrix.
Refercenes
- Attention: This template ({{cite doi}}) is deprecated. To cite the publication identified by doi:10.1515/crll.1911.140.1, please use {{cite journal}} (if it was published in a bona fide academic journal, otherwise {{cite report}} with
|doi=10.1515/crll.1911.140.1
instead. - Attention: This template ({{cite doi}}) is deprecated. To cite the publication identified by doi:10.1007/b105056, please use {{cite journal}} (if it was published in a bona fide academic journal, otherwise {{cite report}} with
|doi=10.1007/b105056
instead., page 9, Ch. 0.6 Publication under J. Schur - Attention: This template ({{cite doi}}) is deprecated. To cite the publication identified by doi:10.1112/blms/15.2.97, please use {{cite journal}} (if it was published in a bona fide academic journal, otherwise {{cite report}} with
|doi=10.1112/blms/15.2.97
instead.