Revision as of 11:45, 24 November 2013 editLilily (talk | contribs)260 edits →Proof: added proof using eigendecomposition← Previous edit | Revision as of 19:01, 24 November 2013 edit undoLilily (talk | contribs)260 edits →Proof: completed proof using eigendecompositionNext edit → | ||
Line 2: | Line 2: | ||
== Proof == | == Proof == | ||
⚫ | === Proof using eigendecomposition === | ||
⚫ | Let <math>M = \sum \mu_i m_i m_i^T</math> and <math>N = \sum \nu_i n_i n_i^T</math>. Then | ||
⚫ | : <math>M \circ N = \sum_{ij} \mu_i \nu_i (m_i m_i^T) \circ (n_i n_i^T) = \sum_{ij} \mu_i \nu_j (m_i \circ n_j) (m_i \circ n_j)^T</math> | ||
⚫ | Each <math>(m_i \circ n_j) (m_i \circ n_j)^T</math> is positive definite and <math>\mu_i \nu_j > 0</math>, thus the sum giving <math>M \circ N</math> is also |
||
=== Proof using the trace formula === | === Proof using the trace formula === | ||
Line 42: | Line 36: | ||
: <math>\operatorname{Cov}(X_i Y_i, X_j Y_j) = \langle X_i X_j \rangle \langle Y_i Y_j \rangle = M_{ij} N_{ij}</math> | : <math>\operatorname{Cov}(X_i Y_i, X_j Y_j) = \langle X_i X_j \rangle \langle Y_i Y_j \rangle = M_{ij} N_{ij}</math> | ||
Since a covariance matrix is positive definite, this proves that the matrix with elements <math>M_{ij} N_{ij}</math> is a positive definite matrix. | Since a covariance matrix is positive definite, this proves that the matrix with elements <math>M_{ij} N_{ij}</math> is a positive definite matrix. | ||
⚫ | === Proof using eigendecomposition === | ||
==== Proof of positivity ==== | |||
⚫ | Let <math>M = \sum \mu_i m_i m_i^T</math> and <math>N = \sum \nu_i n_i n_i^T</math>. Then | ||
⚫ | : <math>M \circ N = \sum_{ij} \mu_i \nu_i (m_i m_i^T) \circ (n_i n_i^T) = \sum_{ij} \mu_i \nu_j (m_i \circ n_j) (m_i \circ n_j)^T</math> | ||
⚫ | Each <math>(m_i \circ n_j) (m_i \circ n_j)^T</math> is positive (but, except in the 1-dimensional case, not positive definite, since they are ] 1 matrices) and <math>\mu_i \nu_j > 0</math>, thus the sum giving <math>M \circ N</math> is also positive. | ||
==== Complete proof ==== | |||
To show that the result is positive definite requires further proof. We shall show that for any vector <math>a \neq 0</math>, we have <math>a^T (M \circ N) a > 0</math>. Continuing as above, each <math>a^T (m_i \circ n_j) (m_i \circ n_j)^T a \ge 0</math>, so it remains to show that there exist <math>i</math> and <math>j</math> for which the inequality is strict. For this we observe that | |||
: <math>a^T (m_i \circ n_j) (m_i \circ n_j)^T a = (\sum_k m_{i,k} n_{j,k} a_k)^2</math> | |||
Since <math>N</math> is positive definite, there is a <math>j</math> for which <math>n_{j,k} a_k</math> is not 0 for all <math>k</math>, and then, since <math>M</math> is positive definite, there is an <math>i</math> for which <math>m_{i,k} n_{j,k} a_k</math> is not 0 for all <math>k</math>. Then for this <math>i</math>and <math>j</math> we have <math>(\sum_k m_{i,k} n_{j,k} a_k)^2 > 0</math>. This completes the proof. | |||
== Refercenes == | == Refercenes == |
Revision as of 19:01, 24 November 2013
In mathematics, particularly linear algebra, the Schur product theorem, named after Issai Schur (Schur 1911, p. 14, Theorem VII) (note that Schur signed as J. Schur in Journal für die reine und angewandte Mathematik) states that the Hadamard product of two positive definite matrices is also a positive definite matrix.
Proof
Proof using the trace formula
It is easy to show that for matrices and , the Hadamard product considered as a bilinear form acts on vectors as
where is the matrix trace and is the diagonal matrix having as diagonal entries the elements of .
Since and are positive definite, we can consider their square-roots and and write
Then, for , this is written as for and thus is positive. This shows that is a positive definite matrix.
Proof using Gaussian integration
Case of M=N
Let be an dimensional centered Gaussian random variable with covariance . Then the covariance matrix of and is
Using Wick's theorem to develop we have
Since a covariance matrix is positive definite, this proves that the matrix with elements is a positive definite matrix.
General case
Let and be dimensional centered Gaussian random variables with covariances , and independt from each other so that we have
- for any
Then the covariance matrix of and is
Using Wick's theorem to develop
and also using the independence of and , we have
Since a covariance matrix is positive definite, this proves that the matrix with elements is a positive definite matrix.
Proof using eigendecomposition
Proof of positivity
Let and . Then
Each is positive (but, except in the 1-dimensional case, not positive definite, since they are rank 1 matrices) and , thus the sum giving is also positive.
Complete proof
To show that the result is positive definite requires further proof. We shall show that for any vector , we have . Continuing as above, each , so it remains to show that there exist and for which the inequality is strict. For this we observe that
Since is positive definite, there is a for which is not 0 for all , and then, since is positive definite, there is an for which is not 0 for all . Then for this and we have . This completes the proof.
Refercenes
- Attention: This template ({{cite doi}}) is deprecated. To cite the publication identified by doi:10.1515/crll.1911.140.1, please use {{cite journal}} (if it was published in a bona fide academic journal, otherwise {{cite report}} with
|doi=10.1515/crll.1911.140.1
instead. - Attention: This template ({{cite doi}}) is deprecated. To cite the publication identified by doi:10.1007/b105056, please use {{cite journal}} (if it was published in a bona fide academic journal, otherwise {{cite report}} with
|doi=10.1007/b105056
instead., page 9, Ch. 0.6 Publication under J. Schur - Attention: This template ({{cite doi}}) is deprecated. To cite the publication identified by doi:10.1112/blms/15.2.97, please use {{cite journal}} (if it was published in a bona fide academic journal, otherwise {{cite report}} with
|doi=10.1112/blms/15.2.97
instead.