Misplaced Pages

Mutual coherence (linear algebra)

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
Value in matrix theory

In linear algebra, the coherence or mutual coherence of a matrix A is defined as the maximum absolute value of the cross-correlations between the columns of A.

Formally, let a 1 , , a m C d {\displaystyle a_{1},\ldots ,a_{m}\in {\mathbb {C} }^{d}} be the columns of the matrix A, which are assumed to be normalized such that a i H a i = 1. {\displaystyle a_{i}^{H}a_{i}=1.} The mutual coherence of A is then defined as

M = max 1 i j m | a i H a j | . {\displaystyle M=\max _{1\leq i\neq j\leq m}\left|a_{i}^{H}a_{j}\right|.}

A lower bound is

M m d d ( m 1 ) . {\displaystyle M\geq {\sqrt {\frac {m-d}{d(m-1)}}}.}

A deterministic matrix with the mutual coherence almost meeting the lower bound can be constructed by Weil's theorem.

This concept was reintroduced by David Donoho and Michael Elad in the context of sparse representations. A special case of this definition for the two-ortho case appeared earlier in the paper by Donoho and Huo. The mutual coherence has since been used extensively in the field of sparse representations of signals. In particular, it is used as a measure of the ability of suboptimal algorithms such as matching pursuit and basis pursuit to correctly identify the true representation of a sparse signal. Joel Tropp introduced a useful extension of Mutual Coherence, known as the Babel function, which extends the idea of cross-correlation between pairs of columns to the cross-correlation from one column to a set of other columns. The Babel function for two columns is exactly the Mutual coherence, but it also extends the coherence relationship concept in a way that is useful and relevant for any number of columns in the sparse representation matrix as well.

See also

References

  1. ^ Tropp, J.A. (March 2006). "Just relax: Convex programming methods for identifying sparse signals in noise" (PDF). IEEE Transactions on Information Theory. 52 (3): 1030–1051. doi:10.1109/TIT.2005.864420. S2CID 6496872.
  2. ^ Donoho, D.L.; M. Elad; V.N. Temlyakov (January 2006). "Stable recovery of sparse overcomplete representations in the presence of noise". IEEE Transactions on Information Theory. 52 (1): 6–18. doi:10.1109/TIT.2005.860430. S2CID 14813938.
  3. Welch, L. R. (1974). "Lower bounds on the maximum cross-correlation of signals". IEEE Transactions on Information Theory. 20 (3): 397–399. doi:10.1109/tit.1974.1055219.
  4. Zhiqiang, Xu (April 2011). "Deterministic Sampling of Sparse Trigonometric Polynomials". Journal of Complexity. 27 (2): 133–140. arXiv:1006.2221. doi:10.1016/j.jco.2011.01.007. S2CID 2613562.
  5. Donoho, D.L.; Michael Elad (March 2003). "Optimally sparse representation in general (nonorthogonal) dictionaries via L1 minimization". Proc. Natl. Acad. Sci. 100 (5): 2197–2202. Bibcode:2003PNAS..100.2197D. doi:10.1073/pnas.0437847100. PMC 153464. PMID 16576749.
  6. Donoho, D.L.; Xiaoming Huo (November 2001). "Uncertainty principles and ideal atomic decomposition". IEEE Transactions on Information Theory. 47 (7): 2845–2862. CiteSeerX 10.1.1.39.3696. doi:10.1109/18.959265. S2CID 9500527.
  7. Fuchs, J.-J. (June 2004). "On sparse representations in arbitrary redundant bases". IEEE Transactions on Information Theory. 50 (6): 1341–1344. doi:10.1109/TIT.2004.828141. S2CID 18432970.
  8. Joel A. Tropp (2004). "Greed is good: Algorithmic results for sparse approximation" (PDF). CiteSeerX 10.1.1.84.5256.

Further reading


Stub icon

This linear algebra-related article is a stub. You can help Misplaced Pages by expanding it.

Categories: