Misplaced Pages

T(1) theorem

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.

In mathematics, the T(1) theorem, first proved by David & Journé (1984), describes when an operator T given by a kernel can be extended to a bounded linear operator on the Hilbert space L(R). The name T(1) theorem refers to a condition on the distribution T(1), given by the operator T applied to the function 1.

Statement

Suppose that T is a continuous operator from Schwartz functions on R to tempered distributions, so that T is given by a kernel K which is a distribution. Assume that the kernel is standard, which means that off the diagonal it is given by a function satisfying certain conditions. Then the T(1) theorem states that T can be extended to a bounded operator on the Hilbert space L(R) if and only if the following conditions are satisfied:

  • T(1) is of bounded mean oscillation (where T is extended to an operator on bounded smooth functions, such as 1).
  • T(1) is of bounded mean oscillation, where T is the adjoint of T.
  • T is weakly bounded, a weak condition that is easy to verify in practice.

References

Functional analysis (topicsglossary)
Spaces
Properties
Theorems
Operators
Algebras
Open problems
Applications
Advanced topics
Category: