Misplaced Pages

Tensor (intrinsic definition)

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
(Redirected from Tensor-modern) Coordinate-free definition of a tensor This article assumes an understanding of the tensor product of vector spaces without chosen bases. An introduction to the nature and significance of tensors in a broad context can be found in the main Tensor article.
This article includes a list of general references, but it lacks sufficient corresponding inline citations. Please help to improve this article by introducing more precise citations. (November 2024) (Learn how and when to remove this message)

In mathematics, the modern component-free approach to the theory of a tensor views a tensor as an abstract object, expressing some definite type of multilinear concept. Their properties can be derived from their definitions, as linear maps or more generally; and the rules for manipulations of tensors arise as an extension of linear algebra to multilinear algebra.

In differential geometry, an intrinsic geometric statement may be described by a tensor field on a manifold, and then doesn't need to make reference to coordinates at all. The same is true in general relativity, of tensor fields describing a physical property. The component-free approach is also used extensively in abstract algebra and homological algebra, where tensors arise naturally.

Definition via tensor products of vector spaces

Given a finite set {V1, ..., Vn} of vector spaces over a common field F, one may form their tensor product V1 ⊗ ... ⊗ Vn, an element of which is termed a tensor.

A tensor on the vector space V is then defined to be an element of (i.e., a vector in) a vector space of the form: V V V V {\displaystyle V\otimes \cdots \otimes V\otimes V^{*}\otimes \cdots \otimes V^{*}} where V is the dual space of V.

If there are m copies of V and n copies of V in our product, the tensor is said to be of type (m, n) and contravariant of order m and covariant of order n and of total order m + n. The tensors of order zero are just the scalars (elements of the field F), those of contravariant order 1 are the vectors in V, and those of covariant order 1 are the one-forms in V (for this reason, the elements of the last two spaces are often called the contravariant and covariant vectors). The space of all tensors of type (m, n) is denoted T n m ( V ) = V V m V V n . {\displaystyle T_{n}^{m}(V)=\underbrace {V\otimes \dots \otimes V} _{m}\otimes \underbrace {V^{*}\otimes \dots \otimes V^{*}} _{n}.}

Example 1. The space of type (1, 1) tensors, T 1 1 ( V ) = V V , {\displaystyle T_{1}^{1}(V)=V\otimes V^{*},} is isomorphic in a natural way to the space of linear transformations from V to V.

Example 2. A bilinear form on a real vector space V, V × V F , {\displaystyle V\times V\to F,} corresponds in a natural way to a type (0, 2) tensor in T 2 0 ( V ) = V V . {\displaystyle T_{2}^{0}(V)=V^{*}\otimes V^{*}.} An example of such a bilinear form may be defined, termed the associated metric tensor, and is usually denoted g.

Tensor rank

Main article: Tensor rank decomposition

A simple tensor (also called a tensor of rank one, elementary tensor or decomposable tensor) is a tensor that can be written as a product of tensors of the form T = a b d {\displaystyle T=a\otimes b\otimes \cdots \otimes d} where a, b, ..., d are nonzero and in V or V – that is, if the tensor is nonzero and completely factorizable. Every tensor can be expressed as a sum of simple tensors. The rank of a tensor T is the minimum number of simple tensors that sum to T.

The zero tensor has rank zero. A nonzero order 0 or 1 tensor always has rank 1. The rank of a non-zero order 2 or higher tensor is less than or equal to the product of the dimensions of all but the highest-dimensioned vectors in (a sum of products of) which the tensor can be expressed, which is d when each product is of n vectors from a finite-dimensional vector space of dimension d.

The term rank of a tensor extends the notion of the rank of a matrix in linear algebra, although the term is also often used to mean the order (or degree) of a tensor. The rank of a matrix is the minimum number of column vectors needed to span the range of the matrix. A matrix thus has rank one if it can be written as an outer product of two nonzero vectors: A = v w T . {\displaystyle A=vw^{\mathrm {T} }.}

The rank of a matrix A is the smallest number of such outer products that can be summed to produce it: A = v 1 w 1 T + + v k w k T . {\displaystyle A=v_{1}w_{1}^{\mathrm {T} }+\cdots +v_{k}w_{k}^{\mathrm {T} }.}

In indices, a tensor of rank 1 is a tensor of the form T i j k = a i b j c k d . {\displaystyle T_{ij\dots }^{k\ell \dots }=a_{i}b_{j}\cdots c^{k}d^{\ell }\cdots .}

The rank of a tensor of order 2 agrees with the rank when the tensor is regarded as a matrix, and can be determined from Gaussian elimination for instance. The rank of an order 3 or higher tensor is however often very difficult to determine, and low rank decompositions of tensors are sometimes of great practical interest. In fact, the problem of finding the rank of an order 3 tensor over any finite field is NP-Complete, and over the rationals, is NP-Hard. Computational tasks such as the efficient multiplication of matrices and the efficient evaluation of polynomials can be recast as the problem of simultaneously evaluating a set of bilinear forms z k = i j T i j k x i y j {\displaystyle z_{k}=\sum _{ij}T_{ijk}x_{i}y_{j}} for given inputs xi and yj. If a low-rank decomposition of the tensor T is known, then an efficient evaluation strategy is known.

Universal property

The space T n m ( V ) {\displaystyle T_{n}^{m}(V)} can be characterized by a universal property in terms of multilinear mappings. Amongst the advantages of this approach are that it gives a way to show that many linear mappings are "natural" or "geometric" (in other words are independent of any choice of basis). Explicit computational information can then be written down using bases, and this order of priorities can be more convenient than proving a formula gives rise to a natural mapping. Another aspect is that tensor products are not used only for free modules, and the "universal" approach carries over more easily to more general situations.

A scalar-valued function on a Cartesian product (or direct sum) of vector spaces f : V 1 × × V N F {\displaystyle f:V_{1}\times \cdots \times V_{N}\to F} is multilinear if it is linear in each argument. The space of all multilinear mappings from V1 × ... × VN to W is denoted L(V1, ..., VNW). When N = 1, a multilinear mapping is just an ordinary linear mapping, and the space of all linear mappings from V to W is denoted L(V; W).

The universal characterization of the tensor product implies that, for each multilinear function f L m + n ( V , , V m , V , , V n ; W ) {\displaystyle f\in L^{m+n}(\underbrace {V^{*},\ldots ,V^{*}} _{m},\underbrace {V,\ldots ,V} _{n};W)} (where W can represent the field of scalars, a vector space, or a tensor space) there exists a unique linear function T f L ( V V m V V n ; W ) {\displaystyle T_{f}\in L(\underbrace {V^{*}\otimes \cdots \otimes V^{*}} _{m}\otimes \underbrace {V\otimes \cdots \otimes V} _{n};W)} such that f ( α 1 , , α m , v 1 , , v n ) = T f ( α 1 α m v 1 v n ) {\displaystyle f(\alpha _{1},\ldots ,\alpha _{m},v_{1},\ldots ,v_{n})=T_{f}(\alpha _{1}\otimes \cdots \otimes \alpha _{m}\otimes v_{1}\otimes \cdots \otimes v_{n})} for all vi in V and αi in V.

Using the universal property, it follows, when V is finite dimensional, that the space of (m, n)-tensors admits a natural isomorphism T n m ( V ) L ( V V m V V n ; F ) L m + n ( V , , V m , V , , V n ; F ) . {\displaystyle T_{n}^{m}(V)\cong L(\underbrace {V^{*}\otimes \cdots \otimes V^{*}} _{m}\otimes \underbrace {V\otimes \cdots \otimes V} _{n};F)\cong L^{m+n}(\underbrace {V^{*},\ldots ,V^{*}} _{m},\underbrace {V,\ldots ,V} _{n};F).}

Each V in the definition of the tensor corresponds to a V inside the argument of the linear maps, and vice versa. (Note that in the former case, there are m copies of V and n copies of V, and in the latter case vice versa). In particular, one has T 0 1 ( V ) L ( V ; F ) V , T 1 0 ( V ) L ( V ; F ) = V , T 1 1 ( V ) L ( V ; V ) . {\displaystyle {\begin{aligned}T_{0}^{1}(V)&\cong L(V^{*};F)\cong V,\\T_{1}^{0}(V)&\cong L(V;F)=V^{*},\\T_{1}^{1}(V)&\cong L(V;V).\end{aligned}}}

Tensor fields

Main article: tensor field

Differential geometry, physics and engineering must often deal with tensor fields on smooth manifolds. The term tensor is sometimes used as a shorthand for tensor field. A tensor field expresses the concept of a tensor that varies from point to point on the manifold.

References

  1. Hackbusch (2012), pp. 4.
  2. Bourbaki (1989), II, §7, no. 8.
  3. Halmos (1974), §51.
  4. de Groote (1987).
  5. Håstad (1989).
  6. Knuth (1998), pp. 506–508.
Tensors
Glossary of tensor theory
Scope
Mathematics
Notation
Tensor
definitions
Operations
Related
abstractions
Notable tensors
Mathematics
Physics
Mathematicians
Category: