Over a million developers have joined DZone.

Tensors - Part 4: Behavior Under Change of Coordinates

DZone's Guide to

Tensors - Part 4: Behavior Under Change of Coordinates

The fourth installment of the Tensors series covers the flavors and mixer of tensors.

· Big Data Zone
Free Resource

Learn how you can maximize big data in the cloud with Apache Hadoop. Download this eBook now. Brought to you in partnership with Hortonworks.

In the first post in this series, I mentioned several apparently unrelated things that are all called tensors, one of these being objects that behave a certain way under changes of coordinates. That’s what we’ll look at this time.

In the second post, we said that a tensor is a multilinear functional. A k-tensor takes k vectors and returns a number, and it is linear in each argument if you hold the rest constant. We mentioned that this relates to the “box of numbers” idea of a tensor. You can describe how a k-tensor acts by writing out k nested sums. The terms in these sums are called the components of the tensor.

Tensors are usually defined in a way that has more structure. They vary from point to point in a space, and they do so in a way that in some sense is independent of the coordinates used to label these points. At each point, you have a tensor in the sense of a multilinear functional, but the emphasis is usually on the changes of coordinates.

Components, Indexes, and Coordinates

Tensors in the sense that we’re talking about here come in two flavors: covariant and contravariant. They also come in mixtures; more on that later.

We consider two coordinate systems, one denoted by x‘s and another by x‘s with bars on top. The components of a tensor in the x-bar coordinate system will also have bars on top. For a covariant tensor of order one, the components satisfy

\bar{T}_i =T_r \frac{\partial x^r}{\partial \bar{x}^i}

First of all, coordinates are written with superscripts. So xr is the r coordinate, not x to the power r. Also, this uses Einstein summation notation: there is an implicit sum over repeated indexes, in this case of r.

The components of a contravariant tensor of order one satisfy similar but different equation:

\bar{T}^i =T^r \frac{\partial \bar{x}^i}{\partial x^r}

The components of a covariant tensor are written with subscripts and the components of a contravariant tensor with superscripts. In the equation for covariant components, the partial derivatives are with respect to the new coordinates, the x bars. In the equation for contravariant components, the partial derivatives are with respect to the original coordinates, the x‘s. Mnemonic: when the indexes go down (covariant tensors) the new coordinates go down (in the partial derivatives). When the indexes go up, the new coordinates go up.

For covariant tensors of order two, the change of coordinate formula is:

\bar{T}_{ij} = T_{rs} \frac{\partial x^r}{\partial\bar{x}_i} \frac{\partial x^s}{\partial \bar{x}^j}

Here there the summation convention says that there are two implicit sums, one over r and one over s.

The contravariant counterpart says:

 \bar{T}^{ij} = T^{rs} \frac{\partial\bar{x}^i}{\partial x^r} \frac{\partial\bar{x}^j}{\partial x^s}

In general, you could have tensors that are a mixture of covariant and contravariant. A tensor with covariant order p and contravariant order q has p subscripts and q superscripts. The partial derivatives have x-bars on bottom corresponding to the covariant components and x-bars on top corresponding to contravariant components.

Relation to Multilinear Functionals

We initially said a tensor was a multilinear functional. A tensor of order k takes k vectors and returns a number. Now we’d like to refine that definition to take two kinds of vectors. A tensor with covariant order p and contravariant order q takes p contravariant vectors and q covariant vectors. In linear algebra terms, instead of simply taking k elements of a vector space V, we say our tensor takes p vectors from the dual space V* and q vectors from V.

Relation to Category Theory

You may be familiar with the terms covariant and contravariant from category theory, or its application to object oriented programming. The terms are related. As Michael Spivak explains, “It’s very easy to remember which kind of vector field is covariant, and which is contravariant — it’s just the opposite of what it logically ought to be [from category theory].”

Hortonworks DataFlow is an integrated platform that makes data ingestion fast, easy, and secure. Download the white paper now.  Brought to you in partnership with Hortonworks

tensorflow ,functions ,tensors

Published at DZone with permission of John Cook. See the original article here.

Opinions expressed by DZone contributors are their own.

The best of DZone straight to your inbox.

Please provide a valid email address.

Thanks for subscribing!

Awesome! Check your inbox to verify your email so you can start receiving the latest in tech news and resources.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}