# Tensors - Part 2: Tensors as Multilinear Operators

# Tensors - Part 2: Tensors as Multilinear Operators

### John Cook continues his series on tensors with examples of dot products and determinants.

Join the DZone community and get the full member experience.

Join For FreeLearn how to operationalize machine learning and data science projects to monetize your AI initiatives. Download the Gartner report now.

The simplest definition of a tensor is that it is a multilinear functional, i.e. a function that takes several vectors, returns a number, and is linear in each argument. Tensors over real vector spaces return real numbers, tensors over complex vector spaces return complex numbers, and you could work over other fields if you’d like.

A dot product is an example of a tensor. It takes two vectors and returns a number. And it’s linear in each argument. Suppose you have vectors *u*, *v*, and *w*, and a real number *a*. Then the dot product (*u +* *v*, *w*) equals (*u*, *w*) + (*v*, *w*) and (*au*, *w*) = *a*(*u*, *w*). This shows that dot product is linear in its first argument, and you can show similarly that it is linear in the second argument.

Determinants are also tensors. You can think of the determinant of an *n* by *n* matrix as a function of its *n* rows (or columns). This function is linear in each argument, so it is a tensor.

The introduction to this series mentioned the interpretation of tensors as a box of numbers: a matrix, a cube, etc. This is consistent with our definition because you can write a multilinear functional as a sum. For every vector that a tensor takes in, there is an index to sum over. A tensor taking *n* vectors as arguments can be written as *n* nested summations. You could think of the coefficients of this sum being spread out in space, each index corresponding to a dimension.

Tensor products are simple in this context as well. If you have a tensor *S* that takes *m* vectors at a time, and another tensor *T* that takes *n* vectors at a time, you can create a tensor that takes *m* + *n* vectors by sending the first *m* of them to *S*, the rest to *T*, and multiply the results. That’s the tensor product of *S* and *T*.

The discussion above makes tensors and tensor products still leaves a lot of questions unanswered. We haven’t considered the most general definition of tensor or tensor product. And we haven’t said anything about how tensors arise in application, what they have to do with geometry or changes of coordinate. I plan to address these issues in future posts. I also plan to write about other things in between posts on tensors.

Bias comes in a variety of forms, all of them potentially damaging to the efficacy of your ML algorithm. Our Chief Data Scientist discusses the source of most headlines about AI failures here.

Published at DZone with permission of John Cook . See the original article here.

Opinions expressed by DZone contributors are their own.

## {{ parent.title || parent.header.title}}

## {{ parent.tldr }}

## {{ parent.linkDescription }}

{{ parent.urlSource.name }}