For example, in the beginning, author describes tensors as things behaving according to tensor transformation formula. This is already very much a physicist kind of thinking: it assumes that there is some object out there, and we’re trying to understand what it is in terms of how it behaves. It also uses the summation notation which is rather foreign to non-physicist mathematicians. Then, when it finally reaches the point where it is all related to tensors in TensorFlow sense, we find that there is no reference made to the transformation formula, purportedly so crucial to understanding tensors. How comes?
The solution here is quite simple: what author (and physicists) call tensors is not what TensorFlow (and mathematicians) call tensors. Instead, author describes what mathematicians call “a tensor bundle”, which is a correspondence that assigns each point of space a unique tensor. That’s where the transformation rule comes from: if we describe this mapping in terms of some coordinate system (as physicist universally do), the transformation rule tells you how to this description changes in terms of change of the coordinates. This setup, of course, has little to do with TensorFlow, because there is no space that its tensors are attached to, they are just standalone entities.
So what are the mathematician’s (and TensorFlow) tensors? They’re actually basically what the author says, after very confusing and irrelevant introduction talking about change of coordinates of underlying space — irrelevant, because TensorFlow tensors are not attached as a bundle to some space (manifold) as they are on physics, so no change of space coordinates ever happens. Roughly, tensors are a sort of universal objects representing multi linear maps: bilinear maps V x W -> R correspond canonically one-to-one to regular linear maps V (x) W -> R, where V (x) W is a vector space called tensor product of V and W, and tensors are simply vectors in this tensor product space.
Basically, the idea is to replace weird multi linear objects with normal linear objects (vectors), that we know how to deal with, using matrix multiplication and stuff. That’s all there is to it.
The author is perfectly clear in the first sentence that the piece's focus is about the usefulness of tensors in a physics context.
Technically, that’s a tensor field which is a section of the tensor bundle. Similarly, a vector field is a section of the tangent bundle (the collection of all the tangent spaces of the points on the manifold). A vector field is just a choice of a tangent vector for each point from that point’s tangent space.
In grade school it drove me nuts when the homework required us to describe a word without using the word (or it’s Latinate siblings). And yet as an adult there are few enough weeks that go by where some grownup doesn’t try to pull that same trick.
If you think developers are guilty of circular logic, check out some of the math pages on Wikipedia. You can get lost in moments.
Is it just me or are we horrible at teaching advanced math? Where are the examples (with actual numbers)? Where is the motivation? Where are the pictures?
Eg "Numbers (formal) are those objects which behave like numbers (informal)."
It is still a work in progress, but does it help address some of the problems you see in learning mathematics? Any feedback is greatly appreciated. Thanks.
TensorFlow "tensor"(and most other use of "tensor" in programmer jargon) is not a tensor at all, it's just a multidimensional array.
Matrices (or linear transformations in general) are important examples of tensors. There's a nice adjunction between tensor spaces A(x)B and the space of linear transformations B=>C given by:
Hom(A(x)B, C) = Hom(A, B=>C)
In the case of Tensorflow I think they do actually still talk about linear transformations of some kind so it's perfectly fine to call them tensors.
Wald's approach in General Relativity is much better - he treats Tensors as a multilinear map from vectors and dual vectors to scalars.
He then derives the underlying coordinate transformaton rules, for the vector spaces used in differential geometry. But
Most of the article tries to provide some intuition behind why multilinear maps, which sound like a fairly abstract concept, might be relevant in physics. The key link being the importance of coordinate invariance.
I didn’t go into deriving the coordinate transforms from the multilinear map definition as I didn’t feel that it’d provide much better intuition, but I did mention the equivalence near the end.
Yeah, the idea that there are pre-existing things that we're trying to describe is somewhat weird to me when we're trying to come up with a definition of a tensor. The whole point of mathematics is that you come up with the definitions and theorems fall out.
In particular, this comment is funny and speaks to some difference in how I and the author view what we're doing when defining a tensor:
> But why that specific transformation law - why must tensors transform in that way in order to preserve whatever object the tensor represents?
Because we defined it like that! When you make the definition "a tensor is a thing that follows X laws", you don't get to ask why, you just defined it!
Just a funny bit of phrasing, I get what is meant :)
That's just how it's presented in textbooks. It's obviously not math is actually done.
>[the explanation in the OP] will make sense to physicist, but no sense to most everyone else, including mathematicians
And then went on to describe tensors in a way that is unfriendly to non mathematicians by saying
> tensors are a sort of universal objects representing multi linear maps: bilinear maps V x W -> R correspond canonically one-to-one to regular linear maps V (x) W -> R, where V (x) W is a vector space called tensor product of V and W, and tensors are simply vectors in this tensor product space.
For example the stress and strain calculations which are used for calculating Deformation (Say if you were rolling a sheet of steel in a mill) makes use of tensors and also something called an "Invariant" I assume this also comes from Physics/Mathematics world.
Even as a physicist I found it highly confusing when I got told in physics classes that a tensor is "just a thing (or object) that behaves like so under coordinate transformation". Like, what do you mean by "thing"? I have no intuition to this yet, I need it concise definitions! Fortunately I took a differential geometry class at the same time, which was really helpful.
I'm sure this has confused a lot of people, especially beginners. Clashing terminology is one of the main difficulties in interdisciplinary work, in my experience. I don't think it's good to shrug it off like that.
(Sort of like how vectors kind of got going via a list of numbers and then they found the right axioms for vector spaces and then linear algebra shifted from a lot of computation to a sort of spare and elegant set of theorems on linearity).
You're getting annoyed that people are confusing the map with the territory [1]. Multidimensional arrays with certain properties can be used to represent tensors, but aren't tensors. In the same way a diagram of torus isn't a topological space, or a multiplication table isn't a group, or a matrix is not a linear map. Isomorphic but not literally the thing.
Or you're annoyed that people forget an array representing a tensor needs to satisfy some transformation law and can't just be any big array with some numbers in it.
Or maybe you're a fan of basis-free linear algebra!
Which one is it?
1: https://en.wikipedia.org/wiki/Map%E2%80%93territory_relation
More importantly though, “tensors” as commonly used in machine learning seem to rely on a single special basis, so they really are just multidimensional arrays. A machine learning algorithm isn’t really invariant under a change of basis. For example, the ReLU activation function is not independent of a change of basis.
One of my old physics professors taught us to think of tensors as "arrays with units." If it's a vector/matrix/higher dimensional array but has physical units, it's probably a tensor. The fact that it has units means it represents something physical which must obey additional constraints (like the coordinate system transformation rule).
https://www.youtube.com/watch?v=f5liqUk0ZTw
Very simple and basic.
Edit: incorrectly wrote vectors instead of tensors.
Definitely not beginner level.
Normally, you first study the distinction between vectors (which can be expanded to tensors) and scalars in second-year Analytical Mechanics class. You also get a taste of tensors toward the later material in Electromagnetism (which is also probably second-year). And you finally arrive at a rigorous definition of tensors when you take Mathematical Physics (second-year or third-year depending on your skills).
https://grinfeld.org/books/An-Introduction-To-Tensor-Calculu...
The kicker? All of them are tensors. Tensor is just a generalisation of the concept.
I am no licensed mathematician, so this could be off. However, every time I dive into this topic, I have to wade through way too complex mathnobabble to arrive at that notion. So let's keep it simple: tensors are a mathematician's template for arrays of any dimension.
(I kid, but I think this is true, right?)
For example, if I have a vector x in V and a map T from V to W, then I would like the truth of T(x)=y to be independent of how I represent T and x.
Whether or not the concrete block breaks under that stress obviously does not depend on your choice of basis or units, so your transformation rules had better reflect that reality.