Talk:Tensor/Archive 7
This is an archive of past discussions about Tensor. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | ← | Archive 5 | Archive 6 | Archive 7 | Archive 8 |
Discussing a new lead
I think that the suggestion for a new lead packs too many diverging intentions together. Chatul rightfully addresses locality and mentions the cross product as leading to another vector space, which it is only in another ballpark. The same wrt ballparks I think about the concepts of manifolds, tensor fields, orientation, volume element and Hodge dual: I prefer not to see these mentioned in the lead of this article, and neither any symmetry groups and their representations. This is not to say that I would not wish to know all about these, but I think that this first contact article should not venture (in the lead!) into such spheres, unavoidably necessary in physics apps. I also see no advantage in mentioning the details of the tensor algebra, nor the finicky properties of the tensor product, not even of vector spaces and their duals, and also not the categorical aspects of universal properties as are fundamental for a mathy view on tensors.
I think that all the above mentioned concepts are necessary to talk about the full blown tensor experience in physics or math, but I plead for keeping the content of this lead restricted to the introductory level that is sketched in the hat-note, and that I tried to feel bound to. I see no urgent encyclopedic need to discriminate arbitrary number collections from tensors, nor is it to me necessary to have ordered bases for stress tensors, nor to talk more about the non-tensoriality of the cross product than I did. The Levi-Civita is just a near example for an order three tensor. BTW, for obvious reasons, we have no metric tensor and no index juggling either, no mentioning of this being free in Euclidean space, ... There is so much more, sensibly left out of this article.
I take the freedom to explain my reservations about details along this example snippet:
the Cauchy stress tensor T, a second-order tensor that can take a directional unit vector n as input and map it to the stress vector T(n) across an imaginary surface orthogonal to n -- this relationship between vectors is expressible as a matrix multiplication or tensor product, as shown in the figure (right).
- Tensors have not been introduced as maps, now they map. There is no reasoning about the given order of the tensor. There is no "imaginary property" of the plane to which the result refers to. The stress -especially with tensors- is not across a plane, but in a point (the caption is still sloppy, but compare the content, please), just the orthogonal decomposition is wrt to the normal/the plane. A
tensor productproduct of tensors more precisely 09:50, 27 September 2018 (UTC) is distinctly different to matrix multiplication, certainly not simply an alternative ("or"). I'd prefer to rather explain this example in detail than to give that lot of additional ones (WP:no textbook), and of non-tensors too.
As regards examples, I agree to JSpriggs that scalars (of any type) make mostly only poor illustrations, but I am incapable of presenting a good example that involves the curvature tensor beyond the fact that it is constant along all of a sphere, but that would be a non-local claim. Certainly, this would be an excellent example, touching areas also outside of physics. As an aside, I perceive terms like "hypercubic shape" and "tensor exponential" as paradigmata of bad expert lingo (vulgo jargon).
This ended up in a TL;DR, pardon me, please! Just one last suggestion, and then Good Bye:
I suggest to leave the scope of the lead roughly restricted to the current content and suggest to improve the text at places that do not meet an agreed upon level. Purgy (talk) 14:05, 26 September 2018 (UTC)
- I've been too busy in real life to contribute to this discussion until now, but let me raise two points:
- I don't see any of the proposals being better than the current lede.
- Tensors are one of those classic math topics that can be approached differently by people with differing backgrounds (novice math students, experienced mathematicians, physicists, etc.). For such topics, there tends to be a lot of disagreement over "the right way" to do it. The current lede is already the result of compromises over many years.
- So I agree with Purgy's last statement, that the current lede should not be rewritten massively, even as it is of course revised in places. Mgnbar (talk) 21:50, 26 September 2018 (UTC)
- Purgy, thanks again for your thoughts. I have written a shorter proposal below in a new section, partly to meet your requests. I agree that there is some sort of balance to be reached between 1) too much info and context and 2) not enough info and context. It's a kind of art to provide information that gives a good amount of context and indication of relatedness to various topics, but not too much of it. I'm trying to strike the right balance, as I edit my proposal further. (And as I adjust, perhaps I'll be better prepared if, instead, I end up just tweaking the current lead.)
- Regarding the snippet you quoted, I did introduce tensors (in the first sentence) as objects that can map. And I did define the order of a tensor. There *is* an imaginary property of the plane to which the result refers to: it's not a physical surface, it's a mathematical, hypothetical one. The stress *is* a stress over or across a surface: the surface provided by the directional surface-normal vector. (No disrespect, but you may want to review this topic to clear up this confusion. Or maybe I'll need some clarification if I am in fact mistaken.) I had taken the reference to the tensor product directly from what you had written in the caption, but I agree, you were mistaken to write "tensor product" there (and I was mistaken to copy what you wrote). I've changed it to "tensor contraction" now, but kept "matrix multiplication" too, since that's much more familiar to people.
- I think scalars are interesting because they take on a new character in the context of tensors, but I have taken out an example for brevity. I'll try to consider how I could meet my need for clarity and unambiguousness in the current article by editing it, but if you have further thoughts on what I've written, I'd appreciate hearing them.
- @Mgnbar: Thanks for your thoughts, too! I'll try to take to heart what you've said. It would be nice, though, if there was some kind of summary of conclusions that people made about where to draw lines in style and content. (I guess the style guide is the general best attempt at that.)
- @Zeroparallax: as I tried to hint to already in my previous comment, I intend not to comment on this topic any further, at least for the time being. This includes valuations of your suggestions, as well as any efforts as to explicate who is mistaken where. As a final remark: In your RfCs you seem to largely underestimate the task of sensibly commenting on your suggestions, in face of already given arguments, and you barraging tangential hints. Sorry, Purgy (talk) 09:50, 27 September 2018 (UTC)
- Purgy, thank you. (I think I'll withdraw even my shorter proposal now.) I appreciate the time and effort you've put in to this article and into my somewhat naive attempt at a proposed re-write. I may have more proposals in the future, but I'll try not to sap everyone's energy. Take care.
Thoughts re (short) new lead proposal
To keep things clear, maybe we could put thoughts regarding the (short) new lead proposal here.
Zeroparallax (talk) 23:07, 26 September 2018 (UTC)
- So here's an example. Your proposal suggests that a tensor can be defined as an element of a tensor product. But other editors disagree. (Search for the phrase Not every element of a tensor product would be called a "tensor" by most mathematicians and physicists here.)
- So you need to find reliable sources supporting this assertion and your other assertions. Of course you know that. What I'm trying to convey is that it's not just a little detail, that can be ironed out later. You have to do it now, preparing for each of your assertions to be challenged. The page is contentious.
- In a section above, you asked for a succinct summary of the consensus on how this topic should be treated. The answer is: the current lede of this article. If that answer does not satisfy you, then the next answer is: the talk page of this article, including all of its archives.
- I'm not trying to discourage you from improving the article. I'm just trying to convey that it's not going to be a quick task. Regards. Mgnbar (talk) 12:27, 27 September 2018 (UTC)
- Here's another example. Your proposal contains a description of what a "geometric" vector is, which I don't understand, despite having a Ph.D. in geometry. (I mention this not to argue from authority or intimidation --- you can't verify that I have a Ph.D., and even with one I could be wrong --- but just to convey that the average reader may not understand either.) I interpret this passage to be something about how physicists apply tensors only in certain situations. However, it's worth noting that the current lede also suffers from a flaw like this. Mgnbar (talk) 13:07, 27 September 2018 (UTC)
- @Mgnbar: Wow, thank you! Great points and examples! This is very helpful for me. (I've withdrawn my proposals. I may have proposed edits in the future, but I may just try to write a personal blog post instead in the meantime.)
- Zeroparallax (talk) 19:12, 27 September 2018 (UTC)
- Your posts remain gracious, even in response to pessimism from your fellow editors. Cheers. Mgnbar (talk) 19:36, 27 September 2018 (UTC)
- @Mgnbar: Thanks. I have a question for you, given that you've studied geometry deeply, probably especially some specific sub-topic of geometry. Do the following terms have definitions (even loose ones) that would be commonly accepted by geometers? geometry, geometric space, geometric object
- My understanding would be that "geometry" implies that there is some notion of space (~points) related by distance, curves/lines, and angles, which in higher dimensions leads to higher-level concepts like area and volume. Maybe there is not a need for continuity in the space, but usually there would be continuity. There is not a need for infinite divisibility; there could be continuous and quantized "points" (~"pixels"). And there is some notion of a "coordinate system" as a way of organizing and algebraizing the space (moving into analytic geometry). And a "geometric object" would be something with certain properties that are independent of the choice of coordinate system (such as length and direction, or effect upon other geometric objects).
- From reading some of the article on mathematical spaces, I get the idea that there is a fuzzy boundary between what is geometry and what is not geometry (and what may simply be "algebra"). So this, in my mind, would simply transfer a fuzzy boundary to notions of geometric space and geometric object. However, it wouldn't make these terms meaningless or useless. There are still clear cases. The terms can still be useful, with the acknowledgment that in some cases, it will be up to the discretion of the user/author whether to consider the topic/object to be geometry/geometric.
- Hmm, maybe some confusion could come from the fact that many seemingly "non-geometric" / non-physically-spatial "spaces" can be "geometrized" by creating abstract notions of "distance" and "angle". And then the question is *where* is the fuzzy boundary, and how many non-physically-spatial "spaces" do we accept as "geometric"?
- Although mathematical concepts are precise, the terms that mathematicians use, to talk about the sub-disciplines of mathematics, are not precise. For example, geometry and topology are sister sub-disciplines with a great deal of overlap. For another example, algebraic geometry is better classified as algebra than geometry, despite its name. And there are algebraic geometers who are basically complex analysts.
- Usually mathematicians call something "geometry" when there is some kind of rigidity (as opposed to the squishiness of topology), often because of some kind of metric. For example, you can pursue almost all of an introductory linear algebra course working purely algebraically. But as soon as you introduce inner products, then you have notions of length and angle and therefore geometry.
- My problem with the phrase "geometric vector" is that it does not appear to have a mathematical definition, that would allow us to make unambiguous statements of the form "this vector is geometric and this vector is not". Rather, it seems to be something about how physicists use vectors to model relationships in space-time.
- I'm not saying that the term "geometric vector" cannot appear in the article. It is clearly important to some people who use tensors. It just seems to me to be a detail of how people apply tensors to physical situations, not something that is essential to what a tensor is, mathematically. So I would like to see it shifted far out of the lede, to the sections on applications. Mgnbar (talk) 23:04, 29 September 2018 (UTC)
- The term Geometry applies to more than metric spaces, e.g., Affine geometry, Projective geometry, Symplectic geometry; see Erlangen program. In particular, Differential geometry deals with many constructs for which a metric doesn't exist or isn't relevant, e.g., Differential forms. Note that continuity does not apply to finite projective space. Shmuel (Seymour J.) Metz Username:Chatul (talk) 18:15, 30 September 2018 (UTC)
Compromising holors
- There is no "tensorial array" without fixing a basis, so I think this should never be swept under the rug, when talking about tensors as arrays.
- There is nothing beyond "arbitrary arrays of quantities" in holors, afaik, therefore I rm the sentence. >small>This is not intended as definition, but just for reasoning the duplicate phrase. 10:40, 10 October 2018 (UTC)
- Noting the abuse of "tensor", the "not strictly" is appropriate.
- The, to me absolutely, alien applications still should have refs.
- Why remove most prominent holors, like tensor densities (IIRC, the most prominent holors in Moon's book) and Christoffel's? "Holors" are the main topic in this paragraph.
- Explicitly referring to those unaware of the rare term "holor" might be impolite. (Who knows "merate"?)
- Tensors are not in need of a generalized setting for their algebra and calculus, holors are, compared to that of tensors.
- What errors were introduced?
- Ping doesn't look like it works in summaries.
How about presenting a preview of upcoming compromises on the TP? Purgy (talk) 08:30, 9 October 2018 (UTC)
- So a holor is just an array of numbers? It surprises me that this concept is deep enough to have enough secondary sources to be notable. (But then sometimes math surprises me.) Mgnbar (talk) 13:00, 9 October 2018 (UTC)
- I made my statement above more restricted. Purgy (talk) 10:40, 10 October 2018 (UTC)
- @Purgy: I will follow your suggestion regarding discussing proposed edits (even though you didn't follow your own suggestion with your last edit). Addressing your points above, in order:
- 1. True: There is no "tensorial array" without fixing a basis. But as is often done with "vectors", there can be some useful sloppiness in referring to a particular representation of a vector as, simply, a vector, when the context is clear. It could instead more clearly be called a vectorial array; likewise it depends on some chosen basis. If you want to remain explicit about the context (of some chosen basis) that is fine, although it can make the writing a little more clunky and may distract from the main point of this section. (Side note: Maybe it would be best if, outside of Wikipedia, we start adopting the abbreviated term "vector rep" so we don't have to say "vector representation" or "vectorial array". That could cut down on ambiguous usage of "vector".) In my proposed re-write below, I'll try to keep the context clear and not use ambiguous terms.
- 2. Your first point may be very relevant here... and now I think you are right about this. Holors seem to be only arbitrary arrays. There is just enough ambiguity in the book Theory of Holors to confuse me into believing that holors include more abstract mathematical objects. I was misled by sentences such as "Examples of holors are complex numbers, vectors, matrices, tensors, and other hypernumbers." I think now that when Moon and Spencer write "vector", they mean what I would call "vectorial array" (that is dependent upon a basis). And when they write "tensor", they mean "tensorial array" (that is dependent upon a basis). In my mind, I deduced that the definition of holors (taking into account the full context of the book) was better phrased as "a mathematical entity that *can be represented as* one or more independent quantities". (I think a definition like that would actually be more useful, but that's a separate point.) I think I was mistaken and you are correct.
- 3. Given the point I'm trying to make, this can confuse the issue. See my proposal below.
- 4. Ok, I'll provide a reference, but the reference more properly belongs in the article on holors than this subsection of the tensors article. (Note that I disagree that any specific citation is necessary beyond what I already provided, because it is a straight-forward application of the concept, but I'll provide a direct quote anyway.) "The simplest application of holors specifies a collection of discrete objects -- for instance, machine screws. Each merate gives the number of screws of a particular type. The complete holor designates the inventory of a particular tool room (as regards screws in stock)." -- from Theory of Holors, page 5
- 5. Holors are not the main topic of this subsection: the relationship between tensors and holors, and the way that holors generalize notions of tensorial arrays are the main topics of this subsection. In particular, the confusion about the meaning of "tensor" and how "holors" can clear up that confusion is the main point of this subsection. See my proposal below.
- 6. Ok, if you prefer a re-phrasing, I'll go along.
- 7. I'm not sure what you mean here. I wouldn't claim that tensors have needs. Tensors don't have needs; they're not people. What we can say is that "The concepts of holors and associated terminology provide a general setting for tensorial arrays and the algebra and calculus that they are involved in, including providing names and categories for non-tensorial objects that tensorial arrays interact with."
- 8. The error was mine. I now agree with you that holors are apparently simply arbitrary arrays (with associated useful terminology and notation).
- 9. Ok. (I meant ping me here in the talk pages to discuss edit ideas.)
- Alright, so, with all that in mind, here's my proposal for editing this section:
- @Purgy: I will follow your suggestion regarding discussing proposed edits (even though you didn't follow your own suggestion with your last edit). Addressing your points above, in order:
As discussed above, a tensor can be represented as a (potentially multidimensional, multi-indexed) array of quantities -- a tensorial array -- if a basis for the related vector space is chosen for tensors of order greater than zero. A common misconception is that a tensor is simply a multidimensional array -- a kind of generalization of vectors and matrices. But this is not the case (at least in the context of this article), since a tensor, when represented as a multidimensional array, must obey certain transformation properties when changing basis vectors or coordinates. So a tensorial array is an array, but an array is not necessarily a tensorial array. In particular, a tensorial array can be a multidimensional array, but a multidimensional array is not necessarily a tensorial array. (This may more sloppily be said as "a tensor can be a multidimensional array, but a multidimensional array is not necessarily a tensor", where "tensor" here refers to a tensorial array.)
The mathematical term "holor" was coined in part to help clear up this confusion. A holor is an arbitrary array, and includes tensorial arrays as a special case. Holors can be said to be a generalization of tensorial arrays, in particular because the terminology associated with holors provides a general setting for the algebra and calculus that tensorial arrays are involved in, including providing names and categories for non-tensorial objects that tensorial arrays interact with.
So tensorial arrays can be analyzed as a particular type of holor, alongside other non-tensorial holors, such as the Levi-Civita Symbol, neural network (node and/or link) values, indexed inventory tables, and so on. And when encountering the term "tensor" generally, it may sometimes be more accurate to substitute inequivalent terms such as "holor" or "arbitrary array" or "multidimensional array", depending on the context and potential misusage.
Zeroparallax (talk) 23:18, 9 October 2018 (UTC)
- I would rather that we not mention "holors" at all. Christoffel symbols and tensor densities (including Levi-Civita symbols) depend on a choice of basis and are very similar to tensors, just having a small modification to their transformation laws. So they should be described as tensor-like or pseudo-tensors. Other kinds of holors are entirely different and irrelevant. JRSpriggs (talk) 00:02, 10 October 2018 (UTC)
- @JRSpriggs: I, and the few people who have reviewed the book (Theory of Holors: A Generalization of Tensors) online (see Amazon), have found it to be conceptually quite useful, as have various authors of academic and technical material (as cited in the section on holors currently in the Parry Moon page). I think it's worthwhile to mention, in this section on generalizations of tensors, the book with that subtitle, especially since there is rampant confusion about tensors and this concept (and book) helps to point out clarification for that confusion.
- Sometimes it is most relevant to point out similarities between concepts (cats are like dogs, tensor densities are like tensors) and sometimes it is most relevant to point out dissimilarities (cats are not dogs, tensor densities are not tensors). That's how you gain clarity.
- To avoid similar lengthy developments, like in "Discussing a new lead", I cite my closing remark from there right at the beginning:
"This ended up in a TL;DR, pardon me, please! Just one last suggestion, and then Good Bye"
, it holds here as well. - While I only marginally agree to remove the whole "Holor"-section, I do think that holors do not help to understand the concept of tensors, rather they possibly provide a generalized setting, in which -as you like to call it- tensorial arrays may be embedded, together with -let's call them pseudo- other pseudo-tensorial arrays, and where even "indexed inventory tables" may be discussed. Nevertheless, the concept of holors does not seem to be notably accepted in the communities of either abstract or number-crunching mathematicians, or of physicists, and I never read about clerks doing inventories using this term. Holors do (?almost?) nothing to the abstract concept of tensors, but to provide a collective noun for just very specific manifestations (arrays) of several concepts, related by a similar behavior under transformations. Afaik, this property is often even considered as the defining one in physics, but is fully in the background in the math view, and even physicists mostly talk about "abstract indices".
- To avoid similar lengthy developments, like in "Discussing a new lead", I cite my closing remark from there right at the beginning:
- Summing it up, holors do not generalize the abstract mathematical notion of tensors (universal property with multilinearity?), and are not accepted in the physicists community, holors do unify to a certain extent the treatment of basis-dependent (non-geometrical) representations of tensors together with a possibly only vaguely specified notion of pseudo-tensors, at the expense of covering non-related terms, but this does not justify to promote the term by glossing over prerequisites, and emphasizing not asked for unifications. Sorry to suggest also here a withdrawal of your proposal, Purgy (talk) 10:40, 10 October 2018 (UTC)
- The non-tensor-like holors might better be called spreadsheets or relational databases. JRSpriggs (talk) 19:47, 10 October 2018 (UTC)
Ok, well, I've placed some of the content of my proposal into the section on holors. I'll accept the general sentiment here, which is to reject my proposal for this subsection in the tensors article. I believe that the subsection, as currently written, is quite awkwardly worded, but I'll leave touch-up edits to others, since my edits keep getting reverted. However, since I've provided the desired reference, I will delete the "citation needed" notices myself; if anyone wants to add the citation/quote I gave above (in bullet point 4), I think it's best placed (if at all) in the holors section (of the Parry Moon) page. Feel free to add the quote I provided above as a footnote there.
(Or if you really want it done, I can do it if you ask me to.)
Zeroparallax (talk) 22:57, 10 October 2018 (UTC)
Multidimensional arrays
I know little about tensors, but I've got a problem with the statement "A tensor may be represented as a (potentially multidimensional) array ...". My problem is that the array is NOT the tensor (a representation of a tensor) because the transformation properties are (obviously, I think) part of the tensor but are not part of any array. So, a tensor can be represented (in a particular basis, I think?) by a multidimensional array ACCOMPANIED by the transformation rules. I think that is correct? I also think two more things need to be discussed. First is for each index i,j,k,l is it or is it not true that the "array" is square? I believe it is true. Secondly, is it or is it not true that each array is invertible (has an inverse)? Both are additional requirements which limit the multidimensional array (i.e. impose structure). It might also be worthwhile to comment on whether the components have to be or usually are scalars or complex or what).72.16.99.93 (talk) 16:16, 24 October 2018 (UTC)
- The array is certainly not the tensor, but it is a representation of a tensor in a specific basis. The transformation properties of the representations are derived from the transformation between the bases. There is no need, or even possibility, for additional transformation rules.
- The matrix representing a rank 2 tensor in a particular basis is square, but need not be invertible. The transformation matrix is invertible, and is the tensor product of two matrices, each of which is either the basis transformation or its inverse, depending of whether the tensor is covariant, contravariant or mixed. The components, by definition, are scalars, but those might be, e.g., real, complex, elements of a finite field. In Physics the scalar field is usually the Reals.
- As an example of a tensor that need not be invertible, consider the rank 2 enery-momentum-stress 4-tensor in Relativity; not only need it not be invertible, it vanishes in a vacuum. Shmuel (Seymour J.) Metz Username:Chatul (talk) 20:37, 24 October 2018 (UTC)
Problem with lead
The lead uses the Cauchy stress tensor as an example of a tensor. OK. The problem is the diagram (figure/illustration). The text describes T(v) as " the force (per unit area) exerted by material on the negative side of the plane orthogonal to v ...". Well, this is not ok. The diagram shows what are apparently 3 components of the tensor T, labled T(ei) (i=1,2,3) which are normal to the (flat) face of what may or may not be a cube (but if it isn't then the only difference is that its faces don't meet at 90°). Nowhere(!!) is the vector v shown, nor is the relationship between this (arbitrary) cube and the vector v explained. I fail to see why you'd use a diagram which can't be easily related to the text. And clearly whether or not the 3 components of T are orthogonal to one another (which SHOULD also be clear and not a matter of guessing on the readers part), they can NOT all three be "orthogonal to v" (since in 3D, it is a 2D plane which is orthogonal to a straight line.) Note that a far better diagram can be found in the article about the Cauchy Stress Vector, see figure 2.2 there (but note that the figure numbers are completely detached from any rational ordering sequence). I strongly suggest that that figure 2.2 be used instead of the one currently in place, although there the vector is called n, not v, so that some light editing should also be done.72.16.99.93 (talk) 15:36, 24 October 2018 (UTC)
- The diagram is wrong, not the words which I put into the lead. There is no cube. The stress can be defined at any point. It has nine components which depend on the choice of an orthonormal basis at that point. JRSpriggs (talk) 21:21, 24 October 2018 (UTC)
- I hesitate in calling the diagram flat out wrong, but it certainly contains some incoherent notations. I think it stems from a traditional derivation of the stress tensor, employing an "infinitesimal cube" in some point of a solid, edges oriented along the bases vectors, which is put into equilibrium. From this the normal stress and two orthogonal shear stresses can be derived, all three orthogonal to each others, and with respect to a surface normal (v) and the associated plane. I am unsure, whether another pic, focusing more on a plane that is not expressed within the tensor, might improve the illustrativity. This plane, given by its normal (3 components), is just the input to the stress tensor (9 components with structural constraints). Purgy (talk) 07:20, 25 October 2018 (UTC)
Why? Start with the why.
Hi, I got to this page from an article about Einstein's Theories of Relativity. I haven't studied this.
For new readers, starting with the "known" may be an easier approach to land these "unknown" ideas. My guess is that tensors were probably built up from the ideas of scalars/vectors, instead of reduced to them. I haven't studied it - just a hunch.
I'll throw this out there as an inaccurate, but fixable, introduction:
"Tensors describe the dramatic tension of the physical universe.
Physicists (notably Einstein as he looked for a Unifying Theory of physics) had a bunch of equations about motion, gravity, electricity, and stuff, that all had a similar structure - numbers, units, scalers, vectors, etc. E.g. 55 miles/hour, 110 volts, 30 knots north by northwest.
If we wanted to unify these phenomena (e.g. gravity and electricity) into a single "Physics Theory of Everything", we'd have to create a model to describe "what is a phenomenon like gravity?" or "what is a phenomenon like electricity?". If this model works for gravity and electricity, we can use it to discover new and interesting phenomenon (examples of things we discovered).
Tensors were invented to describe these natural phenomena (e.g. gravity and electricity) using a single model. Physicists use tensors to look for places where our current understanding of physics works and (more interestingly) where it doesn't (what can't we describe with tensors? dark matter?, dark energy?, etc)."
Thanks!
Michaeljcole (talk) 19:12, 4 November 2018 (UTC) Michael — Preceding unsigned comment added by Michaeljcole (talk • contribs) 19:10, 4 November 2018 (UTC)
- Thanks for your interest in improving the article. You are right to propose changes to the introduction on the talk page. This article is somewhat contentious, partly because editors with varying backgrounds want to edit it, and partly because readers at various levels of education want to read it.
- As you know, your proposed text would need a lot of polishing. That's fine.
- I don't like one feature of your treatment. It presents tensors as if they exist to serve physics. Mathematicians would disagree. For example, Riemann invented Riemannian geometry many decades before Einstein used it to model gravity.
- Also, readers are continually pressuring us to tell them what the title of the article means, as quickly as possible. Your treatment is weak on defining what tensors are, either intuitively or precisely. Mgnbar (talk) 04:41, 5 November 2018 (UTC)
- The lede should link to Tensor field for applications in Physics, and details of those should be moved there.
- I suggest referring to tensors as algebraic entities and defining geometric vectors as vectors in the tangent or cotanget space for a point on a differentiable manifold. Shmuel (Seymour J.) Metz Username:Chatul (talk) 18:19, 5 November 2018 (UTC)
- I disagree. See earlier refutation of this viewpoint. Sławomir
Biały 18:31, 5 November 2018 (UTC)
- I disagree. See earlier refutation of this viewpoint. Sławomir
- I prefer the current version to both suggestions above, and also want to refer to the various refutations in previous, but still valid discussions. Purgy (talk) 08:16, 6 November 2018 (UTC)
Tensors as multilinear forms
Shouldn't tensors be defined as multilinear forms instead of multilinear maps ?
Something like:
"
Let V be a vector space defined over a commutative field of scalars F, and V* its dual space.
Tensors can be defined as multilinear forms from V*p × Vq to F. F is typically (but not necessarily) the set of real or complex numbers.
"
It would be more accurate (precise) even though multilinear forms obviously also are mutilinear maps.
I also believe it is better to start from that definition, from which all other definitions can be derived. Btw this is how Godement (from Bourbaki group) defines them.
Dummynerd (talk) 09:10, 15 May 2019 (UTC)
[1]
- A "form" is reserved for the case where the codomain is F and not a vector space thereover. The Cauchy stress tensor is an obvious example of a tensor whose codomain is not F.--Jasper Deng (talk) 09:12, 15 May 2019 (UTC)
Lead paragraph not very useful
"In mathematics, a tensor is a geometric object that maps in a multi-linear manner geometric vectors, scalars, and other tensors to a resulting tensor."
- This is circular: it attempts to describe tensor in terms of tensor.
- "multi-linear" links to "Linear relation" where "multi" does not appear.
- What does "resulting" mean? Resulting from what? Does this just refer to the right-hand side of "map"?
- "geometric" -- what does that mean? A later sentence says " meant to emphasize independence of any selection of a coordinate system". Independence of what? Subsequent paragraphs describe "Changing the basis transforms the values", which sounds like not independent of coordinate system.
This lead sentence appears to say that a tensor is a mapping of a tensor on the left to a tensor on the right.
tensorL --(tensorA)--> tensorR
... where the arrow itself is also tensor, let's call it tensorA.
Applied to the Cauchy stress tensor, this suggests that not only is the CST a tensor, playing the role tensorA, but also the input coordinate system or basis plays the role of tensorL, and the output values's space (ie: a 3x3 matrix of a subset of R) plays the role of tensorR.
Which raises the question -- in what way is a coordinate system a tensor? And how is a matrix of R a tensor? Gwideman (talk) 23:36, 28 May 2019 (UTC)
- A point in a coordinate system is a position vector, and a vector can be considered a tensor. A matrix A over R (not "of R") represents an order-2 tensor, whether it be the bilinear form or just the linear map .
- I'm not sure how it could be made clearer, since some tensors indeed are mappings of one type of tensor to another. However, multi-linear should link to multilinear map, not linear relation.--Jasper Deng (talk) 23:47, 28 May 2019 (UTC)
- Thanks for your speedy comment! You say "a vector can be considered a tensor". OK, what qualifies a vector to be considered a tensor? According to this article, to be a tensor, a vector would need to map a tensor to a tensor. What are those two tensors? Gwideman (talk) 00:01, 29 May 2019 (UTC)
- Also you say "some tensors indeed are mappings of one type of tensor to another" -- so some tensors are not? Yet that conflicts with this article's opening sentence. Gwideman (talk) 00:02, 29 May 2019 (UTC)
- I changed the first sentence to "In mathematics, a tensor is a geometric object, either a scalar, a geometric vector, or a multi-linear map from other tensors to a resulting tensor.". I hope you consider this an improvement. JRSpriggs (talk) 00:45, 29 May 2019 (UTC)
- I don't know whether it's an improvement. It maintains the aspect that I was having trouble with -- that apparently a tensor acts on a tensor to produce a tensor. I don't understand what the criteria for being a tensor are such that the thing acted upon, and the thing produced, themselves qualify as tensors. But following up on your wording -- you're saying that a scalar can be a tensor. Do you literally mean a plain number, such as "3.5"? What would be an example? Or do you mean a mapping to scalars, for example from conventional space (x,y,z) to a scalar value at each point? — Preceding unsigned comment added by Gwideman (talk • contribs) 02:35, 29 May 2019 (UTC)
- The intro does not attempt to actually define what tensors are. The definition is presented in later sections. This is not ideal, but it's hard to make an accurate, informative definition in the intro. Mgnbar (talk) 03:34, 29 May 2019 (UTC)
- I don't know whether it's an improvement. It maintains the aspect that I was having trouble with -- that apparently a tensor acts on a tensor to produce a tensor. I don't understand what the criteria for being a tensor are such that the thing acted upon, and the thing produced, themselves qualify as tensors. But following up on your wording -- you're saying that a scalar can be a tensor. Do you literally mean a plain number, such as "3.5"? What would be an example? Or do you mean a mapping to scalars, for example from conventional space (x,y,z) to a scalar value at each point? — Preceding unsigned comment added by Gwideman (talk • contribs) 02:35, 29 May 2019 (UTC)
- To Gwideman: The hat note says in part, "This article is about tensors on a single vector space. For tensor fields, see Tensor field.". So for the purposes of this article, 3.5 is a scalar. As for the remaining circularity, it is no worse than defining a natural number as either zero or the successor of a natural number. Each use of the word tensor in the definition should have been previously determined to be a tensor by the definition. So more complicated tensors may be defined in terms of simpler tensors (only). OK? JRSpriggs (talk) 05:31, 29 May 2019 (UTC)
- It has other problems. A tensor is an algebraic object, and there are several equivalent (for finite dimensions) ways of defining a tensor. One way uses the tensor product, another multilinear maps. There is also the issue of mixed tensors, which have additional ambiguity. The lede should be neutral as to how to define a tensor and should explain that the concept is algebraic but that tensor fields have applications to geometry et al. Shmuel (Seymour J.) Metz Username:Chatul (talk) 19:01, 29 May 2019 (UTC)
- I agree that tensors are essentially algebraic (especially if they're not tensor fields), and that bringing geometry into it is a red herring. On the other hand, removing geometry from the lede seems to make the lede even harder to write. Mgnbar (talk) 19:47, 29 May 2019 (UTC)
- I wasn't suggesting that the lede shouldn't mention Geometry, just that it make clear that tensors are algebraic entities used in Geometry and elsewhere. Shmuel (Seymour J.) Metz Username:Chatul (talk) 14:50, 30 May 2019 (UTC)
- I agree that tensors are essentially algebraic (especially if they're not tensor fields), and that bringing geometry into it is a red herring. On the other hand, removing geometry from the lede seems to make the lede even harder to write. Mgnbar (talk) 19:47, 29 May 2019 (UTC)
- To clarify the issue with regard to mixed tensors, the definition assumes that all of the covariant factors are contiguous and that all of the contravariant factors are contiguous; it excludes tensors like Shmuel (Seymour J.) Metz Username:Chatul (talk) 14:50, 30 May 2019 (UTC)
Tensor Analsis pertains to Tensor fields
The discussion of Tensor Analysis in Tensor#History belongs in Tensor field, not here. Shmuel (Seymour J.) Metz Username:Chatul (talk) 14:45, 5 June 2019 (UTC)
No mention of usage in Machine Learning
There is a huge community of researchers and practitioners in Machine Learning, most notably in Deep learning & Deep Neural Networks, that uses the term "tensor" in a related but not identical way as this page describes. In ML, a tensor is generally described as an array data structure, *not necessarily* imbued with a notion of mapping other tensors to each other, or indeed not necessarily even contextualized within vector spaces. Here are two examples:
- "In some cases we will need an array with more than two axes.In the general case, an array of numbers arranged on a regular grid with avariable number of axes is known as a tensor." -- definition from "Deep Learning" by Ian Goodfellow, Yoshua Bengio, and Aaron Courville, see http://www.deeplearningbook.org/contents/linear_algebra.html
- "A tensor is a generalization of vectors and matrices to potentially higher dimensions. Internally, TensorFlow represents tensors as n-dimensional arrays of base datatypes." -- definition from the TensorFlow package, see https://www.tensorflow.org/guide/tensors
The notion of a tensor on this page seems to be (mostly and primarily) as an operator, whereas the ML sense of it seems to be as a data structure. Somewhat akin to the linear algebra notions of Linear Transformation and Matrix, where the latter is a convenient structure to represent the former, but still possesses a definition of its own.
Evidence that "these tensors are not those tensors":
- The Tensor software page makes no mention of the most popular "tensor-based" Machine Learning packages, e.g. TensorFlow.
- The TensorFlow page does not link to Tensor.
What can be done to either reconcile the two concepts, or merge the apparent ML concept into this existing page? Maybe even a minor edit would go a long way, e.g. adding something like "tensors are often used as data structures in Deep learning to represent large numbers of related parameters, which may or may not possess the algebraic properties of tensors as mathematical operators."
Kenahoo (talk) 20:16, 6 June 2019 (UTC)
- From a quick glance at the articles, I could not tell whether they are truly working with tensors or merely doing matrix multiplication and addition. JRSpriggs (talk) 08:46, 7 June 2019 (UTC)
- It is definitely more than just matrix multiplication and addition, in at least two senses: 1) there are often more than two dimensions, often an arbitrary (or unknown) number of dimensions; 2) decomposition into useful components, for example, is a key ingredient in a lot of work.
- I just attended the 2019 IEEE Data Science Workshop, at which it seemed like 90% of the work presented involved tensor algebra. It could be instructive to check out the "Detailed Technical Program" to get a sense of what researchers are doing. For example, the slides for Tutorial 2: Tensors in Data Science are available. Kenahoo (talk) 18:09, 11 June 2019 (UTC)
- Tensors are huge within ML, particularly within Google. Largely because they're being used to support one approach to make these large tasks scalable, and also to break the problem small enough to run it across hardware farms of many cheap, efficient processors, such as the Google TPU.
- However there's also a problem that most people in ML don't understand either ML, the maths underlying it, or tensors. In particular, there's a tendency (we see it in the quote here) to confuse tensors and any other high-dimensional matrix.
- There's also a strand (my own interest) in displaying tensor glyphs, which are used for high-density data displays. A tensor field is mapped onto a two dimensional display, with each discrete point within that used to display a glyph representing the tensor. These are qualitative displays, more than quantitative, often relying on preattentive attributes to convey an 'overall impression' of complex data, at a glance. Andy Dingley (talk) 09:31, 7 June 2019 (UTC)
- I guess the essence of my question is whether Machine Learning researchers & practitioners are "confusing tensors and other high-dimensional matrices", or whether they've essentially created their own new modified definition of the term. At what point does a community's particular definition variant become worthy of its own page, or its own mention on an existing page? There must be precedent for other communities taking a mathematical term and using it for a slightly different meaning, which then becomes very popular. Kenahoo (talk) 18:09, 11 June 2019 (UTC)
- There are a lot of areas that use matrices that do not represent tensors in a particular basis, e.g., Linear regression. Fortunately, the statisticians seem to know the difference. Shmuel (Seymour J.) Metz Username:Chatul (talk) 15:35, 7 June 2019 (UTC)
- Kenahoo: I have skimmed the Tutorial 2 that you linked. I couldn't detect any tensors (beyond the usual calculus used in optimization problems: scalars, vectors, Hessian matrices). Maybe there is a better example? Mgnbar (talk) 18:35, 11 June 2019 (UTC)
- Oops; I read Tutorial 1 by accident. What a waste. Yes, the linked Tutorial 2 does have tensors. With more than two indices. Kronecker products. Decompositions. Other editors would do well to take a look. Mgnbar (talk) 19:16, 11 June 2019 (UTC)
References
- ^ Roger Godement - Cours d'algèbre - Section 21-2
Image illustration isn't too helpful
While the image used in the introduction shows an example of a tensor, I feel that it far from the ideal image to use to illustrate the concept of a tensor. The image shows only one example of tensors, and many readers may struggle to even properly interpret what it is trying to show. I feel that an example that either shows the more fundamental essence of what a tensor is, or an image that shows the variety of different concepts that may be considered a tensor (or does both) would be a much better aid towards readers' understanding of what a tensor is, than a (somewhat difficult to interpret) illustration of a single example of a tensor. This is a sketch of what that may look like. - Ramzuiv (talk) 22:29, 26 September 2019 (UTC)
Simplifying lead section
User MarkH21 added a template suggesting to shorten the lead paragraph, stating that there are unnecessary technical details that should be included in the main body of the article. I think this would be an acceptable way for the lead to be:
- In mathematics, a tensor is an algebraic object that describes a linear mapping from one set of algebraic objects related to a vector space to another. Objects that tensors may map between include, but are not limited to, vectors and scalars, and, recursively, even other tensors. Tensors can take several different forms – for example: a scalar, a vector, a dual vector, or a multi-linear map between vector spaces.
Note that I moved the mention of the relationship between tensors and vector spaces from the middle of the paragraph to the first sentence. I think this is important information that helps readers anchor the concept of tensors to a more commonly familiar concept. I also removed the "at a point" from the list of examples, since the lead should be about tensors themselves, not tensor fields. We can move these pieces of information to the main body, outside of the lead:
- Euclidean vectors and scalars (which are often used in elementary physics and engineering applications where general relativity is irrelevant) are the simplest tensors.
- While tensors are defined independent of any basis, the literature on physics often refers to them by their components in a basis related to a particular coordinate system.
I'd like to give a chance to hear other opinions before I shorten the lead myself, and I haven't taken the time to decide where the information being removed from the lead should be relocated -Ramzuiv (talk) 00:47, 5 February 2020 (UTC)
- On further reflection, I realize that MarkH21 was probably not referring to the first paragraph by itself, but rather the first 8 paragraphs, all of which come before the table of contents. I very much agree that this is far too long for an introduction, but it does make most of what I wrote above less relevant to the problem (although I think my proposals might still be worth implementing) -Ramzuiv (talk) 02:27, 5 February 2020 (UTC)
- @Ramzuiv: Yes, indeed I was referring to the full eight paragraphs. But I do think your proposal is an improvement for the first paragraph. Go ahead and make the change! — MarkH21talk 02:49, 5 February 2020 (UTC)
- I've moved one paragraph from the introduction to the Examples section, and I moved some more paragraphs into a new section titled Properties. I will note that the introduction now doesn't make any mention of the transformation rules for tensors, such as the reason why the cross product isn't strictly a tensor. I believe that this detail can be concisely mentioned somewhere in the first paragraph, but I can't implement that right at this instant - Ramzuiv (talk) 02:54, 5 February 2020 (UTC)
- @Ramzuiv: Yes, indeed I was referring to the full eight paragraphs. But I do think your proposal is an improvement for the first paragraph. Go ahead and make the change! — MarkH21talk 02:49, 5 February 2020 (UTC)
Inner product.
I'd like to clarify an issue raised in edits by user:Kamil Kielczewski and user:Jasper Deng. An inner product is a contraction of an outer product, contracting one covariant index and one contravariant index for a vector space V and its dual V*. For tensors in an inner product space, the term includes contraction of two covariant or two contravariant indices using the inner product.
There's a similar concept in Exterior Algebra, but it's complicated by the issue of orientation. Shmuel (Seymour J.) Metz Username:Chatul (talk) 21:14, 20 February 2020 (UTC)
Merge article Tensor (intrinsic definition) into here
The article Tensor (intrinsic definition) should not be a separate page from this page, so I will propose merging it into this page
-Ramzuiv (talk) 23:42, 5 February 2020 (UTC)
- Oppose: The main tensor article is already very long. There used to be three different tensor articles on the en.Wikipedia - with traditional, abstract, and in-between descriptions as three parallel articles (I don't remember the precise names). Including Tensor (intrinsic definition) into the appropriate section here at Tensor would then lead to suggestions to again split apart the three approaches: a bunch of numbers; the geometric approach (multilinear maps); the algebraic approach (with tensor products). (Personally I would put the "bunch of numbers (arrays)" approach second and the geometric approach first, because I like geometry.) My guess is that the integration of the earlier versions took a lot of work and probably nobody would want to do a split. Boud (talk) 20:40, 15 April 2020 (UTC)
- One of the reasons that the article is so long is that it contains a lor of material that properly belongs in Tensor fields. I would like to see how much the article shrinks if that material is moved before considering the merits of the merger. If the size becomes manageable then I would support a merger; if not, not.
- BTW, I would consider both the multi-linear map approach and the quotient space approach to be algebraic, and tensor fields, with any approach, to be geometric. I would put the two intrinsic approaches first and second and put the numbers in terms of a basis approach third. Shmuel (Seymour J.) Metz Username:Chatul (talk) 23:52, 15 April 2020 (UTC)
About Removing the Wikilink for "Covector"
JRSpriggs, I removed the wikilink to covector because it redirected to the page "Linear functional" which was already linked. It seemed unnecessary to have 2 wikilinks that ultimately go to the same page. Proxima Centari (talk) 05:23, 15 February 2020 (UTC)
- The two of us may know that "covector", "linear functional", and "1-form" all mean the same thing (perhaps looked at from a different perspective), but that does not mean that the reader knows that. WP:Easter egg implies that the reader should not be expected to anticipate to what a link will take him. So the link should stay. JRSpriggs (talk) 11:07, 16 February 2020 (UTC)
- How about removing the wikilink but changing the wording to make it clear that they all refer to the same thing. I'd suggest also removing the wikilink to 1-form, except that it's a separate article rather than a redirect to linear functional. Shmuel (Seymour J.) Metz Username:Chatul (talk) 23:40, 15 April 2020 (UTC)
- Feel free to do the re-wording. If it becomes clear enough, I will accept dropping one of the links. JRSpriggs (talk) 10:05, 16 April 2020 (UTC)
- As a first cut, "covector, functional, linear form 1-form, e.g. dipole moment, gradient of a scalar field". It occurred to me that some approaches to Differential Geometry define the cotangent space (covectors) first[a] and then define the tangent space (vectors) as functionals on that, and yet others define the tangent and cotangent spaces independently[b] and then define a bilinear map.
- Feel free to do the re-wording. If it becomes clear enough, I will accept dropping one of the links. JRSpriggs (talk) 10:05, 16 April 2020 (UTC)
Does the literature ever regard the total derivative at a point as a tensor?
if U and V are vector spaces, and is a differentiable function, then the total derivative of F at a can be considered to be an element of . Does the literature ever refer to it as a tensor? Shmuel (Seymour J.) Metz Username:Chatul (talk) 16:52, 12 July 2020 (UTC)
- If V is a space of tensors over U, then so will be also. JRSpriggs (talk) 15:50, 13 July 2020 (UTC)
- I'm asking about independent finite dimensional vector spaces. Put more concretely, if , is open and is differentiable, does the literature ever refer to as a tensor in ? Shmuel (Seymour J.) Metz Username:Chatul (talk) 18:01, 13 July 2020 (UTC)
Examples
I don't understand why my edit has been reverted. The gradient article defines the gradient of a scalar function as , which is clearly a vector field. It states already in the introduction, that the gradient is a vector, while the derivative of a scalar function is a covector.2003:EE:E713:2219:D2E:6199:1FBE:1BAA (talk) 21:17, 13 July 2020 (UTC)
- Your revision replaces gradient in cell (0,1) with Total derivative, which applies to vector valued functions. Did you mean differential of a scalar field?
- As a side note, would it be clearer to write "differential of a scalar field f" and "gradient of a scalar field f"? Shmuel (Seymour J.) Metz Username:Chatul (talk) 01:03, 14 July 2020 (UTC)
- The gradient-article uses both "derivative" and "differential" for , so I thought it doesn't matter which one to use. Writing "differential of a scalar field f" and "gradient of a scalar field f" is fine in my opinion.2003:EE:E713:2255:A82B:842:F3B9:F069 (talk) 12:59, 14 July 2020 (UTC)
- Yes, but it doesn't use "Total derivative", which is something different: essentially the Jacobian. Shmuel (Seymour J.) Metz Username:Chatul (talk) 13:35, 14 July 2020 (UTC)
- See MTW page 59. Our article Gradient is wrong. The gradient is covariant. When applied to a scalar field, it is the same as the differential. JRSpriggs (talk) 05:19, 15 July 2020 (UTC)
- This is one of many cases where different authors[a] use different definitions[1] for the same term. I suspect that the references in gradient use the definition in the article, or equivalent. Shmuel (Seymour J.) Metz Username:Chatul (talk) 12:36, 15 July 2020 (UTC)
- People often assume that one definition is as good as another, that is, that the choice of a definition is arbitrary, so that no definition is wrong. This is a mistake. Some definitions are definitely inferior to some other definitions.
- In this case, using the MTW definition works even in spaces which lack a metric. While the definition in our article on gradient only works when there is a metric (and a single preferred metric at that). Thus our article's definition is wrong (inferior) compared to MTW's definition. JRSpriggs (talk) 20:49, 15 July 2020 (UTC)
- Do you really consider using a definition from Serge Lang to be arbitrary? The Mathematical literature is full of inconsistent nomenclature, and Lang's reputation[b] in Mathematics is unassailable.
- However, I believe that the gradient article should mention both definitions and cite sources. Shmuel (Seymour J.) Metz Username:Chatul (talk) 21:35, 15 July 2020 (UTC)
Notes
References
- ^ Serge Lang (2002), Introduction to Differentiable Manifolds, Universitext, Springer, ISBN 0-387-95477-5
Mode
Is there sufficient interest to add a definition of 'modes' of a tensor? Could this go in the Properties section or is there sufficient ground to create a Terminology section? --- Preceding comment by 2.205.16.20.
- I've never heard this term. What does it mean (briefly)? (And please sign your talk page comments with four tildes, like ~~~~.) Mgnbar (talk) 13:15, 10 November 2020 (UTC)
- Likewise, I have never heard of it, and "modes" does not appear in the article. JRSpriggs (talk) 21:26, 10 November 2020 (UTC)
- It's widely used in computing research literature on tensors (e.g., see the frequently cited paper "Tensor Decompositions and Application" by Kolda et al.). I have come to the understanding that people use it as a synonym for 'dimension'. However, I'm not an expert, so don't know the full story here, including why there's a difference in terminology. Thought it'd be good to raise since it wasn't mentioned in the article. 130.149.224.40 (talk) 14:08, 19 November 2020 (UTC)
- Thanks for that paper. It says, "The order of a tensor is the number of dimensions, also known as ways or modes." And in a footnote it adds that this concept is sometimes called rank, although that term has another meaning. So for example the Cauchy stress tensor has two modes. I can't judge whether this term is common enough to merit inclusion in this article. Mgnbar (talk) 15:47, 19 November 2020 (UTC)
Disambiguate 'tensor' in the context of machine learning?
In machine learning applications, e.g., https://en.wikipedia.org/wiki/TensorFlow 'tensor' is often used merely to mean 'higher-dimensional array' and there is often no notion of covariant and contravariant indices or a basis. See for example this usage: https://www.tensorflow.org/guide/tensor#ragged_tensors . I found Wikipedia's tensor articles quite confusing when trying to understand what a tensor is in the context of machine learning. — Preceding unsigned comment added by 59.167.182.241 (talk) 13:03, 15 December 2020 (UTC)
- The machine learning usage of tensor is going to be an increasingly big issue for this article. A substantial and growing fraction of readers coming to this article are coming for the machine learning usage. What if we simply added a hatnote directing those readers to Array data type or something like that? Mgnbar (talk) 14:53, 15 December 2020 (UTC)
- I think that the issue is resolved. This article links to a disambiguation page, which mentions that tensors in machine learning are multidimensional arrays. That page now links to Array data type, which in its lede now defines tensors as multidimensional arrays. Good? Mgnbar (talk) 20:21, 15 December 2020 (UTC)
1-dimensional vector space over a field F
In the paragraph Definition: As a multilinear map, It is written that V can be a vector space over an arbitrary field F where the codomain can be a 1-dimensional vector space over the same field F.
A 1d vector space over F is isomorphic to F itself is it not? I therefore propose simplifying the expression to state the codomain is the field F itself. Victor bosman (talk) 09:11, 24 June 2021 (UTC)
- It could be argued that a 1-dimensional vector space is slightly more general (because the isomorphism is not unique). But I agree that it's not worth complicating the presentation. So I've changed it. Mgnbar (talk) 11:11, 24 June 2021 (UTC)
- Identifying one dimensional vector spaces over F would greatly complicate the presentation of the general case, since there is no canonical isomorphism between V and V*. --Shmuel (Seymour J.) Metz Username:Chatul (talk) 11:25, 24 June 2021 (UTC)
- To clarify: You agree with the edit that I made today? (Just making sure.) Mgnbar (talk) 13:52, 24 June 2021 (UTC)
- Yes, that edit looks fine and is an improvement. --Shmuel (Seymour J.) Metz Username:Chatul (talk) 18:16, 24 June 2021 (UTC)
- To clarify: You agree with the edit that I made today? (Just making sure.) Mgnbar (talk) 13:52, 24 June 2021 (UTC)
Written by an academic living in his own world
Poor creature. We all are poor creatures living in our own worlds. — Preceding unsigned comment added by 193.146.80.130 (talk) 14:03, 5 October 2021 (UTC)
language or languages
@Chatul: Hi, after the word "different" we should use a plural noun. Here
describe the same geometric concept using different language and at different levels of abstraction.
is one case of the above rule. Why did you revert my edit? Note that at the same level of abstraction, there may be different languages too. Thanks, Hooman Mallahzadeh (talk) 04:37, 16 March 2022 (UTC)
- It's a peculiarity of English that a word sometimes has distinct meanings, and that the plural form is not used for all of those meanings. In this case, "language" can mean either a specific set of grammar and semantics or a manner of expression; the former may occur as either singular or plural, while the latter only occurs as singular. It's an idiom, and like most idioms doesn't bear close scrutiny.
- I noted with amusement that https://www.wordhippo.com/what-is/the-plural-of/language.html, while briefly describing the matter, used the word "countable", which to me means something very different to how a grammarian uses it.--Shmuel (Seymour J.) Metz Username:Chatul (talk) 08:37, 16 March 2022 (UTC)
- In this specific context, neither "language" nor "languages" is incorrect to me (as a native speaker of English). My gut reaction is to prefer "language". Mgnbar (talk) 10:17, 16 March 2022 (UTC)
- "Languages" would be appropriate if we were talking about, say English, French, German, and Russian, as whole methods of communication. "Language" here is referring to particular texts expressing the same or similar ideas within one language, English. It is being used as a mass noun, so we do not pluralize it (with "s"). JRSpriggs (talk) 17:05, 16 March 2022 (UTC)
Cannonical isomorphisms
There are several approaches to defining tensor and tensor product, and in each of them and are canonically isomorphic; conventional usage is to write them both as and ignore the difference.
There is a related issue when mixed tensors are involved, e.g., could be defined as or as ; the two are canonically isomorphic.
I believe that the article should briefly mention this, and am not sure of the best place(s) to put the text. Shmuel (Seymour J.) Metz Username:Chatul (talk) 21:00, 22 December 2022 (UTC)