Jump to content

Talk:Tensor/Archive 8

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Archive 5Archive 6Archive 7Archive 8

Mutidimensional array VS. M-way array and M-order Tensor

Pinging @Mgnbar:@JRSpriggs:@JayBeeEll:: Multidimensional array is an informal terminology that creates confusion. Using wikipedia's prefered terminology, a 3 dimensional vector is a 1 dimensional array with 3 dimensions. Does the array have 1 dimension or 3 dimensions?

In CS/ML, the information of a 3 dimensional vector is stored in a 1-way array data structure with 3 dimensions. Here is another example that shows the use of the words order, mode and dimensionality. These words are closely related but they refer to slightly different properties:

  • The information of a 3rd order tensor, is stored in 3-way array array of dimensionality . The dimensionality of the 2nd mode of the third-order tensor is .

The order of a tensor is known in physics as the rank, but the old definition of the word rank is being depricated. The word rank is now used in physics as one uses it in math.

Would it be best to start a "Tensor CS" wikipage? AlexMov (talk) 15:04, 22 December 2022 (PST)

Rather, I would say that the components of a vector in three dimensional space may be represented as a one dimensional array of (usually real) numbers containing three numbers. If you are careful about your use of language, there should be no confusion. "M-way" is not standard terminology, so it will cause confusion. What does "way" mean here? Why is "M-" needed; is it a number? JRSpriggs (talk) 23:17, 22 December 2022 (UTC)
@JRSpriggs: @JayBeeEll: I am not sure why you are fond of "Professor Scrotes" terminology, but it is inaccurate.
M-way array or multiway array are standard terminology in math, and CS. See the references below:
Saying that a high dimensional vector (such as, a vectorized image) is "represented" by 1 dimensional array with D numbers is "juvenille", as you like to say. A vector is stored in a 1-way array data structure with D dimensions. Please look at section III of the following paper which discusses tangentially how tensors and "data tensors" came about in CS/ML and why language is very important
http://web.cs.ucla.edu/~maov/CausalX_Block_Multilinear_Factor_Analysis_ICPR2020.pdf
Alexmov (talk) 00:45, 23 December 2022 (UTC)
An issue like this came up two years ago on this talk page; see Talk:Tensor/Archive_7#Mode. Machine learning people tend to use different vocabulary from differential geometry/abstract algebra people, who have traditionally edited this article.
At that time, the consensus was that there isn't much to say about the machine learning concept of tensor, that isn't already covered at Array (data type). Do you feel differently?
By the way, is literally the set of D-tuples of complex numbers, but other D-dimensional vector spaces don't have canonical bases, and the representation of vectors as D-tuples depends on the basis. This is why people say "represented". Sorry if I'm telling you stuff you already know. Mgnbar (talk) 01:39, 23 December 2022 (UTC)
Obviously, we are discussing the same topic, but we have different backgrounds that colors our perspective. The terminology on Tensor page creates confusion. Apparently, we are using the terminology array and dimensionality differently. In CS, an array is a data stucture that stores information, the values of a point, vector, matrix or a tensor. Both a "data tensor" (a collection of vectorized observations), and tensor (a multilinear mapping between a set or range vector spaces to a domain vector space) both employ an M-way array data structure to store their information. If you have a different definition then I would appreciate hearing it, @Mgnbar:. Isn't the representation relative to set of basis vectors, , or a more generic basis matrix, ? Hmmm ... are you using the word array as a synonym for the word matrix/tensor as opposed to referring to it as a data structure that stores information?
Refering to a M-way array as a multidimensional array is strange. Saying that a "vetorized" image (a point in high dimensional pixel space) is a 1 dimensional array with N dimensions or N numbers creates an unnecessary discrepancy and confusion.
Yes, I think we need a section that addresses this terminology differences and give examples of the words rank, order, mode and dimensionality in a sentence. Physicists used to use the order rank to refer to the order on a tensor, but that terminology is being depricated.
Alexmov (talk) 03:33, 23 December 2022 (UTC)
I am emphatically not using "array" as a synonym for "tensor". On the other hand, yes, I regard a matrix as a kind of array.
It seems that the only examples of vector spaces that you consider are and . These have standard bases. Most vector spaces don't. That's important because a basis is exactly what creates a correspondence between tensors and arrays.
It's not yet clear that we're discussing the same topic. I'm discussing tensors (in the sense of math), and you seem to be discussing arrays. Mgnbar (talk) 03:46, 23 December 2022 (UTC)
Please give an example of how you use the words tensor and array differently and the relationship between these two words.
In CS, an array is a data structure that stores information. It is not a mathematical object. So, I am not certain why you are mentioning canonical basis. A tensor, a multilinear mapping from a set of vector spaces to a domain vector space, is stored in an M-way array, where M=C+1. For example, , where , and is stored in M-way array of dimensionality .
A matrix is linear mapping between a range vector space to a domain vector space, . The information is stored in a 2-way array of dimensionality .
Alexmov (talk) 06:05, 23 December 2022 (UTC)
The metric tensor of a Riemannian manifold is a tensor field, and it's value at a point is a tensor. Neither is an array, and there are no canonical bases for the tangent spaces. --Shmuel (Seymour J.) Metz Username:Chatul (talk) 13:23, 23 December 2022 (UTC)
All of the above can be stored into an array data structure. What is the point? Alexmov (talk) 18:57, 25 December 2022 (UTC)
Suppose that you have an n-dimensional vector space V and an m-dimensional vector space W (over the same field, such as R or C). Let B be a basis for V and C a basis for W. Then, with respect to these bases, you can write vectors, linear transformations, and other tensors as arrays. For example, a linear transformation f from V to W can be written as an m x n matrix A. What's crucial is that if you choose different bases, say B ' and C ', then you get different arrays representing the same tensors. A tensor is not an array. Arrays are (non-canonical, usually) bookkeeping systems for tensors.
And this is not some nitpicking detail. In disciplines such as physics and geometry (which Chatul mentioned above), there is usually no God-given basis. For a piece of math to be meaningful, it needs to be invariant under whatever symmetries the space possesses (e.g. Poincare group). So there's a lot more going on here than arrays of data.
This is why I suspect that it's not useful to try to treat traditional tensors and machine-learning tensors in the same article. They have different audiences with different backgrounds and goals. Mgnbar (talk) 14:17, 23 December 2022 (UTC)
  1. Mgnbar, you have argued that an array has a basis and I have disagreed. I have asked you to use the word tensor and the word array in a sentence. I am still waiting.
  2. The Array page is mixed up. It uses notions of basis as if an array is mathematical object, as opposed to a data structure that stores information.
  3. You said: " m-dimensional vector space W (over the same field, such as R or C)." The complex numbers include the reals.
  4. This conversation was regarding a small change from "multidimensional array" to M-way array or multiway array. While the above discussion was interesting. It is irrelevant to the proposed change which is minor, and alleviates a lot of confussion.
Alexmov (talk) 21:42, 23 December 2022 (UTC)
I never said that an array has a basis. That would be nonsense. I said that a finite-dimensional vector space has a basis.
In my last post, in the paragraph beginning "Suppose that you have...", there are four sentences, that use both the word "tensor"/"tensors" and the word "array"/"arrays". So your wait, for me to use these words in a sentence, should be long over. Mgnbar (talk) 23:11, 23 December 2022 (UTC)
Oh, and about R vs. C: Yes, the latter contains the former. Consequently, any vector space over C is also a vector space over R. But the converse is not true. Mgnbar (talk) 23:22, 23 December 2022 (UTC)
No, a matrix is simply a rectangular array. --Shmuel (Seymour J.) Metz Username:Chatul (talk) 18:20, 23 December 2022 (UTC)
If you mean to say that a matrix (a mapping between vector spaces) is a 2-way array, but a 2-way array in not necessarily a matrix, then I agree. User:Alexmov|Alexmov]] (talk) 01:39, 24 December 2022 (UTC)
A matrix is not the same thing as a mapping between vector spaces. --JBL (talk) 21:02, 24 December 2022 (UTC)
I mean to say that a matrix is not a mapping between vector spaces, it is a 2-dimensional array. --Shmuel (Seymour J.) Metz Username:Chatul (talk) 15:25, 25 December 2022 (UTC)
Sounds like you are using the word matrix and 2-way array as synonyms. Alexmov (talk) 17:36, 25 December 2022 (UTC)
And Merriam-Webster[1] agrees with me. --Shmuel (Seymour J.) Metz Username:Chatul (talk) 17:57, 25 December 2022 (UTC)
Merriam-webster definition does not agree with you. 1. A system of equations is a mapping between vector spaces. 2. A matrix can be stored in a 2-way array and it can even be said that a matrix is a 2-way array, but a 2-way array is not necessarily a matrix. However, none of this is related to the "multiway array" terminology discussion.Alexmov (talk) 18:13, 25 December 2022 (UTC)
The cited text from MW uses the term a rectangular array; how is that not agreement that a matrix is a two-dimensional array?
The matrix of coefficients of a system of linear equations is related to, but is not the same as, the system of equations.
Yes, I'm aware of, e.g., Pascal's triangle, but the point in dispute is not whether every array, or every two-dimensional array, is a matrix, but rather whether every matrix is a 2-dimensional array.. Shmuel (Seymour J.) Metz Username:Chatul (talk) 14:00, 28 December 2022 (UTC)

Your comment dos not make sense.

  • Let me repeat, I do not dispute that a matrix may be stored in a two way array or that it may be said that a matrix is a 2-way array.

I dispute that a 2-way array is necessarily a matrix.

  • I question the use of the use of the word represented. The first few sentences keep saying that a tensor is represented by an array. Is a tensor represented by an array, or is it stored in an array data structure? Alexmov (talk) 17:17, 29 December 2022 (UTC)
    Given a choice of basis, a tensor is represented as an array (just as a linear transformation is represented by a matrix); math does not have "data structures", but I suppose that on a computer one might store a matrix as an array data structure. --JBL (talk) 22:19, 29 December 2022 (UTC)
    Your comment is using words that are related, but it does not make sense. You are using the word array in a mathematical sense as opposed to a programming sense. I would appreciate a reference to a math textbook that discuses arrays. Alexmov (talk) 22:57, 29 December 2022 (UTC)


@JRSpriggs: you doubted that the terminology M-way array. I provided references that this is a well established terminology. This is a minor change that can alleviate a lot of confussion. Are there any other objections? Alexmov (talk) 06:47, 23 December 2022 (UTC)
It is fine with me if you introduce other terminology for the order of a tensor, as long as it's unobtrusive and cited. Regards. Mgnbar (talk) 23:11, 23 December 2022 (UTC)
Thank you for the conversation. Cheers! Alexmov (talk) 02:09, 24 December 2022 (UTC)
I have added three sentences at the beginning of section Multiway Array and three sentences at the end of the second paragraph. I tried to touch the document as lightly as possible. Alexmov (talk) 14:14, 24 December 2022 (UTC)
Well this is obviously not true, since you (once again) changed every instance of "multidimensional" (a clear and relatively non-jargony term, relative to the present context) for the niche jargon "M-way array". --JBL (talk) 21:03, 24 December 2022 (UTC)
@Mgnbar@Chatul@JayBeeEll@JRSpriggs
We discussed replacing the "multidimensional array" term with the more appropriate terminology "multiway array" or "M-way array" that avoids unnecesary terminology discrepancy or confusion. Therefore,
* I replaced 5 instances of the word multidimensional array to M-way array and changed the section title to multiway array which was a given.
* I added three sentences at the beginning of the section Multiway Array and three sentences at the end of the second paragraph explaining the use of the terminology.
By most people's standards, I touched the document very lightly and adhered to the above discussion. @JayBeeEll@JRSpriggs, I do not understand the resistance to the proposed change, or the hostility. Perhaps the terminology "multiway array" or "M-way array" is new to your community, but we are having a very old conversation.
The terminology "multidimensional array" could be replaced with "multiway array (multidimensional array)" or "M-way array (M-dimensional array)" which keeps both terms, but depricates the "multidimesional array" as it was done in this reference:
https://www.sciencedirect.com/science/article/pii/0024379577900696
Alexmov (talk) 22:09, 24 December 2022 (UTC)
No, we discussed things that you believe to be true and that we (TINW) believe to be false. Your beliefs about what are appropriate, confusing and discerpancy are not objective facts and are not shared by other editors in this discussion. --Shmuel (Seymour J.) Metz Username:Chatul (talk) 15:25, 25 December 2022 (UTC)
You've mischaracterized a suggested edit that was backed by several references that have thosands of citation that were were published over the last 40+ years. Other editors were already discussing the related "mode" terminology, Talk:Tensor/Archive_7#Mode, which is employed in the literature in liu of an objectively discrepant use of the word "dimensionality". The second sentence in the Multidimensional Array section on the Tensor page uses two objectively discrepanent definitions of the word dimensionality -- "a vector in an n-dimensional space is represented by a one-dimensional array with n components". This can be rewritten as -- a N-dimensional vector is stored in a 1-way array (1-dimensional array) with N components.
Please explain, why are you guys fan boys of @Professor Scrotes and biased in favor of his discrepant terminology?
Alexmov (talk) 17:00, 25 December 2022 (UTC)
Would you rather have someone else do the editting? Is that the issue? Alexmov (talk) 19:53, 26 December 2022 (UTC)
Please stop the argumentum ad hominem and be civil. Assuming that an editor is a fan boy of anybody is rude. The fact that multiple people disagree with you does not indicate that they are in a conspiracy or even that have heard of each other. --Shmuel (Seymour J.) Metz Username:Chatul (talk) 15:46, 27 December 2022 (UTC)
I entered the conversation in good faith and provided references to present day articles with thousands of citations and pointed out that the Tensor webpage uses two different meanings of the words dimensionality. I explained the usage of the terminology. However, you chose to continue to mischaracterize a conversation on the merits. This behavior continues to beg the question of why are you biased in favor of @Professor Scrotes objectively discrepant terminology when an established alternative terminology exists.
Alexmov (talk) 03:25, 28 December 2022 (UTC)
There are several ways in which you are a very difficult person to discuss with; one of them is that you make 30 or 40 edits over several days in order to write a short paragraph, but the other is that you seem somewhat confused. Whoever User:Professor Scrotes is, they have made a grand total of one edit to this article (this one), which did not change any of the language we are discussing. This article is nearly 20 years old; the language "multi-dimensional arrays" has been in it since at least 2004. The help page Help:Page history has information about how to understand the editing history of a Wikipedia article, which you may find useful.
More broadly, it does not seem that you are engaged in the project of trying to understand your interlocutors. Since you are obviously in the minority in this discussion (so far no one has agreed with you beyond being willing to include an unobtrusive mention of the terminology "m-way array" or whatever), you can't simply browbeat the rest of us. Convincing other people to change their views generally involves understanding their views, which unfortunately you don't seem very interested in. As long as that continues, I don't think you're going to have any luck getting what you want. --JBL (talk) 17:40, 28 December 2022 (UTC)
Dear @JayBeeEll,
Please refrain from insulting me and mischaracterizing my behavior. I have stayed civil despite being told that I was "vandalizing" the web page with "juvenile" terminology or that my edits were "disimproving" the page with these changes https://en.wikipedia.org/w/index.php?title=Tensor&diff=1127640645&oldid=1124797292). I patiently explained that there is a terminology clash in the 2nd sentence in the multidimensional array section
  • "a vector in n-dimensional space is represented by a one-dimensional array with n components",
that can be averted with the following established terminology
  • "a vector in n-dimensional space is stored in a 1-way array with n components".
I provided references with thousands of citations substantiating the use of the terminology M-way array. The above mistake touches also on the concepts of order, mode and dimensionality of a tensor in addition to the concept of M-way array data structure and its dimensionality.

Both JRSprigs and Chathul have slowly acknowledged that the word dimensionality is used in two different ways after first denying that there was any problem. I and Mgnbar think that document needs to be edited. I suggested a compromise, Kruskal's "N-way array(N-dimensional array)" terminology, that keeps both terms in place.
Considering the insults and the level of resistance I encountered for such a small change (https://en.wikipedia.org/w/index.php?title=Tensor&diff=1129287723&oldid=1124797292), it is not surprising that a broken terminology has not been corrected for the last 19 years.
No, I have not and will not acknowledge the false claim that the word dimensionality is used in two different ways in the article. The dimensionality of an array, a tensor or a vector space is the number of parameters needed to specify something. It's the same concept whether it's indices specifying a position in an array, coefficients specifying a vector in terms of a basis or abstract inidices.
Now, there are other definitions of dimension in Mathematics, but they are not used in or relevant to this article. --Shmuel (Seymour J.) Metz Username:Chatul (talk) 16:25, 1 January 2023 (UTC)
You said below "It is true ... with slightly orientations." which is a type of acknowledgement. When one says that a "3-dimensional vector is represented by a 1-dimensional array with 3 components (dimensions)", then the term dimensionality is used to refer to two different quantities in the same sentence. Alexmov (talk) 17:58, 6 January 2023 (UTC)

PS. Note that the tensor factor analysis field originally used the informal N-dimensional array terminology and later depricated it, because it was clashing with the mathematical definition of dimensionality. In the early 2000s when a crossover took place from psychometrics to CS, DeLathauwer etal. and Vasilescu and Terzopoulos were careful not to make the same mistakes.
  • (4609 citations) L De Lathauwer, B De Moor, J Vandewalle, "A multilinear singular value decomposition" SIAM journal on Matrix Analysis and Applications 21 (4), 1253-1278 https://doi.org/10.1137/S0895479896305696
  • (1167 citations) Vasilescu, M.A.O. and Terzopoulos, D. (2002). Multilinear Analysis of Image Ensembles: TensorFaces. In: Heyden, A., Sparr, G., Nielsen, M., Johansen, P. (eds) Computer Vision — ECCV 2002. ECCV 2002. Lecture Notes in Computer Science, vol 2350. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-47969-4_30
Unfortunately, Kolda and Bader in their review used the informal definition of a tensor in their first sentence and most people latched on to it.
Physics used the terminology rank when referring to the order of a tensor which clashes with the traditional mathematics definition of rank, but they've been depricating the rank terminology in favor of tensor order.Alexmov (talk) 02:30, 29 December 2022 (UTC)
It is a shame that you do not acknowledge your errors. It is really rather hard to have a conversation with a person who has so little regard for their interlocutors that they argue they were being "civil" while writing confused inanities like Please explain, why are you guys fan boys of @Professor Scrotes and biased in favor of his discrepant terminology?
Yes, the word "dimension" (not "dimensionality") is used in two different ways in the context of matrices and tensors. This is maybe arguably an unfortunate fact about Mathematical English, but it is not a "problem" to be solved, nor is it a "broken terminology", it is just the way the world is. --JBL (talk) 22:12, 29 December 2022 (UTC)
@JRSpriggs, contrary to your assertion, this is not my terminology. It is established terminology and I have provided evidence of this. Saying that you find established terminology difficult to understand is neither here nor there. Alexmov (talk) 04:48, 24 December 2022 (UTC)
Replacing the terminology "multidimensional array" with well established "multiway array" or "M-way array" removes unnecessary confusion. Saying that 3 dimensional vector is stored in a 1 dimensional array with 3 numbers(dimensions) creates an unnecessary discrepancy. The word dimensionality ends up being used in two different ways in the same sentence. Let me repeat, "M-way array" or "multiway array" is not my terminology. Alexmov (talk) 05:20, 24 December 2022 (UTC)
There was no confusion and no discrepancy to remove. --Shmuel (Seymour J.) Metz Username:Chatul (talk) 15:25, 25 December 2022 (UTC)
That is untrue. The second sentence in the Multidimensional Array states: "Just as a vector in an n-dimensional space is represented by a one-dimensional array with n components" used two different definitions of the word dimensionality in the same sentence. Other editors were already debating the "mode" terminology which is used in the literature thatr is employed in the literature to avoid a discrepant use of the word dimensionality, Talk:Tensor/Archive_7#Mode.
It is true, and it is the same definition for both cases, with slightly orientations. A one dimensional array is one dimensional because it takes a single scalar (an integer in this case) to indicate which element you are referring to while an n-dimensional vector space is n dimensional because it takes n scalars (from a field in this case) to indicate which element you are referring to, in terms of a particular basis. Would you object to The red paint is in a green can.? --Shmuel (Seymour J.) Metz Username:Chatul (talk) 02:00, 27 December 2022 (UTC)
"It is true ... with slightly orientations" Full stop. There is an established terminology that fixes the discrepant use of the word dimensionality. I provided references with thousand of citations, and an explanation of the usage.
* A N-dimensional vector is stored in a 1-way array with N components (dimensions)
* If a matrix has 5 rows and 6 columns, are its dimensions 2 or 5x6?
* A 3rd order tensor has 3 modes that have dimensionality , respectively. The tensor is stored in a 3-way arrray of the same dimensionality.
This is a minor fix. Why the resistance?
Alexmov (talk) 06:18, 28 December 2022 (UTC)
  • (4326 citations) Tucker, L. R. (1966). Some mathematical notes on three-mode factor analysis. Psychometrika, 31(3), 279–311. https://doi.org/10.1007/BF02289464
  • (1815 citations) Joseph B. Kruskal, (1977) Three-way arrays: rank and uniqueness of trilinear decompositions, with application to arithmetic complexity and statistics, Linear Algebra and its Applications, Volume 18, Issue 2, 1977, Pages 95-138,ISSN 0024-3795, https://doi.org/10.1016/0024-3795(77)90069-6
  • (1306 citations) CA Andersson, R Bro (2000) "The N-way toolbox for MATLAB" Chemometrics and intelligent laboratory systems 52 (1), 1-4
  • (4609 citations) L De Lathauwer, B De Moor, J Vandewalle, "A multilinear singular value decomposition" SIAM journal on Matrix Analysis and Applications 21 (4), 1253-1278 https://doi.org/10.1137/S0895479896305696
  • (717 citations) P.M. Krooneneberg (2008) Applied Multiway Data Analysis. Wiley Series in Probability and Statistics. Vol. 702. John Wiley & Sons.
Alexmov (talk) 07:35, 28 December 2022 (UTC)
Indeed, the word "dimensions" is widely used in two ways that clash slightly (a matrix is a "two-dimensional" arrangement of scalars but the "dimensions of the matrix" are 5×6). Personally when I'm teaching linear algebra I try to talk about the "shape" or "size" of a matrix to help not confuse my students between the two different uses of the word "dimension". But the fact of the matter is that these dual uses are both completely standard. Apparently you (and maybe some other people) feel this is regrettable and should be changed; this view might even be correct. But the language "m-way array" is nowhere near as common or widespread as "m-dimensional array", and Wikipedia by construction is conservative, not a venue for radical language changes. --JBL (talk) 17:54, 28 December 2022 (UTC)
Misusing terminology is very common, but that is not the same thing as being a standard. You are using the appeal to popularity falacy to justify your opposition to the suggested minor edit. The wikipedia page is re-enforcing two mistakes.
* Saying that "a vector in an n-dimensional space is represented by a one-dimensional array with n components" is two kinds of wrong.
* Suggesting that "A N-dimensional vector is stored in a 1-way (1-imensional) array data structure with N components" is a radical edit is grasping at straws. The mistake in the Tensor webpage touches on several poorly defined concepts and their use - the order, modes and dimensionality of a tensor, in addition to the the concept of a N-way array datastructure and its dimensionality. By most people's standards I touced the document lightly (see https://en.wikipedia.org/w/index.php?title=Tensor&diff=1127640645&oldid=1124797292), and I was in the process of adding references when you reverted all my edits.
Given the insults I've had to endure, it is not surprising that the mistake has lasted uneditted for the last 19 years on the Tensor webpage.
PS. You might be underestimating your linear algebra students. Introduce the vocabulary and its proper usage and they might surprise you. Where do you teach?
Alexmov (talk) 23:50, 28 December 2022 (UTC)
Are there any other objections? Alexmov (talk) 17:05, 25 December 2022 (UTC)

I have added the "way" terminology with two references. There is after all no reason to omit this term. However, it is noteworthy that many of the references that Alexmov mentions (Kruskal 1977, Kolda and Bader 2009, etc.) use some version of "multi-dimensional" in addition to some version of "multi-way". So I think that our article, despite its imperfections, is adequately summarizing the literature in this regard. Mgnbar (talk) 01:01, 29 December 2022 (UTC)

Originally, some tensor factor analysis people used the terminology N-dimensional matrices or N-dimensional arrays which clashes with the traditional mathematical definition of dimensionality, but they depricated it. Kruskal depricated the terminology "N-dimensional array" by putting it in a paranthetical, "N-way array (N-dimensional array)". During the cross-over from psychometrics to CS, most of us (Rasmus Bro etal, De Lathauwer etal, Vasilescu and Terzopoulos) have been very careful to use the term N-way array. Unfortunately, Kolda and Bader's review used an informal definition of a tensor in the first sentence and most people have latched on to it. This has been causing problems ever since.
Alexmov (talk) 2:50, 29 December 2022 (UTC)
First question: Currently the article prefers "multi-dimensional" and merely mentions "multi-way". Is your view that it should prefer "multi-way" and merely mention "multi-dimensional"? Or that "multi-dimensional" should be removed entirely? Or what?
Second question: How do you feel about "multi-way" vs. "m-way"? The latter option has a problem, in that we have to introduce m as the total order/wayness of the tensor first. The total order is meaningful in some cases (e.g. when we restrict to orthogonal changes of basis of the underlying vector space) but not meaningful in general. That's one of the issues lurking behind the scenes of this debate. Mgnbar (talk) 03:15, 29 December 2022 (UTC)
> Currently the article prefers "multi-dimensional" and merely mentions "multi-way". Is your view that it should prefer "multi-way" and merely mention "multi-dimensional"?
I think the article ought to use the terminology "multi-way" but mention "multi-dimensional" since it is popular. If one wanted to use layman's language - one could use terms, such as a string of numbers, a grid of numbers and a cube or hyper-cube of numbers.
> How do you feel about "multiway" vs. "m-way"?
Both terminologies are equally good.
> The latter option has a problem, in that we have to introduce m as the total order/wayness of the tensor first. The total order is meaningful in some cases (e.g. when we restrict to orthogonal changes of basis of the underlying vector space) but not meaningful in general. That's one of the issues lurking behind the scenes of this debate.
I agree about the lurking issues. An M-order tensor may be stored in an M-way array, but if the tensor is decomposed by tensor ring/train then it is represented as the product of a set of 3rd order tensors where each third order tensor is stored in a three-way arrays. Is that what you are getting at? I have always treated the word vector/matrix/tensor as mathematical concepts and arrays as data structures that store information, as opposed to represent information. I get the impression that it might be different in your area of research.
How do you talk about a tensor ring or tensor train decomposition? Alexmov (talk) 06:53, 29 December 2022 (UTC)
I am referring to the fact that a tensor of type (p, q) has total order/wayness m = p + q, but this number has no significance in general, so it would be odd to focus on it. Mgnbar (talk) 07:45, 29 December 2022 (UTC)
I am ok with using multiway array. It will fix with the dimensionality problem. Alexmov (talk) 07:56, 29 December 2022 (UTC)
What do you think of these changes? Where I currently have the word M-way array, I'll replace it with "multiway array (multidimensional array)". https://en.wikipedia.org/w/index.php?title=Tensor&diff=1127640645&oldid=1124797292 Alexmov (talk) 08:07, 29 December 2022 (UTC)
Sorry, but I think that that proposal is a disaster. First, it makes "data tensors" primary, in conflict with the rest of the article. Second, it then breezily asserts an equivalence to multilinear maps, as if the later sections (that establish that equivalence) don't matter at all.
Let's try a different tack. You seem to be interested in decompositions of data tensors and their applications. Based on what I know of the SVD for matrices, I don't doubt that they're important. Is that material covered anywhere on Wikipedia? It doesn't make sense to add it to Array (data type), right? Nor does it make sense to add it to this article. Is this a sign that we need a new article? Mgnbar (talk) 08:19, 29 December 2022 (UTC)
Btw, a tensor train /tensor ring decomposition represents a M-order tensor with a product of several (p,q)-tensors.
Do you object to replacing 6 instances of multidimensional array with "multiway array (multidimensional array)"?
Alexmov (talk) 08:23, 29 December 2022 (UTC)
Unfortunately I'm traveling and don't have access to my books. So I cannot evaluate whether "multi-dimensional" is a standard phrasing, in the math subjects that this article traditionally involves. Perhaps one of the other math editors can address that. If it's not standard phrasing, then it's okay with me to replace it with "multi-way", which is standard in one discipline at least. :)
In general I am worried about your making systemic edits to this article. I wish that you would Wikipedia:Draft an article about tensors in machine learning or whatever your interest is, rather than potentially damaging this article, which already decently (albeit imperfectly) addresses its subject. That would help us all figure out what the division of labor is, whose terminology should prevail where, etc. Regards, Mgnbar (talk) 19:05, 29 December 2022 (UTC)
Btw, you added the terminology "ways" to a tensor. A tensor has multiple modes and an array has multiple ways. I've tried to edit it, but the edit may not survive @JayBeeEll. Are you one and the same person but responding from two different accounts? Alexmov (talk) 23:20, 29 December 2022 (UTC)
Oh good lord. --JBL (talk) 00:11, 30 December 2022 (UTC)
Indeed good lord. I am becoming exasperated with you and Professor Scrotes. I do not appreciate you constantly reverting my edits. Alexmov (talk) 00:16, 30 December 2022 (UTC)
@Mgnbar, I changed following:
  • The total number of indices required to identify each component uniquely is equal to the dimension of the array, and is called the order, degree, rank, or number of ways of the tensor.
to this
  • The total number of indices required to identify each component uniquely is equal to the dimension or the number of ways of an array, and is called the order, degree, rank or modes of the tensor.
Tensors have modes and arrays have ways. However, @JRSpriggs promptly deleted the edit. I reverted the undo, but it may not survive. Alexmov (talk) 23:14, 30 December 2022 (UTC)
Yes, I have seen your edits, I understand their purpose, and they seem fine to me. Mgnbar (talk) 23:27, 30 December 2022 (UTC)
I made the following grammar changes and term introductions.
  • The total number of indices (n) required to identify each component uniquely is equal to the dimensions or the number of ways of an array. This is why an array is referred to as an n-dimensional array or an n-way array. The total number of indices is also called the order, degree, rank or modes of the tensor.
@JRSpriggs if you have a problem with this change. Please state it, rather than just reverting.
Alexmov (talk) 03:14, 31 December 2022 (UTC)

Warning to future readers: Because one editor of this thread repeatedly edited posts after other editors had responded to them, the final version does not exactly capture how the discussion progressed. Mgnbar (talk) 07:33, 5 January 2023 (UTC)

Yes, and without marking the changes with <del>deleted text</del> and <ins>inserted text</ins> or with {{deltext|deleted text}} and {{ins|inserted text}}. --Shmuel (Seymour J.) Metz Username:Chatul (talk) 14:11, 5 January 2023 (UTC)
Several editors engaged in non-linear reponses, replying to messages after other editors had responded to them.
What is indisputable is that I was trying to fix the second sentence that is using the term dimensionality in two different ways. JRSpriggs misrepresented the edit I performed in his call for "help", "@Alexmov: has been repeatedly vandalizing Tensor, replacing standard terminology with infantile expressions, e.g. 'Multidimenional maps' with ' M-way array'. Please help." (https://en.wikipedia.org/wiki/Wikipedia_talk:WikiProject_Mathematics#Tensor). While JRSpriggs has stayed above the fray after the initial call for "help" (ie. call to bully), for some unexplained reason several editors volunteered themselves to get dirty despite lacking expertise and arguing for the sake of arguing.
Any edits I would perform would reflect the discussion in Sec. 3 and Sec 2 of this paper (http://web.cs.ucla.edu/~maov/CausalX_Block_Multilinear_Factor_Analysis_ICPR2020.pdf) or the terminology evolution of this literature:
* (4326 citations) Tucker, L. R. (1966). Some mathematical notes on three-mode factor analysis. Psychometrika, 31(3), 279–311. https://doi.org/10.1007/BF02289464
* (1815 citations) Joseph B. Kruskal, (1977) Three-way arrays: rank and uniqueness of trilinear decompositions, with application to arithmetic complexity and statistics, Linear Algebra and its Applications, Volume 18, Issue 2, 1977, Pages 95-138,ISSN 0024-3795, https://doi.org/10.1016/0024-3795(77)90069-6
* J.B. Kruskal. Rank, decomposition, and uniqueness for 3-way and n-way arrays. In R. Coppi and S. Bolasco, editors, Multiway Data Analysis, pages 7–18, Amsterdam, 1989. North Holland.
* (1306 citations) CA Andersson, R Bro (2000) "The N-way toolbox for MATLAB" Chemometrics and intelligent laboratory systems 52 (1), 1-4
* (4609 citations) L De Lathauwer, B De Moor, J Vandewalle, "A multilinear singular value decomposition" SIAM journal on Matrix Analysis and Applications 21 (4), 1253-1278 https://doi.org/10.1137/S0895479896305696
* (717 citations) P.M. Krooneneberg (2008) Applied Multiway Data Analysis. Wiley Series in Probability and Statistics. Vol. 702. John Wiley & Sons.
* (1167 citations) Vasilescu, M.A.O. and Terzopoulos, D. (2002). Multilinear Analysis of Image Ensembles: TensorFaces. In: Heyden, A., Sparr, G., Nielsen, M., Johansen, P. (eds) Computer Vision — ECCV 2002. ECCV 2002. Lecture Notes in Computer Science, vol 2350. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-47969-4_30
Alexmov (talk) 17:52, 6 January 2023 (UTC)
Gathering reliable sources on the talk page, adding your post to the end of the thread, and not editing your post after others have responded to it (I hope) are all great. Thanks. Mgnbar (talk) 04:52, 7 January 2023 (UTC)

References

  1. ^ "Merriam-Webster Dictionary". Retrieved December 25, 2022. 5 a: a rectangular array (see array entry 2 sense 5) of mathematical elements (such as the coefficients (see coefficient sense 1) of simultaneous (see simultaneous sense 2) linear equations) that can be combined to form sums and products with similar arrays having an appropriate number of rows and columns

Tensor defined in terms of an array data structure

The first two sentences in the Multidimensional Array Section defines a tensor in terms of arrays data structures.

  • A tensor may be represented as an array (potentially multidimensional). Just as a vector in an n-dimensional space is represented by a one-dimensional array with n components

The first instance of the word "array" and "one-dimensional array" were linked to a CS article defining an array data structures. JayBeEll has removed the first link. We now have a tensor defined in terms of the word "array" that is no longer defined. Alexmov (talk) 06:53, 7 January 2023 (UTC)

The entire first paragraph of the section "As mutlidimesnional arrays" is devoted to the project of explaining what a multidimensional array is (a collection of entries [from an unspecified set of allowable entries], indexed by a fixed number n of indices, such that the ith index takes integer values in an interval for some positive integers -- in which case we say it is an array or tensor). Certainly I think we can agree that, at present, it does not do a good job of explaining that. However, the irrelevant links to articles about data structures made the explanation worse, not better. --JBL (talk) 17:38, 7 January 2023 (UTC)

There is a confusion between data types (more specially abstract data types) and data structure. I agree with the removal by JayBeeEll of all links to array (data structure), as it is not what is discussed in the article. On the other hand, the (abstract) data type array is exactly the formalization of what mathematicians call informally an array. So, I strongly suggest to replace the removed links to links to Array (data type)#Abstract arrays and and Array (data type)#Multi-dimensional arrays (IMO, the beginning of this section well clarifies the terminology ambiguities at the origin of the above very length discussion). D.Lazard (talk) 18:13, 7 January 2023 (UTC)

Do you have a reference that discusses vectors, matrices or tensor in terms of array data types? 108.46.155.167 (talk) 11:02, 8 January 2023 (UTC)
I disagree wtih D.Lazard that a "data type" is relevant here because I think the word "array" is used in a completely sensible way outside any c.s. context. Here are some references, with quotes.
  • Gelfand, Kapranov, Zelevinsky "By an r-dimensional “matrix” we shall mean an array [formula] of numbers"
  • Bremner, Bickis, Soltanifar "Cayley’s hyperdeterminant is a homogeneous polynomial of degree 4 in the 8 entries of a 2×2×2 array."
  • Bremner title: "On the hyperdeterminant for 2×2×3 arrays"
  • Hillar, Lim: "For the purposes of this article, a 3-tensor A over F is an l × m × n array of elements of F"
  • Stavrou, Low: "An order-n tensor is an element of the tensor product of n vector spaces. By fixing a basis, we can associate these elements with n-dimensional arrays."
  • In Lim we get a (false) assertion that "array" is an undefined term in mathematics, followed immediately by the (contradictory, and true) assertion that it means exactly the same thing as what Lim calls a "multi-indexed object"
The point is not that these individual publications are particularly significant (with the exception of GKZ, I don't think they are), it is that it took me only a few minutes to find them and what they illustrate is that the word "array" in the same way I described is both widespread and totally innocuous; none of these authors aside from Lim even contemplates the idea that readers might not understand this use. --JBL (talk) 17:57, 9 January 2023 (UTC)

Where is v in the picture of stress tensor?

As subject. Should be more clarified. 1.47.151.62 (talk) 13:39, 12 February 2023 (UTC)

I rewrote the caption. What do you think? Is it clear now, that e1, e2, e3 are three examples, of what v could be? Mgnbar (talk) 21:46, 12 February 2023 (UTC)
e1, e2, e3 (possible values of v) are not shown because the picture is cluttered with vectors and they are already defined in the caption. e1 is a vector of unit length in the direction of σ1 1. Similarly for the other two. JRSpriggs (talk) 16:34, 13 February 2023 (UTC)
To clarify: When the anonymous poster made that post, e1, e2, and e3 were not in the caption. You are commenting on a different version. Regards, Mgnbar (talk) 16:39, 13 February 2023 (UTC)

Reorg of Tensor article

I propose that the Tensor article is very confusing due to the overloading of this term across disciplines. There is the Tensor (disambiguation) page, but this is insufficient as a beginner level search for "Tensor" in wikipedia takes you immediately to this complex mathematical page. I would propose the following restructuring of articles for Tensors:

1. Tensor (mathematics) - current landing page renamed to make it clear that this discusses the most complex use of the term; as a multilinear map. I understand this is historically/mathematically the most important, but its also extremely confusing to beginners that may be looking for the other meanings. The multilinear map is mathematical as it defines operators (maps) over scalars, vectors, matrices and higher dimensions.

2. Tensor (field) - this is the "applied" version of the mathematical object, used primarily in physics, chemistry, etc., which considers the tensor object present at every point in a field. There is already a page for it on Tensor field. Just add the use of parens in "Tensor (field)" for clarity, since the word "tensor" alone is often used to refer to the field. I know they have a deeper interpretation as a multilinear map, but separating these pages clear helps guide the user to their level of detail/abstraction.

3. Tensor (type) - this is the meaning which many people are searching for in machine learning. It is simply the extension of data to multi-dimensions beyond vectors and matrices -- not constrained to the rules of use in fields or multilinear maps, but constrained to specific operators in ML. Currently this page redirects to Array (data type) but this is hardly discoverable, and insufficient since the use of Tensors in this field has additional meanings which are specific of machine learning, such as backpropagation and Tensor Trains (TT) which would make sense on a Tensor (type) page but not on the Array (data type) page. This Tensor (type) page could then be extended with derivations of backpropagation over tensors, etc.

My suggestion is to make the "Tensor" page itself a disambiguation page, which then takes you to one of these three. Being directed first to the most complex mathematical object is inherently confusing for people who are searching for the other forms but don't know enough yet. The current redirection link to the disambiguation page is not enough, since the single word Tensor itself is overloaded. People with no experience, searching for "tensor", expect the very broadest level of understanding, which is the outline above as a disambiguation leading to detailed use in each field. Ramakarl (talk) 01:14, 16 February 2023 (UTC)

— Preceding unsigned comment added by Ramakarl (talkcontribs) 19:16, 8 February 2023 (UTC)

First, I have moved your post to the end of the talk page. Also, please sign your posts with four tildes: ~~~~.
I agree that readers interested in machine learning will be an increasingly large audience for tensors on Wikipedia, and I agree that this article is not ideal for them. This issue has come up a couple of times in recent years, including a huge discussion about a month ago (see above).
As it stands, those readers should arrive at this page, realize that it is not what they wanted, follow the hat note, and arrive at Tensor (disambiguation), which ultimately sends them to Array (data type). In your view, should this option be made more prominent?
While we're discussing it, should Wikipedia have more content on tensors as used in machine learning, beyond the simple fact that a tensor is a multi-dimensional array? What would such an article contain? Answering that question can help us decide the proper division of labor, which topics should be primary, etc. Mgnbar (talk) 19:26, 8 February 2023 (UTC)
should Wikipedia have more content on tensors as used in machine learning, beyond the simple fact that a tensor is a multi-dimensional array? Clearly yes; this topic could be its own article. What would such an article contain? Someone should ask some machine learning experts to come give feedback. –jacobolus (t) 23:20, 8 February 2023 (UTC)
Upon re-reading your post, I see that you have already proposed some specific content for an article on tensors in machine learning. Sorry for overlooking it. My advice is to write a Wikipedia:Draft of that article. Then, especially after it is promoted to the main space, we can better evaluate what the high-level organization of these articles should be. Mgnbar (talk) 21:54, 8 February 2023 (UTC)
@Mgnbar Thank you. I will consider writing a draft. Regarding the comment "those readers should arrive at this page, realize that is not what they wanted..." I feel this presumes a lot about the user searching for the term "tensor". Often beginners will use single words as a starting point, ie. "tensor". They may not know enough either about what they want, what they're seeing, or about the 'hat', to know what to do next. This is not simply the case of disambiguation of subsumed or cultural uses. Consider the word Signal, a function that conveys information. The Signal (disambiguation) page presents several subsumed examples of signals (i.e. analog, digital, etc.) along with abbreviations and cultural uses. These are different kinds of signals. The various uses of the word "tensor" however, are not all subsumed under the mathematical or physics definition here. Like it or not the term "tensor" is now overloaded in common use. Hence I feel tensor is more similar to a word like Discrete, whose top level term takes you directly to a referring page as a starting point. An even better example is the word Depression, which is a singular word used differently in various fields of study (e.g. geological vs mental health). Is there a good description of 'tensor' that refers to the mathematical object, the physics object, and the machine learning object? While they can all be represented by multi-dimensional arrays this refers to storage representation, rather than meaning. I believe tensor is now overloaded enough to be considered as a single word with distinct uses by field. Ramakarl (talk) 02:09, 16 February 2023 (UTC)
The text, that you've cited from Hongyu Guo, describes the math/physics meaning of tensor, which is what this article currently covers. The machine learning meaning is inspired by the math/physics meaning (as far as I can tell) but is different.
I suspect that the machine learning meaning will come to be more popular/prevalent/searched than the math/physics meaning. So I suspect that you are right about the article hierarchy in the long term.
The problem is: It's difficult for us to plan articles abstractly. Sometimes we go through the pain of debating and reorganizing, and the required material never actually shows up, or it shows up and we realize that our plans were wrong anyway. It's much better to have a bunch of actual text, and then figure out how it should be split into articles, what they should be named, which article should be primary, etc.
Editors who support the machine-learning meaning of tensors have not yet provided this text. That's why Wikipedia's current treatment of machine-learning tensors boils down to a redirect to Array (data type). Mgnbar (talk) 02:39, 16 February 2023 (UTC)
@Mgnbar Ok. I have drafted a new article on tensors here: Tensor (machine learning). A few important comments about this draft. My goal was to create a basic article that was structurally complete with the essential points, history and definitions, with the understanding that others in the ML community could continue to improve it. With the sections and references provided, I don't consider it a stub or placeholder, but a completed draft that I hope could stand on its own while allowing community revision. This is a broad topic which couldn't be written by just one person. One reason for discussing here in wiki talk is to avoid writing a big article, and having it disappear immediately upon publishing. I hope this process of drafting, talking with you (and others) could lead to a stable article. Some additional points: The references are there, but I'm not sure how to reuse refs so there may be dups. Also, I put in first authors but I'm not sure how wiki prefers to handle multiple authors. Overall, let me know what you think. Ramakarl (talk) 04:15, 17 February 2023 (UTC)
@User:Ramakarl: This draft is fantastic. I mean, it needs a little polishing, but it's more than enough to serve as a launchpad. I'm going to edit it a bit and then help shepherd it from draft space to main space (hopefully soon). Then we can have a more concrete discussion of disambiguation pages, etc. Mgnbar (talk) 07:54, 17 February 2023 (UTC)
@Mgnbar The page for Tensor (machine learning) is now live, thanks to your help and edits. My suggestion is for the current Tensor page to be moved to Tensor (mathematics) and the Tensor (disambiguation) page to be moved Tensor. Similar to how the wikipedia terms are handled for the words depression or discrete. Also, what is the article Tensor (intrinsic definition) and how is it different from Tensor (mathematics)? Ramakarl (talk) 04:35, 21 February 2023 (UTC)
It appears to be a basis-independent description of tensors. The title is misleading because it only describes one of the common definitions. I believe that it needs a description in terms of multi-linear maps as well. It might be a candidate for a merge. --Shmuel (Seymour J.) Metz Username:Chatul (talk) 16:31, 21 February 2023 (UTC)
I tend to agree with Ramakarl, that this article should be moved to Tensor (mathematics) and Tensor should move/redirect to Tensor (disambiguation). But I don't feel very strongly. If we had data saying that an overwhelming fraction of readers search for the machine learning meaning rather than the math/physics meaning, then I would feel more strongly. Mgnbar (talk) 00:39, 22 February 2023 (UTC)
I don't feel the hat/redirect is sufficient for the many novices to ML or wikipedia that will come here. Ramakarl (talk) 16:17, 23 February 2023 (UTC)
I'm agnostic on the move itself, but I believe that Tensor or Tensor (mathematics) should include a hatnote like {{distinguish|Tensors (machine learning)}} in place of #Machine learning and that the first paragraph of the renamed (from #Applications of tensors of order > 2 ) #Computer vision and optics should be moved to the top of #Applications or made a separate subsection with the original name and mention, e.g., the Riemann curvature tensor field , which is a tensor of order 4.
I urge liberal use of hatnotes, e.g., {{distinguish}}, to guide the reader who lands on an article for an unintended meaning of tensor. Also, the phrase tensor field is a standard term of art and changing it to tensor (field) would just confuse readers. --Shmuel (Seymour J.) Metz Username:Chatul (talk) 04:37, 16 February 2023 (UTC)
NB. A Tensor field is also a mathematical object, not an application, and is a function on a manifold, not on a field. --Shmuel (Seymour J.) Metz Username:Chatul (talk) 12:02, 22 February 2023 (UTC)--Shmuel (Seymour J.) Metz Username:Chatul (talk) 12:02, 22 February 2023 (UTC)

Distinction between math/physics and machine learning tensors

At Talk:Tensor (machine learning)#Merge into Tensor Article, a source has been found that says that machine learning tensors are merely arrays, and that this is a/the key distinction between machine learning tensors and math/physics tensors.

Please chime in there (not here) if you wish to contribute to this particular discussion. Mgnbar (talk) 12:25, 27 June 2023 (UTC)

Swap History and Examples sections

I propose swapping the two sections, then adding a more detailed writeup of physically-relevant examples like the Ricci tensor, curvature forms, curl, grad, ..., and the equations they appear in, which is probably more useful to learn the history than a wordy history section. MeowMathematics (talk) 22:06, 29 September 2023 (UTC)

Adding a more concise section after the introduction

I assume that people come to this article below the introduction to get either 1) the definition of what a tensor is, quickly, or 2) essentially a book chapter on tensors that they can read part or all of. Currently the article serves 1) horrendously, so I made an edit to fix that, which was revoked for reasons I genuinely do not understand. More precisely, one needs to read 1200 words before getting to the basis definition, and 1900 words before getting to the tensor product definition.

I propose putting a concise version of both definitions as near to the top of the article as possible. This will serve the needs of 1), and not harm the needs of 2), since this group will in any case either be scrolling to a specific section or reading through the article in its entirety anyway.

Maybe the particular way I went about doing this was suboptimal, but if so I'd appreciate being told why. -MM — Preceding unsigned comment added by MeowMathematics (talkcontribs) 19:34, 29 September 2023 (UTC)

I thought that your edits were a decent (albeit imperfect) attempt at improving the article. Thanks for the effort.
The current Definition section is highly verbose. At least the first and third definitions are clearly marked. The second definition, in terms of multilinear maps, is vague. Really it exists to motivate the third definition, I guess.
I don't support pushing the definitions into the lede. So I support keeping the definitions where they are --- right after the lede. But maybe the verbose text there is straying into Wikipedia:Wikipedia is not a textbook territory. Maybe it should be more concise.
P.S. Please sign your talk page posts with four tildes to generate your signature: ~~~~. Mgnbar (talk) 19:51, 29 September 2023 (UTC)
Well, the first thing that I see is There are two equivalent definitions of tensor; there are more than two definitions, and the definition in terms of bases is rather old fashioned. Surely multilinear maps and tensor products are the most important, at least in Mathematics.
I believe that the lead has material that belongs elsewhere in the article, e.g., in History. Also, a section on motivation before the definitions might help. -- Shmuel (Seymour J.) Metz Username:Chatul (talk) 20:46, 29 September 2023 (UTC)
Taking your comments into consideration, how about the following approximate sketch for how the Definition part of the article should be rewritten?
1) "A tensor T in a vector space (with a chosen basis) is represented by an array. A k-dimensional array is a collection of n x ... x n (k times) real numbers [example picture for k = 0, 1, 2, 3]. Let V be a vector space and T be a tensor. A tensor is defined as a collection of arrays transforming a certain way under change of basis. [give example of co/variant transformation when k = 1, then give full-on Einstein notation definition of co/variant (after introducing Einstein)] [give basis definition of tensor as in my edit]".

For the current article's paragraphs, I propose:

"A tensor may be repr..." - cut

"The total number of in..." - cut, unecessary

"Just as the compo..." - cut, this is confusing and I think explained better in my sketch above

"Here R ji are the ent..." - cut, as mentioned should move explanation for Einstein later

"This is called a co..." - cut, too long

"As a simple example,..." - move to examples section of (p,q) = (1,1)

"Similarly, a linear op..." - either cut or move to the (p,q) = (1,0)/(0,1) section.

"Here the primed indices denot..." - cut most of it, move list of names for (p,q) under the definition

"Definition" - keep

"The definition of a ten"- move to history

"An equivalent definition of a te" - cut

2) "[explanation of what \otimes^n V and V* is] [give the definition of tensor] [maybe say how \otimes relates to multilinear maps, then give that "other" definition of tensor]"

Finally,

"Tensors in infinite dimensions"- just remark in the definitions that V must be finite dimensional, then move this to another section

"Tensor fields" - keep, but add a coordinate-free brief definition. MeowMathematics (talk) 22:03, 29 September 2023 (UTC)

I have not fully absorbed what you are proposing here, but at a glance it looks like you are hostile to explanatory text, and that seems bad. I reverted your previous edit because it introduced repetition and increased the level of technicality, while losing the nuance necessary to deal with the situation that "tensor" does not have one fixed definition. Broadly, I am concerned that you are trying to write for people at your own level of mathematical sophistication, rather than writing WP:ONELEVELDOWN. (I am happy to defer if it is the consensus of other editors that this would be an improvement, though.) --JBL (talk) 18:17, 30 September 2023 (UTC)
Thanks for the comment. I think you might have misunderstood my intentions (to be clear, I don't intend for most of this rewrite to be anything like my previous edit, which I had made terse because it was surrounded by a lot of existing exposition). I definitely agree that this article should be understandable to someone who's not 100% comfortable with the definition of vector space. However, when both current-me and high-school-me tried to read this article, it's a real strain to understand what it's saying, and from comments I get the impression this is true for others too. I think it's really not too hard to move this article in the direction of being a) much shorter (so making the definition look less scary!), b) still understandable to beginners and c) including more easy-to-understand examples. I'll wait around for a bit to see what people think, but otherwise I might take a stab at writing this in a week or so. MeowMathematics (talk) 16:23, 2 October 2023 (UTC)
Please indent your replies with one more colon than what you are replying to or use the {{od}} template.
I believe that the the Definitions section should give the modern, coordinate free,[a] definitions before the 19th Century definition. — Preceding unsigned comment added by Chatul (talkcontribs) 15:15, 1 October 2023 (UTC)
User:MeowMathematics, it seems that you want to cut huge swathes of this article. Maybe that will end up being a good idea, but it's hard to tell. Concretely, I suggest that you draft your new version at User:MeowMathematics/Tensor, so that we all have a better idea of what you propose. Mgnbar (talk) 19:58, 2 October 2023 (UTC)

Notes

  1. ^ The modern definitions use either
    • Tensor products
    • multilinear maps
    There may be others that I am not aware of..