Talk:Invertible matrix/Archive 1
This is an archive of past discussions about Invertible matrix. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 |
What?
Are large parts of this article copy and pasted from answers.com, or did answers.com take large parts of this article? Can someone explain the statement: "As a rule of thumb, almost all matrices are invertible. Over the field of real numbers, this can be made precise as follows: the set of singular n-by-n matrices, considered as a subset of , is a null set, i.e., has Lebesgue measure zero. Intuitively, this means that if you pick a random square matrix over the reals, the probability that it will be singular is zero."
Does this mean the matrix
- (0, 2)
- (0, 0)
is invertible, even though it is not row reducible to the identity? Somehow I don't think so, but for a person not steeped in mathematical know-how, it is misleading and suggests that is indeed the case. Basically, this article messed me over because I used it as a study help. Someone who knows what they're talking about needs to re-write it, or else sue answers.com.
18.251.6.142 08:41, 10 March 2006 (UTC)
- Answers.com copied this article under the GFDL, so all is fine. :)
- That matrix is not invertible, and it is not row reducible either, so I don't see a problem. That text just says it is more likely that a given matrix is invertible that it is not, it does not mean all matrices are invertible.
- I suggest you go over things which you don't know and read only the parts you understand. The article is written such that it gives some information both to people who know nothing abou this stuff, and to people who know a lot, it is not tailored specifically to you. If overall this article manages to answer some of your questions, I guess you should be happy with that. Oleg Alexandrov (talk) 17:41, 10 March 2006 (UTC)
- Thank you Oleg. I am sorry if I seemed a bit upset, but other parts of this site have been very helpful in my coursework and this particular statement seemed a bit misleading and cost me a lot of time. 18.251.6.142 17:59, 10 March 2006 (UTC)
Invertible != regular ?
The article currently equates invertibility and regularity with the statement beginnging "In linear algebra, an n-by-n (square) matrix A is called invertible, non-singular, or regular if..." in the first line. However, this interpretation of "regular" conflicts with the definition "A stochastic matrix P is regular if some matrix power P^k contains only strictly positive entries" given on the Stochastic matrix page. I've not seen the term "regular matrix" before, but it seems that that page uses "regular" where I would use "ergodic", while this page uses it as a synonym for "invertible".
As far as I can see, one of the definitions must be wrong, as each can trivially be shown to exclude the other. In the unlikely event that both meanings are in common use, then it is an error on the Stochastic matrix page that "regular" is linked to this page. 152.78.191.84 12:08, 13 March 2006 (UTC)
- I'm struggling with this seeming contradiction as well. I can't find any references to regular matrices in my (somewhat limited) library, but I found a few conflicting references online. Wolfram's site (http://mathworld.wolfram.com/RegularMatrix.html) just redirects me to the entry for "Nonsingular Matrix", which gives credence to the equivalence of invertability and regularity. I've found a few other definitions as well which don't seem quite as reputable. A book transcript from Springer-Verlag (http://www.vias.org/tmdatanaleng/hl_regularmatrix.html) gives the definition found on the Stochastic matrix page. Finally, Thinkquest (http://library.thinkquest.org/28509/English/Transformation.htm) gives the definition of a regular matrix as a matrix whose inverse is itself (). This definition seems pretty off to me; wouldn't that imply that the only "regular" matrix is the identity or a permutation matrix?
- I looked around for references on the Stochastic Matrix Theorem cited on the Stochastic matrix page but was unsuccessful as the page gives no references. Perhaps someone with more knowledge could point to a source for this theorem, which likely would clarify the confusion? Mateoee 17:54, 17 November 2006 (UTC)
- I never heard of invertible matrices being called regular. I will remove that from the article. Oleg Alexandrov (talk) 02:45, 18 November 2006 (UTC)
Rephrase a sentence?
I find the sentence "The equation Ax = 0 has infinitely many the trivial solutions x = 0 (i.e. Null A = 0)." very confusing. Why not rephrase it to "The equation Ax = 0 has only the trivial solution x = 0."? If the former sentence is more correct I apologize for my lack of knowledge in this subject. :/ Karih 18:18, 3 December 2006 (UTC)
- You're absolutely right, it is very confusing to put it mildly. I fixed it; thanks for bringing this to our attention. -- Jitse Niesen (talk) 02:10, 4 December 2006 (UTC)
Inversion of 4 x 4 matrices
please!
- Although it is possible to derive equations for the inversion of 3x3 and 4x4 matrices like the one for the 2x2 matrix, they will be huge (and therefore not really suitable for the article). The generalised analytic form is already given (i.e. in terms of determinant and co-factors), and furthermore, inversion may be achieved more practically using an algorithm such as Gaussian elimination. Oli Filth 09:01, 19 January 2007 (UTC)
Account for other systems than R
in "Inversion of 2 x 2 matrices", 1/(ad-bc) is used. Should it be (ad-bc)^-1 to account for other systems such as rings ? I'm in no way a mathematician so i ask someone more knowledgeable to consider. Dubonbacon 18:17, 23 February 2007 (UTC)
- I'd guess that people sufficiently advanced to know about such stuff will readily convert between these notations. I think that one would probably need to assume that the entries come from a field instead of a ring. But I'm in no way a pure mathematician so all my matrices have numbers in them. -- Jitse Niesen (talk) 12:34, 26 February 2007 (UTC)
Statement equivalence
The formulae given for inverting 2x2 and 3x3 matrices are not valid for matrices with non-commutative elements and maybe this should be made clear. On the other hand, there are many articles on matrices and to qualify every statement with words like 'where the matrix elements are real, complex, or multiply commutatively' would make the pages cumbersome for most readers. Perhaps a footnote?
I cancelled out these two lines beacause they are not equivalent with the invertibility of a matrix. Only if both are true and this is already given in the next line (..exactly one solution..).
- The equation Ax = b has at most one solution for each b in Kn.
- The equation Ax = b has at least one solution for each b in Kn.
I also removed
- The linear transformation x
|->
Ax from Kn to Kn is one-to-one. - The linear transformation x
|->
Ax from Kn to Kn is onto.
beacuas these are not equivalent. And also not equivalent with the other statements
- What? I can prove that the transformation is both onto and one-to-one for square matrices. These two statements are, in fact, equivalent iff A is a square matrix. The onto statement is equivalent to saying that Span{Col A}} = Kn, whereas the one-to-one statement implies that Nul A = {0}, both of which are given in the parts that are listed. If you want the full formal proof, you'll have to wait until I get my Linear Algebra textbook out. IMacWin95 01:36, 28 April 2007 (UTC)
- Agreed. Oli Filth 11:56, 28 April 2007 (UTC)
Gaussian elimination example
I've reverted the addition of an example of Gaussian elimination, because that is already covered in the Gaussian elimination article. That article would be the appropriate place to add an example. This article is concerned with the mathematics of inverse matrices, not the numerical intricacies of how to obtain them. (Otherwise, for parity, we'd need step-by-step numerical examples of Newton's method and LU decomposition as well, which would hideously bloat the article.)
Adding an example that has no explanation of the steps involved is certainly not helpful! Oli Filth 23:44, 3 May 2007 (UTC)
Note
I think making a statement like "As a rule of thumb, almost all matrices are invertible" is vague and not accurate. There may be more invertible matrices than not, but a statement like that will certainly confuse many readers, especially those that are new to the subject. 69.107.60.124 18:31, 28 January 2007 (UTC)
- I agree. I will remove that. Oleg Alexandrov (talk) 23:15, 28 January 2007 (UTC)
- Actually, after reading the text, I disagree. That statement is definitely not precise, but it is made precise in the next sentence, and that rather vague statement is used to motivate the numerical issues below.
- I don't much like the current intro, but it has its good points and I can't think of anything better to replace it with. Oleg Alexandrov (talk) 23:19, 28 January 2007 (UTC)
- I think that "As a rule of thumb" is misleading. (At least one of my students was slightly confused by it.) I deleted that and rephrased/reordered parts of the paragraph. Hopefully, the new version is less confusing. Fgdorais 21:02, 17 September 2007 (UTC)
- I would like to add that "almost all square matrices are invertible" can also be interpreted in the sense of category, i.e., the set of invertible matrices is open dense. This is more intimately connected with perturbation of coefficients and numerical considerations. Perhaps this material should be removed from the introduction and have its own paragraph where these two (and pehaps other) interpretations can be discussed. I may decide to write such a paragraph when I'm less busy, but I would be very happy if someone else were to volunteer. Fgdorais 14:55, 23 September 2007 (UTC)
Invertible matrices are common
After the discussion at #Note above, Silly Rabbit moved the paragraph about almost all matrices being invertible to a separate section. Recently, an IP editor added the sentence "Informally speaking, invertible matrices are common whereas singular matrices are rare (the precise meaning of this statement is given below)." I think that even with all the qualifiers, this sentence is misleading. For instance, singular matrices are not rare in exams. I replaced it with the statement that random matrices are singular with probability zero, which is a formulation that I hope most people can understand. -- Jitse Niesen (talk) 12:46, 18 December 2008 (UTC)
- I rephrased it using the words almost surely, which are both formally defined and informally understandable. If anyone thinks that's not OK, feel free to revert or change it. Oliphaunt (talk) 22:59, 18 December 2008 (UTC)
- I understand the puropse behind your edit - "rare" and "common" are general ideas and do not have an exact mathematical meaning in this context. However, there needs to be a balance between being precise and conveying the idea to a layman. The point of deferring the exact meaning of the statement to later sections is so that we can be more colloquial in the introduction. In fact, there is more than one sense in which singular matrices are rare (eg: singular matrices are nowhere dense), and as it currently stands the introduction simply picks one precise explanation and ignores the other. For these reasons we should leave the technical details about lebesgue measure, almost surely picking random matrices, density, and so forth until later, and keep the tone of the introduction informal. 67.9.148.47 (talk) 10:33, 20 December 2008 (UTC)
- No, my problem is the probable interpretation of laymen when reading that sentence is wrong. Most laymen (or, at least, a lot of them) will interpret it as saying that you will rarely encounter a singular matrix in practice, and I believe that in fact quite a lot of the matrices that are encountered in practice are singular. Oliphaunt, I like the "almost surely". -- Jitse Niesen (talk) 16:34, 20 December 2008 (UTC)
- I understand the puropse behind your edit - "rare" and "common" are general ideas and do not have an exact mathematical meaning in this context. However, there needs to be a balance between being precise and conveying the idea to a layman. The point of deferring the exact meaning of the statement to later sections is so that we can be more colloquial in the introduction. In fact, there is more than one sense in which singular matrices are rare (eg: singular matrices are nowhere dense), and as it currently stands the introduction simply picks one precise explanation and ignores the other. For these reasons we should leave the technical details about lebesgue measure, almost surely picking random matrices, density, and so forth until later, and keep the tone of the introduction informal. 67.9.148.47 (talk) 10:33, 20 December 2008 (UTC)
The basics?
I actually came here to check that my memory that A . A^-1 =I after checking the whole page and the page for Matrix I gave up looking and started working through the example there. They give a A and a A^-1 after the first row its clear it is true.
But its not on the wiki entry, so I checked the German page, and sure enough there it is. Sure its not for all Matrices but it is the basic idea isn't it? —Preceding unsigned comment added by 137.248.1.11 (talk) 10:01, 3 November 2009 (UTC)
Matrix inverses in real-time simulations ?
I'm not against the paragraph, but the sentence "Compared to matrix multiplication or creation of rotation matrices, matrix inversion is several orders of magnitude slower" looks quite unfounded. As far as I can tell, you need 8 muls & 4 adds for a matrix multiplication in the 2x2 case, compared to 1 reciprocal, 6 muls & 1 add for an inversion. For 3x3, it's 27 muls & 18 adds compared to 1 reciprocal, 30 muls & 11 adds. Doesn't look like "several orders of magnitude" (which I interpret as a factor of at least 50, but more like >100) to me. For 4x4 matrices, it's 64 muls & 48 adds versus 1 reciprocal, 164 muls & 83 adds. If every operation counts as 1 FLOP, that's 112 FLOPs versus 247 FLOPs - still only a factor of less than 3.
If the original author could please explain what he/she meant. Catskineater (talk) 22:40, 21 February 2010 (UTC)
- Maybe the author is referring to how the operations for matrix-product can be done in parallel, moreso than inverse operations. Either way it needs a citation needed tag, which I'll add. Antares5245 (talk) 22:21, 15 June 2010 (UTC)
- I removed the sentence (and the sentence before it) because as far as I know there is no performance problem if you use the equation for the 3x3 inverse. One might point out that computing the inverse of rigid body transformations in homogeneous coordinates can be implemented by a transposition of a 3x3 (rotation) matrix and a negation of a (translation-)vector, which is significantly faster than a general 4x4 inverse (maybe even orders of magnitude), but it is trivial that you avoid more expensive computations if there is a cheaper alternative. --Martin Kraus (talk) 13:56, 20 July 2010 (UTC)
Etymology of "Singularity"
Could someone please point out what it is that is "single" in "singular matrix"? Thanks Gwideman (talk) 00:24, 20 September 2010 (UTC)
- Probably nothing very close. It's the same "singular" as in "singularity", which is more like "exceptional". 94.255.156.147 (talk) 21:31, 26 January 2011 (UTC)
Analytic solution?
For some reason, the subsection about Cramer's rule (and special cases thereof) is called "Analytic solution", which feels very confusing as this method has nothing whatsoever to do with analysis. I suppose "closed form formulas" might be what the editor wanted to express, but simply Cramer's rule is probably most in line with its sibling subsection headings. 94.255.156.147 (talk) 21:27, 26 January 2011 (UTC)
- As explained in analytical expression, the terms closed-form expression and analytical expression are almost equivalent. Here, "analytic solution" might be preferable to emphasize the difference to a "numerical solution". Cramer's might be too specialized since there probably are other forms of analytic solutions that are equivalent to Cramer's rule but not covered by it. --Martin Kraus (talk) 17:35, 24 August 2011 (UTC)
"regular" revisited
As I could find in a former discussion (2006), the term "regular" has been removed from the introduction because it seems not equal to "inversible". But still "Regular matrix", as it is referred to in "Irregular matrix", e.g., redirects to this article. First, I was quite confused that the opposite of a matrix with "a different number of elements in each row" should be an inversible matrix. Second, the more confused I was when I could not find the term "regular" anywhere in the whole article where I was redirected from "Regular matrix"...
--chiccodoro —Preceding unsigned comment added by 131.152.34.104 (talk) 09:29, 26 May 2008 (UTC)
- I'm not sure what the best thing to do is here. Firstly, the term "regular matrix" is used in the meaning of "invertible matrix", so I initially added this to the article. However, reading the 2006 discussion made me realize that this usage is very rare and that it would be misleading to add it as a synonym in the first sentence, so I reverted myself. Secondly, the article stochastic matrix does no longer mention the meaning of "regular matrix" that the 2006 discussion refers to. The best solution I could think of is to turn regular matrix in some kind of disambiguation page. I think that should help against any confusion. -- Jitse Niesen (talk) 12:38, 18 December 2008 (UTC)
- The disambiguation page is already done. Should this comment be removed from here?-- Arauzo (talk) 11:57, 1 November 2011 (UTC)
- As I understand it, talk pages are never pruned down. Austinmohr (talk) 19:13, 1 November 2011 (UTC)
'Integral representation' method
With regards to this addition, after some discussion on my talk page it's clear it has a number of problems.
Mostly it's not in the source, or anything like it. The source does discuss an integral involving a matrix but not any of the formulas here and the matrix in the source is not a general square matrix but a symmetric one. The source contains nothing identified as a method for inverting matrices, the point of that section of the article. Further the notation is very unclear and non-standard: it looks like the integral of a scalar (Ax being a vector), so cannot be matrix valued. With so many issues it definitely needs a source giving it as a method for inverting general matrices, otherwise it is original research.--JohnBlackburnewordsdeeds 08:47, 5 January 2012 (UTC)
- ok, I try again to explain.. the integral consists of a product of two factors 1) an exponential of a scalar value exp(-0.5*(Ax)^2) = exp(-0.5*x'A'Ax) (I use x' and A' to denote the transpose of x, A), I hope you see that x'A'Ax which is (Ax)'(Ax) is a scalar, and 2) a rank 1 matrix x(Ax)' . thus the result of the integral is clearly matrix valued. The other only fact needed to see it is that the expectation value of xx' of a multivariate normal distribution with mean 0 is its covariance matrix (this is line one of the contribution) ok man, this was my last try, if you do not understand now, than I am just sorry.. but it is not a good reason to remove it. thank you. — Preceding unsigned comment added by 131.246.191.181 (talk) 08:18, 6 January 2012 (UTC)
- Please sign your talk page messages with four tildes (~~~~). Thanks.
- The idea is not that you explain, but that you provide a source. Please do read the policies as explained in wp:NOR, wp:V and wp:BURDEN. That is the way things go here. We don't have to understand anything — you have to provide a source, so we can directly verify it, and then, per wp:consensus, talk about whether it is sufficiently wp:notable to be included here. It's all just basic Wikipedia policy. - DVdm (talk) 08:56, 6 January 2012 (UTC)
- yes, ok, unfortunately I am not able to name a source.. thought it is elementary enough, but this is of course relative.. bye — Preceding unsigned comment added by 131.246.191.182 (talk) 11:02, 6 January 2012 (UTC)
- Looks like a classic case of wp:OR. There's way to many of those here, specially in the math articles, where we always have to fight that "But, it's trivial. Every amateur mathematician can verify it." I wouldn't even look at these things, let alone verify or even discuss them: it's up to them to get a source, not up to us to beg for one. - DVdm (talk) 21:43, 5 January 2012 (UTC)
blockwise inversion
currently, 2 by 2 block wise inversion is provided, can someone write down the 3 by 3 block wise inversion? Jackzhp (talk) 23:39, 18 April 2010 (UTC)
- I'm not sure there actually is a "3x3" block inverse. You can come close by partitioning one of the 2x2 blocks further into another 2x2 block. What's more interesting is doing the block inverse by calculating the inverse of B or C , instead of A or D. This is possible because the inverse of a horizontal reversal of matrices is equal to the vertical reversal of the inverse, and can be useful when A or D is singular. Probably not interesting enough to add to the article though. Antares5245 (talk) 22:25, 15 June 2010 (UTC)
- As another note on this section of the article, I think there needs to be some clarification of what is meant when it is said that the matrix is invertible if and only if A and D - CA^{-1}B are invertible. The decomposition into the submatrices is arbitrary. It's easy to come up with an invertible matrix for which no upper left square submatrix is invertible (think of the identity with the first and last columns switched - any block decomposition gives a singular "A" matrix). Clearly the statement cannot be true then; negate it: the matrix is singular if and only if either A or D - CA^{-1}B is singular. But my example contradicts that. Is there some extra condition that I am missing and which is implied? 18.63.6.219 (talk) 19:20, 22 August 2011 (UTC)
- In its current form, the formula given for the block elements of the inverse does not display symmetry wrt permutation of diagonal elements or offdiagonal elements, which the problem should have. So far as I can tell this may not even be correct. A correct example can be found here:
- http://www.cs.nthu.edu.tw/~jang/book/addenda/matinv/matinv/
- Unless there is a good reason for this form I may change it in a few days. — Preceding unsigned comment added by 130.207.66.72 (talk) 14:58, 29 August 2012 (UTC)