Jump to content

Talk:Kernel (matrix)

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
(Redirected from Talk:Null space)

untitled

[edit]

Can somebody make a multi-dimesional example here? (more than one free variable....)

I am for the idea of merging this article (Null Space), with Kernel -- minghan 15:23, 26 January 2006 (UTC)[reply]

I would prefer to have kernel (mathematics) continue to focus on the more general case while this article addresses the restricted case of kernels of matrix operators in linear algebra. Deco 02:33, 27 January 2006 (UTC)[reply]
Changed my mind. This article presents itself as the same concept as kernels, so a merge is warranted. Deco 02:35, 27 January 2006 (UTC)[reply]
A slight philosophical difference is that a matrix is not a map, whereas a linear transformation is, even though the two are commonly identified. The null space of a matrix refers to what would be the kernel if we thought of it as a linear transformation. Whether the articles are written this way, I don't know, but I think that at least in theory, the two should be considered separate concepts, that you realize are the same after a little bit of thinking.

Hmmm. I still suggest that the merge discussion be centralized in kernel (mathematics). — Arthur Rubin | (talk) 02:57, 17 March 2006 (UTC)[reply]

Division by Zero

[edit]

"Nullity" should no longer redirect here, since it appears that someone has managed to solve the "divide by zero problem"... and the solution is called Nullity. (http://www.bbc.co.uk/berkshire/content/articles/2006/12/06/divide_zero_feature.shtml) Medevilenemy 19:24, 6 December 2006 (UTC)[reply]

No offense, but that "solution" is total bullcrap. --Wooty Woot? contribs 09:54, 7 December 2006 (UTC)[reply]
It's not Wikipedia's job to decide whether a theory is correct or not. Leave that to discussion within the mathematics community and related journals. See the note at the bottom of WP:NOR. --Tjohns 12:03, 7 December 2006 (UTC)[reply]
I think discussion regarding this should take place over at Talk:Nullity. --Tjohns 12:00, 7 December 2006 (UTC)[reply]

Major revision

[edit]

I just posted a major revision to this page, and I added the WikiProject Mathematics template above. Jim 03:06, 9 September 2007 (UTC)[reply]

That is not an "expansion" as your edit summary says, but a "limitation" to the special case of a finite dimensional linear operator. The definition is much wider than for matrices only. −Woodstone 06:05, 9 September 2007 (UTC)[reply]

I don't agree that the introduction to the article should focus on the general case like this. Here's why:

  1. The null space of a matrix is an important idea, and most of the article is devoted to it. Null spaces of matrices are central to elementary linear algebra, and I think they deserve their own article.
  2. Introducing null spaces in the context of general operators makes the article less accessible to a general audience.
  3. For general linear operators, the null space is more often referred to as the kernel.

In an attempt to resolve this disagreement, I've created an article called kernel (linear algebra) that discusses the general case, and added a disambig template to the top of this article. Jim 18:45, 9 September 2007 (UTC)[reply]

You are missing the point again. Linear algebra is not the general case. Also non-linear operators can have a null space. Did you notice that there are already quite many "kernel (something)" articles? Why add another one. In my opinion the titles should be reversed from your idea. Null space is generic. The "Matrix" case is quite specific and should have an appropriate title if you insist on splitting it from the generic one. −Woodstone 07:09, 10 September 2007 (UTC)[reply]
I'm not sure I've ever heard of anyone talk about the null space of a nonlinear operator. Most of the time people use the word "operator" to mean "linear operator", and even in that case the "null space" is most commonly called the "kernel". My experience has been that the word "null space" is used most of the time to talk about the null space of a matrix.
Google backs me up on this. There are 476,000 results for "null space", of which 347,000 involve the word "matrix", and 413,000 involve either the word "matrix" or the word "linear". The remaining hits don't particularly seem to be about the null space of a nonlinear operator.
The Math World article on null spaces restricts to "linear transformations", as does the Merriam-Webster definition. I looked at the Wikipedia articles that link to "Null space", and I didn't notice any that have to do with nonlinear operators.
I like the current organization of the articles on null space and kernel (linear algebra), but if you strongly object to then we could describe our disagreement on Wikipedia talk:WikiProject Mathematics, and ask what the other members think. Jim 17:17, 10 September 2007 (UTC)[reply]

Since you've reverted the introduction again, I've asked for an outside opinion from Wikipedia talk:WikiProject Mathematics. (Those coming from the outside should take a look at the current version and previous version of the introduction.) Jim 01:18, 14 September 2007 (UTC)[reply]

I don't believe the set of solutions of for a general f is called null space. It's only called a null space if f is linear. I think it's called a kernel if f preserves the relevant algebraic structure, and the expression for the general situation is zero set.
I'm not so sure about having separate articles on null space (of matrices) and kernel (linear algebra) (of linear operators). There is a big overlap and I'm not convinced that the article would be less accessible if we were to treat linear operators here, provided that we do introduce null spaces for matrices first and that we delineate the more abstract parts which rely on identifying matrices with linear operators. If we do decide to have separate articles, then I think they should be linked more than by just a note at the top. -- Jitse Niesen (talk) 02:24, 14 September 2007 (UTC)[reply]

The introduction in terms of a matrix looks better to me, more accessible. It's OK to have a separate matrix-oriented article, and a separate operator-oriented article, with more of a functional analysis and algebra flavor (not the college lower division linear algebra). These really serve distinct audiences. I am not sure if the kernel vs nullspace nomenclature has anything to do with this distinction - they are exactly the same to me - so perhaps some renaming is in order. And, yes, I agree with Jitse Niesen that both refer to linear operators only. By the way, computing the basis of the nullspace of a matrix by means of elementary transformation as done in the article now and in undergraduate textbooks is unfortunate and of (perhaps) a didactic value only because of numerical stability problems; practical computation should be done via QR decomposition or better SVD, like in Matlab function null. Jmath666 04:34, 14 September 2007 (UTC)[reply]

I agree that there's a problem with presenting only the row reduction perspective, and I've been trying to figure out how to deal this problem for many different linear algebra articles (c.f. column space, row space, Euclidean subspace, and system of linear equations.) On the one hand, row reduction is the standard algorithm given in most introductions to linear algebra; on the other hand, it apparently has serious numerical stability problems, making it relatively useless for practical applications. See Talk:System of linear equations for more discussion about this issue. Jim 20:45, 14 September 2007 (UTC)[reply]
I did not say there is a problem with row reduction as opposed to columns; there is a problem with the reduction approach at all. This results in unfortunate engineers who try to implement those naive methods out of the book in software. But the educational establishment deems it important so students are subjected to it. Hence it has a place on Wikipedia where those students may look. If I have time I may add a paragraph here about computing the nullspace in a numerically sound manner. Yes the issues in solving systems are similar, but simpler. See also the paragraph on numerical computation in Moore-Penrose pseudoinverse. For example Trefethen-Bau ISBN 978-0898713619 do not even consider Gaussian elimination (a.k.a. reduction) for anything and start with QR. Matlab switched from QR to SVD (more expensive but more reliable) for things like the nullspace many years ago; reduction methods are not even a remote consideration for many decades now if reliability is important. Jmath666 04:48, 15 September 2007 (UTC)[reply]
Sorry, we seem to be miscommunicating. I was agreeing with you—by "row reduction" I mean Gaussian elimination, or any method that involves elementary row and/or column operations. I'm calling it "row reduction" just because it's usually done with rows in introductory courses. I know about the numerical problems with reduction, and I've been trying to figure out how this should be reflected in the articles on elementary linear algebra. See my last post. Jim 09:07, 15 September 2007 (UTC)[reply]
I'm not sure we need a separate article kernel (linear algebra) next to kernel (algebra)#Linear operators, but if kept, there should be cross references.  --Lambiam 08:58, 14 September 2007 (UTC)[reply]
See also Wikipedia:Reference desk/Archives/Mathematics/2007 August 12#Null space and Kernel (algebra).  --Lambiam 09:05, 14 September 2007 (UTC)[reply]
Even if I understand that matrices are more common use, nullspace does not only concern finite dimensional spaces, and not only real vector spaces (in particular, complex matrices and matrices over finite fields are also widely used). Hence the definition, at least, should not restrict the term to such a particular situation. Moreover, in the article, the term matrix is used in place of "real matrix", leading to strange assertions like "The null space of an m × n matrix is a subspace of Rn".Also,the introduction of the term "Euclidean space" is misleading: Why should the natural metrics be taken into account? This is particularly funny if A is used to define a quadratic form associated to a non-Euclidean distance. Should we consider that the isotropic subspace of a quadratic form is an Eucldiean subspace??? pom 21:33, 18 September 2007 (UTC)[reply]
What's happening is that I'm trying to make this article as accessible as possible to non-mathematicians (e.g. second-year students in an introductory linear algebra course). The most basic case of a matrix is a real matrix, so this is what gets talked about in the article. Jmath666 has now implemented a partial solution to this problem (see below).
Also, "Euclidean space" is used here as a general synonym for Rn, primarily because Euclidean space is currently the main article on n-dimensional space. It's possible that "coordinate space" would be better terminology, but the article on coordinate space is currently written from a more general standpoint, and real coordinate space is a redirect to Euclidean space. (There's also a very odd article entitled n-dimensional space which should probably be merged with Euclidean space, but could alternatively be developed into a main article on n dimensions.) Jim 05:19, 19 September 2007 (UTC)[reply]

I agree with Taxipom. Thanks to the references to Euclidean space, it is proving very difficult indeed to work out how much of the article applies to, say, matrices containing complex values, or over finite fields. This is most certainly not "as accessible as possible", it is extremely confusing. 92.30.174.205 (talk) 23:09, 27 September 2011 (UTC)[reply]

Renaming proposed

[edit]

In view of the discussion above I propose there should be

  • kernel (linear algebra), including elementary linear algebra and numerical linear algebra aspects (this article)
  • kernel (functional analysis), or operator theory (relating/including Fredholm theory etc)
  • kernel (algebra) (as is now, perhaps incorporating kernel (operator theory))

with proper redirects from the matching variants of nullspace, and a dab page. Each with distinct audience and at a distinct level. For example, referring to the post by Taxipom, the words "natural metric" and all considerations that follow will make no sense whatsoever to a typical 2nd year US college student who just takes a first course in linear algebra and wants to use wikipeda to look up something. While it would be all fine and of interest to a grown mathematician or a student in a classical, pure math oriented program. Or... maybe I'll just be bold do it and see if anyone objects. Jmath666 04:12, 19 September 2007 (UTC)[reply]

Done, Kernel (mathematics) is the main article now and everything is links to/from there. Hopefully it makes more sense now. This article, now called Kernel (matrix) is about matrices only and more abstract stuff belongs elsewhere. Jmath666 04:56, 19 September 2007 (UTC)[reply]

I really like this suggestion. However, I think that null space (matrix) would be a better title for this article than kernel (matrix). "Null space" is the terminology used in most 2nd-year linear algebra books, and "null space of a matrix" returns twenty times as many Google hits as "kernel of a matrix". Would you object if I moved the page? Jim 05:03, 19 September 2007 (UTC)[reply]
There are Null space (matrix) and also Nullspace (matrix) namely redirects to Kernel (matrix). If you want to switch it around, you will need the help of an admin to delete the redirect page first. Also Kernel (matrix) is consistent with Kernel (mathematics). I do not think it is a big deal either way, do what you think is right. Jmath666 07:37, 19 September 2007 (UTC)[reply]
(edit conflict)
There seem to have been some renames/redirects already, but I lost track. However, redirecting "null space" to "kernel (matrix)" is very wrong. The concept of null space is much wider than as only applied to matrices. I agree with redirecting all "null space" articles to corresponding "kernel" articles. But the bare "kernel" article should have a fully generic definition and can then descend into or refer to specific cases, e.g. for matrices. Especially so, because the general definition is in no way more complex than the specialised one. −Woodstone 05:05, 19 September 2007 (UTC)[reply]
Yes there is already a bare "kernel" article, namely the main article Kernel (mathematics) and it does exactly as you say. Perhaps Nullspace should redirect there, so I have done that now too. Thanks for noticiting. Feell free to change whatever you like. Jmath666 07:37, 19 September 2007 (UTC)[reply]

This is a mistake. A linear transformation has a kernel. A matrix is not a linear transformation so it does not have a kernel. There is a lot of slop in the literature about this, but if you go back to basics you will find this to be true. In particular, a matrix may represent a linear transformation in more than one way, so it is intrinsically nonsensical to refer to its "kernel". There is a perfectly good term for for the null space, namely "null space". This page should at least have a correct definition and only then cite Kernel (linear algebra). Zaslav (talk) 07:53, 23 February 2020 (UTC)[reply]

Proof that A Q2 = 0

[edit]

I am sure this is very obvious to most of you but I was curious to see at the end of the article a method for computing the null space using QR factorization. I am not a mathematician myself but was wondering what is the proof that A Q2 = 0? It may be staring me in the face but I don't see it at the moment, I am sure it is correct as I tried the method using SciLab and it worked very nicely. --Rhoddydog (talk) 17:35, 14 August 2008 (UTC)[reply]

PS I just realized why A Q2 = 0, for those who might be interested: multiply both sides of A^T P = [Q1 Q2] [R 0]^T by [Q1 Q2]^T (since Q is orthogonal) and multiply out the resulting expression, one of the results is A Q2 = 0. --Rhoddydog (talk) 10:05, 14 August 2008 (UTC)[reply]


" multiply both sides of " --> "left multiply both sides of ..." --Mangledorf (talk) 15:29, 20 August 2012 (UTC)[reply]

Split Left Nullspace

[edit]

Can I remove the paragraph from this page and put it on a separate "left nullspace" page? It's nice when you search for something if you just get the information you want instead of a whole bunch of other stuff.daviddoria (talk) 15:46, 19 September 2008 (UTC)[reply]

Relation to eigenvalues

[edit]

Hi there,

I thought it would be nice if there would be a section explaining the relation between the nullspace and the eigenvalues/eigenvectors of the matrix.

As far as I know, the dimension of the nullspace (nullity?) of a matrix is equal to the amount of eigenvectors with eigenvalue 0. And the nullspace is spanned by these vectors, too. Drugbird (talk) 07:56, 27 April 2010 (UTC)[reply]

Using RRE to calculate Nullspace

[edit]

The current algorithm given using RRE to find a nullspace has a step "Interpreting the reduced row echelon form as a homogeneous linear system" and then "determine which of the variables are free". That's a bit of a stretch for someone who is just learning how to calculate nullspace. I'd suggest a more concrete algorithm for it, like the following:

  1. If the input matrix is wider than it is tall, pad zero rows to make it square.
  2. Put the matrix in RRE form keeping the pivots along the diagonal.
  3. If the matrix is taller than it is wide, crop the lower rows to make it square (they will all be zero rows if the RRE was done correctly).
  4. Subtract the identity matrix.

That produces a null space basis from the non zero columns of the result. It is simple, uses RRE, and doesn't require the "now write out tons of expressions and modify them" step. Antares5245 (talk) 06:37, 28 August 2010 (UTC)[reply]

LQ Factorization

[edit]

It seems that using the LQ factorization would be faster at finding the null space than the QR factorization, since we can apply it directly to the matrix $A$ instead of to $A^T$. LAPACK has routines for computing the LQ factorization as well.Vinzklorthos (talk) 15:13, 6 February 2012 (UTC)[reply]

Also, I have heard that QR is faster than SVD (presumably, LQ would be faster as well). The reason to use SVD relates to the possibility that the matrix does not have full row rank, i.e., at least one row is a linear combination of other rows; if the matrix is a constraint matrix in an optimization problem, then the matrix can be reduced and still yield an equivalent problem. SVD would catch this by having more zero singular values, but QR or LQ would only indicate this by having a degenerate R or L matrix, and this would have to be checked. Using SVD allows the user to not worry about this. In conclusion, as I understand, if the user knows the matrix has full row rank, then it is faster to use an LQ factorization of $A$ (or a QR factorization of $A^T$).Vinzklorthos (talk) 15:13, 6 February 2012 (UTC)[reply]

Numerical computation of null space

[edit]

I have completely rewritten, and renamed this section into "Computation of the null space in a computer". In fact, the previous version did not respect WP:NPOV by supposing that every numerical computation is a floating-point computation, which is a blatant mistake. On the other hand I have removed the large part of the section devoted to specific numerical algorithms, because it is outside the scope of this article. In fact, null space computation is a special instance of solving a homogeneous linear system. Deciding which algorithm is the best in each case is a highly technical matter, whose answer depend on various parameters of the input matrix (its structure) as well as the architecture (cache(s), vectorization, number of cores, size of the memory, ...) of the computer. Implementing a high performance linear algebra package is yet a research project for a whole team! Thus the questions, that were addressed in the part that I have removed, have not their place here but in an article high performance liner algebra, yet to write. Also the suppressed part presented a single algorithm as the state of the art. This is WP:OR --D.Lazard (talk) 16:42, 19 October 2012 (UTC)[reply]

Assessment comment

[edit]

The comment(s) below were originally left at Talk:Kernel (matrix)/Comments, and are posted here for posterity. Following several discussions in past years, these subpages are now deprecated. The comments may be irrelevant or outdated; if so, please feel free to remove this section.

Tried to make this better. Probably needs more references. Jim 03:09, 9 September 2007 (UTC)[reply]

Last edited at 01:43, 1 January 2012 (UTC). Substituted at 02:16, 5 May 2016 (UTC)