Talk:QR algorithm
This is the talk page for discussing improvements to the QR algorithm article. This is not a forum for general discussion of the article's subject. |
Article policies
|
Find sources: Google (books · news · scholar · free images · WP refs) · FENS · JSTOR · TWL |
This article is rated C-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | |||||||||||
|
If i remember it correctly, in general, the complexity is 2*n^3/3 + O(n^2).
"In this crude form the iterations are relatively expensive. This can be mitigated by first bringing the matrix A to upper Hessenberg form with a finite sequence of orthogonal similarity transforms, much like a QR decomposition. For symmetric matrices this replaces a O(n^3) procedure by one that is O(n^2)."
1) You might want to say somewhere that A0:=A :)
2) So, if A is symmetric => procedure costs O(n^2)? Is that also the case for Hermitean matrices?
"If the original matrix is symmetric, then the Hessenberg matrix is also symmetric and thus tridiagonal, and so are all the Ak. This decreases the computational cost of the algorithm because now each QR decomposition is only O(n), not O(n^2)."
Is this not the same assumption as above, ie A is symmetric? So, when is the cost O(n) and when O(n^2)? 131.155.54.18 13:58, 14 November 2005 (UTC), Wilbert Dijkhof
- Hoi Wilbert. After consulting Golub & Van Loan, I think the costs are as follows. A single QR-decomposition costs 4/3 n^3 flops in the general case (omitting lower order terms). However, QR-decompositions for Hessenberg matrices cost 6n^2 flops while it costs 10/3 n^3 flops to reduce a matrix to Hessenberg form using Householder transformations. If your matrix is symmetric, then the Hessenberg form is in fact tridiagonal. In this case, it costs 4/3 n^3 flops to bring it in tridiagonal form and I think that every QR-decomposition costs O(n) flops, but I'm not sure about that. Warning: I probably have some constants wrong.
- I have no idea how much generalizes to Hermitian matrices. Please do change the article to make it clearer. Groetjes, Jitse Niesen (talk) 15:26, 14 November 2005 (UTC)
- Hoi Jitse. Nog steeds in Engeland? I clarified some things a bit, especially about the computational complexity. I also used Golub & Van Loan as source. I hope it's good now. One thing is not clear to me. A general Householder orthogonalization (without using a Hessenberg form) costs 2/3 n^3 (page 148 Golub). This is even less than bringing it to Hessenberg first (5/3 n^3) and then to do a QR decomposition. Is this Hessenberg only used for numerical stability? Wilbert Dijkhof, 131.155.54.18 14:55, 16 November 2005 (UTC)
- Nou ja, om precies te zijn, zit ik nog in Schotland. Ze zijn er hier niet zo happig op als je zegt dat Edinburgh in Engeland ligt (net zoals sommigen in Eindhoven niet graag horen dat Eindhoven in Holland ligt).
- If all you want to do is computing a simple QR-decomposition, then it is indeed not useful to bring the matrix in Hessenberg form. However, in the QR-algorithm for computing eigenvalues, you have to do many QR-decompositions, each with a cost of 2/3 n^3. In this case, it is better to reduce A to Hessenberg form and incur the 5/3 n^3 cost, because in the next iterations, you can then do the QR-decompositions for O(n^2) cost. Does that make sense? I should stress that numerical linear algebra is not my speciality. In fact, it is one of the fields which I really should know better. -- Jitse Niesen (talk) 23:29, 16 November 2005 (UTC)
- Hoi Jitse. Nog steeds in Engeland? I clarified some things a bit, especially about the computational complexity. I also used Golub & Van Loan as source. I hope it's good now. One thing is not clear to me. A general Householder orthogonalization (without using a Hessenberg form) costs 2/3 n^3 (page 148 Golub). This is even less than bringing it to Hessenberg first (5/3 n^3) and then to do a QR decomposition. Is this Hessenberg only used for numerical stability? Wilbert Dijkhof, 131.155.54.18 14:55, 16 November 2005 (UTC)
Some possible improvements
[edit]Some day this article could use a little fine tuning. To that end, here are a few notes.
The eigenvalues are roots of the characteristic polynomial, so in general we cannot compute them by a fixed predetermined sequence. That's why we use an open-ended iteration to achieve convergence.
This also implies that we cannot directly bring a matrix into upper triangular (or diagonal, for symmetric) form. However, we can use a fixed sequence to bring a matrix into upper Hessenberg (or tridiagonal, for symmetric) form. This gives us a mass of known zeros that will be preserved by the iterations to follow.
Clever as it is that reversing the QR factors effects a similarity transform, it is not at all obvious that doing this repeatedly will produce the convergence we seek. And by itself, it may not.
Success depends critically on use of shifts. That we choose to do the shifts implicitly is a numerical nicety; but decent convergence requires dealing with close eigenvalues somehow. The article really must explain shifting (or link to an explanation).
The most important special case is that of real symmetric matrices. For a real matrix, the characteristic polynomial will have real coefficients; thus any complex eigenvalues occur in complex conjugate pairs. The algorithm may not be able to achieve a real diagonal, and provision must be made for recognizing the 2×2 blocks giving a conjugate pair.
In counting costs, we must distinguish between a QR step, which is iterated an uncertain number of times, and a fixed cost like reduction to upper Hessenberg form. Also the QR factorization gets cheaper as we succeed in isolating roots, because we can ignore more and more of the matrix. Note that for symmetric matrices we can store the matrix as just the diagonal and subdiagonal; this is why the QR factorization is O(n) at worst. Without symmetry, we must touch half the matrix, so we can't beat O(n2). If we can count on a bounded number of iterations per eigenvalue, then the total cost of the iteration phase is n times the cost of an iteration: O(n3) in general, and O(n2) in the symmetric case.
The speed of convergence is one of the most remarkable aspects of the QR algorithm. It was a tremendous improvement over its predecessors. However, today we know that a Jacobi algorithm can give better numeric results, and is more amenable to parallelization. Some mention should be made of these issues, even though QR deservedly remains a workhorse.
Of all these points, the need for an extended discussion of shifting seems most urgent. --KSmrqT 13:13, 19 August 2006 (UTC)
Termination
[edit]Is it possible to decide if this algorithm terminates which means the given matrix has actually eigenvalues? If so it should be mentioned here I suppose. Nulli 10:28, 15 November 2006 (UTC)
QR decomposition
[edit]Why don't we merge this page in the QR decomposition one, and then link this one to QR decomposition ?
--Fbianco 22:14, 15 April 2007 (UTC)
QR decomposition should have a link to here, but they are different. QR iteration (this page) uses QR decomposition to find eigenvalues.
- If anything, this article could be expanded into a series of articles including multishift versions and the modern forms of the algorithm that do the QR decomposition only in a very hidden implicit form.--LutzL (talk) 07:36, 16 July 2008 (UTC)
incorrect operation counts in article
[edit]The claimed operation counts for reduction to Hessenberg form in the article seem to be off by a factor of two.
The Householder method for reduction to Hessenberg form requires arithmetic operations for general square matrices, and for Hermitian matrices (where Hessenberg = tridiagonal). See, for example, Trefethen and Bau, Numerical Linear Algebra, or Demmel, Applied Numerical Linear Algebra.
There is another non-Householder method based on a Gaussian-elimination-like procedure that takes fewer arithmetic operations. However, this algorithm (although it was used in EISPACK I believe, but not in LAPACK) seems to have fallen out of favor because it is sometimes unstable (Businger, 1969). The best reference for that method seems to be Wilkenson's 1965 book, which I don't have in front of me; my recollection is that it requires half as many as the Householder method, which may explain the numbers in the article (which are incorrectly attributed to Householder). However, it would be good to check a good reference for this. (Numerical Recipes uses the Gaussian-elimination method, but I think erroneously gives the operation count as instead of .
—Steven G. Johnson (talk) 02:44, 6 November 2008 (UTC)
- Okay, I've changed the article to correctly give the Householder-algorithm counts. It seems to me that a further discussion of this algorithm belongs in the Hessenberg matrix article. —Steven G. Johnson (talk) 02:57, 6 November 2008 (UTC)
- Your memory is fine. Golub & van Loan says that elimination to Hessenberg form via Gaussian elimination costs operations, just like you say. -- Jitse Niesen (talk) 14:26, 6 November 2008 (UTC)
Q_0, R_0?
[edit]This page does not make it clear what the starting Q and R are. It says we let and then
which I guess implies that
but what's and ? —Ben FrantzDale (talk) 10:59, 6 June 2011 (UTC)
- Qk and Rk are the QR factors of Ak. So Q0 and A0 are the QR factors of A. i.e. A = Q0 R0. See QR decomposition. — Steven G. Johnson (talk) 12:55, 6 June 2011 (UTC)
What about eigenvalues and vectors of non-square matrices?
[edit]As of 2016-04-20, the opening paragraph ends, "The basic idea is to perform a QR decomposition, writing the matrix as a product of an orthogonal matrix and an upper triangular matrix, multiply the factors in the reverse order, and iterate."
This will NOT work for a matrix that is not square, because Q and R are not conformable in the reverse order.
??? Thanks, DavidMCEddy (talk) 17:54, 20 April 2016 (UTC)
External links modified
[edit]Hello fellow Wikipedians,
I have just modified one external link on QR algorithm. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:
- Added
{{dead link}}
tag to http://www.alglib.net/matrixops/general/svd.php. - Added archive http://web.archive.org/web/20081209042103/http://math.fullerton.edu:80/mathews/n2003/QRMethodMod.html to http://math.fullerton.edu/mathews/n2003/QRMethodMod.html
When you have finished reviewing my changes, please set the checked parameter below to true or failed to let others know (documentation at {{Sourcecheck}}
).
This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}}
(last update: 5 June 2024).
- If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
- If you found an error with any archives or the URLs themselves, you can fix them with this tool.
Cheers.—InternetArchiveBot (Report bug) 10:21, 21 July 2016 (UTC)
Added visualizations and even discussion surrounding computability
[edit]The code (right now pretty terrible) can be found here: https://github.com/ogogmad/la-vis. I don't know if the discussion around computability is clear or of interest to people. --Svennik (talk) 17:36, 9 August 2021 (UTC)
- The following Youtube video is by me, based on the visualization I provided here: https://www.youtube.com/watch?v=XGAaHoppcvE --Svennik (talk) 20:09, 16 March 2022 (UTC)