Talk:Eigendecomposition of a matrix

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Not any basis is orthogonal[edit]

"Any eigenvector basis for a real symmetric matrix is orthogonal" --- really? The zero matrix is symmetric, and all vectors are its eigenvectors. Yes, the orthogonal basis can be chosen; but the given formulation is inaccurate. Boris Tsirelson (talk) 20:06, 15 April 2009 (UTC)[reply]

I revert the edit of 143.167.67.175. It is far not enough to require that the matrix is not zero. The zero matrix is only the simplest example. Please do not claim statements unless you find them in a source. Boris Tsirelson (talk) 15:16, 21 April 2009 (UTC)[reply]

Just out of interest, can you specify a counterexample? Oli Filth(talk|contribs) 16:59, 21 April 2009 (UTC)[reply]
Sure. It is a matter of multiplicity. If at least one eigenvalue is of multiplicity 2 or more then we can choose non-orthogonal basis vectors from the corresponding eigenspace. Boris Tsirelson (talk) 17:39, 21 April 2009 (UTC)[reply]
Then can we add in parentheses: (If there are n distinct eigenvalues, the eigenvectors must be orthogonal.)? 132.67.97.20 (talk) 13:39, 28 November 2010 (UTC)[reply]

The Eigenspaces are orthogonal. If they are 1D, then the corresponding EVs are orthogonal. LMSchmitt 21:25, 26 May 2021 (UTC)[reply]

Regarding "Eigendecomposition of a matrix"[edit]

Is the beginning of the "Eigendecomposition of a matrix" paragraph correct?

I tend to think it should be phrased

"Let A be a square (N×N) matrix with N linearly independent eigenvectors, q_i columns, a_i \,\, (i = 1, \dots, N). Then A can be factorized as \mathbf{A}=\mathbf{Q}\mathbf{\Lambda}\mathbf{Q}^{-1} where Q is the square (N×N) matrix whose ith column is the eigenvector q_i of A ..."

Chlopisko (talk) 04:52, 12 July 2012 (UTC)[reply]

Correct. N linearly independent columns is merely equivalent to A be invertible, which is insufficient to make A diagonalizable (for example, the defective matrix is invertible). N linearly independent eigenvectors however means that the A cannot have any strictly generalized eigenvectors, so its Jordan form has to be diagonal. I'll change it. 68.102.164.94 (talk) 03:17, 2 October 2013 (UTC)[reply]

Dimension (Generalized eigenvalue problem section)[edit]

Hello,

in the Generalized eigenvalue problem section, do the vi vectors have to be of dimension n? Shouldn't we write

I see no reason why the number of generalised eigenvectors should be equal to their dimension, but as this is a bit beyond my math skills, I prefere to ask before modifying.

cdang|write me 12:49, 2 January 2013 (UTC)[reply]

Mmmm, I found a source telling me that A and B are n×n matrices. This is consistant with the begining of the article, but as this seciton is about a "generalised" case, it is not obvious that this limitation also applies here. So, I reformulate: would it be a good idea to remind that we are still in the case of square matrices?
cdang|write me 13:15, 2 January 2013 (UTC)[reply]

Confusion about (in?) Eigendecomposition of a matrix: Example[edit]

I am trying to understand this topic/example. In it is Λ (the matrix with eigenvalues on the diagonal) which is surrounded by Q and Q−1, while A is on the other side of the equation. But in the example, it is the other way around. What am I missing? Or what is the article missing? Orehet (talk) 11:24, 13 November 2013 (UTC)[reply]

Functional Calculus and orthogonality condition for Q[edit]

In the section "Eigendecomposition of a matrix" it is mentioned that the eigenvectors are usually normalized but they need not be. While that's true, we should note that in the "Functional calculus" section for the formula's to hold, we need Q to be an orthogonal matrix. I just found it's never mentioned in the text, which could be misleading. 136.159.49.121 (talk) 21:24, 2 December 2015 (UTC)[reply]

Example correction[edit]

Shouldn't we add the condition c!=0 and d!=0 in the Example? Pokyrek 8. March 2016 —Preceding undated comment added 07:37, 8 March 2016 (UTC)[reply]

Conclusion from Eigenproblem to matrix decomposition is missing parts[edit]

"The decomposition can be derived from the fundamental property of eigenvectors:" This section needs to be improved with the matrix diagonalization theorem for which even no wikipedia article exists. Without this theorem the result is plainly wrong, since the vectors may not be linear independent etc. Possible references/how to fix this: https://nlp.stanford.edu/IR-book/html/htmledition/matrix-decompositions-1.html [Just using/stating the theorem] http://mathworld.wolfram.com/EigenDecompositionTheorem.html

Proofs may not be feasible in wikipedia though.

One can at least link to the spectral theory giving hints that this does not follow in general, since this may be confusing. https://en.wikipedia.org/wiki/Spectral_theorem — Preceding unsigned comment added by 134.61.83.68 (talk) 13:45, 4 January 2018 (UTC)[reply]

Only v1*Bv2 = 0, or also more generally vi*Bvj = 0?[edit]

The article contains the text:

"If A and B are both symmetric or Hermitian, and B is also a positive-definite matrix, the eigenvalues λi are real and eigenvectors v1 and v2 with distinct eigenvalues are B-orthogonal (v1*Bv2 = 0).[1]"

This way of writing suggests that something is stated only about v1 and v2.

I presume the statement may be true for vi and vj with i and j more general (but still with distinct eigenvalues).

If my understanding is correct, I propose the text be changed in a way to reflect this greater generality more clearly.Redav (talk) 20:04, 26 May 2021 (UTC)[reply]

References

  1. ^ Parlett, Beresford N. (1998). The symmetric eigenvalue problem (Reprint. ed.). Philadelphia: Society for Industrial and Applied Mathematics. p. 345. doi:10.1137/1.9781611971163. ISBN 978-0-89871-402-9.

Regarding the section titled, "Useful Facts"[edit]

This section title doesn't seem very encyclopedic in my opinion. Would a better name be "Properties of Eigendecompositions"? Nbennett320 (talk) 02:45, 3 November 2021 (UTC)[reply]

Normal Matrices[edit]

The discussion on unitary and normal matrices contains the line: If A is restricted to be a Hermitian matrix (A = A*)

Which implies a Hermitian matrix is equal to its own conjugate and not equal to its own conjugate transpose. This would limit Hermitian matrices to having only real-valued entries and, arguably worse, any matrix at all with only real-valued entries would then be Hermitian, i.e. it implies a matrix need not even be a square matrix to be Hermitian.

I've never edited a wikipedia page in my life and am not starting now but stopped by the talk page to say that this warrants a correction so somebody else can do the honors. 2605:6440:300A:3003:0:0:0:5732 (talk) 06:52, 15 September 2023 (UTC)[reply]