05-05-2004, 02:45 PM | #1 (permalink) |
Insane
Location: Mexico
|
Matrix Algebra (AGAIN)
ok so here goes, i got some really great help from you guys last time so maybe u can help me once more
Suppose that A is an invertible Matrix and that A can be diagonalized ortogonally. Explain why, A^-1 may also be diagonalized ortogonally?? so any help?? somebody anybody...
__________________
Ignorance Is Bliss |
05-06-2004, 11:34 AM | #3 (permalink) |
pigglet pigglet
Location: Locash
|
Off the top of my head, without a real answer, I would think that if you looked up something like Singular Value Decomposition, you might find something that essentially answers this question. What I'm about to say could be very wrong, but :
if A = LDU (or something like that) where L and U are a set of matrices to permutations to orthogonalize A leaving behind only a diagonal matrix D, then if you take the inverse of A A_inv=inv(LDU) =? U_invD_invL_inv (maybe the ordering doesn't switch for inverses...I can't remember) then you can see that D_inv will be a matrix which contains the values of (1/d_i) where d_i were the diagonal entries of the original matrix D. Assuming that D isn't singular, then something along these lines should point towards A_inv being orthogonalized diagonally. I think. At least it might be food for thought
__________________
You don't love me, you just love my piggy style |
05-06-2004, 02:31 PM | #4 (permalink) | |
Insane
Location: Mexico
|
Quote:
(P^-1)*A*P = D in which D is a diagonal matrix which contains the eigenvalues of matrix A.
__________________
Ignorance Is Bliss |
|
05-06-2004, 10:19 PM | #5 (permalink) |
Location: Waterloo, Ontario
|
I believe that, in order for a matrix to be orthogonally diagonal, it must have no zero columns (diagonal matrices may have zero columns). If so, then it's a simple matter to prove that, for every orthogonally diagonal matrix, there exists a multiplicative inverse and that inverse is also a orthogonally diagonal.
Knowing this, if (1/P)AP = D, an orthogonally diagonal matrix, then 1/( (1/P)AP ) = (1/P)(1/A)P = 1/D and, because 1/D is also orthogonally diagonal, 1/A must also be orthogonally diagonalizable. QED. ...edited for correctness... Last edited by KnifeMissile; 05-07-2004 at 12:27 AM.. |
05-09-2004, 02:02 AM | #6 (permalink) |
Wehret Den Anfängen!
Location: Ontario, Canada
|
You might want to do what Knife did in a few more steps.
Let A be a matrix, and let D be the orthogonal diagonalization of A. Then there exists a P such that 1/P * A * P = D Let 1/A be the inverse of A. Examine C = 1/P * 1/A * P. C * D = 1/P * 1/A * P * 1/P * A * P = 1/P * 1/A * (P * 1/P) * A * P = 1/P * (1/A * A) * P = 1/P * P = I Thus, C = 1/D As D is diagonal, its inverse is diagonal. Thus, C is diagonal. Then, as C = 1/P * 1/A * P by the definition of orthogonal diagonalization, 1/A can be orthogonally diagonalized.
__________________
Last edited by JHVH : 10-29-4004 BC at 09:00 PM. Reason: Time for a rest. |
Tags |
algebra, matrix |
|
|