![]() And the eigenvalues are squares of the eigenvalues of A. So that's the point of diagonalizing a matrix? Diagonalizing a matrix is another way to see that when I square the matrix, which is usually a big mess, looking at the eigenvalues and eigenvectors it's the opposite of a big mess. They're also the eigenvectors of A squared, of A cubed, of A to the n-th, of A inverse. I'm just squaring each eigenvalue.Īnd the eigenvectors? What are the eigenvectors of A squared? They're the same V, the same vectors, x1, x2, that went into v. How do I understand that equation? To me that says that the eigenvalues of A squared are lambda squared. What is this? What is this saying about? This is A squared. That's A squared.Īnd if I did it n times, I would have A to the n-th what would be the lambda to the n-th power V inverse. So the V at the far left, then I have the lambda, and then I have the other lambda- lambda squared- and then the V inverse at the far right. So that it's just the identity sitting in the middle. Well, you may say I've made a mess out of A squared, but not true. So if I look at A squared, that's V lambda V inverse times another one. Oh, what do I want to do? I want to look at A squared. So I'll copy that great fact, that A is V lambda, V inverse. Just so you see how it connects with what we already know about eigenvalues and eigenvectors. So that's a way to see how A is built up or broken down into the eigenvector matrix, times the eigenvalue matrix, times the inverse of the eigenvector matrix. So this is just A and this is the V, and the lambda, and now the V inverse. So that would give me A, V, V inverse is the identity. Or other times I might want to multiply by both sides here by V inverse coming on the right. I diagonalize A by taking the eigenvector matrix on the right, its inverse on the left, multiply those three matrices, and I get this diagonal matrix. This is matrix multiplication and my next video is going to recap matrix multiplication. ![]() If I multiply on the left by V inverse that's V inverse AV. ![]() If I multiply both sides by V inverse I discover- well, shall I multiply on the left by V inverse? Yes, I'll do that. But now that I have it in matrix form, I can mess around with it. So do you see that the two equations written separately, or the four equations or the n equations, combine into one matrix equation. That's a capital Greek lambda there, the best I could do. So the natural letter is a capital lambda. And everybody calls that- because those are lambda 1 and lambda 2. And here is a new matrix that's the eigenvalue matrix. So, what do I have now? I have the whole thing in a beautiful form, as this A times the eigenvector matrix equals, there is the eigenvector matrix again, V. So that's 0 times that column, plus lambda 2, times that column. So I want no x1's, and lambda 2 of that column. I get lambda 1 x1, which is what I want.Ĭan you see what I want in the second column here? The result I want is lambda 2 x2. But when I multiply a matrix by a vector, I take lambda 1 times that one, 0 times that one. I'll go back and do that preparation in a moment. Right there I did a matrix multiplication. Taking this first column, lambda 1 x1, is lambda 1 times x1, plus 0 times x2. I can say, wait a minute, that is my eigenvector matrix, x1 and x2- those two columns- times a matrix. But I can look at this a little differently. So I'm seeing lambda 2 x2 in that column. And A times the second column is Ax2, which is lambda 2 x2. And what is A times x1? Well, A times x1 is lambda 1 times x1. What's the first column? The first column of the output is A times the first column of the input. And now, just bear with me while I do that multiplication of A times the eigenvector matrix. The eigenvector matrix, maybe I'll call it V for vectors. If I have n of them, that allows me to give one name. I want to create an eigenvector matrix by taking the two eigenvectors and putting them in the columns of my matrix. So I could write the eigenvalue world that way. So, you remember I have a matrix A, 2 by 2 for example. ![]() Well, actually, it's going to be the same thing about eigenvalues and eigenvectors but I'm going to use matrix notation. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |