Transcript for:
Introduction to Matrix Powers and Properties

Hey folks, my name is Nathan Johnston. Welcome to lecture 11 of Introductory Linear Algebra. Today's lecture is going to be a short one because all we have to do is we have to introduce matrix powers and they're actually fairly intuitive so we're not going to have to say too much about them.

Okay, so what a power of a matrix is is it's just what you get if you multiply that matrix by itself a whole bunch of times. Okay, so a to the power k, that's what this notation over here means, a to the power k means a times where there's in total k copies of a there. So for example a squared a to the power of two that just means a times a.

A cubed a to the power three that means a times a times a and so on. Okay and this is just in direct analogy with powers of real numbers right it was the same thing for real numbers. x cubed means x times x times x if x is a real number.

Same thing here for matrices. Okay one slight sort of weird thing that you have to be a little bit careful about though is we define a to the power zero to be the identity matrix. matrix okay and the reason we do this is well there are two reasons that we do this one of them is again to be an analogy with real numbers okay for real numbers x to the power of zero if x is a real number then x to the power of zero is one right and remember we think of the identity matrix as our matrix version of the number one it's the matrix that has the property that when you do multiplication with it doesn't change anything right so it's our matrix version of the number one another reason that we do it though is so that we have nice properties of matrix exponentiation in particular if we make the definition this way that a to the power zero is the identity matrix then we get theorems like this one that tell us that sort of properties of matrix exponentiation that we want to hold actually do hold.

Okay so here's the setup suppose you've got some square matrix and k and r they're non-negative integers then well a to the power k times a to the power r that's just a to the power k plus r so this is just very much like our sort of exponentiation rules for real numbers, right? If you replace a by a real number then this is a property that you already know about exponentiation. We're just saying well the same thing holds if the base is a matrix now.

And similarly a to the power k to the power r, that's just a to the power k times r, okay? And again this is just a matrix version of a property that you already know about powers of real numbers, okay? And both of these, like the proofs of both of these facts are just immediate, right?

Like what does a to the power k mean? Well that means you take k power, k? copies of a and multiplying together and then a to the power r that means you take r copies of a multiplied together so all together i've got k copies of a and then r more copies of a so all together i've got k plus r copies of a okay and that's it that's the whole proof and then the similar thing happens for the second property you just count how many a's there are over here and how many a's there are over here and they match up so they're the same thing right so in general for matrix powers if you have an exponentiation rule for real numbers It's probably going to work for matrices as long as there's only one thing in the base. Okay notice in both of these rules here it's the same matrix A in the base everywhere and that's what lets it work.

There are also some exponentiation rules if there are two different things in the base. For example, I mean if x and y are real numbers then x times y to the power k well one exponentiation rule that you know says that that equals x to the power k times y to the power k. That's true as long as x and y are real numbers.

Okay, but if you replace those by matrices then it's not true. Okay, if you do a to a times b to the power k in general that's not the same thing as a to the k times b to the k. Okay, and why is that? Well what gets in the way is the fact that matrix multiplication is not commutative.

Okay, if you were to expand this out what this means is you take a b and you multiply it by itself k times. So you do a b times a Okay, but what over here on the right, what this means is a times the self k times and b times the self k times. So it's a a a a a a a times b b b b b b b.

So there's the same number of a's on the left hand side and on the right hand side and the same number of b's on the left hand side and on the right hand side, but they're in the wrong orders. Okay, and you can't commute them past each other to show that they're equal. In general they're not equal.

Okay, even when k is 2 these two expressions are not the same. This one on the left means a b a b, this one on the right means a a b b. Okay, and in general those are different.

things. Okay so if you have an exponentiation rule with just one thing in the base probably fine, if you have an exponentiation rule with two or more things in the base be really careful, it might not be true. All right and one other thing that I haven't dwelled on enough yet that I want to dwell on a little bit is that matrix powers they only make sense for square matrices, you can't do this for rectangular matrices, and the reason is just that the matrix multiplication doesn't actually make sense if you're working with a non-square matrix right you can't do a times a unless sort of the inner like you need the inner dimensions between a and a to match up in other words you need the number of rows of a to equal the number of columns of a it's got to be square All right, so the only square matrices if you're doing powers of it. All right, so let's just compute some powers.

All right, make sure that we understand this definition and how to do the computations. All right, so just making up a random square matrix A here. A is 1 minus 1, 2, 1. All right, let's compute A squared. All right, and for that you just do your usual matrix multiplication rule.

It's just both of the matrices that you're multiplying now are the same. All right, so it's rows of A dotted with columns of A. All right, so top left entry, first row dotted with first column. Okay, so 1 minus 1 with 1, 2. Well, that's going to be 1 minus 2 is minus 1 altogether.

Top right entry will be first row with right column. So 1 minus 1 dotted with minus 1, 1 altogether. That's minus 2. Bottom left entry, where does that come from?

Bottom row dotted with left column. Bottom right entry, that comes from bottom row with right column. All right, and you do all that and you get a squared. All right, well, if we want to go higher, well, if we want to commute a cubed now.

Well, now. a cubed you can think of it in either of two ways either you can do a times a squared or you can do a squared times a and even though matrix multiplication in general is not commutative matrix multiplication with powers of a matrix is commutative so you can do this either way you can do a times a squared or a squared times a you'll get the same answer okay so in this particular case this is the answer that you're going to get when you try to compute a cubed and you can do this again either as rows of a dotted with columns of a squared or rows of a squared dotted with columns of a. You'll get the same answer either way.

All right and of course you can go even farther. You can go to a to the power 4 if you want and based on what we've computed so far there are a lot of different ways that we could have gotten a to the power 4. Okay we could have multiplied a by a cubed right so that would be rows of a with columns of a cubed or we could have done a cubed with a so that would be rows of a cubed dot with columns of a or we could have done a squared times a squared right we could have done rows of a squared with columns of a squared all of those will give us the same thing so as long as sort of the powers add up to the right number you can get the right answer okay so there are a lot of different ways that you can think about this matrix here but no matter which one you use to compute it you've got the same thing all right And now I'm just going to do a sort of a brief sidebar about how to compute large powers of matrices a little bit more efficiently than this naive way of increasing the power by one every time. What if I asked you to compute a to the power eight?

Okay, well I mean you could compute a to the power five and then a to the power six and then to the power seven and then to the power eight just by multiplying by a every single time, but if we use our exponentiation laws, right, our power rules, Then we can do this a little bit more quickly by noticing that hey a to the power 8 that's the same as a to the power 4 squared. So I just compute this take this a to the power 4 that I already have and I square it. I multiply it by itself and that'll give me a to the power 8. Anyway I do that multiplication I mean it's kind of ugly because the numbers are getting bigger.

But it's just again the same thing from definition and this is what I get. And certainly even though the numbers are big that's much quicker than multiplying by a an additional 4 times. Okay and the savings get even greater when you go to larger powers of a.

Okay so if you want to compute large powers of a matrix don't just multiply by that matrix over and over and over again, rather compute powers sort of by squaring as much as you can and that so like for example if you want a to the power 8 you just square a bunch of times. If you want a to the power 9 well you square all the way up until you're a to the power 8 and then just one multiply by one more copy of a. Okay and this sort of trick this is called exponentiation by squaring.

And it's a much quicker way of computing exponents of large matrices. And the same trick works for numbers as well. And the same trick is used for large exponents of numbers. All right.

So that basically does it for powers of matrices. Later on in this course, we're going to return to powers of matrices and see how to generalize this definition. So all we've done so far is we've learned how to compute powers of matrices where the exponent is some non-negative integer. So a to the power of 0, 1, 2, 3, 4, 5, and so on.

Later on in this course, we're going to learn how to define things like a to the power of minus 3. What does the minus third power of a matrix mean? That seems kind of weird, but it's something that we'll be able to do. And even more weirdly, we'll be able to talk about things like a to the power of root 2, or a to the power of pi, or a to the power of 7 fifths, stuff like that.

So we'll be able to do arbitrary real number powers as well, once we develop a little bit more theory. All right, so that'll do it for today. We just have one more lecture in week three about matrices, and then we'll move on to bigger and better things.

So I will see you next class when we start talking about block matrices.