okay okay so today uh we are going to learn the last method to show eigenvalue I can Vector problems uh in previous classes we learned um uh the power iteration method right and normalized power iteration method and the inverse power integration method right so the concept actually is the same use you know iteration right in the iteration so the novela is the power elevation actually is uh is improved you know iteration method that you know during each step we can normalize the vector right and so that's the values in the vector is not uh larger than one right so it can help uh simplify the communication and the inverse power iteration method is try to find the mist eigenvalue and corresponding economic right and the the power iteration is to find the largest value and the corresponding eigen Vector yeah today uh the method we are going to learn is uh caught really quotient yeah so why we want to uh you know develop or learn this method yeah because you can as you can see from the examples right of no matter is power iteration normalized the power iteration of inverse iteration right so you can see that we need you know several steps you know sometimes at least nine ten steps we can you know you know get very close to the you know eigenvalue and I go Vector right yeah so about 10 steps 12 steps to reach that and uh actually uh it takes some time right yeah so is there any way that we can uh you know get the eigenvalue within less steps right if we can you know Reach This within less steps let's say maybe five or six right I think you know it's much better right yeah so the communication all the uh time consumption is less right yeah so this is why we uh have this really caution method yeah so this method can accelerate the convergence you know if the power iteration method yeah okay now let's see what is really quotient uh given an approximate I go back to X for Matrix a determine the best estimate for corresponding item value lambd can be this is can be considered as and by one in your list Square program so here are you gonna please on the description you can see that we uh you know regard you know this I can value uh I can make a problem as a least Square problem right and you can see here instead of this ax equals lamb tax right yeah and we put X in front of Lambda right yeah and here we regard Lambda you know is the um by one uh this describe problem yeah and we know that when we learn the least Square problem right uh the fundamental method is normal equation right yeah so normal equation again we use here so based on the normal equation if we pre-multiply a you know X transpose to both sides right of this equation so that is X transpose X Lambda right we've got add some force a x and so why here we pre-multiply X transpose right um recall the normal equation for uh at least a square right so that time is uh it's in this way right ax ax equals B right yeah and we pre-multiply into a transpose right yeah so here X is regarded as this uh you know vector or Matrix and we promote black X transpose and we can get this and then because here we try to um get a solution for Lambda right yeah and then lambd equals uh this right so both sides left side right side both sides divided by X transpose X and we can get the solution for Lambda for Lambda and this method this really quotient method has many useful purpose you know the most important way is that uh Edgar accelerate the convergence of power iteration method now so namely it runs faster it has faster convergence so let's use the example to show the convergence speed of this really really quotient method in order to compare with previous methods right let's use the use the same example so before when we learn this normalized power iteration right we use the example Matrix a is three one one three and the initial uh Vector x0 is zero one right let's still use this example here uh we have the Matrix a which is three one one three and the initial Vector X zero which is uh zero y right let's double check oh yeah zero one okay now uh let's use really quotient method to find the eigenvalue the dominant eigen value and the corresponding Agave actor and we know that this really quotient method is to uh accelerate power and Rich method right yeah and here let's use the normalized power education method okay use the update lose for normalized power evaluation so here is how to update XK right yeah so we have y k equals a that's K minus one okay um first update uh YK right YK equals that um a x k minus one and then update XK x k equals uh y k divided by k Infinity Norm [Music] and here we have this quotient isk transpose here is K divided by isk transforms by x k okay uh so the same first let's build a table all right below the table so we have the index of Step K right here we have the Vector x k I suppose and we have the YK Infinity Norm right here so I put the vacuum Infinity Norm here is is used to compare with power iteration and normalize the iteration method and the last term is this one this quotient XK let's suppose a as k as K transpose okay uh zero stat s so zero step that we have x0 x0 which is uh zero one right yeah X2 zero is here is given zero one so now let's update uh y1 and X1 so y1 is a as K minus one there's a X zero right zero one so as one three and then update x y x y equals y1 divided by y1 Infinity null so y1 Infinity Norm is three right the maximum volume so S3 so we can cut X y's one third one okay now uh calculate this quotient term so uh actually which is Lambda because x k transpose so it's zero one okay zero one so right now it's a low Vector right yeah X transpose is a row vector times a times x y x y X Y is one third one uh divided by X1 transpose zero one times zero one and the the numerator here um so here is uh one two first element is one then the second is three yeah so one three times one third one and the denominator so here is uh oh uh I'm sorry here I uh make a mistake so X One transpose uh so here should be X1 right but I substitute X zero here right yeah so at the same as in the denominator right here should be x what and then X Y is here so let me here is yeah so here should be uh one third one answered one and here one third one third one okay and then the numerator here is um one plus Y is two okay two and then the second element is and third times one third one the denominator denominator here uh foreign nice so you can get the final results here is two stairs two Service Plus uh tensors memory calculates uh users follow through the service okay and the uh numerator is four right denominator is 10 nice so uh the result is that is 36 tests right so actually is so important 3.6 yeah uh so let's add to the table uh so this is step one right that's step one and X1 transpose uh one third one and for y uh y1 infinity now uh is three right and for the quotient term here's three point six three point six next update uh X2 so first so y two is a x y okay so X Y is one third one so it's one plus one two uh one third plus three uh 10 third so my X2 will be Y2 divided by Y2 Infinity no so here is the Y2 to the Third divided by Y2 Infinity Norm here is the tensor so we can get X through as 0.6 and what 0.67 okay and then the quotient term here uh should be x 2 right X2 okay then uh X2 transpose to the Third a and uh X2 divided by X2 transpose X2 so uh please follow the uh follow with me and at the same time why calculate right you also can uh you know calculate the base of and if I make make mistake right if there is table and then uh you know that right so do it with me together yeah okay uh so let's calculate the Lambda for this step so that is uh six plus ten third right the first element so six particles or that is 26 sir and then uh two Plus 10 this is 12. [Applause] okay uh so the numerator here is 26 third times to tensor so the denominator uh that is four 136 divided by 9. so let's calculate the result uh three 62 Plus here is 120 so obviously 172 foreign foreign 3.88 yeah so you know for this kind of competition you can use a calculator right yeah okay all right these two uh the table step two right so the X2 X2 uh is 0.6 1. Y2 infinite known one two Infinity Norm is ten third right and it's 3.33 and the quotient term Lambda here is 3.88 okay now next update uh Y3 S3 equals a X2 X2 is 0.6 is three fifths one so uh here is nice waves plus one um nine fifths plus one fourteen fifths and this one is three fifths plus three is um 18 fifths okay um then X3 equals Y3 minus 3 is um 14 steps 18 fifths divided by Y3 Infinity known which is 18 fifths right so we got x-ray is that seven nice one okay so uh seven nice so point seven eight one okay now calculate lamped right so here is x three thanks seven nice one times a um X3 divided by X3 times actually transpose times X3 so the numerator here is uh seven third plus one right seven third plus one 10 3. uh seven nice plus three uh 34 nice uh the denominator here is 49 80 Watts made it first uh so uh denominator is 130 80 cars okay then the result it'll be here okay the result here is uh 258 divided by 65. so is three points 97. okay let's report another I told these people uh third step X3 is 0.781 Y3 Infinity now is 18 fifths that's 3.6 and the quotient term is 3.97 okay let's do uh one more just one more step update by fall all right wave uh three one one three x three seven nice one seven nice one here is uh seventh third last one uh ten third seven nice uh three uh 34 nice update X4 nice ball because uh y4 10 34 nice divided by the uh by Infinity Norm right which is 34 Mass so we can guide X4 is 30 balls and what and here is 0.881 and then uh update the quotient term so X4 transpose uh here is 15 17s why a uh X4 divided by X for transpose of x okay uh so the numerator here so 45 17. so as 6 2 17s and 6 6 17s times 15 17 is what uh so the denominator [Applause] denominator is 514 divided by 2 89. foreign ER is two zero five two eighty nine denominator and spell y 4 289. so the result is that is 3.99 okay as uh add to the table this is step four and X4 is 0.881 4 is by for Infinity Norm is uh 34 nice 3.78 3.78 and for the quotient term is 3.99 3.99 okay so let's stop here um and uh I can tell you that in Step six so X6 becomes 0.97 Watt and uh Why by six Infinity Norm is 3.94 the current term Lambda is for okay so this is the computation process for this example using really quotient method and you can see that after six steps this quotient term which is the lamp the dominant eigenvalue right it becomes 4. okay after six terms and if you uh remember that when we do the same example with normalized iteration okay you can see here with the normalized iteration as step six right so the um Y6 Infinity known which is the eigenvalue is 3.97 right yeah Etc and actually uh at the step 12. now you know we can get the eigen value for we can reach the eigen level 4. but here with a really caution you can see at step six we reach value 4 right for this icon value and then you can see here as F6 I create the this Y6 Infinity Norm which is the uh which is the uh I can value using normalized power activation method right and you can see the difference yeah and if you compare each step this y k u Phenom underscore quotient term you can see that right so the quotient term is always larger than YTP Norm right and it becomes closer and closer to a four uh so finally average is four you know faster right faster than white Infinity Norm yeah so that's why you know uh when we mentioned this uh poly really caution method you see that it converges faster than normalized power iteration and power iteration methods yeah it can accelerate the convergence and uh you see that actually although the initial Matrix right the initial Matrix a and the initial Vector X zero it's not that complex right yeah the values are not complex but during the competition still we have some complex computations right yeah so uh you know the purpose here is try to help you to practice and be able to use this method to solve this kind of problem so in reality I create you can use the calculators or you can use you know some programming tools to uh compute like you can input you know these formulas this equation Matrix in my life right and automatically you can get the computational result right so it will not take much time but for handwriting and recommendation yeah it needs some time yeah but this is the process to let you understand and be able to use this method okay uh so so far we uh Howard uh Power iteration normalized power iteration inverse iteration and really caution matters right so we can use this uh for new methods to solve eigenvalue either Vector problems okay uh it's different from you know what we learned before right yeah okay so try to understand new knowledge and use new matters to save the same kind of problem yeah and so okay we finished this uh this chapter so next next class next Monday we will have the uh exam right so we call it midterm exam because we don't have the final exam right so I uh make this midterm exam uh in the 10th week and intensity okay so the midterm exam will cover uh the knowledge and content we we already learned yeah so it means coward until this lecture so this uh this should be lecture 13 yeah so the midterm will cover you know the lectures until 13 lecture 13. uh yes so it will cover uh lecture 13. yeah it's included um it's very likely to have one question one question about you know maybe wow these four matters yeah so for the exam uh questions the tab of questions we only have three types of questions uh one top question is a short answer question uh so for short answer question uh basically will focus on some uh Concepts definitions properties right yeah for example um maybe there is a question like uh uh what is uh over determine system right so just use one or two sentences to answer what is over determine system right yeah and the second uh question tab is uh true or false yeah so actually is uh similar to the short answer question uh you know maybe it's more focusing on some properties right like I give you a matrix right and I make the statement that okay this Matrix is singular and let you to tell is true or false right yeah some some questions like this the third question tab will be computation question right so you can see actually uh we you know we are doing a lot of computations right uh Matrix Vector computations yeah so um you know this is the major part major component in the exam computations with vectors Matrix and especially you you are required to solve problems using the newly learned methods okay like uh Elementary animation culture elimination right you should use these methods to solve linear system but not the you know you know the real operations right yeah okay and how much time uh so is uh 90 minutes one half hour one and a half hour when we submit it to Blackboard yes or great yes yeah some answers through Blackboard so I will make the exam available uh at the class session there's a class time so it'll be uh available at four o'clock yeah four o'clock and uh how to prepare for this exam oh while uh important resource is the lecture right because you can see I can do in the lecture we we did you know many examples right and I uh you know we did together one step by one step right yeah so uh the slides uh very important the examples on slides are very important and uh another source is uh the homework right yeah so homework uh you know homework we uh did extra practice right yeah some questions so uh maybe there will be similar questions you know to the homework questions yeah uh next Source will be the the lab right yeah the lab yeah so it's especially the you know most recent lab it's about uh a graph right and calculate the adjacent Matrix right so it's very interesting you can see that you know the uh powers of the education Matrix they have physical meanings for the graphs right yeah so you know it's also very important for the lab in the labs are very also very important for exam you know how many questions like for the short answer questions uh uh you know maybe just a foul yeah and 12 hours you know five questions for the commentation question uh commentation question [Music] maybe uh I think maybe five to six conversation classes yeah but for each competition question it it may have uh sub questions right like in the homework right we have one question and it may have sub questions a b or c right yeah something like this yeah so I um I can give you some information that you know um what can't do communication questions we will have the first one is that the vector vector knob right so um you know there should be a question about the vector no right we have different kind of norms of vectors right yeah so you should be able to uh you know calculate Vector knowledge right and uh uh another condition question may be about Matrix so for Matrix here uh you know uh maybe do some Matrix computation or maybe you know piss down some physical minions of Matrix right Matrix power yeah so uh if you you know if you already did the lab you you should know what I'm talking about uh next one of course you know Elementary elimination culture elimination right and the force one uh you know should be the uh at least a square program right over determine system you know how to uh so that using normal equation householder transformation right and the last question you know should be about eigenvalue eigenvector problem so basically we have five or six computation questions okay uh the time because the short answer and 12 false questions actually are very uh very simple it will not take much time for commutation question I will try to make the computation very simple I will uh you know design the values very very small and very simple yeah and even though it may have some question yeah and uh also for you know like if we have questions about householder transformation right you know it's just very simple maybe just the first step you know something like that and if we have the power iteration method right maybe it does to calculate two steps I will not require you to calculate the 10 steps yeah so it will save some time um yeah I will post the the uh the solutions for homework and labs like [Music] uh so uh the solution should include the work and the answer pause yeah but the work the procedure mean uh may not be that you know um detailed yeah I may not be that detailed any other questions foreign [Music]