Transcript for:
Understanding Linear Dependence and Independence

hey folks my name's Nathan Johnson and welcome to lecture 4 for advanced linear algebra today we're gonna be talking about linear dependence and independence roughly speaking the idea behind dependence and independence is we want to capture the notion of a set of vectors containing redundancies or not containing and redundancies ok so remember back to last lecture when we talked about the span of a set of vectors and we talked about this idea that if you have say up up a two dimensional plane in 3-dimensional space well you can span that plane by taking just two vectors right you just pick any two vectors that are not collinear on that plane and then the span of them is the whole plane so in the sense you can describe that plane via just those two vectors well you could also describe that plane via say five vectors right you could just pick any five vectors as long as they're not all collinear on that plane then the plane is still gonna be the span of those five vectors but this seems somehow kind of silly it seems redundant right you don't need to use five vectors to describe a plane that suffice is to just use two okay and that's the idea that dependence and independence is gonna get out it's gonna get out this idea that you know well five vectors yeah it works but there's some redundancies there you could throw away some of those vectors and the spans still gonna be the same okay so that's the idea here and the way that we formalize this mathematically is well we have this linear combination here okay so what we do is we say that we've got a set of vectors B in a vector space we say that this set B is linearly dependent in other words it contains redundancies if there's some linear combination of the vectors in that set that equals the zero vector so if there's some way of taking a linear combination of those vectors in B and getting the zero vector as a result that's linearly dependent that's it contains redundancies okay if there's no way to do this if there's no nonzero linear combination that gets you the zero vector then we call it linearly independent and in that case we think of it as not containing any redundancies in a sense all the vectors really point in different directions from each other they each give us a new dimension in the sense okay so there are a bunch of notes that I want to make on linear dependence and independence before we go to any examples the first of which is that a that being linearly independent well that's actually equivalent the the way that you check this is I mean just via setting up a linear system what you do is you set up this linear system over here you set up this linear combination let's just straight from the definition of linear dependence and independence and the sets linearly independent if this linear system here only has one solution if the unique solution is just the all zero solution right here the the coefficients and the linear combinations that are the variables in the linear system you solve that linear system if you only find this solution then great you know it's independent on the other hand maybe you find infinitely many solutions then it's dependent okay then there is some nonzero linear combination that gives you the zero vector okay and another really important point is that and we could have defined dependence and independence a slightly different way we could have said that a set of vectors is linearly dependent if and only if there's a particular vector if there's at least one vector in that set that's a linear combination of all of the other vectors in the set okay so this definition that we gave up here it's sort of a it it's a version of this definition down here that doesn't care about any one particular vector the way the new sort of bridge between these two definitions is like if you start with this definition up here well because at least one of these scalars is nonzero you can always move one of these terms over to the other side and solve for that vector right at least one of the CJS is nonzero and move it to the other side solve for BJ and you'll get VJ is a linear combination of the other vectors so that's another equivalent way of defining linear dependence as if you can write at least one of the vectors in the set as a linear combination of the other ones okay and again that comes that sort of highlights this idea that you know the set it if it's linearly dependent then there's a redundancy there's some sort of some vector in that set that doesn't give us really a new direction because it's just a combination of all the other vectors in there okay there's sort of nothing new about it okay and there's one final note that I want to make about dependence and independence before we do some examples and that is if you have a set containing just two vectors that's really easy to determine whether the sets dependent or independent its dependent if and only if the vectors are multiples of each other and it's independent otherwise its independent if they're not multiples of each other and the reason for this is well again you're just asking is there one vector in the set that's a linear combination of all the other vectors on the set well if there's only two vectors in the set you're just asking well is this guy a linear combination of the other guy in other words is this guy a scalar multiple of this guy right linear combination of just one vector all you can do is scale in multiplication okay so let's do a couple examples to see how we can determine you know whether it's that's dependent or independent so let's start off with just a set of two vectors in this case the vectors are polynomials polynomials of degree two okay and we want to know is this set here a dependent or independent and well we just use that sort of test that I just mentioned we we look at these two vectors are they multiples of each other no they're not they're not multiples of each other so we know right away that set is linearly independent okay in other words there's no redundancies here these vectors really point in different directions like maybe we don't have great geometric intuition for this space here but that's sort of the idea here they're pointing in different directions or not they're not just multiples of each other okay so that's with two vectors are easy let's go up to a set with three vectors okay so in this case the vectors happen to be matrices is this set linearly dependent or independent okay so this time we don't have sort of a quick and easy check we've actually got set up a linear system and what you do is you set up exactly the linear system from the definition of dependence and independence see one times one vector from the set plus c2 times another vector from the set plus c3 times another vector from the set set that equal to zero and the question is is the only solution to this linear system the all zero solution okay and again we set up a linear systems based on matrices back when they're talking about you know spans so here the idea is very very much the same okay it's just you know look at each of the entries each of the entries and the matrices give you a linear equation okay so looking at the top-left entries 3 C 1 plus 2 C 2 plus 0 C 3 equals 0 so that's at my first equation from the top left and the bottom left I get to see 1 plus 0 C 2 plus see three get equal zero so that's that equation and similarly you get two other equations from the top right and bottom right okay so you get four equations and three unknowns and you just go through and you solve that linear system using Gaussian elimination and you're gonna find that for this particular linear system this linear system right here there is only the zero solution okay yeah you're really the only way that you can satisfy all these equations simultaneously is really if C 1 C 2 and C 3 are all equal to zero okay so that tells us that this set up here really is linearly independent the only linear combination of them that equals zero is the all zeroes linear combination so the linearly independent set in other words all those matrices yeah they really point in different you know directions even though we again we don't quite have a great geometric sense of what the space of matrices looks like but that's the idea okay now let's do one more example but this time with the vector space of functions so remember that's what that funky script F over here means the vector space of functions so we've got this set here sine squared Coast Guard and Coast 2x and we're asking again is that linearly dependent or independent set so we start off the exact same way that we always do C 1 sine squared plus C 2 ko squared plus C 3 coast 2x and we set that equal to zero and we're asking the question does that imply that C 1 C 2 and C 3 are all equal to zero is that the only solution okay so the way that we're going to tackle this problem is a little bit different than the previous ones because these three functions up here they're all trig functions and we remember that hey we know all sorts of trig identities from way back in the day we learned these in previous courses and in particular we remember that there's a trig function relating Coast 2x to these two functions sine squared and Co squared in particular Coast 2 X just equals Co squared minus sine squared okay and that's exactly the type of thing that we want because we can rearrange this equation if we just move everything over to the left hand side then we're gonna get sine squared minus cos square plus Coast 2 x equals the zero function okay and this equation here once we moved everything over to the right to the left hand side is exactly an equation of this type with particular scalars chosen in particular with c1 equals 1 c2 equals minus see three equals one this equation is exactly this equation so that tells us that no this set is not linearly independent because C 1 C 2 and C 3 don't all have to equal 0 right they could equal 1 minus 1 and 1 they could also equal to minus 2 and 2 they could equal any multiple of these 3 scalars here but we found at least one nonzero solution that works that's the important point so that tells us this set is linearly dependent okay in other words this set here there's sort of redundancies in it one of these functions one of these vectors can be built out of the other ones using just our vector space operations using just scalar multiplication and vector addition in particular this is how you do it via this trig identity you just take coasts squared and you subtract sine squared and you get this function over here so there's a redundancy there there's sort of nothing new when you introduce this third function here okay so that's why it's linearly dependent okay so that basically does it for this week next week we're going to combine these ideas and start talking about basis of arbitrary vector spaces and we'll see sort of what made this final example a little bit different in flavor than the previous examples the the basic idea is that this vector space F it doesn't have a nice basis for us to work with so it's it's a little bit trickier to turn this linear independence equation here into a system of linear equations so this vector space is somehow it's more exotic than the other ones that we've been looking at like polynomials and matrices and RN ok so I will see you then