Transcript for:
Jan 5 Pt 1 - Understanding Joint and Marginal Probability

right this joint pmf tells you all information about this random and the marginal pmf we saw that it could be fine we had the example of these different colors and different numbers one to six so marginal pmf of X you just sum over all possible small y so you fix your X small X and you vary wi over all possible y because you want to see what is the chance that random varable value small X then you have to vary all Y and you look at the Joint X and Y this x is fixed and we VAR why and the same we could do this time if you want to see what the chance that y Tak value small y this time we VAR so you fix Y and you sum over small X and that gives you a the chance random above Y is going to be exactly small why and the same thing it's just more complex writing with integral but the same thing work with continuous random varable of some everywhere you have integral joint BMF is replaced by density because the chance when you have continuous random varable could be temperature it could be stock price or any other example the chance that you are at any point is zero so we represented by density and you have the joint density of two random marbl what does it mean joint density so it means that okay if I want to see for example what is the if I want to see for example what is the chance that X and Y or inside some set a for example this is a let's say this is y this is X know so see what's the chance you are here you just have to look at the integral of this density over this area you just have to look at the integral we don't need to calate but that's how we do we don't need that much to do integral but that's how we do the last example we did was this here so yesterday we not going to repeat it but that's just it's good to understand conceptually what does it mean so for the continuous random different way numerically or sometimes you canot solve it and the density exactly tells you okay if I want to see this two random bar what's the chance it's going to be somewhere in the plane I just have to integrate over X and Y inside this plane of this density that's exactly what does it mean d Dy is a Circ you have to integ so that's what we mean with joint density and similarly you can find from joint density the margin after so if you fix X and you integrate over o y The Joint density that gives you [Music] the marginal for x and if you do for y if you vary X that gives you the marginal for y okay so that's any question about that then we can Define what is the expected value of random barall because we want to Define Co variant want to see what does it mean Co variance of two random varos if you have one you have seen that the variance of what just expected of x² minus G of X expectation of x Square now was the variance of X and when we were saying expected x square how you calate is a continuous random varable you have to find right x² is first expected X you want to find what's the expected of this integral all if could be zero the density on places but you have to vary that and here you put X that gives you expected of expected value of is just this one you want to find what is expected of X squ make that square if you want to find expected value of one which is one expected value of constant is itself you just have to replace one here and the integral of the density from negative infinity infinity is one so whatever is here you put it there that gives you expected value of that variable this exponential of X you put of X here and that gives you what the expected value of that expected value any function of that random ver would be calculated this way just need X sare for variance sare and here is the same thing if you have two random varable let's say you want to find what is expected X Y for coari we need to find this expected X Y this time you have double integral for X and Y you vary both and you have the joint density if you want to expected XY you just have to put XY here and that gives you what is expected XY that make sense okay so the any function you have a random you want to find their expectation just good to know the definition and after we work with more wellknown welln random varable we know what are the variance andari so to Define coari we need to know what does it mean when we expected is y for example okay so if you have any function of so here we talk about two random variables but it could be more than two random variables maybe you have example that you have different random variabl and random variabl X1 X2 XM it could be anything and then you want to see what is the expected value of this function of temp for example expected value of if you multiply all these random varable to each other how you calculate is just this is a discrete random varable you just sum overall possible X and you multiply this G by the pmf they are continuous you have an integral and you put that random varable multiply by the density and you integrate over all these variables okay so we don't go more than two but could be generalized to more than two random bar usually we X Y is this way if you have more than two you just put you want to see what is expected Y which is a function of all these and random VAR then you put that function multiply by The Joint PDF of them and you integrate overall this random VAR that make sense okay looks very complex but it works the idea is this this one you yes right so if you have more than two now let's say you have x y z three random then you have the joint density of these three random varable but it tells you what the density of Y let's say you want to find what is expected value of x² y how you calculate that Tri integral over o x o y o z you have the density DX d y these if you just take the density that's one if you just take that no if you want to find this expectation what I have to put here exactly the same thing as here you x² y this is nandom that function of all the n varable and integrate that one now let's say pmf is for random right if you have discret random VAR instead of you have Su so you have all sum so you have three sums here or you can write all one all XY Z from negative Infinity to Infinity all possible value that they take and you have the same thing you have square right and here you have pmf right so you have all this X Y the same thing integral so we can find the function of any random this way the expectation of a function of Random so that's how we calate so it's just the definition what we need is for example would it be possible for you to find like an example problem in the book and actually work it using this uh yeah so because here is just the definition and the example is would be when we Define the coari of two random bar because we want to see what is the coari of two random bar if the means that we want to see what is similar to variance we want to Define what is the co variance of two random Barb that tells you for example if you have two random varable which could represent two different things for example stock price of one asset and stock price of another asset you want to see how these two move reach each other if one goes up what happens to the other one so the coari is defined we will see what is that the variance tells you what is this one expected x² minus E of x² the co variance the coari so we're going to see is going to be anyone no coari of to random varable did expect that X Y expected X Y yeah so there are two ways to write it one way is this one X so you take what is the difference between X and E of x of X or mu if you put squar here that's the variance of x if you take expected of x minus its average squar that would be variance of X but now if you multiply by y that would be Co variance of X and Y you see how far X is from its average how far is y from its average and you multiply this two and you take the expect of that that is coari of two random bar both is that the same asare this is no it's not so this is the same as this one is the same as expected XY if you multiply these two and simple the term you can see that this is the same as expected XY minus e of x * e of Y right so when xal to Y what happens what is co variance of X and X itself this is the variance of X this is exactly e of x² minus expected of X the whole thing square coari of random variable with itself is just variance of but it's two random varable to be different when X and Y are independent what happens what is coari of X and Y when and Y independent independent what is co variance in this case this is anyone zero when two random Mar are independent they join pmf or PDF factor and you can show that the expectation also factors so expected XY is going to be e of x * e of Y okay when two random variable are independent Co variance is going to be zero to Independent that means that there is no Co variance or correlation between these two random varable if you know that X is going to up for example we don't know what happens to Y why still independent of each so when two random independent the coari is going to be zero we're going to Define later so now we said when you two random ver are independent there when two random varable are independent for example if these two are continu random they dra density Factor when two random variables are independent that means that the joint density factors joint density is a PDF of X multip by PDF of Y if there are this the Jo pmf also Factor this is for independent random bar that's the definition of independence two random varable are independent if they're joint PDF or joint pmf factors this is much more than just expectation expectation is just the average Factor this one is anything so it's all density and all BM factor and the only difference here again isre this one is for continuous so every time we write the PDF it's continuous and this one is discreet okay for pmf or PDF so all these things we say is works for B yeah but every time is a continuous you have integral you have density every time it's discrete you have pmf and you have some okay so independence of random varable means the density or the pmf the pro factors if this is true then what is expected it's y we said if you want to calculate what is expected of XY in general you have to write two integral we said you have to put XY here and here is the joint PDF one integral is for X the other one is for y this one is for X the other one is for y when they are independent what happens this guy is for x and for y so you could write it as two integrals separate of each other one is for x x * PDF of X the second integral is y DF of Y what is this one expected of x if you want to find expected of X this is only one random bar X Y where that which one this one uh no one under that's expected y just say you want to find what is expected XY you multiply XY by The Joint PDF if you want to multip find expected x² ybe you multiply x² ybe here but when these two are independent this Factor so you become two separate integral and then you can draw calculate them separately so this expected x² MTI by expected y here okay so when you have two random varable independent of each other you just take two different expectations you don't need to look double integral you do one for x one for y and multip each other first value exactly so it's X Y oh no it's expected x² ybe is exactly this L integral that's the definition what is expectation any function here you have X and Y you just put that function here and you multiply by joint p take integral so what is that FX so if that's X YB what does that f ofx y become in this case where so we have the join PDF so this is definition of okay let's go back to the example of rolling a diet so when you roll a d what was x x 1 2 3 4 5 6 with probity 1/ 6 how you calculate the expected of X so each time you say this is the SU okay that's how you could write you can write also one by one or you can say k is equal to 1 2 3 4 5 6 K multiply by so you're going to have K with Pro one/ six that's exactly the pro each time you put that it's the same thing here you say you're going to have x² with Pro inad you have now okay probility exactly yeah so each time is the same as sum but just change it with integral we're not going toate integral but this integral exactly the sum when it's continuous yeah so this is exactly the same when X is continuous so if you want to find expected of X you do this way if you want to find expected x² you're going to say I'm going to have K square with one/ 6 this one it doesn't become Square this one doesn't change because you're going to still have one over six probability that having one square one/ six chance to having two square one/ six chance having three Square so this one is fixed the pro but this number change if you have x square you're going to have one square 2 square of this 6 square if you have X CU you're going to have 1 Cube 2 cube of to 6 Cube no so this number 10 that's why this number change here and that one the density is the same here it's aity okay any other question so so it's just good to understand what is the definition of coar and then we do more example how for definition it becomes a bit abstract so think why is the same thing if you look at the sum yeah the integral is just for continuous random so each time you have any function of two random varable we just have to put their joint density and we multiply by that function and if these two random variables are independent of each other everything factors you just have one expectation for X another one for y okay okay so when you have two random varable independent expected XY is e of x * e of Y but but the reverse the reverse is not true just having expected x y e of x * e of Y it's not enough to for this two random VAR to be independent independent is much more than that Independence is all probabilities Factor not just the average so when two random are independent this is true but if this is true it doesn't mean they are independent it could be just the average factors but not every probability do we have some properties of expectation already so if you have two random bar both if you want to find what is expected of 2 * x + 3 let's say expected of X is 10 so you have a random variable which average give you 10 in average you play a game in average you earn $10 now you're going to double your investment on this game plus $3 what is expected of that we never going to do dou inte definition this 23 23 yeah so this is just two time expected a but you do two * that plus three expected that f is 10 this is 23 okay so that is going to be 23 so if you have any function like that a * X plus b you just can write a this is called linear function expectation let's say now you have another random varable White you we invest in two different stock one has expected value of 10 the other one has expected value of five you buy two share of X and three share of white what is going to be the expected value you get see expectation is linear so this one is 2 * expected of X Plus three times expected of white does that make sense yes so if you buy if x x in average gives you $10 Y in average gives you $5 you invest two in X and three share of Y so you have two times expected of plus three times expected of Y which is see 35 here 20 + 15 35 Okay so expectation is linear that means that and this is independ no here you don't need anything that's a good to think about the expectation X and Y correlated to each other let's say for example this is the maximum correlation let's say x is 2 * y this is everything what happens to Y happen this is correlation equal to one maximum correlation this is still true because when X is 2 Y what happens what is expected 2 x + 3 y so let's say y x is 2 * y so let's replace x y y x is 2 * y then you have expected of 4 * y + 3 * Y which is 7 * y 4 y + 3 y 7 Y which is 7 * expected of Y this is still 35 so you don't need Independence you see expectation is linear and you don't need Independence here so it's just me X and Y are completely correlated to each other you still have the same expected value the think about average is you don't need Independence and you could easily look at the expected of sum of different random variables each so we will do more example the sum of random varable that so it doesn't need to be two random you could have many of them you could have random different random variables the sum of expected of sum is going to be the sum of expectation okay let's do this uh example of matching algorithm here so let's say we have and people you know how many you are in this room say 35 so and there are 35 gifts which are assigned at random and X is the number of people who receive their own gift so let's say you have each one personal object or gift and then you randomly distribute it and you want to see what's the people X which is the number of people that get their own gift and you're asking got what is expected of each but you see that what you see be expectation how many people in average will receive their own gift is that here the question so let's say each of you has a cell phone so we randomly distribute know the cell phone the we want to see how many of people in this room will get their own cell phone but will be the average of number of people getting their own cell phone what is your guess Z in average this is so if you have one person what's the average of people getting their own the average this 100% of the average is one the average number is one how about if you have two people have person one and person two what is the chance that person one gets his own cell phone one half what is the chance person to get his own cell phone one half what is the average one and one half chance for person one one half chance for person two the average is one why is it one why so is it still one but is the average 12 12 that's a good question why we add these two numbers yeah okay let's see that's why I'm going to Define so you have to somehow write correctly so how we can model this problem yeah many situation this happens yeah so this kind of problem that is for any person there is two possibility either he's get getting his own cell phone or he doesn't get so we can write it as what's called indicator function so let's say X1 we're going to Define X1 is for person one this is one if what happens you get your own you get your own yeah so if person one if one person one let's write for I in general person I get their own or cell phone here zero otherwise either he gets his own gift or not so you get your own gift or no this is zero or one this is one if you get it or zero if you don't get it X1 is the indicator function for person one X2 is the indicator function f for person two what is this total number of people then which is here if you have two people in this room how many get their own gift X1 is for person one X2 is for person two how many they get X1 plus yeah X2 X1 is 0 1 X2 is 0 1 right X1 plus X2 if there are two people in this room X is X1 plus X2 X1 is one but this X1 and X2 are not independent yeah because if there are two people if X1 take his own gift X2 is going to get also it's on GI they are not independent it's not like binomial we will see which will be independent here they are not independent so because they're not independent that's why we canot model but if they were we will see later binomial it was binomial if they were independent so here they are not independent because if X1 is one that means that person one get his own cell phone then X2 is going to be one two so if X1 is one X2 is one two but we don't need to model the correlation between these two because in question you are asked what is expected of X in average how many people get their own gift because because expectation is linear and you don't need Independence this is e of X1 plus plus e of X2 so you could separate e of X1 plus e of X2 what is X1 this is a random varable which takes one with probity is it one half if there are two people there's half chance for you to get your own cell phone so X1 is one with probity half this is zero with probity half then expected of X is what is expected each [Music] one so expected X1 or X2 is the same this is it's going to be one half chance you get zero you don't get plus half chance you get one what is this this 1 12 in Aver this is 1 12 X1 the same for X2 there are all the same distribution X1 X2 X people are the same distribution but they could be correlated to each other but they are the same distribution individual so expected one is2 expected two is2 then this is 12 + one2 which is one so in average one person gets his own gift how about if we have 100 people each siid are the same so this time let's say you have 100 people what happen here you have expectation with some up to 100 people expected x00 does that make sense yes so what is expected side this time if there are 100 people what is the chance that you if there are 100 people you get your own cell phone because there are n different cell phones there are 100 different cell phones here what is your chance you get your own one n or one over 100 here so expected is 1/ and it is 100 there is if there are 100 different cell phones the chance that you get your own is one divided by 100 does that make sense yes okay so this one is going to be 1/ 100+ 1/ 100 1 over 100 which is you have 1 person 1% you have 100 of them you see the again one it doesn't matter how many you are the average number that get their own cell phone is always one that saying you expect at least one person is not at least is exactly the average number who get their own cell phone is exactly one because of this property of expectation even they are correlated you could write a sum of different numbers it's very practical this way that usually many situation you could write random varable as indicator something happens or not happens and you want to see how many in total happen in average it's just expected of first one expected of second expected of last and this property of expectation is very practical so you don't need to know their correlation and you just look at the expectation of each of them and you sum up so if you have n people the chance that you get their your own cell phone or your own gift is 10 divided by n and when you sum up is again one yes question anyone here so so expected of X is one it doesn't matter how many and this is very important that linearity hold for any random varable you don't need the independence so was the case they wer dependent here eyes are not independent if you have two people for example as we saw if first one gets his own cell phone the second is always getting his own cell phone too okay I don't know yes question I I'm still kind of lost on how you got the expected value of x i equal to like how did you know that it was so how how many cell phone are in the room there are n in total okay what is the chance that you get your own cell phone because it's random I give you randomly one c phone any in total the chance that you get your own is is one out this one yes question just to differentiate between where the end came from versus like whether it was people verus gifts say if there was additional gifts in the mix that didn't necessarily belong to anybody like how does that change the formula because yeah it doesn't really dictate or show how we went from n end to the variable that it happened to be one because it the same value but if they weren't so if if there are other gifts in the room you say but then again they are the same right the probabilities are the same but there not one/ n maybe it's one over instead of 1 100 maybe is it 1 over 200 for example there are another 100 which is not belong to anyone other cellone then for each one 1 over 200 and then you sum up instead of having one you have 1 over 200 + 1 over 200 you have 100 of this is 100 over 200 which is5 so you can still do even if these ones are different probabilities it still work if you need is different probability you just have to all these numbers again and it works so this particular case the E of X is number and then expy of X would be the number of people divided by number or sorry yeah the number of people divid by the number of K yes exactly because here we had the they weren't independent of each other so if you wanted to because here it's nice that we could use the linear property of expectation and we don't need to write the whole distribution for if you have two people for example what is the distribution for X let's see you can say the distribution for X here what are the possible value for x if you have two people what is the possible value it could be zero no one's get his own gift it could be one one of these two gets their own gift or it could be two both get their own gift what is the chance of zero this is okay what is the chance of one this with probability is it possible to have one person getting their own gift if one get the other one it never happens this is zero or it doesn't exist this is zero or two so there only two possibilities zero or two what is the chance of having zero half what is the chance having two 12 what is the average 0 * 12 + 2 * 12 is 1 so you could write the distribution for it and do but it becomes harder when there are n people it's harder to write the distribution of X but this question we don't need because we were asked only about expectation and we just need to write sum of expectation and we know that each person get his own GI Pro 81 n so we can use that writing distribution of X in general is hard in this question when they as they are not independent of each other okay so we can use linear property of expectation to find what is the expectation of any question next as we said the co of two random marbl is defined as expected Hy minus expected X expected y let's try to find here what is expected X1 and X2 for let's see if you understand so1 is going to be one2 Z2 what you think going to be Co variance of H1 X2 the coari as we said can be calculated this way so if you have two random variable X and yarian is expected X Y minus E of x e of Y what is so here X and Y are X1 X2 which is expected of X1 * X2 minus E of X1 e of two that's the coari of two random VAR if you have two person you want to see what's the you want to see one random VAR is person one if he's getting his own gift or not second is for the second person if he's getting his own gift or not okay let's do this one it's easier you already know the answer what is expected X One expected each one is so this is one we Pro 812 zero we 812 there is one half chance you get your own gift and one half chance you don't get X1 is what is here 12 MTI by 12 what is expected X1 H2 that one is harder but we can write it with this one we need the join pmf of X1 X2 did okay that's right X1 X2 we want to see the join pmf X1 X2 is you can take right X1 X2 both together what are the possibilities for them it could be 0 0 is it possible one Z you can write but it never happens 0 1 or 2 two so these two never happen yeah we Pro it is zero zero chance here zero chance here The Joint pmf at 1 0 is z The Joint pmf at 01 is z at 0 0 this is