all right we are on the last two chapters uh of the course last two lectures so uh Monday's lecture models of decision- making and then um in the Wednesday we will talk about evaluating decisions so uh let's get right into it so we've spent right the vast majority of this quarter learning to how to evaluate various types of scientific episodes um and we've gotten so far is like in a V wide variety of situations now you should be able to conclude whether there is you know support or not for some scienic hypothesis be it a causal hypothesis statistical hypothesis and so on right um so if you conclude there is strong evidence for some hypothesis that means there's new information it's something you should believe right um the question is how then do you apply that information um for example you conclude that there's strong evidence that saccharin is a causal factor in bladder cancer um but not a particularly effective one right you uh are certainly aren't guaranteed bladder cancer if you put a package of Sweet and Low right in your coffee um so how should you weigh this information right against other information um maybe there's other benefits to sacon right if uh you're trying to keep a low sugar diet and helps you do that there's could be benefits to using it even though there's um right an a causal relationship between the Sweet and Low and and the cancer but not a very effective one um everyone's values are different right some people maybe don't care if they get cancer right or you know particularly with smoking right some people love smoking so much that they will uh forego potentially like the years of their life right at the end of their life if it means they get to smoke for the years they have um that's your own personal value right so long as you aren't hurting anybody else um there's nothing wrong with that right so just getting the scientific information um doesn't on its own make any prescription about how you should behave right so what we're going to learn today is given certain scientific information what's a framework for then deciding how to earn your behavior right um there there is a rigorous sort of models of decision- making out there it's called decision Theory um closely related to what you may have heard of called Game Theory right so um in our model of decision making uh we will be using a game as a model because many of the factors of a game um are similar to everyday life decisions so the first part part of a decision and the first part of our decision-making model um is a variety of options uh if you don't have more than one option you don't have a decision to make right a decision is the act of choosing from a number of options at least two so uh back to our old friend the jar of marbles right um we might suppose that the jar contains red blue and green marbles um and let's suppose we're playing a game right and the game is that a marble will be selected at random from the jar and you're going to bet on which color here are the options are bet on red bet on green bet on Blue right you have three options um and again there's not more than one option it's not a decision right so here uh to keep the model rigorous and to make sure that there is uh uh there are answers to these questions we're going to make it so there only one option be chosen and the options are mutually exclusive and exhausted now this may not always be the case in real life right but uh we want to start with something that's simple and mathematically rigorous and then if we need to expand it or adapt it to real life um do that later so we've used exclusive and exhaustive right numerous times before in this course but just as a reminder in this uh context to say an option is exclusive means um you have to bet that the marble is red the marble is green or the re marble is blue which you can't bet that the marble will like be either red and green and red right um because being red is included in the BET being red or green now it's conceivable right you depending on how the game is set up right there might be a bet make that the the marble is either red or green um I if any you ever bet on horse races but that's one of the bets you you bet on the you can bet on the horse to place or to show right that means if youve bet to place it means the comes in either first or second right if you bet to show either first second or third those are conceivable bets but what you certainly can't do is make a bet that will be either red or green and red right because again those aren't exclusive red is included in both bets so that's what exclusive mean um and then exhaustive means that uh all those options cover all the possible outcomes so no no color is going to be drawn than red green or blue the other part of the model is the different states of the world so uh the possible states of the world in our game will be red being drawn blue being drawn or green being drawn um be careful to make sure that the states of the world and the options are completely separate right so for example winning and losing right even those are possible outcomes that could happen those are not states of the world right because winning depends on what your bet was right so it depends on what option you chose we want the options and the states of the world to beely separate states of the world also must be exclusive and exhaustive right so um again we've covered exclusive and exhaust okay third part of the model outcomes so our options and the possible states of the world right if you put one on the x- axis and one on the Y AIS uh you create a little Matrix that appies every possible outcome in our game so this is in figure 9.1 in the text so all the outcomes in our Matrix are various option State pairs um obvious obviously the number of outcomes will be the product of the number of options and the number of states oral so here we have three and three so we have nine total uh three of these outcomes will correspond to winning and six of the outcomes will correspond to losing so winning is if you bet on red and red is drawn bet on green and green is drawn or your bet on blue and blue is drawn any of those three are winning outcomes the other six losers those are the three parts of the model that we need to create our Matrix um but there's one important part missing which is the values of the various outcomes so if I bet on red and red is drawn I win right but big deal unless I'm winning something of value um or potentially losing something of value right so um we need to somehow assign values to the various outcomes um once we've assign these values right decision Theory then is a model for deciding how to maximize the potential value right which will be the way we make decisions we'll make decisions in order maximize the value the value of the outcome right um so there's a number of ways of assigning values uh simplest way to assign values to outcomes is to just rank them what we would call ordinally right just in some order from most preferred outcome to least preferred outcome um this is certainly a adequate basis for choosing an option um but it's not the most informative way to assign values uh the the important thing it doesn't do is it really doesn't tell us how much one outcome is preferred to another right um we can't assume for example that our second preference is like twice as good as our fourth right um without also assigning what we say call a measure to these values so a standard measure we use and certainly in Bedding games that is used as money so we know that an outcome that Nets $10 is twice as good as one that Nets you $5 and that's um helpful when you're making certain calculations that we may have to make later right um You could also just assign numerical values to outcomes uh call those utilities right so an outcome with utility of 100 would be twice as good as an outcome of 50 later we'll find that you can uh do the same job just assigning numbers between zero and one right and and that'll accompl so decision strategy also depends on a lot of outside knowledge um for you to un that will help inform you on sort of what state of the world is more likely right to to actually obtain so and this is the part where we apply our scientific knowledge to the decision making right so again we we choose an option and we choose the option that we think is going to uh line up right with the state of the world and give us the best outcome well scientific knowledge gives us some Clues as to what the uh actual state of the world is right and that can help us make choice that gives us the best outcome um of course depends on the decision problem right uh it can vary widely how much we know we might actually know nothing whatsoever right about what's going to obtain that's certainly the case with the jar of marbles right it's purely a game of chance so um no scientific knowledge going to tell us model is going to be drawn but there are certainly other situations where we can know exactly what the state of the world is in which case the outcome is typically much easier to choose or the the correct option right so there's G to be different strategies for making decisions depending on how much you know what kind of knowledge you have um and we're going to go through those now there's three basic types so um one is called the first we we'll talk about is called decision- making with certainty now uh this course so far has never allowed you um to claim anything with absolute certainty all of our evaluation strategies sort of at the best said well there's strong evidence for this hypothesis right um we never said for sure this hypothesis is true right that was just not the the nature of science right there's always the possibility that hypothesis could be falsified um but you know for certain everyday decisions we can treat some hypotheses as effectively certain I don't think um theory of gravity right is going to be overturned anytime soon right so when I'm making everyday choices I'm going to assume that the acceleration of gravity is 9.81 meters perss squared I don't think that's going to get me in any trouble right so um we can treat some decisions problems as cases of decision- making with c cty right um so what does that mean it means that we know exactly what state of the world obtains um we could conceive of a marble situation where that's the case right if we knew all of the marbles in the jar red then we would know for sure that the state of the world is the state where red is drawn right so decision- making with certainty is not gambling right it would make for a very boring game chance right instead um the whole trick to decision making with certainty is correctly calculating the values of your options relative to state of the world which is known so let's look at our uh Matrix here and my head is blocking part of it uh but if I have to stop and grab my book so I can see these I will but I can get through this without seeing the bottom um so but for you pull out your book and look at figure 9.2 too so let's say we know the state of the world is that red will be drawn uh we have three options then betting on red will make us $10 uh betting on so we're looking at the top or sorry at the First Column there right that's the world where red is drawn um betting on red gives us 10 bucks right uh betting on Green we lose 10 bucks betting on Blue we lose 10 bucks so pretty obvious here betting on RAD produces the outcome with the highest value uh so this suggests a strategy right for decision- making with certainty which would be presuming that you have certain knowledge of the actual state of nature choose the option associated with the highest valued outcome compatible with that known State um so again state of nature that we know in this case is that red will be drawn because the jar is full of red marbles bed on red gives us the highest value so that's to choose um I mean notice in this case we don't even need the whole Matrix right we didn't even use the second two or the second and third columns um now this is a possible situation right you could enter uh few real life cases are like this um and you probably wouldn't need a uh class in decisionmaking right to to know to bet on red if you knew they're all red marbles in the jar um but start slow and build up so sort of on the other end of the spectrum here is something we call decision- making with complete uncertainty right that's exactly the situation in the jar of marbles right it's supposed to be a game of chance totally equal probability that it could be a red drawn blue drawn or green drawn so in that case you're decision- making with complete uncertainty no information whatsoever about the state of the world um now betting games of chance are about the only time when you'll have absolutely no about the state of the world right in most everyday situations should I buy this car right should I um take this job should I enroll in this graduate program um there will be information that you have various op but uh again another potential situation and and it's useful to model it so if we had no idea which state of World obtain the only information we would have right to guide the decision would be the Matrix itself that we we made um but that is actually a fair amount of information it's not information about the world but nonetheless it's useful information for making a decision all right for example even when we have no knowledge about state of the world um we can still make some informed choices just from our Matrix so look at this Matrix here this is 9.3 in the text um and compare betting on Green and Bing on blue as options so if red is drawn the outcomes are identical right minus you lose five bucks either way uh if blue is drawn identical outcomes right you make five bucks whether you bet on green or whether you bet on Blue um however if green is drawn uh you win 10 bucks on if you bet on green and you only win five bucks if you bet on Blue so no matter what the state of the world betting on Green will give you at least as good an outcome as betting on Blue right and there's a chance that you could do even better if you bet on green so it looks like betting on Green is the better option right so let's uh Define better option a little more rigorously for any two options A1 and A2 if there is at least one state for which A1 has a higher valued outcome than A2 that for us would be right betting on Green gives you 10 bucks if green is drawn and only five if blue is drawn so there's at least one state where A1 has a higher valued outcome and if there's no state for which A2 has a higher valued outcome there's no point where betting on Blue gets you any better than betting on Green uh in that case we can say option A1 is the better option right compared to uh this definition applies for purely ranked outcomes even if you we hadn't assigned a measure hadn't assigned dollar values you just ranked your the outcomes by preference um you can use option so if A1 is better than A2 then we we can also say that A2 is worse finding those terms now looking at the same Matrix here from 9.3 notice that red is neither better nor worse than betting on Green um so if you look at Red If You Bet on red and red as drawn you do better than Green right um if you bet on red and green as drawn you do better betting on Green and then blue is drawn you do better on green so there are some outcomes where red does better than Green and there's some outcomes where green does better than red so neither is better or worse than the other so given this definition of a better and a worse option since there's nothing ever to be gained by choosing a worse option we can come up with a general rule for decision-making even if you're in complete uncertainty which is the eliminate worse options rule any options are worse than some others eliminate the worse options from further consideration so we can rule out blue there's no advantage to betting on Blue so don't do it and given the better worse relation that we've just defined we can also Define a best option so uh the best option if one option in the decision problem is better than every other option that option is the best option available for the problem so that would then suggest a strategy the best option strategy so if a decision problem contains a best option choose it so that means right if there's a option for which every other option uh there's no chance of doing any better right and you can in fact do worse then right choose the one that is literally better than every now those situations may be rare um but you'd be surprised right part of this why I want to teach decision Theory here is we often make decisions without even considering this stuff at all right we just sort of like go with our gut and stuff and just taking a moment to even thinking about well how much do I value the actual outcomes here and um is there any advantage to choosing this at all you'd be surprised like how how often you can rule out some like useless options or you'll find yourself in a a best option scenario where there's the choice is easy right so um none of these figures so this figure here doesn't contain a best off one as the previous slides where uh green is the better option than blue blue is the worse option than green but red and green neither better nor worse than each other so sometimes an option even if it isn't the best can be good enough right where you're like I'm happy with this outcome I I'll choose this option right um and we can Define this a little more rigorously so this requires just deciding on if there's some payoff that is the lowest you would be happy with um and if there were an option that would guarantee a at least that level of payoff then you might say okay I'm going to pick that one right um so for example right if you're playing a a betting game and you're thinking you know what I enjoy this game I'm having a good time and if I can just break even and play for a few hours and have fun I'll be happy right I just don't want to lose money okay well if there's an option that could guarantee that there was no way you would lose money right and even potentially win some but at the very worst you just walk away having had a good time then that would be an appropriate right option for you uh so let's define these terms so let's satisfaction level right call satisfaction level the minimum value value rank if you're using rank options that the decision maker regards as a satisfactory payoff for a given decision problem uh we'll call that the satisfaction for that particular decision maker it's going to be going to differ from person what satisfaction level is we'll call A satisfact option um an option where every outcome associated with that option has a value at least as great as that decision Maker's satisfaction level um if so then that option is a satisfactory option for the decision maker in that given problem so again if you are someone and you like to gamble but you don't really like to lose money you just enjoy having fun having a few drinks at the table and getting into the the banter with the other players right you just like to spend a few hours doing that then maybe right uh you're betting on red right where the worst you could do is break even um maybe you want to bet on red right that would be a satisfactory option for you if your satisfaction level is zero so this suggests a strategy right the satisfactory option strategy if a decision problem with no best option contains a satisfactory option then choose that um and as I've just said in 9.3 betting on red is a satisfactory option if you're satisfied with breaking even um now you might be left with more than one satisfactory option right so if there was more than one option where the worst you could do is break even uh then you'll have to potentially use a different strategy to choose among those but it lets you rule some stuff out right um you know maybe one of them has higher potential for payout with this the same sort of uh satisfaction level and so uh choose that one right or you know if they're the same SI of coin or whatever right so it's certainly possible to have more than one satisfactory option so here's figure 9.4 uh has no best option right um it has a satisfactory option if you're satisfied with zero right that would be betting on Blue um on this one you know if you bet on red you have the possibility of losing Five Buck right if blue is drawn if you bet on red same with betting on Green you might lose five bucks but if you bet on Blue the worst you can do is zero if Reddit um now you can sort of always play this strategy um if you're willing to lower your satisfaction level right low enough um you can always find right some satisfactory option if you're willing to go low um so security level of an option just finding some terms that's the lowest value or lowest ranked outcome associated with an option that's the security level so for red security level is neg5 that's the lowest value you get blue security level is zero right that's the lowest you can get if that on Blue so the option with the greatest security level is the option with the highest attainable satisfaction level and this suggests a general strategy for any right Matrix with any set of values we can call it the play it safe strategy so that means just choose the option with the greatest security level right so given the values right um there doesn't even have to be a break even option right maybe the best you can do is lose five bucks in any of the options or something like that right um whatever values are you can have a general strategy where you choose the option with the greatest security uh also known as maximin strategy so this is a general strategy that could be used um that would just aim to minimize losses and basically pays no attention to possible right so if there was an option that was like z00 Z no way to win no way to lose but it happened to have the highest security level then you would choose that all the now uh if you don't care about minimizing losses and you only care about maximizing right potential gains uh the opposite right of the play it safe strategy you could pursue a different strategy um certainly every value Matrix has at least one highest valued outcome um for now let's assume there's only one obviously there could be more than one that has the same highest value um but say there's only one then the Gambler strategy would be to choose the option associated with the highest valued outcome Maxx we call that right Kenny Rogers The Gambler if you don't know what that is Google um all right so if there's more than one outcome with the highest value right so say betting on red right could win you 10 but betting on Green could also win you 10 um then just you could look at the second highest values right take those into account and pick the pick the option that has the highest uh highest value and also the highest second value or something like that right and so on um well so what should we do right what's the right strategy gamble play it safe well uh in 9.4 The Gambler strategy recommends betting on Green right because you could win 10 that's the highest value on the board of course betting on red can also win you 10 bucks right so since the highest values are tied we look at the second highest green has the second highest at five plus five the second highest one for red is zero so it looks like green is the one to play if you're running the Gambler strategy right um the play it safe strategy would recommend blue right because um the lowest you could get is zero right um whereas with red and green the L you can get is losing five bucks right so which one should we choose is one better than the other well um I mean the general consensus is there's no sort of rigorous way to choose between who it's really sort of up to your personality or or your own goals um but it's a fair amount of progress given that we know nothing about the state of the world right we have um a number of strategies right now that we can use um to choose an option right given our own personality our own sort of like what we're comfortable with as far as potential wins potential losses and so on um really in order to do sort of much better than this uh we need some information about the world want to say anything more rigorous about what is the right way to make decision okay that situation where we have some information about the world is decision- making with risk um so we know not with certainty right what the state of the world is that's decision- making with certainty and but we do have some information it's not decision-making with uncertainty so what we do know is probabilities of state of the world obtaining right um now in this chorus right notice that um often the best we can do is estimate some probability with 95% confidence we estimate back and we say well we can't say for sure what the probability of something is in the population but we can with 95% confidence say it's in some interval um that is still true uh but just so we can get through this chapter without making it inordinately uh complex let's um set that side for a moment and let's pretend that we know exact probability about what's going on in reality it'll be intervals rest you can still make the calculations that way but we just need to simplify it for moment right so let's say we know the exact probabilities with total confidence um so this kind of decision making we just can't do it when the outcomes are are merely ranked right we're going to have to perform some mathematical operations on the values namely we're going to have to multiply them times the probabilities right so um you're going to need some measure you need to assign some number value right to to the values you can't do it with just a ranking but we will see a way um later on to turn a a ranking into numerical okay so decision making with risk um involves another concept called expected value so suppose you're playing this game again and and we we looking at 9.4 here um so these are again the values that we drew up before we knew anything about um the states of the world but now we have a little more information suppose we know that the jar has it contains 30% red marbles 30% green Marbles and 40% blue marbles right um so now question is should we bet on red well we have more information right we can make uh uh we have we can make a much better decision um so let and it it's going to involve the concept of expected value right so expected value means imagine you were to play the game a large number of times right uh if you did that on average a bet on red would win three times in 10 plays right so you play the game a 100 times right and on average right out of every 10 plays Red would be drawn three times right because 30% of the marbles are red um so if you were betting on red over 10 games right you would win $10 three times right because you can expect out of those 10 times three of those times red would be drawn 30% of the marbles right um you would get nothing three times because green is 30% of the marbl so you expect to be drawn on average three times out of 10 and then you would lose $5 four times right because 40% of the marbles are blue so you can figure over the long run about four out of 10 times right You' be pulling out a blue marble and if you bet on red and they draw blue you're losing five bucks so um you would net right a total of over 10 games you would net $30 right so you three wins betting on red uh three times where you broke even because green was drawn and four times where you lost five bucks because blue was drawn so it be 30 plus 0 minus 20 so you Net 10 bucks over 10 games red right every every time um or an average of $1 per game right and in fact you can do the same calculation without sort of in your head imagining that you're playing it 10 times or 100 times or whatever you can do the same math over one game and just multiply each outcome by its probability so it would be3 * 10 plus3 * 0 plus4 time5 and you come up with a $1 expected value for betting on red so more rigorously here the expected value EV of an option is the weighted sum of the values of its possible outcomes the weights being the probabilities of theing States so by weighted sum we just mean um right we multiplied the uh the value right of the outcome times the probability that'll occur like point .3 probability that you win 10 bucks 3 probability you'd win zero bucks four probability you'd win neg five bucks what allity so uh this suggests a strategy the expected value strategy which is choose the option with the greatest expected value right um so you can run the numb yourself we found the expected value of red was $1 um the book says us tells us that uh this strategy should lead us to bet on Blue right so you can do the math yourself 3 time 0 plus 3 * 5 plus4 Time 5 right um if there happens to be a tie for the greatest expected value uh then just eliminate the other options right and then treat the remaining choices as a case of decision- making under uncertainty right so um even if there's a tie we still have further uh ways to to choose between the two maybe one is a better option right and so on if you have security levels maybe make the choice on that basis and so on okay so is the expected value stratey the best strategy so consider uh this one here 9.5 um so in this chart betting on red has the lowest expected value right so they've run the numbers for us and calculated for us so that last column is the expected value so betting on red 3.5 betting on Green four betting on Blue seven so blue has the highest expected um but betting on red has the highest security level right the worst you can do betting on red is zero doll right you can't lose anything betting on Blue the worst you can do is lose 10 bucks right um so over the long run betting on Blue will make you the most money right but if you're only playing one game uh the you you have uh it's certainly possible right that you could lose 10 bucks heading on Blue right um so in that case question is is it irrational to right to want to play it safe and just bet on red right because you know you might lose 10 bucks and you don't want to lose 10 bucks even if decision Theory tells you that betting on Blue gives you the highest expected value um maybe not right maybe that's not so irrational certainly uh if you were going to play all day right you would you should keep betting on Blue right because certainly over the long RM if if you're only betting once that may um make it rational to play a different sort of strategy um and it also might be the case that raw dollar amounts just don't totally capture a person's values right so for example um if you are very rich losing $10 doesn't really matter to you right if you're totally broke and that $10 means you get to eat or not right um it could mean a lot right so right a very rich person might be rational to choose blue roll the dice right a broke person it might totally be rational to choose red because they know at the very least they'll still get to eat that day right um so even though we do have this expected value strategy it doesn't necessarily mean that it's irrational right to play it safe option or something like that there's a lot more to it than just but the chart tells us a lot right okay so hopefully that was clear enough um if not feel free to you know definitely read the chapter you can review those slides again uh see me an office hours talk about it with um the rest of the lecture we're going to talk a bit about a um strategy develop by Frank Ramsey right this is him here uh he was a sort of philosopher and he uh he died relatively young but made a lot of uh uh had a lot of insights in a lot of actual different areas um but also in theory so he's gonna um come up with a a way of uh assigning values to Any Given decision problem right so it's going to require a little bit of uh um yeah it's not too tricky the terminology something called an ethically neutral proposition right this just means any proposition so a proposition just means like I don't know a sentence I guess or like maybe more strictly some state of the of Affairs in the world right don't worry too much about what aition is um but an ethically neutral one is just one that nobody has any interest in whether it's true or false um apart from its effect on some other proposition they do care about like what's an example of an ethically neutral proposition uh flip a coin right so the proposition that the coin lands heads up is something that no one cares about at all unless you're betting a bunch of money on it right um on its own the coin lands heads up who cares and by ethics we don't mean like morals here exactly we just mean do you care about it at all does it matter to you um so it's something you would not care about but you would care whether the coin lands heads up if another proposition were true like if the coin lands heads up I win 10 bucks okay now all of a sudden I'm interested right so um this is the idea right there's ethically neutral propositions which basically are going to be gambled right and then the non ethically neutral propositions are the ones you care about like I went of money okay so assign preferences to the outcomes that are not ethically neutral we're going to rank them between values zero and one okay so the idea here um is you're going to be able to take sort of just ranked outcomes and convert them into numerical values so you take your favorite outcome give that a value of one take your least favorite outcome give that a value of zero and then for any intermediately ranked outcomes uh you would be indifferent between the two options um if you decide when you would be indifferent between these two options right and the idea is you could either get one guaranteed or get one that you like more with the some probability right you take some gamble um that you could get that one it'll make more sense when we see the examples right um but so say I've got two outcomes um that are ranked right uh if I would choose uh if I'm indifferent between getting one of them for sure and getting another one with like a 083 probability right then that's the that's the utility right that's the value I sign to those outcomes we'll again we'll see an example okay so I expect that was a little confusing let's work through an example right so here's Kurt uh High School football player offered scholarships from two programs State University and out of state university we'll call Su and OSU so neither scholarship guarantees that he'll play right he'll get to go to the school but they can't say that he'll be actually in the games right um after some research Kurt estimates that he's got probably about 0 75 probability of playing if he goes to OSU and a06 probability of playing if he plays at Su right Su is just a more competitive program right the better players we get to so um again at OSU 75% chance he plays 25% chance he rides the bench right and OSU 60% chance he plays 40% chance he rise right okay so and he's here's his is ranked ideally he'd love to go to Su and play football right su's got the best team right that would be great uh second second choice he attends OSU and plays fo third choice attend OSU and don't play football fourth choice attend Su and don't play so basically what we're looking for we're looking at here Kurt more than anything wants to play right so the two Cho two outcomes where he plays are like like one and two um but academically he prefers to play OSU so if he doesn't get to play he'd rather be at U right so that that's how he ranks his Cho okay so applying Ramsay's method right will assign a value one to the Top Choice attend Su and play football we'll assign a value of zero to the bottom Choice attend Su and don't play football and now we need to figure out out um the utility values right of the other two choices excuse me so we need some ethically neutral proposition right so here's the ethical ethically neutral proposition we'll use to calculate the intermidate value of attend intermediate value of attend OSU play football let's say is a probability of rolling a dice and getting either a one two three four or a five so so that basically a five and six chance of doing that right the only thing um right if he gets a six right then he loses that's that's the probability so a good probability again 083 probability right so he thinks that he would be indifferent between being guaranteed to play u that would be his second choice and getting to play at Su which was his first choice with a probability of 083 those two to him um are he's in different between they're basically the same either guarant being guaranteed to play OSU or an 83% chance of play of playing at Su those are about the same to him right that's the idea method right and he lower probability and he choose OSU he choose the safe betat right if it was attending Su with playing for Su 75% probility he he'd just choose OSU right he'd choose the safe bet um any higher right he'll take the gamble okay um but at point exactly 083 he's in different okay so that means according to Ramsay that Kurt applies to the outcome attend OSU and play football 83 utility I hope that helps right that I think the in principle right in general terms the Ramsey method is sound confusing but hopefully applying this it's also granted it's a weird way to think about your options right thinking about okay would I rather have this one guaranteed or a 75% chance of this one right or 083 of this one right um so the Ramsey method does require you to sort of like think about things that way if you can wrap your head about thinking about things that way then we have a nice rigorous way of assigning so I skipped over he also assigns a value to attending OSU and don't play football right again it is uh here is a 0.25 chance so I think they use like flipping a coin four times or something but um anyway it's in the text right but it's the same basic idea right so the ethically neutral proposition can be anything rolling a dice flipping a coin whatever number of times um but it comes up that the value he assigns to attending OSU and not playing football is 0.25 right he has a strong preference to play football as you can see right okay so given these utility values and given the probabilities right that he estimated after we doing some research on these programs now we can work out the decision Matrix and figure out the expected value of each uh possible outcome or of each option um so we can just maximize expected utility so we'll calculate the expected utility for each option so attending Su um again there's right two outcomes where they tends Su uh so there is the attend Su and play football 6 probability utility value of one so that's the first term there um and add to that the other uh possible outcome if he chooses attend Su and that will be uh not playing football so probability of 04 utility value of zero so 04 time 0 so you add those up you get 6 Plus 0 you get 6 that is the right expected utility of attending Su now for OSU do the same thing right so uh we take the preferred option right attend and play that's one way it could go if he chooses uh to attend Su 75 probability time 83 utility right and add that up to the other possible outcome if he tends Su 0.25 probability 0.25 utility you get 62 plus 06 so you get a 68 expected utility and it turns out that right the option with the highest expected utility is actually attending OSU right so he should go to OSU right uh that may have been surprising right you may not have thought that you have to take the time and run the numbers to do that right um this is where decision Theory can really pay off if you take time to think about these somewhat rigorously um of course the expected utility here is sort of an averaging over an imaginary like hundred times you would make that choice right in reality he's only making this Choice One once only one of the four outcomes will happen so he might choose right the outcome with the highest expected utility attending OSU and he might end up with his third joint attending OSU and not playing football right that's certainly a possibility um but that's how to that's life