in this lecture we're going to cover human judgment decision making and bias so humans make many judgments and decisions each day that vary in terms of their importance and their consequence both for the individual themselves as well as for other people now what is judgment and decision making or jdm for short well it refers to a process of selecting choices from alternative courses of action after some degree of deliberation either at the individual level or the group or collective level and so judgment and decision making involves prediction and or anticipation of possible consequences of alternative courses of action as well as a possible evaluative reactions to such consequences using data encourages us to make better decisions and come to better judgments and this is really at the heart of hr analytics because after all hr analytics part of it is acquiring and analyzing and interpreting data so the rational decision making model is one rigorous approach that we can take and it's one way to ensure that we're going to make good decisions or better decisions when it comes to high-stakes situations so the scientific process is an important way to actually inform rational decision-making and this is because the scientific process is based on the idea or the philosophy of empiricism or the idea that knowledge is based on evidence or data and so the scientific process is as follows first you formulate the hypothesis then you design a study to actually test that hypothesis you collect data then to test that hypothesis analyze those data interpret the data and report the findings and possibly repeat that cycle over again now the scientific process can be thought of as one type of rigorous problem solving as it is based around formulating hypothesis which is really like identifying a problem and then collecting a data to actually evaluate the problem and different decision alternatives you might make to help address that problem now again the rational decision making model is really useful for high stakes decisions so let's look at an example of what that rational decision making model might be that we follow well first we're going to want to identify the problem after that we need to actually establish some type of decision criteria now these decision criteria are used to evaluate the different decision alternatives that we might come up with in subsequent stages and so after we've established those decision criteria we're ready to weigh them or apply some type of weight to them and this means essentially ranking or assigning some type of value to these decision criteria based on their importance to you and or your organization or team after that we're ready to generate different decision alternatives these are different courses of action that we might take after that we evaluate those decision alternatives or courses of action and then we choose the best alternative implement that decision and then evaluate that decision subsequently and this last phase of actually evaluating the decision might lead you to identify new problems and then you repeat the rational decision making all over again now it's important to remember this is really designed for high stakes decision making humans we make a lot of decisions on a day-to-day basis that we rely mostly on intuition or that are guided a lot by our emotional experiences and reactions to things and so it would be very time consuming to engage in rational decision making for every little decision we make in a given day but when it comes to high stakes decisions especially those that are intended to help us achieve strategic objectives within the organization we want to make sure that we have a very rigorous approach to decision making and again hr analytics can be really helpful for this so let's look at an example of how this rational decision making model might unfold within the selection context within human resource management so starting with identifying the problem let's imagine that we have new employees however these employees are not performing well and so we've determined that we likely need better selection tools to help identify those applicants who are likely to be high potential or better performers should they be hired now next we can establish decision criteria and perhaps in the context of selection we might determine that criterion related validity or the extent to which scores on those selection tools predict future job performance is going to be one of our criteria followed by the cost of those tools as well as applicants reactions to the tools themselves after that we need to actually apply weights to those different decision criteria and perhaps we determine that and decide that validity or criteria related validity is going to be the most important decision criteria that we have followed by cost and then applicant reactions after that we're ready to generate those decision alternatives or courses of action and so perhaps during this phase we identify and develop structured interview questions a work sample as well as a personality tester inventory in order to actually help address and find those more talented individuals that we can then hire and who are likely to perform better on the job should they be hired now we're ready to evaluate these different courses of actions or decision alternatives and this is where we're going to acquire analyze and interpret data in other words we'll likely do a criterion related validation design maybe a concurrent validation design or maybe a predictive validation design for each one of these different selection tools as well as assess the costs of the tools and perhaps survey some of the applicants in order to gauge their reactions to these different selection tools in other words what are their evaluative judgments about how much they liked or disliked these different tools that they were experiencing after that we're ready to choose the best alternative so based on those data that we looked at and analyzed when it came to evaluating the different decision alternatives decision alternatives perhaps we identified that wow the work sample was really the best predictor of future job performance and so therefore we decide to run with that one now we're ready to implement our decision of using that as the selection tool or the new selection tool that we're going to apply to subsequent job candidates who come through now this is when we actually roll out the process but of course we'll need to actually evaluate whether or not when we cross validate these findings with an entirely new sample or population of candidates that come through how well do our findings hold does the work sample still predict job performance in a new group of applicants and so this is really about acquiring analyzing and interpreting an additional or new set of data and this might lead us to identify that well perhaps the work sample is working quite well and but perhaps there are also some issues regarding the cost of it or maybe the applicants don't react well to it for whatever reason maybe it's too long too invasive or something like that we might need to think about redesigning that which is a new problem that then restarts the cycle of rational decision making so now let's talk about different human biases and errors we might make in terms of judgment and decisions so one of the biggest ones that we might fall victim to is what's referred to as anchoring and adjustment bias so what is anchoring and adjustment bias as the name implies we tend to anchor on or focus on certain bits of information at the expense of other bits of information now this information that we're anchoring on may have more or less truth to it and so we should be really cautious about this but sometimes we certain things might be more salient to us because maybe we've heard about them more recently and so for instance if you're in a situation where you're trying to come up with what are the different drivers perhaps for the high turnover rate that we're experiencing right now in our customer service representatives within our company you might recall more readily employee engagement and perhaps this is because you have seen advertisements people have been talking about this but that doesn't necessarily mean that engagement is one of the top drivers within that group of customer service representatives of their decision to turn over so we want to be very thoughtful when it comes to anchoring on one piece of information over another and this is where that rational decision making model of judgment and decision making can come in the next thing is the availability bias as humans were also quite susceptible to this information that we is more readily available to us tends to be more salient to us and for that reason we tend to attend to that more and that comes to mind so we also want to be cautious about this as well we also want to be very sensitive to what is referred to as the escalation of commitment bias so this is the idea that you've continued down this path you've invested different types of resources as you've gone along this path and at some point you might say to yourself well we need to keep on going down this path of let's say using this particular selection tool or this training program because we've invested so much in it already even though we're seeing conflicting evidence now that perhaps suggests that well this isn't such a great tool or this isn't such a great training program and so we want to be very careful about not continuing down a path even when faced with evidence that suggests that perhaps that's not the best path another bias is hindsight bias and this is the idea that and after an event has already occurred we as humans tend to see things a little bit differently than what we saw them as before the event occurred with information at hand we tend to have a hard time putting ourselves in the position of what we were thinking prior to the event occurring and so if something let's say very big happens within your organizational workforce such as maybe a large amount of turnover that suddenly happen amongst a group of employees in a particular job well in hindsight we might look back and say well this should have been obvious we had a lot of discontent people weren't engaged and so forth but perhaps that wasn't so obvious before the fact maybe you hadn't measured those things ahead of time and so this is why very targeted evidence collection and data collection can be so valuable as hopefully we can get most of that information before the event occurs instead of sitting in our armchairs and reflecting back and saying oh that should have been obvious to us and we should remember that it's easy to do these post-mortems and to think oh we should have that person should have made this decision and should have recognized this ahead of time but that's easier said than done because often we didn't have full information before the event occurred now another type of error we tend to make as human beings is equating correlation with causation and so if two events tend to co-occur with each other we often make the assumption that one event is leading to the other event or causing the other event now even when we're engaging in hr analytics and analyzing data for instance for instance actually looking at a correlation between two variables such as job satisfaction and job performance it doesn't necessarily mean if they're significantly and positively related to each other that job satisfaction necessarily causes job performance now theoretically and conceptually you could think oh yes there might be an argument that people are more satisfied with their job are therefore more likely to perform better in that job well conversely you might argue the other way around too that people who are more are performing at a higher level get more feedback about their job more positive feedback and they maybe develop better relationships which then makes them better and more satisfied with their job so again we want to be careful not to equate correlation with causation now another type of bias we should be careful with is what's referred to sampling bias and this is another one that's very very important in the context of hr analytics when we're acquiring data from a population let's say the population of people who are working in a given job within our company we want to make sure we get a representative sample of people the extent to which we don't do that is going to affect the inferences we make and it's going to bias the inferences we make so we want to make sure that we either randomly select people or do targeted sampling methods where we do get that representative sample of people from the population that we're actually targeting and trying to make inferences about now finally another bias that humans are very susceptible to is what's referred to as overconfidence bias people tend to believe they're better at predicting the future than they actually are and this is why a lot of people rely on their gut or their intuition to make major decisions they believe that they can predict the future and they maybe ignore evidence that suggests past instances in which they didn't entirely predict it accurately or at all and so this is where again data can be used to help predict future events through predictive analytics and other techniques now at the same time we want to be confident also cautious even with predictive analytics that these models are going to have by or they're going to have error in their predictions and we're never likely going to perfectly predict human behavior behavior in the future because humans are inherently complex beings and for that reason we want to be cautious when predicting and saying not saying this is 100 going to happen but there's a greater likelihood of this thing happening in the future and so this is an example of how we can help mitigate overconfidence bias so in this lecture we talked about human judgment decision making as well as different biases and errors humans make regularly now engaging in the rational decision-making model can be a great way to ensure that we make good decisions in high-stakes settings and situations in addition we want to be very cognizant of different biases and errors human make-making processes thank you very much