Transcript for:
Understanding Smart Decision-Making Flaws

Why do smart people make dumb decisions? Why do conspiracy theorists think that we didn’t land on the moon or that Hillary Clinton is a space alien? And why won’t Bernice admit that the new Superman movie just isn’t very good? We’ve talked about cognition before. We usually refer to it as the process that we use to think and solve crossword puzzles and stuff. But really, cognition involves knowing, remembering, understanding, communicating, and to a certain extent, learning. And as truly wonderful as our brains are, we can be spectacularly bad at ALL of these things. We used to think our cognition worked like a computer -- logically processing information. But that cabbage-sized chunk of pink, wet brain matter in your skull can do a lot more than math, and the things that it does are certainly not always logical. Many experts argue that it’s cognition that makes us truly human, and that everything that comes with it -- our preferences, prejudices, fears, and intuitions -- are what make us the individuals that we are. We’re not the only animals that show some evidence of cognition, of course: Chimps and gorillas exhibit insight and planning; crows use tools; elephants teach each other. But our capacity as humans to figure stuff out is matched only by our ability to totally misjudge stuff. As prone as we are to brilliance and insight, we’re equally likely to succumb to irrational thinking and false intuition. So, to borrow a riff from Rene Descartes, you think, therefore you are. Which means you’re brilliant a lot of the time. And sometimes, you’re just going to look stupid. [INTRO] We all want to make sense of the world. And one of the major ways our cognition allows us do that is by forming concepts -- mental groupings of similar objects, people, ideas, or events. We like to lump things together. Concepts simplify our thinking in such a fundamental way that we usually don’t have to stop and think about using them, they’re just there. And yet without concepts, we’d need a unique name for everything. You couldn’t just ask me to shake the anglerfish -- because there’d be no concept of shake or fish, let alone stuffed, blue anglerfish. And if I told you I was devastated that I lost my anglerfish -- which I probably would be -- I’d also have to explain my emotions, their intensities, even the words themselves that I had to use. So basically, without concepts, no one would ever get anything done. We’d all be like a bunch of ents taking all morning to say “Hey, what’s up?” We often organize our concepts by forming prototypes--mental images or pinnacle examples of a certain thing. For example, if I say “bird”--the general shape of a songbird probably pops into your head before like, a penguin or chicken or emu, because robins and cardinals more closely resemble our bird prototype. Still, if I show you a picture of some crazy creature you’ve never seen before, and you note that it has feathers and a beak, you’ll probably file it under the bird category because it more closely resembles your concept of bird than your concept of rodent or overcoat or footstool. Concepts and prototypes speed up our thinking, but they also can box in our thinking, and lead to prejudice if we see something that doesn’t fit our prototypes. A hundred years ago the sight of a female doctor might have caused some heads to explode, because in peoples’ tiny minds, the prototypes of “doctor” and “woman” didn’t have any overlap. And actually some people today still feel that way. Haters gonna hate. So it’s important to actively keep your mind open mind to make room for evolving concepts, and remember that concepts may sometimes hurt as much as they help. One of the biggest ways our cognition works to our benefit, though, is through our ability to solve problems. We use our problem-solving skills all the time: How to assemble Scandinavian furniture, bake muffins with a missing ingredient, or handle the crushing disappointment of the new Superman movie. And we approach problem-solving in different ways -- sometimes we value speed; other times, accuracy. Some problems we figure out using trial and error--you know, you try something and if it doesn’t work, try it a different way, and keep at it until something works. Trial and error is slow and deliberate--which may be good or bad, depending on the problem. We can also use algorithms and heuristics to come up with solutions. Algorithms are logical, methodical, step-by-step procedures that guarantee an eventual solution, though they may be slow to work through. Heuristics, on the other hand, are more like mental shortcuts -- simple strategies that allow us to solve problems faster, although they’re more error-prone than algorithms. Say you’re at the store, looking for a family-sized bottle of Sriracha. You could use an algorithm and methodically check every shelf and aisle in the store. Or you could use heuristics and first search the Asian or condiment sections--the places that make the most sense based on what you already know. Heuristics may be way faster, but the algorithmic approach guarantees you won’t overlook the sauce along the way, because they stuck it in the deli or whatever dumb thing they did this week. So algorithms, heuristics, and trial-and-error are problem-solving strategies that involve a plan of attack. But sometimes we get lucky while puzzling out a problem, and Aha!, out of nowhere a sudden flash of insight that solves our problem. I’ll use orange in the muffin recipe instead of lemon! Or, Sriracha lives in the Mexican section! For some reason! Neuroscientists have actually watched that kind of sudden, happy brain flash on neuroimaging screens. In one experiment, they gave subjects a problem to solve: What word can be added to the three words CRAB, PINE, and SAUCE to create a new compound word? Then they asked the subjects to press a button when they had the answer. While the subjects thought about it, scans showed activity in their frontal lobes, the areas involved in the focused attention of typical problem-solving. But right at the Aha! moment, just as they pushed the button, there was a clear burst of activity just above the ear in the right temporal lobe, which, among many other things, is involved with recognition. The answer, by the way, we already gave you the hint earlier in the episode. Where’s my fish? Those sudden bursts of insight are awesome, but you can’t count on them to solve all your problems. And just because something feels, doesn’t mean it’s truly correct. Because as inventive and smartypants as we may be, our cognition often leads us astray in all kinds of ways. For instance, we often look for, and favor, evidence that verifies our ideas, while we’re more likely to avoid or ignore contradictory evidence -- a tendency known as confirmation bias. This is really similar to the overconfidence we’ve talked about, when you’re basically more confident than you are correct. When this kind of cognitive bias takes hold, you might cling to your initial conceptions in a kind of belief perseverance, even in the face of clear proof to the contrary. This happens all the time, and it can be maddening for people watching it happen. People still think that the earth is flat! It’s like...WHAT? HOW? There’s space pictures! I probably don’t need to tell you -- people can really get weird and defensive when they evade facts and choose to see only the information that confirms their beliefs. They may even become functionally fixed, unable to view a problem from a new perspective. Instead they just keep approaching a situation with the same mental set, especially if it’s worked in the past. Say you’ve got a nail sticking out from a board, and you’re like “I need to take care of that!” There’s rocks, and bricks all around you. But because of your functional fixedness on the idea that only hammers work on nails, you don’t even consider hitting it with the brick, and instead you waste a bunch of time in the garage looking for a hammer, and you’re angry and frustrated, and there’s still a nail sticking up from the board. So, our mental set predisposes how we think, just as you’ll remember that our perceptual set predisposes how we perceive. This is what makes heuristics -- those super-convenient mental shortcuts that we all use -- so easily fallible. In the 1970s, cognitive psychologists Amos Tversky and Daniel Kahneman researched how we make snap judgments, and discovered one way smart people make dumb decisions. They found that people believe an event will be more likely to occur if they can conjure up examples or memories of it, especially if those examples are particularly vivid, scary, or awesome. So, say you’re in a casino and you win two dollars at a slot machine. Suddenly every flashing light and ringing bell in the place goes off. But when you lose -- which is the vast majority of the time -- it’s just...crickets. With all their lights and noise-making, the casino makes sure that wins are super vivid and memorable, while losses just go away unacknowledged. That way, the next time you’re standing there with 100 bucks in your pocket, you’re more likely to overestimate your chances of winning, because the memories of winning are more striking. The more mentally available those memories are, the more it seems that it’s going to happen again. This is known as the availability heuristic. And it can warp our judgements of people, too. If we keep remembering news footage that shows people of a given group shooting guns, that can shape our impression of the entire group -- even if what we saw was only a tiny minority within that group. Essentially, we are great at fearing the wrong things. We worry about being killed in a plane crash or getting bitten in half by a shark or accidentally choking on a dumpling. Thanks to our brain’s b-roll of horrific images, we come to fear what’s actually very rare, instead of worrying about much more common, but less memorable ends like car accidents, cancer, and heart failure. Our thinking can also be swayed by framing, or how an issue is presented. Imagine you’re considering climbing Everest or getting a nose job or eating a bowl of raw blowfish. I can frame the risks in different ways. Telling you that you’ve got a 95 percent chance of survival sounds a lot different than saying five out of a hundred people die doing this activity, though the information is the same. Our cognitive minds are capable of incredible intellectual feats and tremendous failures. We can solve problems better than any organism on the planet, but given the chance, we can also mess up a pretty simple judgment every day of the week. But if we’re mindful of our capacity for error -- and if we honor our ingenuity and intellect -- I think our ability to solve any problem is nearly infinite. And that, gives me a lot of hope. Seriously though where is my fish? Today you learned how we use concepts, prototypes, and our mental sets to think and communicate, and how algorithms, heuristics, and insight help us solve problems. You also learned about how fixation, the availability heuristic, fear, overconfidence, and belief perseverance can get in the way of good decision-making and thinking. Thank you for watching, especially to our Subbable subscribers, who make this whole channel possible. If you’d like to sponsor an episode of Crash Course, get a special Laptop Decal, or even be animated into an upcoming episode, just go to Subbable.com/crashcourse. This episode was written by Kathleen Yale, edited by Blake de Pastino, and our consultant is Dr. Ranjit Bhagwat. Our director and editor is Nicholas Jenkins, the script supervisor is Michael Aranda, who is also our sound designer, and the graphics team is Thought Café.