Transcript for:
Understanding Clinical Reasoning Systems

there are several component skills of clinical reasoning in this short video we will discuss one component thinking fast and slow or system one and system two system one is thinking fast when thinking fast one tends to rely on intuition and previous experiences this type of thinking is usually involuntary daniel kahneman uses the term heuristics to describe thinking methods that rely on fast thinking these heuristics can be beneficial efficient and appropriate in several daily activities however an over-reliance on fast thinking can also be problematic and can lead to errors in decision making system two thinking is slow and effortful it's better for complex issues requiring more conscious and intentional thinking and tends to occur in situations that are new and unfamiliar system two tends to be more reliable we all rely on both system one and system two thinking but let's look at their role in clinical reasoning experienced clinicians build capacity for system one thinking through clinical experience and constructing scripts to develop pattern recognition instead of thinking about each sign symptom or piece of clinical information individually they start to chunk this information together and recall how the current situation is similar or different from previous ones making thinking more efficient experts will rely on system one thinking until a situation does not fit a script or pattern in other words they notice that something is different this noticing should shift them into system two thinking where they slow down reflect on what is different and gather more information novice clinicians tend to rely more on system two thinking because they do not yet have substantial clinical experience on which to build scripts as they gain experience they too will begin to rely more on system one thinking but the triggers to slow down and switch to system two thinking are likely to occur more frequently than in expert practice bias can be both positive and negative but we need to be aware of them to balance and check our thinking there are too many types of biases to describe them all here but a few of the more common ones are availability bias recency bias confirmation bias and premature closure availability bias refers to giving priority to the first thought that comes to mind this type of bias may be stronger in clinicians with the narrower experience base who have less exposure to alternative diagnoses or treatment approaches it's very difficult to consider things which we don't have knowledge of to minimize the influence of availability bias make sure to carefully research your decision and ask experts for guidance and feedback recency bias is the tendency to give more weight to events that happened recently an example of this might be when a physical therapist detects a wider than normal pulse of a patient's aorta during abdominal palpation and it turns out to be an aortic aneurysm then for the next month or so the physical therapist palpates every patient's aorta regardless of risk factors confirmation bias occurs when we gather information selectively or interpret it in a way to support our favored conclusion while ignoring alternatives when a clinician falls into a confirmation bias trap they are no longer following the scientific process into rather creating a case for why they are right premature closure accounts for a large proportion of misdiagnosis there is a tendency to end the decision-making process early and accept a diagnosis even though it has not been completely explored and verified potentially resulting in ineffective care this is similar to the parable of the blind villagers who only encounter a portion of the elephant and come to a conclusion about what they're experiencing that does not take into account the entirety of the situation elimination of bias is not a realistic goal and may not be helpful anyway what's more important is to recognize common bias traps and to create checkpoints to control for bias when it leads us astray as we've already discussed both systems 1 and 2 are valuable so how do we train ourselves to appropriately shift from system one to system two when we encounter warning signs there are several things we can do to control for our biases one thing is to develop an awareness of the different types of biases simply by educating ourselves on the many ways our minds can trick us we become more mindful to avoid or at least recognize these mental traps a clinical approach that includes planned bias checkpoints is recommended another effective strategy is systematic reflection for example a clinician may want to take a minute after a patient interview to quickly ask themselves what they are thinking in the moment and to consider alternatives that don't come immediately to mind this may allow them to plan their physical exam more thoughtfully this type of reflection in action may be indoctrinated in novice and even experienced clinicians through the use of guided clinical reasoning to help them build a systematic approach to recognizing bias reflection on action is also very helpful in developing the ability to shift between systems one and two effectively this is particularly useful after a challenging case or when things didn't go as planned reflection on action essentially helps us build a better system one for future action in summary systems one and two thinking are both very useful in driving our clinical reasoning to quickly recognize patterns and also to remember to slow down notice and check biases before we get off course experts typically spend more time in system one but when they notice warning signs they shift the system too seamlessly we are all susceptible to bias and a systematic approach to checking our biases will undoubtedly help us shift back and forth between systems one and two effectively to allow for improved clinical reasoning