By Stuart R. Levine
Published in, The Credit Union Times
Emotional intelligence, self-awareness and commitment to understanding others are traits of a good leader. These leaders make sound decisions based on best available data. But what if the natural automatic workings of the mind interfere with these traits? The research of Daniel Kahneman, a psychologist and 2002 Nobel Prize winner in economics, provides useful insights to avoid cognitive traps to strengthen decision-making.
Kahneman’s best-selling book, “Thinking Fast and Slow”, provides a simple model describing two systems of the brain. Fast-thinking is impulsive, automatic, and intuitive. Slow-thinking is thoughtful, deliberate and deeply analytical. The legacy of human evolution had inherent survival advantages. Fast-thinking allowed humans to take rapid action without need for in-depth thinking. Slow-thinking activates when the mind faces a situation it can’t automatically comprehend and involves conscious mental activities such as self-control, choice and deliberate focused attention. You can improve strategic decision-making and increase self-awareness by appreciation for and avoidance of, fast-thinking cognitive traps, a number of which we discuss herein.
Snap judgments are the domain of fast-thinking. We tend to oversimplify analyses of situations, without appreciating that it is occurring. Heuristics, which are shortcuts or “rules-of-thumb”, allow for quick decisions, but we often overuse these helpful processes. With the trap of the substitution heuristic, we answer an easier question instead of the one that we need to answer. In recruiting, for example, the tough question: “Will this person be successful in the job”, which requires significant study of their background and history of success. This is replaced by the easier question of: “Does this person interview well”.
Availability heuristic overestimates the importance or probability of what is most personally relevant, recently heard or vividly remembered. Managers conducting performance appraisals from memory are more likely to recall exceptional instances of an employee’s performance (positive or negative) than general behavior and will weight that more heavily. They give more weight to performance during the three months before the evaluation than the previous nine months.
Confirmation bias is our natural unconscious tendency to seek and rely on information that confirms our beliefs and downplays or dismisses information that might change our minds. An effective group decision-making tool is to have the person proposing an option to argue against it. An opponent of the proposal can in turn, in good faith, argue for it. Moreover, research shows that unconsciously biased decisions can fit an individual’s circumstances, rather than benefit the organization as a whole. For example, managers who rotate quickly through positions tend to favor projects with short-term paybacks, when longer-term projects would create greater value.
In the endowment effect, just owning something makes it feel more valuable to the owner. In the related loss aversion effect, people would rather leave a situation as is, rather than risk a loss. Strategists are generally better at identifying the risks of new businesses, than appreciating the risks of failing to change. Analyzing existing businesses, products and operations with the same scrutiny as a new investment, will help avoid this trap.
We all too often make avoidable statistical mistakes that negatively impact decision-making; even statisticians do. One mistake, base-rate neglect, is judging probability without taking into account all relevant data. This medical test question illustrates: A serious, but rare, disease affects 1 in 1000 people. How worried should someone be with a positive result from a 95% accurate, generally administered test? Most would believe that they had a 95% chance of having the disease, as did half the Harvard medical students answering this test question. However, because the base rate is very low (1/1000), the actual likelihood is roughly 2% and the chance of a false positive is 98%.
Leaders must discern true cause and effect. Kahneman’s 1960s Israeli Air Force analyses demonstrated how considering regression towards the mean can help. People generally have an average (mean) level of skill at any given point. Through continuous learning and effective coaching, that skill level should improve over time. There are, however, always differences between the trend-line and each individual performance of that skill; some better, some worse. This variability fits expected probability distribution. Kahneman helped Air Force instructors realize that variability in pilot performance from flight to flight followed expected statistical variations. Moreover, to their surprise, their general pilot feedback after each flight had no real effect. Kahneman’s work helped instructors create a more insightful, longer-term view of feedback, coaching and training.
Kahneman’s research is extremely valuable to leaders as they seek continuous improvement in strategic decision-making. Vigilance about our own cognitive processes can make for better managers, decision-makers and, indeed, leaders.