We base most important decisions on our beliefs about the likelihood of uncertain events such as the outcome of an election, the guilt of a defendant, or the future success of a stock. We typically express these beliefs with statements like “I think that,” “Chances are,” and “It’s unlikely that.”
What determines such beliefs?
How Do People Evaluate the Likelihood of an Uncertain Event or the Value of an Inexact Quantity?
Research tells us that people rely on a limited number of heuristic principles by which they reduce the complex tasks of assessing likelihoods and predicting the future. Heuristics, our mental shortcuts, allow us to solve problems and make judgments quickly and efficiently. These rule-of-thumb strategies shorten decision-making time and allow us to function without constantly stopping to think about our next course of action. Usually, we find these mental operations useful, but they can lead to problems if we overuse them or rely on them too heavily. A better understanding of these heuristics and the biases to which they lead can improve judgment in situations of uncertainty—especially when that uncertainty involves risk, as it usually does.
When evaluating possibilities, therefore, we do well to make sure the status quo represents one and only one option. We put aside biases and improve the effectiveness of decisions when we ask: “If we weren’t already doing this, would we now choose this alternative?” Often, we exaggerate the risk that selecting something else would entail, or we magnify the desirability of staying the course over time, forgetting that the future may well present something different.
Our fears, perceptions, and biases tell one story, but the facts tell a different one. As it turns out, the world, for all its imperfections, is in better shape than we might think, even though we are struggling to emerge from a pandemic. We have real problems, but when we spend our energies worrying about the future or feeling guilty about the past, we lose our focus and exhaust our abilities to solve problems, make high-caliber decisions, and take the necessary risks to grow and change.
We can effect change if we are willing to take a chance on our ideas.
It won’t happen automatically, however. Five psychological forces influence high-stakes decision-making: beliefs, cognition, emotions, motivation, and resilience. When we leverage all five, we develop a Disruptive Mindset™—one that shapes resilience recognizes that challenges aren’t permanent; talented people can figure things out, and even failure isn’t fatal.
This mindset allows us to learn from past mistakes so we can move past them. But our minds don’t always work the way we think they do. We think we see ourselves as we really are and the world as it is, but we’re missing quite a bit.We can effect change if we are willing to take a chance on our ideas. Click To Tweet
The Invisible Gorilla
In their book, The Invisible Gorilla, Christopher Chabris and Daniel Simons explained that intense focus on a task can make people effectively blind, even to stimuli that normally attract attention. To prove their hypothesis, they constructed a short film of two teams passing basketballs, one team wearing white shirts, the other wearing black. The viewers of the film were instructed to count the number of passes made by the white team, ignoring the black-shirted players. See for yourself at selective attention test – YouTube before you read further.
Halfway through the video, a woman wearing a gorilla suit appeared, crossed the court, thumped her chest, and moved on. The gorilla was in view for 9 seconds. Many thousands of people have watched the video, but about half of them fail to spot the gorilla. The instructions to count passes and to ignore the players wearing black cause blindness.
The authors noted that the most surprising aspect of the experiment was viewers who failed to see the gorilla were initially sure that it hadn’t been there. People who watch the video without the instructions about their task don’t miss the gorilla, so they wonder how anyone could.
The experiment proves two things: We can be blind to the obvious, and we are also blind to our blindness.We can be blind to the obvious, and we are also blind to our blindness. Click To Tweet
Opportunism or Dereliction of Duty?
Nothing feels as painful as staying stuck where you don’t belong, yet many, not realizing they’re blind, confuse opportunism with dereliction of duty. When we don’t prepare for the unexpected, we can’t move quickly when surprises occur.
Remember, you don’t have to be faster than the bear. But you must be faster than the others in the jungle.