The Book of Beautiful Questions excerpt
Excerpt
The Book of Beautiful Questions focuses on four key areas: How to Decide, Create, Connect, and Lead.
This excerpt is from the “Decide” section of the book.
How to question your own decisions
Don’t just “go with your gut.” When faced with important choices, asking a few key questions can lead to better results
Every day we are confronted with questions that cry out for a decision. Some of the questions are relatively insignificant: What should I have for breakfast? Shall I read this news article, or skip to the next? Others are more important: Should I take on that new project? Talk to my boss about a problem that has come up at work? Is it time for our family to start looking for a new house?
It is perfectly natural, and reasonable, to want to answer such questions as soon as possible. Why waste time overthinking it? Might as well choose what feels right in the moment or, to put it another way, “go with your gut.”
But a growing body of research has concluded that our instincts—our natural tendencies to think or react in certain ways when faced with a decision—aren’t as trustworthy as we might believe. We’re subject to inherent biases, false confidence, irrational risk-aversion, and any number of decision-making pitfalls. As the psychologist and decision-making expert Daniel Levitin puts it, if you make decisions based on instinct, “your gut is going to be wrong more than it is right.”
So what can we do about that? When it comes to important decisions, we can put less trust in feelings and more in evidence. We can seek input from outside sources and differing perspectives—to try to see past our own biases and limited views. We can generate more options to choose from when making a decision (which experts say is a key factor in arriving at better decisions).
But we can’t do any of that unless we’re willing to think about—and ask questions about—the decisions we make, as we’re making them.
Think of your innate questioning skills as a flashlight, and the decision ahead of you as a dark room. Each question you ask yourself illuminates a new area—and the better the question, the more light it casts. Asking yourself questions such as, What am I really trying to decide here? What’s most important? What critical information do I have and not have? gradually enables you to see a little more clearly, and move forward in the midst of uncertainty.
It isn’t easy to slow down, think more, and spend time gathering and weighing evidence before deciding. “It’s going against evolution,” Levitin notes; it seems we are wired to make quick, instinctive decisions. Blame it on our jungle ancestors. Our skill sets developed to make quick judgment calls based on limited information—such as a rustling in the leaves. We’re also a bit lazy, cognitively speaking. In their research on decision-making habits, the professors Katherine Milkman, Jack Soll, and John Payne concluded that humans resort to snap judgments because “we’re cognitive misers—we don’t like to spend our mental energy entertaining uncertainties.”
But unlike our ancestors, who often had to rely on hunches to survive the jungle, we have more time and a wealth of information (too much, it sometimes seems) at our fingertips that can help inform our decisions. If we don’t use the available time and tools to make better decisions, then that, in itself, is a decision—and not a good one.
When we make snap judgments, we’re relying on a limited or distorted view of a situation, while thinking we have a more complete and accurate view. Based on his research of this phenomenon, the psychologist Daniel Kahneman developed a name (and acronym) for it: “What you see is all there is,” or WYSIATI. We form a story in our heads based on what little we know, without allowing for all we do not know, Kahneman explains.
Interestingly, Kahneman’s research found that some people are able to make good snap judgments in certain situations, but only because they know more than most about that particular situation (usually based on past experience). Kahneman points out that a chess master, for instance, may have reliable gut instinct when deciding on a move—because there is so much experience from similar past decisions to draw upon.
Most of us aren’t like the chess master—though we may think we are. “Overconfidence arises because people are often blind to their own blindness,” Kahneman writes. They “sincerely believe they have expertise, act as experts and look like experts.” But all the while, “they may be in the grip of an illusion.”
Rather than ask, Should I trust my gut instincts?, the better question is, How can I override those instincts?
And this is where questioning becomes important. As you’re making decisions, you can attempt to override instincts and become less “blind to your own blindness” by asking more questions. If, as Kahneman suggests, we make poor decisions because of our limited field of view, then what if we could open up a wider view—using our questioning flashlight to do so?
The first thing to do with that flashlight is turn it on yourself. The path to better decision-making begins by questioning one’s own beliefs, biases and assumptions. It’s something people rarely do—and it’s certainly not easy to do (there are some biases that are likely to remain invisible to us no matter how hard we search for them). It may be more difficult than ever in these “echo-chamber” times. If one is predisposed to believe something or hold a certain view, it is easier today to seek out information that confirms that view while avoiding information that challenges it.
When the Nobel Prize–winning physicist Arno Penzias was asked what led to his success, he explained that he made a daily habit of asking what he called “the jugular question.” Penzias said, “The first thing I do each morning is ask myself, ‘Why do I strongly believe what I believe?’” Penzias felt it was critical to “constantly examine your own assumptions.” And this is important to do whenever making decisions—because our assumptions and preconceived notions can greatly influence decisions (assumptions, and the tendency to want to confirm them, is one of the most perilous “decision traps,” according to research).
To take a more holistic view of your own assumptions about a particular issue you’re deciding on, break Penzias’ jugular question into three parts—the What, the Why, and the What if. The first part involves simply trying to identify some of your biases or assumptions. The initial question to ask yourself would be: What am I inclined to believe about this particular issue? For example, if I am trying to decide on a job offer in Seattle, this initial question can help ferret out feelings and assumptions I might have about Seattle, about the challenges of changing jobs, of moving to a new location, and so forth.
Moving from What to Why, we return to Penzias’ original question—which tries to get at the basis for whatever feelings or beliefs you might have on this subject. By thinking about this (and perhaps researching or talking to others about it), we can begin to see if the belief or gut feeling holds up to scrutiny. We may realize that it has little evidence to support it. It may be a viewpoint that made sense once but not anymore (this is such a common problem that the author Daniel Pink recommends regularly asking, What did I once believe that is no longer true?).
In questioning why you believe what you believe, don’t overlook the “desirability bias,” which, researchers are finding, is quite powerful (perhaps even stronger than the much-discussed confirmation bias). To figure out what your desirability bias is on any given issue, ask yourself this simple question: What would I like to be true? Just because you’d like something to be true, doesn’t mean it is. Wishful thinking can get in the way of critical thinking.
After considering what and why, move to what if—as in, What if my beliefs or assumptions on this issue are just plain wrong? In exploring this possibility, there’s a simple and effective strategy you can use: think of whatever you believe about a particular issue, and then consider the possibility that the opposite might be true. Richard Larrick, a Duke University professor and a leading researcher on the subject of “debiasing,” says the “consider the opposite” approach “consists of nothing more than asking oneself, ‘What are some reasons that my initial judgment might be wrong?’” Larrick says it works because “it directs attention to contrary evidence that would otherwise not be considered.”
All of which means that there is at least some scientific basis for the “Opposite George” strategy once employed by the Seinfeld character George Costanza. In a 1994 episode of the show, George (with advice from Jerry) has an epiphany: Since his gut instincts had always seemed to lead him astray in the past, he decides that, henceforth, he will do the opposite of whatever he’s inclined to do in a given situation—in other words, let “Opposite George” take over.
In the show, automatically following the ‘opposite’ approach works wonders for George’s dating life and career. But in a real-life situation, the “consider the opposite” strategy is not meant to provide a clear and reliable solution; rather, it’s designed to open up your thinking to possibilities beyond your first instinct. The opposite choice might turn out to be a good option, but it could also show you that your first instinct was correct—or, perhaps, you’ll realize the best path lies somewhere in between.
Am I thinking like a soldier or a scout?
In order to be able to question your own thinking—so that you can make room for other ideas and views that might conflict with yours—you must be “humble enough to admit that you don’t know something or that you might be wrong about what you think you know,” says Levitin. This goes against a natural tendency in many people to defend what they believe. When trying to consider multiple perspectives, evaluate evidence, and make thoughtful decisions, that tendency to be defensive can get in the way.
To illustrate this point, Julia Galef, co-founder of the Center for Applied Rationality, offers up a clarifying metaphor in the form of a beautiful question. Galef suggests we ask ourselves this question: “Am I a soldier or a scout?” She explains that there is a very different mindset for a soldier as opposed to a scout. A soldier’s job is to protect and defend against the enemy, whereas the scout’s job is to seek out and understand. These two distinct attitudes can also be applied to the ways in which all of us process information and ideas in our daily lives. “Making good decisions is largely about which mindset you’re in,” Galef says.
The mindset of a scout (or any type of explorer) is rooted in curiosity. “Scouts are more likely to say they feel pleasure when they learn new information or solve a puzzle,” Galef says. “They’re more likely to feel intrigued when they encounter something that contradicts their expectations. And scouts are grounded: their self-worth as a person isn’t tied to how right or wrong they are about any particular topic.”
In other words, scouts have “intellectual humility,” to use a term that has been popularized in the past few years by a number of articles, blog posts, and books (and also because the Google executive Laszlo Bock publicly announced that one quality the company looks for when hiring is intellectual humility).
Defined as “a state of openness to new ideas, a willingness to be receptive to new sources of evidence,” intellectual humility is seen by one of its champions, author and University of Virginia professor Edward Hess, as the key to thriving in days ahead.
We can’t compete with artificial intelligence unless we humans keep learning, experimenting, creating, and adapting, Hess says. And we can’t do any of that unless we assume the lifelong role of humble inquirer. As Hess declares in the title of his book, “Humility is the new smart.”
Though humility is often associated with meekness, Hess says we should think of it as “being open to the world.” In that regard, it can be seen as courageous because “I’ve got to overcome my reflexive ways of thinking—my ego, my fears, my fight or flight responses.” If we embrace intellectual humility, he believes, it can help with everything from innovation to civil discourse because “it’s no longer about who’s right—it’s about what is accurate.”
Overcoming the urge to “be right” takes a conscious effort, it seems. The venture capitalist Christopher Schroeder says he uses the following question to remind himself to keep an open mind: Would I rather be right… or would I rather understand?
“If you’re adamant about being right,” Schroeder says, “you lock yourself in your own echo chamber—and that can cause you to make bad decisions.”
The “need to be right” can affect all kinds of decisions, as well as personal relationships—it can keep arguments and feuds going far too long. There’s no doubt that pride plays a big part in all of this: It feels good to think you’re in the right, and to be told by likeminded others that, yes, you’re right and you’ve been right all along. But it doesn’t do much to improve learning, understanding, decision-making, or progress in general.
“If we really want to improve our judgment as individuals and as societies,” Galef says, we should endeavor to change the way we feel about being right—and being proven wrong. “We may need to learn how to feel proud instead of ashamed when we notice we might have been wrong about something, or to learn how to feel intrigued instead of defensive when we encounter some information that contradicts our beliefs,” she says. Galef has her own version of the “need to be right” question; she advises people to ask themselves, “What do you most yearn for—to defend your own beliefs or to see the world as clearly as you can?”
If you can commit to striving for the latter, then you’re in a position to begin making decisions with a more open and informed mind.
Find out more about The Book of Beautiful Questions »