Sometimes we need to react fast, automatically. For example, as we see a large truck speeding towards us as we are standing in the edge of the street waiting for a traffic light to change. Or, as we observe the subtle cues of a very dissatisfied client. And, at a different time, we may find ourselves totally engrossed in the deep work1 of a seemingly intractable problem. And, then our thoughts and actions need to proceed at a slower pace.
Daniel Kahneman addresses the systems that the brain uses for this thinking in his 2011 book Thinking, Fast and Slow.2 Kahneman is a psychologist noted for his work on the psychology of judgment and decision making, and in behavioral economics. He received the 2002 Nobel Memorial Prize in Economic Sciences for his work in behavioral economics. Today, he is the Eugene Higgins Professor of Psychology and Professor of Psychology and Public Affairs, Emeritus, at the Woodrow Wilson School of Public and International Affairs at Princeton University.
The key take-away from Thinking, Fast and Slow is that we have two modes of thinking, System 1 which “operates automatically and quickly, with little or no effort and no sense of voluntary control,” and System 2 which “allocates attention to the effortful mental activities that demand it, including complex computations.”
Kahneman2 tells us that “when we think of ourselves, we identify with System 2, the conscious, reasoning self that has beliefs, makes choices, and decides what to think about and what to do. Although System 2 believes itself to be where the action is, the automatic System 1 is the hero in the book. I [Kahneman] describe System 1 as effortlessly originating impressions and feelings that are the main sources of the explicit beliefs and deliberate choices of System 2.”
System 2 thinking requires your relatively undivided attention. If you do not pay attention, you will perform less well or not at all. Think about making a complex math calculation or completing a tax form.
Kahneman also wrote that “Systems 1 and 2 are both active whenever we are awake. System 1 runs automaticity and System 2 is normally in comfortable low-effort, in which only a fraction of its capacity is engaged. System 1 continuously generates suggestions for System 2: impressions, intuitions, intentions, and feelings. If endorsed by System 2, impressions and intuitions turn into beliefs, and impulses turn into voluntary actions. When all goes smoothly, which is most of the time, System 2 adopts the suggestions of System 1 with little or no modification. You generally believe your impressions and act on your desires, and that is fine – usually.
“When System 1 runs into difficulty, it calls on System 2 to support more detailed and specific processing that may solve the problem of the moment. System 2 is mobilized when a question arises for which System 1 does not offer an answer… System 2 is activated when an event is detected that violates the world that System 1 maintains.”
Ameet Ranadive, Director of Product at Twitter, tells us in his essay, What I Learned from “Thinking Fast and Slow,”3 that “System 1 is continuously creating impressions, intuitions, and judgments based on everything we are sensing. In most cases we just go with the impression or intuition that System 1 generates.” System 1 is very aware that our unconscious drivers such as, in-group/out-group bias, similarity bias, confirmation bias, etc., influence our judgment and decision making. We discussed biases in two Tuesday Readings in early March.4
A reasonable question to ask that we did not address in our earlier readings on bias is this: Just how are our biases formed? The simple answer is we don’t have a good answer to the question. We do know that biases are pervasive, we all have them. That our biases do not necessarily align with our declared beliefs. That our biases generally favor our own in-group. And, we also know that our biases are malleable. Our brains are exceedingly complex, and we can learn and adopt new associations and unlearn others.
The mental process that this learning and unlearning uses is the target of television advertisements that are repeated in every ad-block within a television program that is known to be watched by a target audience. By repeating the message repeatedly, the ad’s sponsor seeks to bias the target audience to take an action favorable to the advertisement governments (e.g., such as buying a specific product). When something is repeated enough times, we tend to believe it. Our brains have difficulty in distinguishing between familiarity and the truth. We are all gullible.
This same approach is used by political organizations, including those involved in the reported Russian interference in the 2016 U.S. Presidential Election, to influence a target audience, selected based on data about their individual preferences, beliefs, etc. available from social media, to vote a favorable way.5
The “Cognitive Reflection Test”6,7 devised in 2005 by Shane Frederick, Professor of Marketing, Yale School of Management, provides a set of simple examples illustrating how System 1 and System 2 thinking works. The test has three questions that provide a simple indication as to whether a person is using System 1 or System 2 thinking to answer the question.
I urge you to answer as you read a question’s text (and before you read the comments immediately following the text of the test):
- A bat and a ball cost $1.10. The bat costs $1.00 more than the ball. How much does the ball cost? _____¢
- It takes five machines five minutes to make five widgets. How long will it take for 100 machines to make 100 widgets? _____ minutes
- In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48 days for the patch to grow and cover the entire lake, how many days will it take for the patch to cover half of the lake? _____ days
What is particularly interesting here is that these questions tempt you, really your brain, to respond using System 1 thinking, that is, your human intuition. And, here, your intuition fails you. Indeed, of the 3,428 university students who completed Frederick’s initial studies, 33% got all three questions wrong and 83% got the answer to at least one question wrong. MIT students got 48% of the answers correct, the highest percentage for any university. (Fredrick was on the MIT faculty when he was conducting this research.)
The intuitive, though incorrect, answers are 1) 10¢, 2) 100 minutes, and 3) 24 days. The correct answers are 1) 5¢, 2) 5 minutes, and 3) 47 days. Frederick observed from the test that “Your intention is not as good as you think.”
System 2 thinking when intensely focused on a task can make you effectively blind, even to strong visual stimuli that you would otherwise see. For an example of this effect, you may want to watch the short video, The Invisible Gorilla. Instructions for watching the video ask you to focus on a specific, demanding task (provided at the beginning of the video) while watching it. [Watch now.] We’ve all experienced this phenomenon to some degree in our regular lives. I work with the windows in my study facing a not-that-busy residential street. If I’m focused on my work, I miss people walking their dogs, cars, the school buses, the trash truck, etc. And, if my focus is less, I see them all.
So, what are some of the takeaways from our understanding of System 1 and System 2 thinking:
- It is very easy to fall into a System 1 trap, thinking about an issue that requires detailed analysis in terms of your intuition, impressions, biases or feelings. If there is hard data, it’s time for System 2.
- System 1 thinking often leads one to jump to conclusions, make too quick judgments, and bad, wrong decisions. In other words, we can fool ourselves into thinking all the parts fit when they don’t.
- We can be blind to the obvious. (As an example, recall the gorilla video.)
- Our minds choose the familiar instead of the true. People evaluate the relative importance of an issue by the ease with which they can retrieved it from memory. Frequently mentioned topics populate the mind as others slip away from awareness. Similarly, the media chooses to report what they believe is currently on the public’s mind. In an authoritarian regime, pressure can be exerted to drive what is reported.
- Most of the time we rightly go with System 1’s recommendations. However, don’t resist your brain’s dis-ease which suggests that something isn’t quite right, something unexpected has popped up, something needs some critical thought, … Putting in the extra time will be worth it.
- System 1 works to produce a coherent, believable story based on the available information. This may lead to a what-you-see-is-all-there-is, WYSIATI, conclusion. There is only limited information to work with, much is missing. You may rely on other’s intentions, well meaning, but incorrect judgments and impressions and not seek what’s missing.
- It is easy to become over confident in the future and let our view of the future directly affect decisions in the short term.
- And, a fun fact: Choosing bad fonts and harsh colors in a document will trigger System 2 thinking, leading people to work harder and more carefully on the work.
- Studying your own weaknesses is extremely difficult. It is far easier to see other’s mistakes than our own.
There are lots of important ideas here that can and most likely should change the process of your thinking. My hope is that you’ll latch onto one or two ideas that are particularly important to you at this time and run with them. Then, come back later for another two. I do believe that it will provide significant dividends. Just learning to routinely ask for the data, the reasoning, associated with a decision may be worth the cost of learning about how to better think.
Make it a great week for you and your team!
. . . . jim
Jim Bruce is a Senior Fellow and Executive Coach at MOR Associates, and Professor of Electrical Engineering, Emeritus, and CIO, Emeritus, at the Massachusetts Institute of Technology, Cambridge, MA.
- Cal Newport, Deep Work: Rules for Focused Success in a Distracted World, Grand Central Publishing, January 2016.
- Daniel Kahneman, Thinking, Fast and Slow, Farrar, Straus, and Giroux, LLC, 2011.
- Ameet Ranadive, What I Learned from “Thinking Fast and Slow,” medium.com, February 2017.
- Tuesday Reading, Bias, March 6, 2018, and Mitigating Bias, March 13, 2018.
- Dipayan Ghosh and Ben Scott, Facebook’s New Controversy Shows How Easily Online Political Ads Can Manipulate You, Time, March 19, 2018.
- Shawn Frederick, Cognitive Reflection Test, devised Wikipedia.
- Tara Kadioglu, Why Slow Thinking Wins, The Boston Globe, July 2015.