Good Thinking: Seven Questions for Denise Cummins
Hear from Denise Cummins about the making of Good Thinking: Seven Powerful Ideas That Influence the Way We Think, and don’t forget to check out this month’s installment of the Cambridge Book Club for additional content and 30% off related titles. As always, we invite you to join the discussion on Facebook, or chime in on Twitter using the hashtag #cambridgeideas.
Why did you decide to write Good Thinking?
Over the course of my teaching career, I became aware of something rather alarming: Science majors know all about hypothesis-testing, philosophy majors know all about argumentation, and business students know about economic theories. But they know very little about research in disciplines outside of their majors. And then these bright and well-educated people are asked to evaluate proposed economic, legal, or medical policies, and sometimes even to vote on them. How are they to do that when there are holes in their knowledge bases where crucial pieces of information should be?
You’re a professor of psychology, but in this book you explore decision-making in many different fields: economics, philosophy, science, etc. How did you go about discovering more about their methods?
Most people equate psychology with psychotherapy. In my first book, The Other Side of Psychology, I introduced audiences to the science of psychology: psychologists who study how we perceive, learn, think, and decide. Experimental psychologists conduct scientific investigations into these questions, and we are well-trained in scientific methodology. Daniel Kahneman is an experimental psychologist who won the Nobel Prize in Economics for his research and theorizing on decision-making. My own training has included cognitive science, philosophy, computer science, and business.
Was there anything that surprised you while you were researching and writing the book? Is there something you learned from it that you’ve incorporated into your own thinking?
The controversy over mammogram screening surprised me-that women were paying no attention to the science underlying the recommendations and were instead outraged over what they saw as a corner-cutting decision at the expense of their health. I incorporated that controversy into the book so that patients would better understand evidence-based medicine, how it is done, and what it means for their own lives and thinking about health.
When have you used some of these seven decision-making tools?
I use hypothesis-testing in my research on human reasoning and decision-making. I keep Bayes firmly in mind when evaluating medical screening recommendations or investment choices. And I seek Nash equilibrium when choosing weekend activities with my husband-that is, I rather do something together that is not my top choice rather than do my top choice by myself.
Some of these seven ideas have become hot topics in the last decade, like rational choice or game theory. But you also devote attention to the power of analogy, calling it “the core of cognition.” Why does it have so much power?
The simple answer is that our minds seem to be wired that way. We tend to notice similarities among people and events, and then assume that what is true of one is true of everything that looks or seems the same. This is a very powerful strategy, but can lead to disastrous consequences (as in stereotyping). The upside is that analogy is a powerful means of making people understand things, because it helps them to see something unfamiliar in familiar terms. For example, when Ben Bernanke persuaded us to approve the bailout of the financial industry, he did so by telling us the banking industry was like an irresponsible neighbor who smoked in bed and set fire to his house in your neighborhood of houses made of wood. This was a very powerful and very persuasive analogy. And only time will tell whether it was the right analogy to draw.
You illustrate many times when our brains lead us to different conclusions than experts do. Why are we so prone to error?
One of the messages of the book is that when we reach a different conclusion than experts, we have not necessarily made an error. The market now is awash in books bemoaning human irrationality and stupidity. But I think that vastly undersells human intelligence.
You can see this in the idea of rational choice. Most people don’t appreciate that the core of economic theory is the concept of a rational agent where “rational” means “self-interested.” When people “make mistakes” in economic game-based research, their “mistakes” don’t focus solely on how they will benefit from a course of action but how it will affect others as well. This is considered “error” according to standard treatments of rational choice. Experimental economists have had to introduce the notions of fairness and inequity-aversion into economic theory in order to predict and explain how people behave. I don’t think these concepts are evidence of “faulty” thinking, and I am not alone in that.
Are there ways that we can use our “faulty” reasoning for the better?
The best way to improve decision-making is to be aware of built-in biases that can be disastrous if exploited by others. Politicians often use analogy to persuade us to support their plans because analogy is a very powerful means of making people think about something new in terms of something they already understand. So we can be lured into believing something is simpler than it really is. Be suspicious of simplistic analogies. Because we tend to accept conclusions that jibe with our beliefs, we tend to ignore the quality of the argument leading up to them. This is called belief bias, and we can arm ourselves against it by evaluating the argument, not just its conclusion. We avoid risk when there is a chance to gain money, but we become risky when there is a chance that we will lose money. If you know that, then you are less likely to (for example) hold on to a bad investment hoping it will gain value again (risky choice) rather than cutting your losses short (safer choice).