This is a review of two books that deal with human thought processes related to communications measurement and evaluation. For a sidebar to this article, see “7 Psychological Reasons Why Media Analysis Tells Only Part of the Measurement Story.”
Psychologists Daniel Kahneman and Amos Tversky asked the following question of a great many experimental subjects:
If a random word is taken from an English text, is it more likely that:
- The word starts with the letter K, or that
- the letter K is the third letter of the word?
If you, like most of the thousands of economists, statisticians, and students who have considered it, answered 1., then congrats. You’re wrong. There are actually about three times as many English words with K as the third letter than the first. But most people find it easier to think of words that begin with K, so they tend to jump to the wrong conclusion based on the easier answer. It’s an example of the availability heuristic, a common mental shortcut that leads to fairly predictable errors in judgment. The question above is just one of many similar questions that the two researchers used to study how humans err in their decisions.
Daniel Kahneman’s Thinking Fast and Slow, and Michael Lewis’ The Undoing Project are books that explore the quirks, biases, and irrationalities of human decisions and predictions. Thinking Fast and Slow is a research-based summation of the state of cognitive psychology seen through Kahneman’s theoretical perspective. The Undoing Project is the broader story of the amazingly productive friendship between Kahneman and Tversky. Kahneman was awarded the Nobel prize, but it would have gone to both of them had Tversky still been alive.
Both books cover a lot of ground, both are very readable, both are bestsellers. Both have received many glowing reviews: for “Thinking Fast and Slow” see The NY Times, Scientific American, and The Guardian. For “The Undoing Project” see Forbes, The Economist, and The New Yorker. I encourage you to go out and read them yourself.
These books are important for two reasons. First, they have as their focus some of the fundamental disconnects of our current political climate. They speak to the techniques of persuasion and propaganda that are the essence of our post-truth times. What’s more, they concern the mission of our particular field of communications measurement. The goal of measurement is to use data in a logical way to enable decisions that avoid the irrationalities of human decision-making. When do humans base their thinking on facts and reality, and when is that thinking irrational? And why? How can we avoid such errors?
A little background. Economists and psychologists in the last century based their theories of human decision-making and learning on the assumption that human beings were logical and rational. No theory made more scientific hay with this than behaviorism. From Pavlov’s dog to the Skinner Box, the relatively straightforward concept that reward-based learning was the basis of behavior promised an easily quantifiable approach to research.
But by the 1950s came the understanding that people actually think while making up their minds. Turns out that “What goes on in there?” was a valid and vital question. Early cognitive researchers did their best to prove that human brains worked like logical, rational computing machines. It soon turned out that what was really interesting was when our brains did not work like computers. The errors we make tell a lot about how our brains work.
That’s where Kahneman and Tversky come in. Their fruitful friendship and prolific research laid the foundation for what we now call behavioral economics. Which—just in case you’re not acquainted with Freakonomics (read our review of the book)—is the study of the various interesting and often bizarre ways in which we humans fail to be rational in our judgments, predictions, and economic decisions. In short, it’s a big reason why the field of measurement and evaluation exists. If people weren’t biased, pig-headed, and ego-driven in their decision making, we’d need to do a lot less measurement and data-driven evaluation of the effectiveness of communications. The data of measurement helps us avoid our irrationalities.
Both these books review the surprising number of biased and irrational ways in which humans think. We naturally see cause and effect where none exists (the law of small numbers). Our judgment is biased by recent events (availability), and by seemingly irrelevant context (prospect theory). People often respond to probabilities with emotion rather than rationality, which underlies the popularity of both insurance and lottery tickets. See the sidebar article “7 Psychological Reasons Why Media Analysis Tells Only Part of the Measurement Story” for more examples which can be expected to affect media analysis in particular.
One of the most important findings in the study of human judgment is that experts are not very good at it. Over decades of research in dozens of fields, it’s been demonstrated over and over that simple decision-making models are better at making expert judgments than are human experts. Understandably unwelcome by professionals everywhere, this result has strong implications for the future of measurement. You probably think your measurement insight job is secure because of your experience and knowledge of the field. Research suggests that, on the contrary, an algorithm can probably do your job better. There are companies out there right now who are automating data analysis.
You can read The Undoing Project and Thinking Fast or Slow separately or together. And you can read them, well, fast or slow. If you studied psychology or economics, then you will find the detailed discussions fascinating. On the other hand, you can thoroughly enjoy the books by just touching on the highlights. Most of the more interesting cognitive biases in Kahneman are also covered by Lewis, and Lewis is a much faster read; more story, less literature review. If you’re out to impress your boss and co-workers, just leave Kahneman sitting around on your desk. The halo effect is bound to improve your image.
The lesson to take away from both these books is that humans very naturally see cause and effect where none exists. We’d like to think of ourselves as logical and rational, but we often aren’t. Or at least the logic is well hidden. Our biases and apparently irrational decisions have important consequences for our lives and our societies. These are two fabulous books for anyone who wants to understand the science behind our times and our industry.
Latest posts by Bill Paarlberg (see all)
- The Growing Demand for Online Privacy: Will It Hamper Communications Measurement? - July 21, 2017
- What Is the Meaning of Measurement, Part 2 - July 21, 2017
- Great Minds on Measurement: John Cage - July 18, 2017