We all know that human beings can be biased in their ability to make decisions and to understand messages. The purpose of this article is to point out a few biases that might be expected to affect a person’s understanding of messages in the media. This article is a sidebar for 2 Books on How Thinking Shapes Measurement: ‘Thinking Fast and Slow’ and ‘The Undoing Project.’”
The practice of communications measurement uses research and analysis to improve organizations’ communications. A major tool in the measurement quiver is media analysis, the study and interpretation of media messages and content. We use data about content to infer what the public understands from the media.
That part about “infer” is tricky. It turns out there are many things that can affect a consumer’s understanding of what they experience in the media. There are many reasons why one person’s experience of a newspaper or social media post is different from another person’s.
Measurement and evaluation sometimes assumes that human beings are like empty boxes to be filled with impressions and messages. To assume that communication ends with exposure to a message makes research easier, but often inaccurate. Here in The Measurement Standard we have written about this before. “Public Relations Measurement’s Empty Head” notes that people are not empty vessels to be filled. They take an active role in accepting or rejecting the messages they are exposed to in the media.
And it’s more complex than that. People unconsciously filter and interpret what they experience as well. They are easily biased and sometimes irrational in their understanding. You’ll find few better introductions to the psychology of this than Daniel Kahneman’s book “Thinking Fast and Slow,” a survey of the cognitive psychological literature on the way people experience the world and make decisions. You can read our review of the book here. One of Kahneman’s major points is that normal people often use intuitive shortcuts and rules of thumb, known as heuristics, to make decisions, thereby introducing systematic errors, or biases, in their predictions, decisions, and understanding. As a sidebar to our review, and to point out some of the limitations of measurement’s common approaches to understanding how media affects people, we’ve gathered seven results from “Thinking Fast and Slow” that illustrate some of the built-in biases, errors, and inconsistencies in human thinking. Enough to suggest that, if we wish to improve communications, then we need to understand how people process and understand the messages that they are exposed to.
Caveat: The list below is a sample from one book, and is deliberately chosen to provoke the reader to think about how people experience the media. Beware of over-simplified psychological results. As Kahneman himself says, “Many psychological phenomena can be demonstrated experimentally, but few can actually be measured.” The results noted below, although they have been studied and replicated in various ways, seem like they should be important, but may or may not have actual significance to real life media consumption.
(And if you’d like to read some research conducted specifically to explore cognitive effects on message communication, start with the Institute for Public Relation’s Behavioral Insights Research Center. See, for instance, Terry Flynn’s paper “How Narratives Can Reduce Resistance and Change Attitudes: Insights From Behavioral Science Can Enhance Public Relations Research and Practice.”)
#1. An attractive package makes us unduly appreciate what’s inside.
It’s called the halo effect. If we like a spokesperson’s voice and appearance, then we will be more likely to like what they are presenting to us as well. And visa versa. Similarly, what appears first in a list biases our experience of what appears later. The implication for media analysis is that a message will be interpreted differently depending on its context.
#2. The more often or easily we think of something, the more readily we think it’s important or true.
People tend to assess the relative importance of things based on the ease with which they think of them. The more frequently a falsehood is repeated, the more likely someone is to believe it is true. This is called the availability heuristic. A variety of it is the mere exposure effect: the more frequently a word is perceived, the more people like the word. Frequently mentioned topics tend to be ranked of greater importance than less frequently mentioned topics. That’s one reason why most communications programs want more mentions: the more frequently people experience a message, the more important it seems, regardless of its actual nature or the relative importance of other messages. When the media provides abundant coverage to terrorism deaths in the U.S., media consumers tend to think terrorism deaths are a bigger problem than they actually are, which, in turn, leads the media to cover them more often.
#3. We let our likes and dislikes determine our beliefs about the world.
Our judgments and decisions are sometimes guided by our emotions rather than rational deliberation. This, the affect heuristic, means that the arguments we find compelling are biased by our existing beliefs. If, for instance, we already like a particular brand or political candidate, our judgment of new information about them is skewed toward the positive. And toward the negative if we don’t like them.
#4. Our ideas affect our actions, and visa versa.
Ideas, whether conscious or not, can affect our actions. Likewise, our actions can affect how we experience ideas. It’s called the priming effect. Experimental subjects are more likely to agree with a message if they are nodding their heads when they hear it, as compared to when they are shaking their heads side to side. If a polling place is in a school, then voters will be more likely to vote in support of school funding. People presented with financial imagery unconsciously act with greater independence and selfishness. People who have lied verbally will prefer mouthwash over soap, and people who have lied in an email will prefer soap over mouthwash.
#5. The more easily we can read it, the more readily we believe it’s true.
A false statement will be more readily believed true if it can more easily be read. It’s called truth illusion. A message, for instance, will be more often believed if it is printed in bolder type, or on paper that provides better contrast. Likewise, a simpler or more memorable statement will be more readily believed as the truth. Or, expressed in the medium of our times, “It’s going to be amazing. True!” Next time you write a press release, remember that a source you quote will be more readily believed if the name is easier to pronounce.
#6. Our judgment can be influenced by obviously irrelevant information.
If you are asked if the height of the tallest redwood is more or less than X feet, then your subsequent estimate will be influenced by the number X. Taller if X is larger, shorter if X is smaller. This anchoring effect is well known by sellers of real estate, artwork, and most anything else: The higher your initial asking price, the higher your eventual sale price. This very robust effect biases the judgment of pros and amateurs alike.
#7. We jump to conclusions based on limited data.
It is important for us human beings to make up our minds about something, even if we have very limited data. This is often a valuable characteristic, but it can also lead to the acceptance of one-sided arguments and stereotypes. Kahneman calls this “What You See Is All There Is” (WYSIATI), and uses it to explain a number of biases of judgment and choice.