Advancing communications measurement and evaluation

6 Measurement Pros Explore the Value of Failure: Everybody Wants to Learn, but Nobody Likes to Make a Mistake

child-falls-off-bike

A fundamental limitation of measurement is our normal human reluctance to admit failure. In order to learn what does work it is vital to expose what doesn’t. The fear of failure lies behind the PR world’s overall resistance to measurement, and motivates faulty techniques like AVEs and vanity metrics. In this discussion six top measurement thinkers explore the value of failure, featuring Richard Bagnall, David Geddes, Chip Griffin, Jim Macnamara, Tina McCorkindale, and Mark Weiner.

By Bill Paarlberg—Half my life ago I went to work as a measurement newbie. With the cocky optimism of youth and a graduate degree in cognitive psychology, I was fired up to show the world of communications measurement how to do proper research.

I was so naive.

One of my first meetings at my new job was with the Director of Research. I’d already reviewed some reports that looked a little hinky, and so I posed some very pointed questions. Mr. Director had to set me straight: “Look, Bill, every project we do has someone’s job or budget or career on the line. We do the research, and people have to use, and sometimes suffer, the results. We have to take that into account when we write our reports.”

That burst my bubble. I was appalled to learn that real-life measurement wasn’t always just about doing proper research. Business is business, and sometimes the measurement business is about doing research that makes the client look good.

Everybody says communications measurement is about learning what works and what doesn’t. But how often are we brave enough to make the mistakes necessary to learn?

So I asked a few longtime measurement pros to comment. To start, Richard Bagnall, CEO of PRIME Research UK and incoming Chair of AMEC, provides a perfect anecdote:

richard-bagnallOne of my favourite stories is actually something that happened to my friend, colleague, and opposite number in the USA, Mark Weiner [Chief Executive Officer of PRIME Research]. Mark tells how he’d always assumed that his clients had measurable objectives and that they simply chose not to share them with him. Then, the CEO of a well-known mid-sized New York agency told him, “I’d rather forego being a proven success in exchange for never being proven a failure.”  That confession changed the way Mark viewed the public relations and research process forever.

The ideal vs. the reality: Does ego trump research?

The thing is, there’s doing the research, and then there’s using the results. And the human research environment means that sometimes you get one but not the other.

Communications measurement is, ideally, an iterative process of learning and improvement. We gather data about the effectiveness of our efforts, wrestle some insight about what works and what doesn’t, make changes accordingly, and then go ’round again by gathering more data, looking for insight, making changes, and etc. The purpose is to learn and improve.

But the beautiful logic of this improvement cycle fails to account for the reluctance of many to acknowledge errors. As Tina McCorkindale, President and CEO of the Institute for Public Relations, points out:

McCorkingdale-Tina-200While we share quotes and stories of how failure makes you stronger and tells you what doesn’t work so you become a better and stronger person, in reality, we are not very tolerant of failure or mistakes.

Making errors is normal and necessary

Apropos, this morning my email included the following book excerpt:

Error followed by correction and instruction is the fundamental process of schooling. You get it wrong, and then you get it right. If getting it wrong and then getting it right is normal, teachers should Normalize Error and respond to both parts of this sequence as if they were totally and completely normal. After all, they are.” —From “Teach Like a Champion: 49 Techniques that Put Students on the Path to College” by Doug Lemov, (thanks to the delightful Delanceyplace.com)

It’s human nature to not want to be wrong. And especially to not want to be seen making a mistake. Whether it’s in the classroom or in a measurement report, failure carries a stigma. Maybe you aren’t learning your ABCs. Or you aren’t doing your job. It’s a matter of ego and image. And sometimes budget or career.

But, as Mr. Lemov says, making errors is a normal part of making progress. Wasn’t that Charles Darwin’s big point? It’s not just that you can’t win them all, it’s more that you’re not supposed to win them all. You can’t improve without discovering what doesn’t work.

Says Mark Weiner:

Mark-Weiner-156squareIt’s human nature to be curious, to seek success, and to avoid risk. Given these contradictions, it’s no wonder why learning is one of life’s most difficult tasks. Mistakes are some of the most powerful—if difficult—learning opportunities available to us. Life isn’t about winning or losing, it’s about winning or learning. Public relations is no different.

Embrace failure

There are people who accept and even embrace the prospect of failure. Chip Griffin, CARMA’s CEO, North America, says:

chip-griffin2Don’t fear failure, learn from it. Whether your own mistakes or those of others, there’s lots to learn. Unless you are extraordinarily lucky, you will experience failure… Treat it as an education rather than a disaster and you’ll be that much stronger for it… The only real failure is not learning from your past mistakes.

Here are 17 inspirational quotes on the value of failure, from everyone from Aristotle to Donald Trump. Read them and you just might convince yourself that the terribly bad day you are having is actually progress. (You’ll notice that these quotes about failure generally come from people who have become so successful that they no longer have to worry about fessing up to a few mistakes.) The subtext—now venturing into the wide world of sports metaphors—is that you won’t win many without losing some. The thrill of victory and the agony of defeat are flip sides of the same coin of progress.

Yet communications measurement often suffers from a refusal to embrace the value of failure. It’s not just that agencies or vendors sometimes push bad news under the rug, or that unflattering reports get conveniently buried in a back room somewhere. A refusal to embrace failure is sometimes built into why we do measurement. We all know that measurement is often done with the goal of justifying someone’s decisions or existence. And you can catch a rotten whiff of that motive in such self-aggrandizing techniques as AVEs, multipliers, and inflated impression counts; more is better, so lots more is lots better.

Here’s Tina McCorkindale again:

One part seems to be lack of education about appropriate research techniques: sometimes we like to chase the shiny red ball because it has a bigger wow factor, rather than choosing the best measurement process that is both valid (measures what we intend it to measure) and helps us achieve our overall goals and objectives. Also, we get caught up in these unrealistic numbers, and have that big number bias (300 million impressions? That’s fantastic!). We engage in “success theater,” as Gary Sheffer calls it.

Mistakes are worth the risk

So can we adjust the way we do measurement to more readily learn from failure and remove its stigma? Doug Lemov suggests, in an academic setting:

…if wrong answers are truly a normal and healthy part of the learning process, they don’t need much narration at all… It’s better, in fact, to avoid spending a lot of time talking about wrongness and get down to the work of fixing it as quickly as possible…

In the measurement world, Chip Griffin points out that measurement programs are, by their nature, complex and difficult to design. So learning from failure is vital:

With measurement, there’s no single best approach, so you need to find the one that works best for your organization.

There are some creative techniques for identifying and learning from mistakes. Here’s Mark Weiner again:

“When Delahaye and Medialink became one company in 1999, I became aware of a strange Delahaye custom, instituted by Delahaye CEO Katie Paine, of celebrating mistakes at monthly meetings. I, like so many executives, viewed performance within a narrow framework which rewarded success but which inadvertently discouraged innovation, growth, and experimentation. I learned then that mistakes are worth the risk and the lessons we learn should be shared and even celebrated.”

Looking for a learning experience

The world of manufacturing has been working on techniques to learn from failure for decades, as David Geddes, Principal at Geddes Analytics LLC, points out:

David GeddesWhen I was at Sprint in the early 1990s, the company had a resolute focus on total quality management. The “14 Points for Total Quality Management” of the great W. Edwards Deming were a constant source of reference, and, indeed, apply to our measurement domain today.

At Sprint, and other companies driving to catch up with the Japanese on quality, we were relentlessly seeking out opportunities to improve, that is, looking for areas where performance was the weakest (Deming’s Point #3), identifying internal and external customers, developing process metrics, and turning the measurement information into action plans. This had to be a relentless company-wide process, driven from the top (Points #1 and #2). Failure itself was not the issue; failure to develop action plans was the issue.

One of the most important, Point #6 “Drive out fear,” directly gets to the point that we need the courage to allocate time to areas of the weakest performance. Think about it. We generally know when we succeed. We see the media hits. We see the attendance at a museum exhibit. But we have to dig to find out the “why’s” of our weak performances.

But what about in the world of measurement? Here at The Measurement Standard we devote a lot of time and energy to the nuts and bolts and best practices of doing research. Yet we devote little attention to how to finesse the awkwardness of failure and just get on with learning. The same is true for most publications on measurement and evaluation.

Replies Jim Macnamara, Professor of Public Communication and Associate Dean at the University of Technology, Sydney:

jim-macnamara-thumbYes, it’s true. The published case studies in public relations and evaluation reports overwhelmingly sing a song of praise. It is hard to find information about what does not work. At the 2016 AMEC Summit on Measurement in London I presented two case studies: one that was very successful, and one that resulted in a campaign being pulled and some negative feedback. This was seen as very unusual, but was welcomed by delegates.

The key thing… is that evaluation should provide learning to inform future strategy and tactics. And learning means insights into what works and what doesn’t work… The industry needs to be a little more open-minded and less defensive. Every day is a learning experience.

And isn’t that really the whole goal of measurement, to have “a learning experience?”

As a final thought, here’s another amusing and spot-on story from Richard Bagnall:

The points you have made and the comments you have received all speak to the lack of planning and objective setting that is too common in PR still. It reminds me of a time a few years ago when I took a call late at night in the office. A PR account manager from a large PR agency was on the phone.

“Can you help me?” he asked, “I need my campaign measured.”

“Sure,” I said. “Tell me a bit more about the campaign and what you’re after.”

“I need a report on 300 clips by next week.”

“OK,” I said, “that’s manageable. Let’s start to understand your brief… What was it that you were trying to achieve with the campaign?”

Back came the answer: “Get press coverage. Beyond that I don’t know: isn’t that what your measurement report is going to tell me?”

###

Bill Paarlberg
Visit me on

Bill Paarlberg

Bill Paarlberg co-founded The Measurement Standard in 2002 and was its editor until July 2017. He also edits The Measurement Advisor newsletter. He is editor of the award-winning "Measuring the Networked Nonprofit" by Beth Kanter and Katie Paine, and editor of two other books on measurement by Katie Paine, "Measure What Matters" and "Measuring Public Relationships." Visit Bill Paarlberg's page on LinkedIn.
Bill Paarlberg
Visit me on
728 Ad Block

Related posts

728 Ad Block