Advancing communications measurement and evaluation

Interview with Jim Macnamara: The Theory and Practice Behind the New AMEC Integrated Evaluation Framework

The Measurement Standard is pleased to welcome back Jim Macnamara, Professor of Public Communication, University of Technology Sydney. This time we’re talking to him about the new AMEC Integrated Evaluation Framework, for which he provided a theoretical underpinning and evaluation taxonomy. We’ll also find out about his recent move to London and AMEC awards.

The Measurement Standard: Welcome back, Jim. We understand you’ve been quite busy lately. Tell us something about that.

Jim Macnamara: My wife and I have just packed up and moved to London where we’ll be for the rest of this year. I’m on sabbatical from the University of Technology Sydney (UTS) and will spend six months doing research with the UK Cabinet Office and UK government departments. I have been appointed Visiting Professor at London School of Economics and Political Science (in their Media and Communications Department). Plus I am a member of the Evaluation Council of the UK Government Communication Service. And on top of that I’ve just been contracted by Routledge to write a book on evaluation of communication. The book I am planning will be quite different to what is in the market, so I’m looking forward to hopefully doing a good job on that.

TMS: Very busy indeed. We expect a little preview of your book, as soon as you can leak it to us. OK, let’s talk about the new AMEC Integrated Evaluation Framework. To what extent does academic research inform it?

JM: The new AMEC Integrated Evaluation Framework is a great example of senior PR and communication practitioners, academics researchers, and commercial social and media researchers working together. The framework is referred to as ‘integrated’ because it:

  • Integrates academic, communication practitioner, and commercial research perspectives;
  • Integrates other models, matrices, and frameworks already in existence and builds on them; and
  • Allows integrated evaluation of paid, earned, shared, and owned media and communication.

I believe it is fair to say that academic research made a significant contribution by basing the framework on best practice program evaluation knowledge. I prefer to use the word ‘knowledge’ rather than theory because many practitioners have an incorrect view that ‘theoretical’ means ‘hypothetical’ and esoteric. The fact is that there is a whole world of program evaluation knowledge in disciplines where evaluation is widely practiced such as public administration, management (e.g., performance management), and organizational development. It is important for PR and communication to stop re-inventing the wheel and making up new models and terms.

What we did in developing the new AMEC Integrated Evaluation Framework was base the approach on program evaluation concepts, principles, and terminology such as program logic models that are widely used across many industries, as well as social science insights into human communication such as the communication-persuasion matrix drawn from social psychology, and then adapt these to PR and corporate communication practice specifically.

Ultimately, the framework has to be a practical tool for practitioners to use—and I believe it achieves that by offering an online interactive application. But behind the tool is a whole range of information and resources that identifies and describes the various terms, and guides users on what metrics and what methods are applicable to each type of activity and each stage. (Note: For more on the new Framework, read Jim’s Introduction to it here.)

TMS: You’ve said that academic research and practice are “mutually informative.” How does that work, particularly how does industry practice inform academic research?

JM: This is a really important issue that relates back to the first question. The interaction of practitioners and academics is a two-way street. It is not only a matter of practitioners having to learn from academics. There is a huge body of research findings in the academy that can inform practice—and this is ongoing and growing every day. So to me it is incomprehensible that practitioners would not tap into that.

On the other hand, academics need to engage with industry for at least three reasons:

  • First, engaging with industry and professional practice is important to identify problems and gaps in knowledge which research can explore. While some academic research explores questions raised by academics themselves, another major reason for research is to examine problems and issues encountered in industry and practice. While that leads to applied research rather than pure or critical research, it is very important.
  • Second, academics want their research to have impact, not simply be published in niche academic journals. In some countries such as the UK academic research funding is now awarded and assessed based on impact, not just publication. Impact can be changing government policy, providing solutions for social issues, or it can be improving practices in an industry or profession. So many if not most academics want their research taken up and used.
  • Third, even if academics are not carrying out applied research (i.e., research to solve a practical problem), they do nevertheless need their research to be grounded and relevant to fields of practice. There is little point in coming up with findings and recommendations that are based on misunderstandings of practice or that are impossible to implement such as being hugely costly or too complex for users to understand. So consultation and relationships with industry and the professions are important.

In short, we learn from each other.

TMS: Academic research has been neglected in relation to industry standards for measurement and evaluation, and generally in PR and strategic communication practice. Why?

JM: There are usually at least two sides to every story and this is the case with the gap between academic research and practice in the fields of PR and communication. Both sides have to take some responsibility.

Practitioners are missing out on a lot of valuable insights and research data by not engaging with academic researchers. For instance, very few practitioners read the findings of academic research, and many industry organizations do not invite academic researchers to their conferences as presenters. In my home country, the Public Relations Institute of Australia (PRIA) runs its national conference with two separate events —one for academics and one for practitioners—on different days. That kind of segregation is counter-productive and very short-sighted.

On the other hand, some academics publish in a narrow range of academic journals, only attend academic conferences, and some write in a quite dense way with a Fog Index that requires a PhD to understand them.

I have called on practitioners to open their minds and reach out to academic research. Equally, I regularly call on academics to reach out to the industry and engage with it through conferences, professional publications, and participating in industry organizations. I am pleased to say that many now do.

Also, in terms of bringing practitioners and academics together, I must commend two recent initiatives:

  1. The Task Force set up in the US to review models and develop standards for evaluation. This is chaired by Fraser Likely and has brought together practitioners and academics from a number of countries.
  2. In late 2015 AMEC established an Academic Advisory Group, which I am honored to Chair.

These initiatives extend the work of the IPR Measurement Commission in terms of both the number of academics involved and international representation.

TMS: Tell us about your recent wins at the 2016 AMEC Awards for communication effectiveness.

jim-macnamara-accepts-UTS-AMEC-awards

JM: There is a back story here, and it’s important to understand that these awards focus on research for planning and evaluation. At the 2015 AMEC Summit on Measurement in Stockholm one of the delegates expressed appreciation for my paper but asked whether I ever did any evaluation, implying that there is a gap between academic research and practice. In fact, I do do research to inform planning of communication campaigns and evaluation—as do many academics who carry out contract research and who are often called in by organizations as independent advisers.

So this year for the first time I entered two research projects in the AMEC awards and was delighted and very honored to win two Gold awards and an overall Platinum Grand Prix award. I think that makes a point very clearly: robust, reliable research using sound methodology can be applied in practice and informs best practice.

TMS: Congrats—but wait a minute. Are you implying that “robust, reliable research using sound methodology” is not typically applied in practice? Are you commenting about the quality of most communication research?

JM: I think we already have the answer to that question. Typically no. While there are pockets of excellence, robust evaluation research is not used in PR practice generally. That has been shown in studies in the US, UK, Europe, and Asia Pacific—for instance, studies published on the Institute for Public Relations (IPR) Web site, the European Communication Monitor, the Asia Pacific Communication Monitor, and so on. Almost every survey of practitioners finds acknowledgement that we need to do better in this area. I look on it positively – this is an area for improvement to show the value of practice. That is the key to future growth.

TMS: Thank you Jim, for another informative interview. All the best in London and on the new book.

###

Bill Paarlberg
Visit me on

Bill Paarlberg

Bill Paarlberg co-founded The Measurement Standard in 2002 and was its editor until July 2017. He also edits The Measurement Advisor newsletter. He is editor of the award-winning "Measuring the Networked Nonprofit" by Beth Kanter and Katie Paine, and editor of two other books on measurement by Katie Paine, "Measure What Matters" and "Measuring Public Relationships." Visit Bill Paarlberg's page on LinkedIn.
Bill Paarlberg
Visit me on
728 Ad Block

Related posts

728 Ad Block