The Measurement Standard

June 22, 2014 | Category:Theory

The MAIE Model: A New Model for Public Relations Measurement and Evaluation

MacnamaraMEIA-2

The MEIA Model of Public Relations Measurement and Evaluation

At the AMEC Measurement Summit a couple weeks ago, Jim Macnamara unveiled his MAIE model, a new conceptualization of what public relations measurement can and should be. The MAIE model provides insights into several of the major difficulties that measurement faces, including why there is continuing resistance to measurement, and why it has been so difficult to use measurement to demonstrate the value of PR. Here Jim presents a condensed version of his AMEC presentation.

 

by Jim Macnamara—With numerous models, principles, guidelines and tools for measurement and evaluation advocated by academics, professional industry bodies, and measurement service providers, it could be asked ‘does the PR and corporate communication sector need yet another model?’

The answer is yes. This new MAIE model offers an approach that overcomes four largely unrecognized and under-researched obstacles to effective measurement and evaluation by undertaking deep analysis of a range of internal and external quantitative and qualitative data. This analysis provides organization management with insights to productively inform future strategy and achieve organizational outcomes, as well as guide future communication planning. Such a forward-looking, value-adding approach changes the timing, scope, and methodology of evaluation in productive ways for both practitioners and organizations.

To begin, let’s examine several obstacles to measurement and evaluation of outcomes (as opposed to outputs) and demonstration of value to an organization that are not addressed in other models.

1. The rear view mirror approach

The first is that most M&E research looks backwards, producing reports that provide a retrospective performance review of what happened in the past. In some cases, M&E is seen by senior management as little more than an exercise in post-rationalization and self-justification by communication practitioners.

This explains a troubling contradiction at the heart of the measurement dilemma – that is, despite demands for results and accountability, employers often will not pay for and sometimes do not seem to want PR and corporate communication to conduct research. Analysis reveals that many employer/client organizations feel that they intuitively or anecdotally know enough about what was done in the past, or they simply feel that ‘what’s done is done’. Many do not want to pay for what they feel they already know and what cannot be changed. However, senior managers are interested in the future and will pay for what they don’t know – factors that point to the need for a new forward-looking approach.

Organization management sees value in terms of contribution to organizational outcomes and, perhaps even more importantly, contribution to future organizational strategy (e.g., identifying opportunities). However, as shown by the European Communication Monitor, linking PR and corporate communication to organizational outcomes remains a major challenge for up to 75 per cent of practitioners and, while communication professionals are regularly called on for advice on organization strategy, studies show that few directly contribute to the formation of strategy.

2. The blurring of measurement and evaluation

The second obstacle preventing demonstration of the value or PR and corporate communication and justifying a new approach is that measurement and evaluation are commonly blurred as a single process and conceived to be broadly the same thing. This is problematic because measurement and evaluation are distinctly different conceptually and methodologically. Measurement is the taking of measures such as counting items, assigning ratings on a scale, or recording comments in interviews, and analyzing these data. Evaluation, as defined in most dictionaries, is “to judge” or “make a judgement” about the value of something.

Measurement involves metrics and analytics, focussed on data and numbers. While it is informed to some extent by metrics, value is a perception. It is a human judgement made from a perspective and in a context. In the case of organizations, value is determined from the perspective of their goals, objectives and needs Likewise, in the case of stakeholders, the value of communication is determined from the perspective of their interests.

By conflating measurement and evaluation, communication practitioners present arbitrary metrics such as the volume and tone of media coverage and the volume of social media ‘likes’, ‘follows’ and retweets, as purported ‘evaluation’. While useful ‘measurements’ of outputs and outtakes, these do not demonstrate value.

MacnamaraMEIA-1

Figure 1. Traditional M&E approaches which blur and conflate measurement and evaluation.

3. A narrow data base

A further obstacle arising from the blurring of evaluation with measurement and conducting evaluation concurrently with measurement or as a conjoined linear process is that evaluation is based on a relatively narrow range of data. In most cases, evaluation is based totally on metrics collected by the organization in its measurement which, with budget restrictions that commonly exist, is often limited. Additional sources of relevant data are largely ignored.

4. Over-emphasis on numbers and ‘scientific’ approaches

The inability of many practitioners to demonstrate the value of PR and corporate communication is further compounded by a fourth major obstacle – the over-emphasis on quantitative data and ‘scientific’ research that is characteristic of modernist research generally and PR and communication management in particular. The focus on taking measurements and the search for a ‘magic metric’ such as an ROI of PR has ignored the previous point that value is a perception (not a number) and also ignored the fact that many if not most outtakes and outcomes of communication are humanistic rather than scientific. For example, PR and corporate communication typically seek to create or influence awareness, attitudes, opinion, engagement, trust, loyalty, reputation, and relationships, as well as behaviours. Evaluating these outcomes requires interpretative qualitative research as they are based on human interpretation and feelings which numbers describe in only arbitrary and superficial ways.

A new four-stage approach – the MAIE model

A new four-stage model of measurement and evaluation was presented to and discussed at the 2014 International Summit on Measurement in Amsterdam in June as an approach to overcome these obstacles and demonstrate the value of PR and communication.

Stage 1: Measurement
As in traditional approaches, the MAIE model begins with measurement – the collection of data to provide relevant metrics (numbers) and analytics which are meaningful patterns of data. The one difference from traditional approaches in the MAIE model is that qualitative as well as quantitative data is advocated as essential.

From this point, the MAIE model departs even more fundamentally from traditional M&E approaches by inserting two additional stages into the process.

Stage 2: Analysis
The first of these is in-depth analysis. This differs from data analysis which is conducted as part of measurement. While data analysis focuses specifically on data collected in a measurement activity (e.g., of survey responses), analysis in the MAIE model advocates accessing and examining a range of relevant external as well as internal data that can provide contextualization and sometimes triangulation for comparison and complementation of findings. External data can include:

  • Published research reports in the public domain (e.g., by universities, research institutes, government, and consultancy firms such as PWC, KPMG, Deloittes);
  • Databases to access publicly available information and even data mining to access ‘Big Data’;
  • Case studies;
  • Historical records (e.g., to identify trends over time, cultural factors, etc.); and
  • Published theories and models noting that, despite practitioner aversion to ‘theory’, text books and research monographs and articles comprise a source of published best practice knowledge.

Such sources provide an expanded data pool which can be used to compare, contrast, question and contextualize findings, rather than relying only on analysis of bespoke metrics. This analysis stage can also incorporate the techniques of market analysis, competitor analysis, business analysis, and academic approaches of critical analysis. This expanded approach to analysis is undertaken for two reasons in the MAIE model, as outlined in stages three and four.

Stage 3: Insights
Before evaluation is undertaken, in-depth analysis should be conducted to identify insights that can inform future organization strategy. Whereas traditional evaluation findings are mostly backwards-looking and descriptive, insights are forward-looking and involve inferences, predictions, suggestions, and recommendations.

For example, insights could include identification of a gap left by competitors, an opportunity to seize thought-leadership on an emerging issue, a likely legislative initiative identified based on patterns of political comment, or an opinion shift among stakeholders that can be productively addressed at an early stage.

Such insights contribute to future business or organization strategy and outcomes, as well as inform PR and communication performance management and planning and, as such, bridge the gap between PR/corporate communication and organizational strategy and outcomes.
Producing insights requires focussed time and effort as well as knowledge of research and analysis. However, the benefits are substantial and there is a range of techniques used by researchers and analysts that can be learned and applied. These include:

  • Triangulation, which involves collecting and comparing two, three or more data sets related to the same issue to identify consistent findings as well as those that vary;
  • Data reduction and display, which is essential to reduce and make sense of data, particularly in processing large and multiple data sets and ‘Big Data’. Data reduction is done by summarizing data in lists such as rankings and tables and visualization can include diagrams, charts, graphs, infographics, and maps such as network maps. Qualitative data in text form such as interview transcripts can be condensed by coding and categorizing, and visualizations such as ‘word clouds’;
  • Avoid what researchers call the “rush to theorize” – that is, jumping to conclusions about findings or insights without adequate data or analysis. Deploying the other analysis techniques summarized here will help ensure findings, conclusions and insights are valid;
  • Team analysis to bring different perspectives and expertise to examining the data;
  • Reflectivity, which allows time to reflect on findings. A ‘cooling off’ or ‘gestation’ period often results in new perspectives or insights missed in initial analysis;
  • The ‘so what’ question, which analysts advocate as an essential question to ask repeatedly throughout analysis in relation to each finding. Rather than present numbers or simple descriptions, ask ‘so what’? What does this mean? What should the organization do? What should the organization not do?

MacnamaraMEIA-2

Figure 2. A new MAIE model of measurement and evaluation.

Stage 4: Evaluation
Finally, in the MAIE model, evaluation is undertaken as the fourth stage. Evaluation undertaken post the identification, presentation and application of insights is able to capture the value-add provided to the organization in insights and is therefore likely to reflect a much higher level of appreciation and perceived value among internal stakeholders (e.g., management) than retrospective reports. Similarly, insights which recognize external stakeholders’ perspectives and lead to organization actions to improve communication and relationships are likely to lead to increased value in the eyes of external stakeholders.

###

Comments

No comment found.

Add Comment Register



Leave a new comment

Your email address will not be published.