The details of a survey’s methodology are critical to the scope and validity of its results. In this article, Jeffrey Henning discusses methodological aspects important to cover when presenting the results of a survey. While his focus is on newsmaker surveys, these points are important to include whenever survey results are presented.
By Jeffrey Henning—A good friend of mine has been a freelance journalist for over 20 years. He’s seen little increase in the amount he makes per word or per article over all that time: what gains he has made have come through increased productivity. And if an idea is going to take too much work to develop into a story, he’ll go on to the next idea instead.
Don’t let it be your idea for a story that he passes over.
If you are pitching a newsmaker survey done by your organization or by one of your clients, make sure that you have answered all the most common questions a journalist will have about that research up front. That way they won’t need to call or email you to get details they consider important. Or worse—decide that’s too much trouble and move on to the next pitch.
Fortunately, it’s easy to know the survey methodology questions that reporters will want answered. Since 2007, the American Association of Public Opinion Researchers and the Poynter Institute have trained reporters to ask key questions about survey research results. And the National Council of Public Polls (NCPP) has published its own list of questions for journalists to ask about surveys, now on its third edition: 20 Questions A Journalist Should Ask About Poll Results.
Comparing the two documents shows the most common questions journalists are advised to ask about surveys:
1. Who paid for the poll and why was it done?
2. Who ran the poll?
3. How many people were interviewed?
4. How were those people chosen? (Probability or nonprobability sample?)
5. What area or what group were people chosen from? (Adults, online consumers, marketing staff?)
6. Are the results based on the answers of all the people interviewed?
7. When were the interviews conducted?
8. How were the interviews conducted?
9. How was the poll conducted? (Online, by telephone, face-to-face?)
10. What’s the margin of sampling error, if applicable?
11. What questions were asked?
12. What order were the questions asked in?
While most news releases include a paragraph about the survey methodology, in the interests of space such statements are often short and don’t always answer all the questions reporters are trained to ask. Make sure yours do.
Here’s an example of the boilerplate statement we at Researchscape International use for our most common PR surveys (internet surveys). Feel free to adapt this to your own projects as appropriate:
On behalf of Acme, Researchscape International surveyed 652 respondents using an online study fielded from February 25 to February 28, 2017, to better understand attitudes towards cybersecurity. Respondents were recruited from an Internet panel and were quota-sampled using 32 different cells (gender by age by region) to closely match the overall national population aged 18 years old and up.
As this was not a probability-based sample, calculating the theoretical margin of sampling error is not applicable. However, as with probability surveys, it is important to keep in mind that results are estimates and typically vary within a narrow range around the actual value that would be calculated by completing a census of everyone in a population. One estimate of this precision is the credibility interval; for this survey, the credibility interval is plus or minus 6 percentage points for questions answered by all respondents (the interval is larger for questions answered by fewer respondents). Again, as with probability surveys, on occasion the results from a particular question will be completely outside a typical interval of error.
There are many types of survey errors that can limit the ability to generalize to a population. Throughout the research process, Researchscape followed a Total Survey Quality approach designed to minimize error at each stage. Researchscape is confident that the information gathered from this survey can be used to make important business decisions related to this topic.
We then compile an appendix showing the order and wording for every reported question, along with the topline results for each.
The more survey methodology questions that you can answer in advance for journalists, the more likely they are to write about it rather than pass over it.