The Problem With Polls

"Two out of three Americans say they try to eat as little fat as possible."

"Two-thirds of Americans report they have given some thought to whether the foods and beverages they purchase or consume are produced in a sustainable way."

"Three out of four Americans say they choose products that are lower in total fat at least sometimes."

Obviously the pollsters who spew statements like these didn't survey every single American to come up with these percentages. Nevertheless, too many consumers see quotable "facts" (I'm using that term loosely) and are happy to share them with friends, family members and anyone else who will listen, especially if the "facts" happen to support their own personal viewpoints.

The media's growing reliance on public-opinion polls is troubling. Harvard University's Theodore H. White Seminar on Press and Politics featured a panel of experts who discussed the use of opinion polls by the media. While some panelist felt polls had value, others were critical of polls and how they were used. All panelists, however, felt change was necessary to make polls more valid and objective.

Unless we have understanding and objectivity in the way polls are conducted and explained, we will continue to have skewed, unreliable results that can potentially mislead an uninformed public.

An article at The Journalist’s Resource by Denise-Marie Ordway offers suggestions for journalists who want to quote a poll. The information is helpful to anyone who wants to be more discerning about the information they read. She suggests you ask the following questions.

1. Who conducted the poll?  “It’s important to know whether it was conducted by a polling organization, researcher, non-expert, political campaign or advocacy group,” Ordway writes.

2. Who paid for it? Ordway suggests you find out if the poll was “funded by an individual or organization that stands to gain or lose something based on the results.”

3. How were people chosen to participate? The best polls rely on randomly selected participants, Ordway says. “Keep in mind that if participants were targeted in some way — for example, if pollsters went to a shopping mall and questioned people they encountered there — the results may be very different than if pollsters posed questions to a random sample of the population they’re interested in studying,” she says.

4. How was the poll conducted? “It’s important to find out if participants filled out a form, answered questions over the phone or did in-person interviews,” Ordway writes. “The method of collecting information can influence who participates and how people respond to questions. For instance, it’s easier for people to misrepresent themselves in online polls than in person — a teenager could claim to be a retiree.”

5. What’s the margin of error? Look for the margin of error, “an estimate of how closely the views of the sample reflect the views of the population as a whole,” Ordway suggests. “When pollsters ask a group of people which presidential candidate they prefer, pollsters know the responses they will get likely won’t match the responses they’d get if they were to interview every single voter in the United States. The margin of error is reported as plus or minus a certain number of percentage points,” she says, and gives this example:

“If a poll shows that one candidate is 2 percentage points ahead of another in terms of public support but the margin of error is plus or minus 3 percentage points, the second candidate could actually be in the lead. The Pew Research Center offers a helpful explainer on the margin of error in election polls.”

6. Were participants compensated? “Offering people money or another form of compensation can also affect who participates and how,” Orway writes. “Such incentives might encourage more lower-income individuals to agree to weigh in. Also, participants may feel compelled to answer all questions, even those they aren’t sure about, if they are paid.”

7. Who answered questions? Find out the demographics of the people surveyed, as that can have a significant impact on responses, Ordway says in her article.

8. How many people responded to the poll? While there isn’t a perfect number of participants, Ordway says higher numbers generally result in more accurate representations. “If pollsters want to know if the American public supports an increase in military funding, interviewing 2,000 adults will likely provide a more accurate measurement of public sentiment than interviewing 200,” she says.

9. Can results be generalized to the entire public? Ordway says journalists should be clear in their coverage whether the results of a poll apply only to a segment of the population or can be generalized to the population as a whole. As a reader, you should be looking at this too.

10. What did pollsters ask? Knowing which questions were asked can help you check whether claims made about poll results are accurate, writes Ordway. “It also can help [you] spot potential problems, including vague terms, words with multiple meanings and loaded questions, which are biased toward a candidate or issue. Cornell University’s Roper Center for Public Opinion Research offers an example of a loaded question,” she says in her article.

Request a copy of the questions in the order they were asked. Participants’ answers also can differ according to question order. Dan Murphy clearly showed how that likely happened in a recent poll. He wrote about it here.

11. What might have gone wrong with this poll? Find out if possible biases or shortcomings have potentially influenced results, says Ordway. Often, the small type at the end of a survey will uncover interesting notes about the poll.

Do Your Homework
Although the journalist/author should be performing due diligence on many of these questions if he/she is sharing poll results, that might not be happening. Whether poll questions are related to animal agriculture, food, political candidates, or any other important issue, we can't afford irresponsibility, especially when opinions impact the way we do business. The long-term ramifications can be devastating.

For More Information
The Journalist’s Resource suggests these additional resources to help you understand more about polls:

  • Journalist’s Resource has written an explainer on polls.
  • The Poynter Institute offers a free online course on understanding and interpreting polls.
  • FiveThirtyEight, a news site that focuses on statistical analysis, has updated its pollster rankings in time for the 2018 midterms. It gave six election pollsters a grade of A-plus: Monmouth University, Selzer & Co., Elway Research, ABC News/Washington Post, Ciruli Associates and Field Research Corp.
  • The American Association for Public Opinion Research provides numerous resources, including information on poll and survey response rates, random sampling and why election polls sometimes get different results.