The Problem With Polls

The Problem With Polls

"Two out of three Americans say they try to eat as little fat as possible."

"Two-thirds of Americans report they have given some thought to whether the foods and beverages they purchase or consume are produced in a sustainable way."

"Three out of four Americans say they choose products that are lower in total fat at least sometimes."

Obviously the pollsters who spew statements like these didn't survey every single American to come up with these percentages. Nevertheless, too many consumers see quotable "facts" (I'm using that term loosely) and are happy to share them with friends, family members and anyone else who will listen, especially if the "facts" happen to support their own personal viewpoints.

The media's growing reliance on public-opinion polls is troubling. Harvard University's Theodore H. White Seminar on Press and Politics featured a panel of experts who discussed the use of opinion polls by the media. While some panelist felt polls had value, others were critical of polls and how they were used. All panelists, however, felt change was necessary to make polls more valid and objective.

Unless we have understanding and objectivity in the way polls are conducted and explained, we will continue to have skewed, unreliable results that can potentially mislead an uninformed public.

An article at The Journalist’s Resource by Denise-Marie Ordway offers suggestions for journalists who want to quote a poll. The information is helpful to anyone who wants to be more discerning about the information they read. She suggests you ask the following questions.

1. Who conducted the poll?  “It’s important to know whether it was conducted by a polling organization, researcher, non-expert, political campaign or advocacy group,” Ordway writes.

2. Who paid for it? Ordway suggests you find out if the poll was “funded by an individual or organization that stands to gain or lose something based on the results.”

3. How were people chosen to participate? The best polls rely on randomly selected participants, Ordway says. “Keep in mind that if participants were targeted in some way — for example, if pollsters went to a shopping mall and questioned people they encountered there — the results may be very different than if pollsters posed questions to a random sample of the population they’re interested in studying,” she says.

4. How was the poll conducted? “It’s important to find out if participants filled out a form, answered questions over the phone or did in-person interviews,” Ordway writes. “The method of collecting information can influence who participates and how people respond to questions. For instance, it’s easier for people to misrepresent themselves in online polls than in person — a teenager could claim to be a retiree.”

5. What’s the margin of error? Look for the margin of error, “an estimate of how closely the views of the sample reflect the views of the population as a whole,” Ordway suggests. “When pollsters ask a group of people which presidential candidate they prefer, pollsters know the responses they will get likely won’t match the responses they’d get if they were to interview every single voter in the United States. The margin of error is reported as plus or minus a certain number of percentage points,” she says, and gives this example:

“If a poll shows that one candidate is 2 percentage points ahead of another in terms of public support but the margin of error is plus or minus 3 percentage points, the second candidate could actually be in the lead. The Pew Research Center offers a helpful explainer on the margin of error in election polls.”

6. Were participants compensated? “Offering people money or another form of compensation can also affect who participates and how,” Orway writes. “Such incentives might encourage more lower-income individuals to agree to weigh in. Also, participants may feel compelled to answer all questions, even those they aren’t sure about, if they are paid.”

7. Who answered questions? Find out the demographics of the people surveyed, as that can have a significant impact on responses, Ordway says in her article.

8. How many people responded to the poll? While there isn’t a perfect number of participants, Ordway says higher numbers generally result in more accurate representations. “If pollsters want to know if the American public supports an increase in military funding, interviewing 2,000 adults will likely provide a more accurate measurement of public sentiment than interviewing 200,” she says.

9. Can results be generalized to the entire public? Ordway says journalists should be clear in their coverage whether the results of a poll apply only to a segment of the population or can be generalized to the population as a whole. As a reader, you should be looking at this too.

10. What did pollsters ask? Knowing which questions were asked can help you check whether claims made about poll results are accurate, writes Ordway. “It also can help [you] spot potential problems, including vague terms, words with multiple meanings and loaded questions, which are biased toward a candidate or issue. Cornell University’s Roper Center for Public Opinion Research offers an example of a loaded question,” she says in her article.

Request a copy of the questions in the order they were asked. Participants’ answers also can differ according to question order. Dan Murphy clearly showed how that likely happened in a recent poll. He wrote about it here.

11. What might have gone wrong with this poll? Find out if possible biases or shortcomings have potentially influenced results, says Ordway. Often, the small type at the end of a survey will uncover interesting notes about the poll.

Do Your Homework
Although the journalist/author should be performing due diligence on many of these questions if he/she is sharing poll results, that might not be happening. Whether poll questions are related to animal agriculture, food, political candidates, or any other important issue, we can't afford irresponsibility, especially when opinions impact the way we do business. The long-term ramifications can be devastating.

For More Information
The Journalist’s Resource suggests these additional resources to help you understand more about polls:

  • Journalist’s Resource has written an explainer on polls.
  • The Poynter Institute offers a free online course on understanding and interpreting polls.
  • FiveThirtyEight, a news site that focuses on statistical analysis, has updated its pollster rankings in time for the 2018 midterms. It gave six election pollsters a grade of A-plus: Monmouth University, Selzer & Co., Elway Research, ABC News/Washington Post, Ciruli Associates and Field Research Corp.
  • The American Association for Public Opinion Research provides numerous resources, including information on poll and survey response rates, random sampling and why election polls sometimes get different results.
 

Latest News

Hogs and Pigs Report: How Will Increase in Pigs Saved Per Litter Impact the Pork Outlook?
Hogs and Pigs Report: How Will Increase in Pigs Saved Per Litter Impact the Pork Outlook?

Of all the numbers in the latest USDA Hogs and Pigs Report, the number that caught economists’ eye was pigs saved per litter. Here's why Steve Meyer, Ever. Ag senior economists, says this is a number to watch.

Why You Need to Understand How USDA Purchase Programs Work
Why You Need to Understand How USDA Purchase Programs Work

With pork producers facing prolonged economic headwinds, recent USDA commodity purchases of pork offer relief at pivotal times. It’s important to understand how these programs work, says NPPC president Lori Stevermer.

Get the Facts Straight on Highly Pathogenic Avian Influenza
Get the Facts Straight on Highly Pathogenic Avian Influenza

Now that the mystery illness impacting some dairy herds has been revealed as the same strain of Highly Pathogenic Avian Influenza that has been impacting the U.S. poultry flock, pork producers are asking questions.

Merck Introduces Sequivity with Microsol Diluvac Forte Adjuvant Prescription Vaccine
Merck Introduces Sequivity with Microsol Diluvac Forte Adjuvant Prescription Vaccine

Merck Animal Health announced it has received license approval from the USDA for Sequivity with Microsol Diluvac Forte adjuvant prescription vaccine for use in gilts and sows. 

What Does the Next Generation of the Pork Industry Want?
What Does the Next Generation of the Pork Industry Want?

It’s easy to make assumptions about what others think, but recent research funded by the Indiana Pork Producers Association and the Indiana Soybean Alliance proves it’s always better to go to the source itself.

Skills Survey Reveals U.S. Agriculture & Food Industry Workforce Needs and Gaps
Skills Survey Reveals U.S. Agriculture & Food Industry Workforce Needs and Gaps

U.S. employers report challenges in finding suitable job candidates with work-ready skills to fill open roles in ag. The AgCareers.com U.S. Skills Survey offers insights, data and trends to address skill development.