Ask Pollsters Some Questions
FAYETTEVILLE, Ark. — The closer it gets to November 2008, the more polls will confront — and sometimes confound — U.S. voters. To weigh the results of any poll, Molly Longstreth, director of the Survey Research Center at the University of Arkansas, suggests voters consider four w’s and an h: who, what, where, when and how.
“As a consumer of information, you should expect the pollster or the news report about the poll to give you the information you need to understand how the poll was conducted,” Longstreth said. “The more you know about who, what, when, where and how, the more you know how seriously you can take the poll’s results. Can you rely on the results as hard news? Or will you just treat the results as interesting information?”
Who
When voters look at poll results and ask “Who?” Longstreth said they should be asking who sponsored the poll, who conducted the poll and who answered the poll questions.
Longstreth advised that voters should assess whether the poll was sponsored by a relatively unbiased group, such as a university or news organization. Is the polling organization considered objective, and did the organization use paid, professional interviewers, rather than volunteers or students fulfilling a class assignment?
“The best-planned poll can be undermined by poor data collection,” Longstreth said.
Poll results should specify the number of people interviewed and give details about who was interviewed, such as approximate ages. To be a reliable reflection of opinion of the population overall or of any particular group — new voters, for instance — poll results must come from a randomly selected, representative sample of the population.
What
Pollsters should also explain what methods they used to survey respondents and what questions they asked.
Longstreth explained that most reliable political polls are conducted by telephone, not on the Internet. In most cases, Internet polls present problems in assuring a random group of respondents from the general population and tend to be skewed toward those who have access to the Internet and interest in the poll. Using current technology, an Internet poll can be valid when conducted among members of an organization or at a workplace where the pollsters know all the individual e-mail addresses and can assure everyone has an equal chance to respond to the poll.
What questions are asked and how they are asked can make a big difference in a poll’s results. Longstreth suggested that news reports should specify what questions were asked, and voters should be wary of comparing results of polls when the questions are not identical. She also cautioned that the wording of questions should be clear and unbiased.
“At the Survey Research Center, whenever possible, we ask 'to what degree do you favor or oppose’ a policy,” Longstreth said. “This form of question does not tacitly assume that either favoring or opposing is more acceptable. It can be a freeing way of asking a question that allows individuals to be more honest in their responses.”
When
A report on a poll should be explicit about when it was conducted. In general, a poll conducted over a longer period of time is more likely to approximate the population than a poll that is conducted for a few days over a holiday weekend when people tend to be unavailable.
“Polls conducted over very short periods of time probably are less accurate than polls conducted over longer periods of time,” Longstreth said. “If you are reading polls weekly in your newspaper, don’t overreact to any one poll. Look at the trends that emerge when the same question is asked over a period of time.”
Sometimes news events that come up during the time of the poll affect the results, and Longstreth suggested keeping that in mind.
Where
Poll results should note what geographic region was covered in the poll so that results can be more fully understood. Proximity to major events may add bias to responses that would not apply in other regions. For example, Longstreth suggested, a poll about disaster preparedness policies conducted in the Gulf Coast states would likely elicit stronger opinions than the same poll conducted in Midwestern states.
How
How the interview participants were selected is an important question that doesn’t always make it into general news reports, Longstreth said.
“Even if the news account doesn’t give much information about how the sample of respondents was selected, there’s often a clue to whether they used a valid, random sample,” Longstreth said. “If an estimated sampling error is provided — usually it’s phrased as something like 'plus or minus 5 percent margin of error’ — that’s a good indication they used a random sample.”
When voters think about a poll’s questions, Longstreth suggests they keep in mind what questions were not asked, particularly in cases where the poll’s sponsors might not be considered unbiased. Thorough poll results will also offer information about any possible errors that could influence the outcome of the poll, such as the wording of questions or the order questions are asked. Longstreth said that such issues should be noted as factors that “can lead to somewhat different results.”
Longstreth cautioned against reacting to the headlines about a poll’s results before learning more about who, what, when, where and how.
“The more information that is provided about a poll’s research methods, the more confident you can be about its results,” she said.
Contacts
Molly Longstreth, director,
Survey Research Center
(479) 575-4222, mlongstr@uark.edu
Barbara Jaquish, science and research communications officer
University Relations
(479) 575-2683, jaquish@uark.edu