Campus / Community / Election 2012 / National / News / October 18, 2012

Pollsters face growing troubles in reaching, persuading potential respondents

For two weeks, senior Antwon Martin spent hours at The Galesburg Register-Mail on the phone. His objective? Talk to voters. Unfortunately, it was not usually that simple.

(Andrei Papancea/TKS)

“You could call 100 people, and if you get a couple of people who actually take the survey, that’s a good thing,” he said. “The people who are willing to take the survey are very helpful, like they can feel your plight. They know that most people don’t … take the survey.”

Martin was one of eight students in Visiting Professor of Journalism Sue Deans’ In-Depth Reporting course who conducted a pre-election poll this fall. The project’s goal was to delve into the views of Galesburg voters, but it also showed students the difficulties faced by both journalists and pollsters alike in collecting this sort of information.

A 2012 study by the Pew Research Center found that response rates to polls have been steadily decreasing since at least 1997, when the number of people who actually completed a poll was at 36 percent of the total number of people reached. Today, that figure is 9 percent.

According to Masahiko Aida, Director of Analytics at Greenberg Quinlan Rosner Research in Washington, D.C., rates in the 1960s and 70s could be as high as 85 percent. Low response rates today create samples that are often not representative of the general population, creating problems for political analysts.

“Nowadays, your survey responses are somewhat skewed,” he said. “When you have a 9 percent or a 5 percent response rate, no amount of weighting can fix that.”

Deans, who has had exposure to polling with her previous roles as an editor at several community newspapers in Colorado, worked with Assistant Professor of Political Science Andrew Civettini and Professor of Economics  Rich Stout to randomly select 3,000 registered voters in Galesburg to call for the 17-question poll. She hoped to get 300 completed polls for a 10 percent response rate.

Soon, however, Deans and her students encountered their first hurdle: people were not picking up the phone. Frequently, busy signals or voicemail messages awaited students at the other end of the line. When someone did answer, he or she was often reluctant to talk.

Having begun the poll on Sept. 26, the class had planned to have their 300 responses collected before the first presidential debate on Oct. 3. But low contact rates led them to continue their work for another week. On Oct. 10, they finally reached 200 responses, which is enough to generate a statistically acceptable margin of error.

“With volunteer students who have other things to do and people who can’t be reached, I think we did the best we could,” Deans said.

The next step for the class is for each student to write a story or stories expanding on the poll results. This will include information on who citizens would vote for in November and what issues factored into their making that decision as well as demographic data. Respondents were also asked if they would be willing to sit down with a reporter at a later date for a more in-depth interview.

This extra reporting is crucial, Deans believes, because questions without follow-ups can lead to ambiguity. She used the issue of health care as an example.

“If you don’t ask a follow-up question like, ‘Are you in favor of Obamacare or opposed to it, or do you want to see something else,’ you really don’t know why it [the issue of health care] is important to them,” Deans said.

Despite difficulties reaching voters and the lack of explanatory power of some of the results, students were still able to observe interesting patterns. Senior Sydney Stensland, for example, noticed that far more women than men completed the poll. She plans to investigate how other aspects of the poll, including questions that asked respondents how important various issues were to them, were affected by gender.

“I could just tell that [women] were just more willing to talk to us, at least me personally,” she said. “I just thought it was really interesting. I’m interested in seeing the ratios with certain things like income level and most important issue.”

For Martin, the most rewarding part of the process was speaking with people who were able to break through his “mechanical, non-judgmental” tone, which he cultivated to keep his own biases out of the poll. Occasionally, however, respondents would engage him in conversation, leading him to think more about what he was doing.

Stensland also interacted with respondents who impressed her with their willingness to talk and she called the project “really rewarding” in spite of its trials.

“[Respondents] were really enthusiastic about helping us out and sharing what they thought with the community,” she said.

Anna Meier

Tags:  andrew civettini Antwon Martin Greenberg Quinlan Rosner Research Center Masahiko Aida Pew Research Center Rich Stout sue deans Sydney Stensland The Galesburg Register-Mail

Bookmark and Share




Previous Post
Senate discusses dryer payment options for next year
Next Post
Senate election voter turnout improves, still room to grow



Anna Meier




You might also like






More Story
Senate discusses dryer payment options for next year
During the Oct. 11 Student Senate meeting, students were elected to sit on the Grievance Panel and payment options for new washers...