Online political polling on the rise amid skepticism of results


Share with others:


Print Email Read Later

In 1980, a local radio station was interested in getting an early read on Iowa's Democratic caucuses.

As Hugh Winebrenner recounted in "The Iowa Precinct Caucuses: The Making of a Media Event," an enterprising disc jockey enlisted the Emmetsburg water department in a unique assessment of voter strength -- the Cess Poll.

"KEMB-FM, broadcasting from the city water plant, reported how much the water level dropped in the town's water tower after residents flushed their toilets successively for [President Jimmy] Carter, [Sen. Edward] Kennedy, and 'undecided/don't care,' " the author explained.

"Don't care," ended up flushed with victory, but between the candidates, Mr. Carter ran well ahead of his challenger, a finish that presaged the actual results.

Despite that success, pollsters have not embraced the Cess Poll methodology. Telephone polls, based on random digit dialing, remain the gold standard for political surveys and the broader, multibillion industry of market research.

But as pollsters seek the pulse of the 2012 electorate, that established approach faces challenges of soaring costs, migration of the public to cell phones and increasing reluctance of people to answer phones.

Growth and controversy

Internet polling has grown rapidly as an alternative, but it's deeply controversial among scholars in the field. Some contend it can never meet tests of accuracy rooted in generations of statistical research.

Proponents of newer methodologies argue that experience and their ever-improving algorithms allow them to weight their results to replicate the accuracy of traditional polls at much lower costs.

And while this dispute is far from resolved, some practitioners see it as yesterday's argument as they experiment with surveys rooted in social media, text messaging and other technology.

"The probability sample paradigm that we have all lived by for 60, 70 years has been shown to work over and over again," said Reg Baker, president of Market Strategies International. "It has science behind it; it has a mathematical basis. ... The problem is [traditional phone surveys] cost too much and it takes too long."

Mr. Baker, who chaired a panel that examined online polling research for a 2010 report by the American Association of Public Opinion Research, noted that cost questions aside, traditional polling's eroding response rates already threaten the goal of achieving truly random samples.

"The question is, are we smart enough as an industry to invent a new paradigm for nonprobability sampling? What can we do to make it representative?"

Even the most traditional polling remains counter-intuitive to some people. Pollsters and people who write about the results are used to the skeptical questions: How can a few hundred respondents determine the views of millions? I vote all the time; how come I've never been polled?

But the statistical science behind surveys is rooted in established theory and generations of research. Some political polling is based on random samples drawn from publicly available voters lists.

Perhaps more typically, according to Terry Madonna, who directs the Franklin & Marshall College Poll, survey organizations will purchase lists of phone exchanges drawn to reflect the target population and completed with the last four digits of the phone number generated at random.

That technique is designed to give every member of the population an equal opportunity of being questioned -- the core requirement of a statistically valid sample.

Cell vs. land line

It never worked perfectly, but it functioned pretty well in a nation in which nearly every household had land-line phones, and cell phones had not been invented. In 2003, fewer than 5 percent of Americans relied solely on cell phones, according to a 2010 study by researchers Stephen Ansolabehere of Harvard University and Brian Schaffner of the University of Massachusetts.

By 2010, more than a fifth of the population was cell phone only. And another 15 to 20 percent reported that they had land lines but made almost all of their calls on cell phones, meaning that well over a third of the population was out of land-line reach.

In a political context, if the cell and land-line folks were otherwise identical, it wouldn't make much difference. For years, scholars and polling professionals were aware of the theoretical problem posed by the migration to cell phones but didn't find much divergence in survey results.

That may have been because the voting patterns of the two groups helped to disguise the polling differences: The cell-only group is disproportionately young; the land-line group older than the overall population, and older people show up at the polls more reliably than younger citizens.

But a Pew Research Center study last year found that small but significant differences had begun to emerge in polling results even after they were weighted to reflect the demographics of the population.

In one widely noted example, the Pew researchers found that land-line-only samples tended to underestimate support for Democratic candidates.

Perhaps the most significant difference was age, with 41 percent of the cell-only group between 18 and 29 while only 7 percent of the land-line cohort was under 30.

In addition, Pew noted that "overall, the land-line sample included more white, non-Hispanics than the cell-only group (79 percent vs. 61 percent) while minorities make up a larger share of the cell-onlys."

Recognizing those differences, many polling organizations have begun to compensate by including cell phone numbers in their surveys.

When they won't answer

But that has not cured their problems. Polling organizations have found it's tougher to get people to answer the phone, then to keep them engaged in a multi-question survey.

Pollster John Zogby, whose firm conducts phone and Internet surveys, recalled an interview early in his career interrupted by the respondent hushing her children with the admonition that she was "on a long-distance call discussing important information."

"That doesn't happen anymore," he said.

Surmounting those hurdles adds to costs. Pollsters have to spend more for cell phone number lists than for land lines. And Mr. Madonna of Franklin & Marshall College noted that the completion rates for cell phone interviews lag behind even the declining rates for land-line interviews.

One answer to this cost pressure was the rise of polls based on automated computer calling used by firms such as Survey USA, Rasmussen and Public Policy Polling. Many traditionalists remain wary of the results gleaned by the computer calls, arguing that trained interviewers provide an added layer of reliability.

Whatever their limits, some automated calling surveys have done as well as more traditional polling firms in accuracy ratings but still face a more resistant population.

Internet surveys

This difficult environment set the stage for Internet surveys. In the political area, the most widely known purveyors of Internet surveys include Zogby International, YouGov/Polimetrix and Harris Interactive. Many more firms use some form of Internet surveying for commercial clients of every description. Among the most widely known is Knowledge Networks.

In Pittsburgh, one firm pressing the bounds of the new polling technology is CivicScience. [Disclosure: CivicScience has a commercial relationship with the Post-Gazette.]

Knowledge Networks, using methods rooted in traditional survey research, has recruited a panel of scores of thousands of polling subjects for its surveys. While they are administered over the Internet, the Knowledge Network subjects were recruited using the random probability principles of traditional polling.

Other firms recruit their respondents in varying ways that do not attempt to mimic probability-based sampling.

Firms using these opt-in panels (so called because the participants opt in to the sample group as opposed to being randomly selected) use statistical methods to balance their survey targets and findings according to the demographic factors and other characteristics of the target populations.

Mr. Zogby explained that his firm uses myriad devices to recruit an overall panel of roughly 475,000 members. Some were recruited through ads on websites. Other names are derived from mailing lists of interest groups of various political persuasions.

Once the names are gathered, Mr. Zogby invites them to join the panel and classifies them according to demographic and other factors. Then, for an actual poll, Mr. Zogby questions a random sample of the prescrubbed panel.

CivicScience, according to its president, John Dick, places short, three-question polls on websites, including the Post-Gazette's. From the answers, it can over time develop an increasingly extensive profile of respondents.

Help from cookies

For example, the first time a person encounters one of the firm's mini-polls, he or she might be asked a political or cultural question, then one that provides a specific demographic fact about the person answering -- his or her age or race, for example.

When that person returns to the site, an Internet cookie identifies him as a repeat participant. Then, he will be asked a new demographic question so that over time an extensive though anonymous picture of the respondent is lodged in the firm's computers.

In theory, that database becomes more robust over time as more profiles are included and more characteristics about each one are included. That, in turn, allows the firm to weight its survey findings to reflect the broader population.

The polling marketplace and the academic community are struggling to assess the validity of these innovative approaches with no clear consensus in sight. But practitioners aren't waiting for theoretical endorsements.

The AAPOR report chaired by Mr. Baker cited an estimate that $2 billion was spent on online research in 2009, with about 85 percent of that spending devoted to work that only a decade ago would have been performed through face-to-face or phone surveys.

A 2009 study by Stanford University researchers David Yeager and Jon A. Krosnick raised red flags about the increasingly pervasive methodology.

They found that Internet, opt-in panels produced results that were consistently less reliable than probability samples. They warned that statistical weighting didn't help and in fact sometimes magnified statistical flaws.

But the 2010 paper by Mr. Ansolabehere and Mr. Schaffner came to a distinctly different conclusion.

After reviewing parallel surveys administered by mail, by phone with live interviewers and through the Internet, " ... we demonstrate that well-executed opt-in surveys are in fact as reliable as phone surveys," the authors assert.

They speculate that earlier findings questioning Internet surveys could be explained in part by growth and improvement in the field.

The 2010 review of studies of online research by a panel commissioned by the American Association of Public Opinion Research urges caution in the use of Internet panels for estimates of overall population statistics.

But the study also states that Internet panels are appropriate for some kinds of research and urges continued study along with greater transparency by polling firms on their methods.

For all of its caution, however, the AAPOR report notes that in the realm of election polling, "studies using nonprobability panels sometimes have yielded results that are as accurate as or more accurate than some surveys using probability samples."

But the debate is certain to continue.

"I think they are fundamentally flawed," Andrew Smith, the director of the University of New Hampshire Survey Center, said of most Internet polls. "The key is a randomization. If you don't start with a random sample, you can't generalize the results."

John Dick, the CEO of CivicScience, has no problem with the observation that the firm's mini-polls stray from polling orthodoxy.

"We honestly do not care what methodology purists think so long as the information and insights we produce provide value for our clients," he said. "To the inside baseball players, we're seen as radically different from what else is out there."

But he argued that the vast number of interviews CivicScience has amassed gives it a unique ability to mine its data.

Mr. Madonna, whose state and national polls rely on cell and land-line phone interviews, said he is open-minded about the potential for Internet polling but isn't ready to abandon a proven methodology.

"I think most of us look at the Internet and see that, at some point, it's going to be the future."

But maybe not for long.

"A lot of people argue that online itself is gradually going to be replaced by mobile -- sending invitations and getting people to do surveys on smartphones ... but of course smartphones pose a whole lot of other problems," said Mr. Baker of Market Strategies International.

Mr. Zogby said he foresees Internet polling becoming more dominant but also expects more utilization of social networks and text messaging as survey vehicles.

"We're in a period of creative destruction," he said, "and a period of tremendous experimentation."


Politics editor James O'Toole: jotoole@post-gazette.com .


Advertisement
Advertisement
Advertisement

You have 2 remaining free articles this month

Try unlimited digital access

If you are an existing subscriber,
link your account for free access. Start here

You’ve reached the limit of free articles this month.

To continue unlimited reading

If you are an existing subscriber,
link your account for free access. Start here