Skip to main content
ABC News
What Pollsters Have Changed Since 2016 — And What Still Worries Them About 2020

If you ask Americans whether they trust the polls, many seem unable to let go of what happened in 2016. Polls taken since then have generally found that a majority of Americans have at least some doubts about what polls say. But as FiveThirtyEight wrote in the run-up to the 2016 election, Donald Trump was always a normal polling error behind Hillary Clinton.

And that’s essentially what happened in 2016: Trump beat his polls by just a few points in just a few states. The presidential polls were, simply, not that off. State-level polling was less accurate, although as editor-in-chief Nate Silver wrote after the election, it was “still within the ‘normal’ range of accuracy.”

That doesn’t mean there weren’t plenty of polling lessons to be gleaned from 2016, though. The importance of education in predicting a person’s political preferences was a big one. And so to better understand those takeaways, we contacted 21 well-known pollsters to find out how they adjusted their methodologies, if at all, and what concerns them most about polling in 2020. In the end, 15 got back to us — a 71 percent response rate that pollsters only dream of in this day and age.1 Here’s what they had to say.

More pollsters are weighting by education and using new ways to reach respondents

Nearly every pollster we talked to has made some kind of modification since the last general election. Some changes were precipitated by what happened in 2016, while others were driven by the challenges facing the polling industry, such as low response rates to phone calls and the greater cost of high-quality polling.

But one thing came up again and again in our interviews: Pollsters told us they were now weighting their samples by education, because one key takeaway from 2016 was just how important someone’s level of educational attainment was in predicting their vote. “In mid-2016, we changed our weights by education, moving the percentage of high school or less respondents up while dropping the college-plus down,” said Jeff Horwitt, a senior vice president at Hart Research, one of the pollsters for the NBC News/Wall Street Journal poll. It was the middle of the election cycle, but already Horwitt and his team were concerned that they might be underestimating the share of the electorate who didn’t have a four-year college degree, and therefore, missing some of Trump’s support. They were right to be concerned, too. A real problem for the polling industry writ large was the underrepresentation of voters with little or no college education.

Some pollsters such as Ipsos and the Pew Research Center have taken weighting by education a step further by weighting for educational attainment within racial groups. This change could be especially important in state-level polling in 2020, as Trump primarily outperformed polls in states that had large populations of white Americans who didn’t have a bachelor’s degree. “This year we are ensuring that we include the combination of education and race/ethnicity in our sampling,” said Cliff Young, president of U.S. public affairs at Ipsos.

Monmouth University Polling Institute director Patrick Murray cautioned, however, that weighting by education isn’t a silver bullet. He noted that weighting by education in a postmortem analysis of their own polling in 2016 had only “a small impact on accuracy and on its own [did] not explain the supposed polling miss in 2016.” Still, weighting by education was, by far, the most common methodology change pollsters reported.

As for other changes since 2016, Marist College Institute for Public Opinion director Lee Miringoff told us they’re paying closer attention to where their respondents live — that is, are they mostly concentrated in a city or outside a metropolitan area? “This has resulted in not only a better balance of geography,” said Miringoff, “but has solved the need to weight by education in most cases.” Given that population density is strongly correlated to how Democratic or Republican an area is — the more people live in a place, the more Democratic it tends to be — this is an important consideration for pollsters as well.

NBC News/Wall Street Journal polls are even weighted by the share of respondents from urban, suburban and rural areas. “This helps to make sure we are fully representing rural Americans,” said Horwitt, adding that it also “removes another factor which can contribute to poll-to-poll variation.”

A number of pollsters have also changed the way they recruit respondents to make sure they are reaching every pocket of the population. Courtney Kennedy, Pew Research Center’s director of survey research, explained that Pew has moved away from conducting polls by live phone calls that use random-digit dialing to reach respondents to an address-based approach in which Pew first gets in touch with respondents by snail mail to recruit them. Horwitt also told us that NBC News/Wall Street Journal no longer uses random-digit dialing; instead, they draw their samples from lists of registered voters, which allows them to “calibrate the mix of respondents between Republicans, independents and Democrats on each survey.”

Pollsters that reach respondents by phone are also relying more on cellphones. Some older voters still primarily use a landline, but 96 percent of Americans reported owning a cellphone in 2019, according to Pew Research. So some pollsters like Suffolk University are upping the share of people they reach by cellphone. Suffolk’s Political Research Center director David Paleologos told us, for example, that they have increased the share of their samples contacted by cellphone from 80 percent to 88 percent.

However, calling more cellphones isn’t without its downsides. For starters, it’s expensive, as federal law prevents pollsters from using auto-dialing technology to reach mobile devices. So some pollsters like Cygnal, Public Policy Polling, Emerson College and SurveyUSA also use SMS texting to reach respondents. “We find that younger people, but also men and people in urban areas, would much rather answer a poll by text than be called on the phone,” said Tom Jensen, PPP’s director.

In fact, many pollsters are using a combination of approaches to reach the widest slice of voters. “We now operate, on any given day, with four different methodologies in use,” explained SurveyUSA CEO Jay Leve, including live telephone calls, automated phone calls, online with pre-recruited respondents and text messaging. The rise of online polling is also part of this story. Spencer Kimball, Emerson College’s polling director, said they have moved away from just using automated phone calls and now use a mixed approach with an online panel component.

Will 2020 polls reveal a new problem?

We also asked pollsters what, if anything, they were still worried about in 2020, regarding either their own polls or the polling industry writ large. Interestingly — but perhaps unsurprisingly, given all the work they’ve put into avoiding the errors of 2016 — only one pollster, Gravis Marketing’s president, Doug Kaplan, told us he is worried about missing “the so-called hidden Trump vote.”

In fact, Marist’s Miringoff is worried about the opposite: “I’m concerned that the industry may be fighting the last war.” To Miringoff, the obsession with weighting polls by education has obscured other underlying problems, such as a heavier reliance on listed telephone numbers or online methods rather than the traditional method of polling people: random-digit dialing, which Miringoff and many other established pollsters believe results in more of a truly representative sample.

Two other pollsters had a different view, though. “What I worry about as a whole for the polling industry is this continued belief that live phone polls are the gold standard,” Cygnal’s CEO, Brent Buchanan, told us. SurveyUSA’s Leve agreed. “I make no case that the online research studies that SurveyUSA conducts are superior to those conducted by a different methodology,” he said. “But I do argue that they are not inherently inferior.” (FiveThirtyEight’s own research has found that, while live-caller polls face undeniable challenges, they remain more accurate than online polls.)

The most common worries for 2020 polling, though, stemmed from the pandemic. Several pollsters said they worried that pollsters would estimate turnout incorrectly. “This is a perennial difficulty for pollsters and survey researchers, which the pandemic is making even thornier,” CBS News’s Kabir Khanna explained. Quinnipiac University Poll director Doug Schwartz offered an example: “With the coronavirus, there may be voters who tell pollsters that they’re voting but then their area experiences a spike in cases around Election Day, and they no longer feel safe going to the polls.” And Emerson’s Kimball and Morning Consult chief research officer Kyle Dropp both pointed out that voters’ increased access to mail voting makes turnout extra unpredictable. And since polls are only as good as their turnout model, this could lead to some polling misses this fall. (In fact, FiveThirtyEight’s model even built in an extra layer of uncertainty this year because of the possibility that the pandemic will disrupt usual turnout patterns.)

Similarly, multiple pollsters expressed concern that, as Pew’s Kennedy put it, “2020 polls might ultimately look ‘off’ not because they were, but because something went awry with the counting of the votes.” (Pollsters aren’t alone in fearing this; according to a recent YouGov/The Economist survey, only 13 percent of registered voters have a great deal of confidence that the election will be held fairly; 18 percent have quite a bit of confidence, and 34 percent have a moderate amount of confidence.)

Monmouth’s Murray made the point that plenty of factors are outside of pollsters’ control. “Our polls might be accurate in terms of how the election would turn out if everyone who actually votes has their vote counted,” he explained. “But what happens if a large number of mail ballots are rejected, or polling places are closed or lines are so long that voters go home?” Quinnipiac’s Schwartz and PPP’s Jensen had similar concerns, although they were cautiously optimistic that the effect would be minimal. “These kinds of things haven’t had a major impact on polls in the past,” said Schwartz. “But with so many possible factors, and so many states so close, it’s possible that the accuracy of polls may be impacted.”

But plenty of pollsters, including Jensen and Gary Langer of Langer Research, who conducts some polls for ABC News/Washington Post, also said that this year’s polling wasn’t really keeping them up at night. “I feel pretty good about the polling in 2020 largely because the polling was so accurate in 2018, and I believe we are still fundamentally in the same political climate that we were then,” Jensen said. “The low level of undecideds in the presidential race also greatly reduces the chances of a dramatic late shift in the numbers.”

Suffolk’s Paleologos pointed to the fact that national polls were actually quite accurate in 2016 and that the states with the biggest polling errors were not polled by high-quality pollsters in the final week of the campaign; if they had been, perhaps we’d have seen Trump’s win coming. By contrast, “there is more polling in battleground states this year, and that is especially true for some of the Upper Midwest states that proved decisive in 2016,” said Pew’s Kennedy. “There has also been an uptick in the volume of state polling done by major, reputable polling organizations that use more rigorous methods.”

So, perhaps, after four years of hand-wringing, the polls will show they were all right after all.

CORRECTION (Oct. 14, 2020, 2:20 p.m.): A previous version of this article described Langer Research as the primary pollster for ABC News/Washington Post. The Washington Post has its own polling staff, so we have updated the article to better reflect Langer Research’s relationship with the Post.

Footnotes

  1. No, we didn’t commission a scientific poll, but we did ask each pollster the same set of questions.

Geoffrey Skelley is a senior elections analyst at FiveThirtyEight.

Nathaniel Rakich is a senior editor and senior elections analyst at FiveThirtyEight.

Comments