The 'failure' of election polling was about 3 key things

Before voting began on Election Day, nearly every major poll was predicting a Hillary Clinton win by 2-4 percentage points. When the smoke cleared Wednesday morning, Donald Trump had won.

In the wake of Trump’s surprise win, arguably the biggest fascination has been the failure of the polls. Politico asked, “How did everyone get it so wrong?” Fusion asked how it went “so, so, so wrong?” Harvard Business Review wrote that pollsters were “completely and utterly wrong.”

Yes, the polling was wrong—but the reasons why are numerous, and nuanced, and will take a long time to fully parse and understand. In addition, it wasn’t just the polls that went wrong, but also the media’s interpretation of the polls.

1. Polls did not fully account for the Shy Trump Voter

One of the biggest theories as to what the polls missed was the idea of “shy Trump voters” who didn’t want to say when polled that they were planning to vote for Trump, but always knew.

White women, in particular, proved to be a surprise: 53% of them voted for Trump overall, led by those without a college degree, who went for Trump by a 2-1 margin. White women with a college degree went for Clinton, but only barely, by six percentage points. “There’s your shy Trump vote,” tweeted Kristen Soltis Anderson, a pollster at Echelon Insights.

Anderson later added that a bigger problem than secret Trump voters was “a phony mirage of a Clinton vote.” Trump got fewer votes than McCain did in 2008 and Romney did in 2012 and won anyway, because too many Democrats didn’t vote.

Indeed, polling also fails to account for turnout, which was the lowest overall it has been since 2000. (Latino turnout was up from 2012 and skewed toward Clinton, but not by enough to beat Trump.) All non-white ethnic groups went for Clinton, as did millennials—but not enough of them voted.

As Harvard Business Review points out, “People tend to say they’re going to vote even when they won’t… the failure of a complex likely voter model is why Gallup got out of the election forecasting business.”

2. Polling methods need to change

As much as big data (and the technology to sift through it) has advanced, our methods of gathering data are still dated. Most of the national polls are still done by landline telephone. And that has been a problem for over a decade now.

In 2003, Gallup wrote a post about the falling response rates in polls. If you start with a target sample size of 1,000 households, Gallup wrote, at least 200 households fall out because they are businesses or non-working numbers. Of the 800 left, another 200 “may be unreachable in the time frame allocated by the researcher… household members at these numbers may use caller ID or other screening devices and refuse to answer.” Now you’re down to 600, of which 200 more people may pick up the phone but refuse to participate in the poll. Suddenly, the sample size has shrunk from 1,000 to a mere 400 households. Declining to pick up the phone, or declining to participate in the poll, may have been a particular problem with this election polling.