THE biggest winners on Election Day weren’t politicians; they were numbers folks.
Computer scientists, behavioral scientists, statisticians and everyone who works with data should be proud. They told us who was going to win, but they also helped to make many of those victories happen.
Three groups of geeks deserve the love they rarely receive: people who run political polls, those who analyze the polls and those who figure out how to help campaigns connect with voters.
Many people doubted the accuracy of political polling this year. Part of the skepticism was based on the wide range of predictions, with some showing President Obama in the lead, and others Mitt Romney. But there were additional, structural reasons to worry whether pollsters would be able to find representative samples of voters.
One problem is that people are harder to reach on the telephone these days. About a third of voters no longer have a land line, and many of those who have them don’t pick up calls from strangers. So modern polling companies have to work harder to find voters willing to answer questions, then have to guess which of these respondents will actually show up and vote.
So it may come as a surprise that, collectively, polling companies did quite well during this election season. Although there was a small tendency for the pollsters to overestimate Mr. Romney’s share of the vote, a simple average of the polls in swing states produced a very accurate prediction of the Electoral College outcome. Notably, the most accurate polls tended to be done via the Internet, many by companies new to this field. That’s geek victory No. 1.
This relatively accurate polling data provided the raw material for the second group of election pioneers: poll analysts like Nate Silver, who writes the FiveThirtyEight blog for The New York Times, as well as Simon Jackman at Stanford, Sam Wang at Princeton and Drew Linzer at Emory University.
What do poll analysts do? They are like the meteorologists who forecast hurricanes. Data for meteorologists comes from satellites and other tracking stations; data for the poll analysts comes from polling companies. The analysts’ job is to take the often conflicting data from the polls and explain what it all means.
Worry about the reliability of the polling data led to widespread skepticism, or even outright hostility, toward poll analysts. The phrase “garbage in, garbage out” was one of the more polite criticisms bouncing around the Internet in the days before the election.
Because the polls were not, in fact, garbage, the first job of a poll analyst was quite easy: to average the results of the various polls, weighing more reliable and recent polls more heavily and correcting for known biases. (Some polls consistently project higher voter shares for one party or the other.)
A harder but more valuable task is to help readers translate the polling data into forecasts of the probability of victory. In Florida, where the final polls showed essentially a tie, according to Mr. Silver’s weighting method, it’s easy to see why he said the chance of either candidate winning the state was 50 percent. Ultimately, President Obama would very narrowly carry the state.
But what about North Carolina, where Mr. Silver projected that Mitt Romney would get 50.6 percent of the vote and President Obama, 48.9 percent? Looking at that very small difference, what probability would you have assigned to a Romney victory in that state?
Most people would guess something very close to 50-50. But not a good numbers guy. By looking back at previous elections with polling data this close, Mr. Silver estimated that Mr. Romney’s chances of winning North Carolina were 74 percent, a number that may seem surprisingly high. (Mr. Romney won the state.)
The slightly larger but still seemingly tiny lead that the president held in Ohio, another swing state, led poll analysts to predict that the chance of an Obama victory in Ohio was around 90 percent. And because Mr. Romney would have to win several such states with small Obama leads in order to prevail in the Electoral College, the analysts ended up with similarly high degrees of confidence in an overall Obama victory. They ended up predicting the Electoral College outcome almost exactly right, especially if you consider the final outcome in Florida to be a virtual tie, as they had projected.
Pundits making forecasts, some of whom had mocked the poll analysts, didn’t fare as well, and many failed miserably. George F. Will predicted that Mr. Romney would win 321 electoral votes, which turned out to be very close to President Obama’s actual total of 332. Jim Cramer from CNBC was nearly as wrong in the opposite direction, projecting that the president would win 440 electoral votes.
Economic View: How Pollsters and Analysts Won Big on Election Day
This article
Economic View: How Pollsters and Analysts Won Big on Election Day
can be opened in url
http://newsdalang.blogspot.com/2012/11/economic-view-how-pollsters-and.html
Economic View: How Pollsters and Analysts Won Big on Election Day