Between the Reddit AMAs, the donation apps, the polls and the failed pundits, the 2012 presidential election was one built and consumed by all things digital and data. The headline, as you’ve no doubt read countless times this week, is that technology won this election. So how did this happen, to what degree is it true, and where do we go from here?
Long live Nate Silver, king of math
Obama may have won his way to another four years, but the real winner Tuesday night was Nate Silver. A little background on Silver: He’s a big stats guy. Silver developed PECOTA, or the Player Empirical Comparison and Optimization Test Algorithm, which is used to forecast and predict Major League Baseball players.
He took this same mindset and applied it to predicting elections, which he did anonymously, blogging at FiveThirtyEight (which is now licensed and published under The New York Times umbrella). In 2008, he was only off by one state in his presidential election predictions.
“The pundits – on both sides of the aisle – claimed the race was going to be decided by razor thin margins,” ElectNext CEO and founder Keya Dannenbaum tells me. “Nate Silver predicted, by Election Day, a 90 percent chance of an Obama victory. The pundits pounced claiming that such a margin was impossible and deriding Silver’s credibility, and him personally, in the process.” But clearly, Silver had the right idea.
And now he’s risen to new levels of infamy thanks to his perfect prediction of the 2012 election. It’s likely the praise being heaped his way is so heavy because pundit-sourced “news” has reached unforeseen, horrifying levels.
In response to this unprecedented level of reporting bias, it appears we’re swinging the other direction – hard. Silver’s perfect analysis has been heralded as a win for big data, as a turn in the way we measure public opinion and trends. The idea of relying on an expert for their two cents isn’t necessary anymore: There are so many signals we can use instead to find these answers. It’s math instead of manipulation, if you want to look at it from a 10,000 foot view.
Still, it’s not as if Silver’s – and data’s – victory means punditry and old media are officially castoffs. “Similarly in politics and political analysis, it is not the case that one side will trump the other and that punditry will die,” says Dannenbaum. “Numbers need stories just as stories need numbers. Punditry is not dead, but the quants are here to stay.”
While Facebook is still arguably the place that people turn to before and after events, Twitter solidified itself as the go-to for all things real-time last Tuesday night. And better yet: It didn’t break.
“[During election night] Twitter averaged about 9,965 [Tweets per second, TPS] from 8:11pm to 9:11pm PT, with a one-second peak of 15,107 TPS at 8:20pm PT and a one-minute peak of 874,560 TPM,” Twitter announced, via its Engineering Blog. “Seeing a sustained peak over the course of an entire event is a change from the way people have previously turned to Twitter during live events.”
“Over time, we have been working to build an infrastructure that can withstand an ever-increasing load,” Twitter explains. “For example, we’ve been steadily optimizing the Ruby runtime. And, as part of our ongoing migration away from Ruby, we’ve reconfigured the service so traffic from our mobile clients hits the Java Virtual Machine stack, avoiding the Ruby stack altogether.”
And Twitter didn’t stutter. We’ve become relatively used to the Fail Whale, especially during national events. It was expected. Reaching capacity and thus breaking has plagued the network – partly because for much of its early days, Twitter was buggy yet flooded with users, the combination of which resulted in down time.
Improving the service’s reliability has been a big priority at Twitter HQ, for a couple of reasons. For starters, users are more and more dependent on it. During the Arab Spring, Twitter earned itself a reputation as a national newsier, a network for connecting in disconnected (and harrowing) situations. Secondly, it’s now an advertising platform, and in order to convince marketers and brands to shell out for its ad products it needs to stay up.
The Obama campaign’s love affair with datasets and digital outreach – and where Romney went wrong
While Silver has inarguably stolen the spotlight (please, I urge you to check out these hilarious Nate Silver memes. Oh Internet, never stop being you.), the Obama campaign’s dedication to data and digital outreach deserves serious commendation.
We saw this start in 2008, when the team turned to social networks to connect and motivate voters, and 2012 was no exception. Jake Levy-Pollans, the Digital Director for the Minnesota chapter of Obama for America, says online outreach only became more important. “The number of people on Twitter today is roughly the number of people on Facebook in 2008, so it was clearly important to us.”
He also mentions that Facebook is dominated by 35-year-old women; they are the networks largest growing demographic. And according to a recent Twitter study, the average user is a 27-year-old women. “It’s been well-reported that this age bracket was an important voter demographic this election.” It’s just an example of the type of data the campaign analyzed and applied in its efforts to win the White House.
“Campaigns had an unprecedented ability to know everything about voters – from social data to commercial data to public data to their own internal records from past campaigns, especially in Obama’s case – and the ones who used this data to build prediction models were the most successful in fundraising, voter persuasion, and turnout, which are the three essentially elements of any campaign,” Dannenbaum says, which a “data-miner” and ad expert with the Obama campaign backs up. In a Reddit AMA, a user by the handle slobmarley who worked with the team says that they were able to read voters like Google or Facebook reads its users. “They know everything Google and Facebook know about you, pretty much,” he writes. “They know what music you like, which Harry Potter book is your favorite, your voting habits, etc. It’s all in databases, you’re just a number in a database with a name attached.” While that might sound cold and calculated, he adds that the motivation was to do whatever it took to get voters to the polls.
So why was the Obama campaign so much more effective at using data and digital means to win the election? Didn’t Team Romney have all the same tools and information at its disposal? “From a 10,000 foot point of view, I can tell you that they were doing things we did in 2008, but weren’t engaging with people at the local level,” says Levy-Pollans. For instance, he mentions that Romney’s campaign largely relied on national, not local, social media accounts.
Romney campaigner John Ekdahl also spilled on the Republican candidate’s missed opportunities. The campaign’s efforts to connect volunteers and operatives via smartphone was called Project Orca, and apparently, it was a bust. “The entire purpose of this project was to digitize the decades-old practice of strike lists. The old way was to sit with your paper and mark off people that have voted and every hour or so, someone from the campaign would come get your list and take it back to local headquarters,” Ekdahl writes on his blog. “Then, they’d begin contacting people that hadn’t voted yet and encourage them to head to the polls. It’s worked for years.”
“Working primarily as a web developer, I had some serious questions. Things like ‘Has this been stress tested?’, ‘Is there redundancy in place?’ and ‘What steps have been taken to combat a coordinated DDOS attack or the like?’, among others. These types of questions were brushed aside (truth be told, they never took one of my questions). They assured us that the system had been relentlessly tested and would be a tremendous success.”
Suffice it to say, the inadequacies of this system led to its demise, and as Ekdahl believes, Romney’s loss.
“They underestimated the scale of the project of replacing a working structural model” Ekdahl tells me. “Someone from Missouri went to their local GOP victory center, and no one there had any knowledge of the project. There was a serious lack of communication.”
Where do we go from here?
Over the past week, much ado has been made about the fact that elections, campaigning, and political discourse have been changed forever. And in part, they will be: No interested candidate is going to see this campaign and not want to replicate what the Obama team was able to do by taking the mountains of information the Internet holds and turn it into deliverables.
“Everyone will jump on the data train,” says Dannenbaum. “Much like Obama pioneered campaigning on social media and now all politicians are there, so too it will be with big data.”
There is cause for concern as well. Just because politicians can so aptly pinpoint us doesn’t necessarily mean they should, and an explosion of micro-targeting could lead to campaigns promising too many things too many specific demographics in order to get the vote. “We already live in political silos, and no one thinks this is a good thing for governance,” Dannenbaum explains. “So if after this election, the big takeaway is that if big data wins, I worry that we may all lose.”
For all of the important things we learned about politics relationship to data and technology in this election, it’s also important to note that this upheaval will coincide with the fact that we as users and voters have more information and access to our candidates than ever. It’s a two way street: They don’t only have the ability to know everything about us, we can know everything about them. This is only going to become increasingly true – the odds that in 20 or 30 years, we’ll have a president who had a Facebook account while he was in college are pretty damn good.
With great power – and data – comes great responsibility, and from now on, we’re going to see both of them (hopefully) applied to future elections. It’s a game changer, for sure, and fingers crossed it’s a good one.