Skip to content Skip to navigation

Raw Data Podcast, Episode Five: Life, Liberty, and the Pursuit of Data

You know that when you ask a relative or friend about their political leanings, what you’re hearing is filtered through their own biases—Grandma and Grandpa care about preserving social security; Annie works at a hospital and wants more funding for Medicare—and that association is preserved when you see someone post a piece of political news on Facebook, or mention that they voted. Facebook has found that users who see another’s “I Voted” sticker in their timeline are more likely to vote themselves, which is unsurprising social reinforcement, and generally a net positive for believers in democracy.

Campaigners and politicians have embraced the way our social ties can encourage us to vote, from get-out-the-vote campaigns that encourage driving neighbors to the polls, to “talk to your grandparents about Obama” viral videos that suggest more familial discussions about politics could have a positive effect. Campaigns have gone heavily digital, looking at contributors’ Facebook friends and tapping into their social networks beyond the gentle exhortation to “invite your friends to the pancake breakfast” of decades past. As election researcher Nate Persily points out, TV isn’t the only screen anymore, and political campaigns are looking at three others (mobile, tablet, and laptop) to target potential supporters with ads and messages.

Political ads are moving to the internet, where the FCC is slowly following

Facebook and other digital platform sites have developed their own rules around advertising—how much of one type of ad should we show to one user, how do we vary ads and avoid targeting individuals so specifically that they get spooked—and for the most part, these platforms are applying the same rules to political advertising. An ad for Hillary is the same as an ad for Hilltop Steakhouse. That’s not how the government views political advertising, though, and television and radio networks have been held to the higher standard of offering equal time to opposing sides. If ABC runs an ad for “Yes on 1” and the campaign for “No on 1” wants a similar ad at a similar timeslot, they have to give it to them. That’s why Saturday Night Live may get in trouble for having Hillary and Donald Trump on the show; any other candidate may merit equal air time, if they are campaigning in a state where Trump has a substantial showing. Hillary’s three minutes and twelve seconds of airtime was also reported to NBC affiliates in case any other Democratic candidates wanted to claim equal time. The rule doesn’t apply to news shows—including news interview shows—but does apply to advertising and entertainment, leaving the door open for complaints against web platforms that don’t provide advertising equally.

Search results have huge effects on our perceptions, though we think of them as unbiased

The elephant in the room here isn’t a Republican on SNL, though: it’s Google. While many may not think of Google’s search services as a “platform”, Google provides vital information about political candidates and issues, and relies upon its status as a trusted purveyor of information to do so. Many young people are unable to distinguish which search results are ads, and bias in search result rankings can be equally hard to detect—as you hear in this episode from researcher Robert Epstein, many people don’t read below the top three search results. This leaves searchers vulnerable to list effects: we’re more likely to remember the items at the top of a list, and to some extent those at the bottom. Practitioners of SEO—search engine optimization—know that positions in rankings translate to financial gains and losses. The World Wrestling Federation became the WWE in part to stop misdirection to the World Wildlife Foundation when typing WWF. The same is true for candidates: in one of Epstein’s experiments, participants used a Google-like search service to research candidates with whom they were unfamiliar, and those whose positive search results ranked higher engendered much more favorable impressions than those with positive results lower down the page.

Google maintains that their algorithm is unbiased: it’s not going to show a swing voter technology-friendly candidates’ results first, for example. But, it would be possible for that algorithm to gain bias, and even to only display biased results to users in contested districts. That kind of manipulation would be very hard to detect, and could have a huge effect on election results. Expect to hear more about online political advertisements and search results from the FCC, as regulation catches up to our viewing, reading, and watching habits.

Read More:

Nate Persily at Stanford

Robert Epstein's research (links to all of his papers starting on p.14)

Ian Morris at Stanford

Listen Now: