TECHNOLOGY

Is Facebook a ‘con’ or not?

In trying to downplay the impact of Russian-planted divisive political posts on its platform, Facebook prompts questions about its business model

Image courtesy Facebook.

Image courtesy Facebook.

In late October, Facebook (along with its platform contemporaries, Twitter and Google) sent a representative to testify before the U.S. Senate Judiciary Subcommittee on Crime and Terrorism. The topic at hand was the possible Russian influence during the 2016 U.S. presidential election via misinformation spread online. For Facebook, that meant fraudulent posts disseminated on its platform. And, in describing the impact of that misinformation, Facebook may have unwittingly forced a serious question about its entire business model.

MORE: The walls are closing in on Facebook

In its written statement presented to the committee, Facebook added context to its earlier revelations that a Russian troll farm, the Internet Research Agency, had spent $100,000 on roughly 3,000 ads that focused on “amplifying divisive social and political messages across the ideological spectrum — touching on topics from LGBT matters to race issues to immigration to gun rights.” According to CNN, Facebook’s testimony this week clarified that “29 million people were served content directly from the Internet Research Agency, and that after sharing among users is accounted for, a total of ‘approximately 126 million people’ may have seen it.”

However, Facebook also sought to downplay the implications of that figure — roughly the same total number of people who voted in the 2016 election.

“This equals about four-thousandths of one percent (0.004%) of content in News Feed, or approximately 1 out of 23,000 pieces of content,” Facebook said in its statement. “Put another way, if each of these posts were a commercial on television, you’d have to watch more than 600 hours of television to see something from the IRA.”

RELATED: Google uncovers ads placed by Russian operatives, says report

There are potentially at least two issues with framing the debate in this way.

First, as any casual user knows, the algorithms that control what appears on Facebook’s News Feed (the stream of posts that greets us when we open the app or sign in), are at least partially geared toward prioritizing content based on what we’ve previously seen, clicked on, or shared. In other words, the issue is probably less about how many people in total may have seen material generated or paid for by foreign influencers, but rather how many users were exposed repeatedly to similar content once they’d initially engaged with a fraudulent post.

That is to say, it’s the repetition for each user that counts, not the overall percentage of Russian posts were present across the entire platform. Further, it likely matters more where those users lived – say, in swing states, for instance.

The second problematic facet of Facebook’s defence—that few people overall may have actually seen the Russian troll farm content—is what it implies about Facebook’s impact on its users. Facebook boasts that by using its platform and accessing people based on its data — which is, by all accounts, quite rich — advertisers and political parties can target their messages effectively. Yet, squaring that with how little impact Facebook suggests the Internet Research Agency’s posts had on users prompts an interesting quandary.

MORE: What will happen when we fall out of love with tech?

As Dylan Byers at CNN tweeted: “FACEBOOK timeline: didn’t happen — happened, but was small — ok, semi-big — ok, it reached 126 million, but no evidence it influenced them”. Pithy though that assessment may be, Facebook arguing that $100,000 worth of ads has little effect on their intended targets implies that its vaunted targeting abilities might not amount to much. Which should be a wake-up call for legitimate advertisers.

In a talk she delivered last month at TEDGlobal in New York City, techno-sociolognist Zeynep Tufecki summarized the effects algorithms are having on our lives. In doing so, she inadvertently summarized the current dilemma: “Either Facebook is a giant con of a half trillion dollars and ads don’t work on the site – that it doesn’t work as a persuasion architecture — or its power of influence is of great concern. It’s either one or the other.”

Looking for more?

Get the Best of Maclean's sent straight to your inbox. Sign up for news, commentary and analysis.
  • By signing up, you agree to our terms of use and privacy policy. You may unsubscribe at any time.