On Aug. 26, Facebook published a blog post titled “Search FYI: An Update to Trending.” The post explained that Facebook was making some changes to its “trending topics” widget, a square that appears to the right of users’ news feed when they sign in to Facebook. In theory, the trending topics widget is designed to keep users up to date with the hottest discussions on the social site.
Facebook decided to change two things about trending topics. First, rather than showing users a topic followed by a description of why that topic was trending (e.g. “Mars: NASA Rover Captures Incredible 360 View of Mars”), Facebook has decided to simply state the topic and the number of people discussing it (e.g. “Mars: 4.2k people talking about this”). Now, instead of seeing a descriptor, users can click or hover and will be shown “a search results page” listing “the news sources that are covering it, posts discussing it and an automatically selected original news story with an excerpt pulled directly from the top of the article itself.”
The second thing Facebook said it was going to do was get rid of the humans—or, most of them. While humans might still be around to confirm “that a topic is tied to a current news event in the real world” (that is, to ensure things like #lunch are kept from the list), they will no longer physically write summaries that once appeared next to the trending topics as descriptors. The trending topics section, in other words, would be left up to an algorithm, monitored by a few engineers. In conjunction with this change, Facebook reportedly laid off its entire editorial staff—some 15 to 18 people.
It took two days for these changes to equate to disaster.
On Sunday, Facebook’s trending topics widget featured a story about Fox News anchor Megyn Kelly that claimed she had been fired for secretly backing Hillary Clinton. The story was completely false, created by a right-wing site called endingthefed.com, which has just over 200,000 followers on Facebook. Eventually, Facebook removed the “story” and issued an apology. In its apology, it explained that a “topic is eligible for Trending if it meets criteria for being a real-world news event and there are a sufficient number of relevant articles about that topic.” The Megyn Kelly hoax piece, said Justin Osofsky, vice-president of global operations for Facebook, met those initial qualifications. “We then re-reviewed the topic based on the likelihood that there were inaccuracies in the articles,” he said in a statement. Only then did Facebook take the story out of its trending widget.
It has been a difficult year for Facebook’s trending news section, especially its human overseers. The detail mostly left unmentioned in Facebook’s update on the 26th (alluded to only via hyperlink) was that in May, its trending topics editorial team—the people who were reportedly let go last week—were accused of anti-conservative political bias. The fallout from that accusation, originally reported by Gizmodo, is now being felt. As it turns out, humans are still better judges of what makes something newsworthy than a computer program.
But Facebook’s disbanding of its human editorial team might speak to something else beyond mere hope of inoculating itself from allegations of bias; it might also be a way that Facebook feels it can distance itself from another allegation: that it has become the world’s largest news media organization.
This is something Facebook’s CEO, Mark Zuckerberg, denies. In fact, he did so Monday while speaking to university students in Italy. “No, we are a tech company, not a media company,” he reportedly said when asked whether Facebook will become a news editor. According to Reuters, Zuckerberg said that, as a technology company, “We build the tools, we do not produce any content.”
Yet the difference between production and distribution of news may increasingly be too fine a hair to split, thanks entirely to Facebook’s own actions.
Facebook has built upon its massive audience by incorporating more and more news content into itself, thereby bringing a more literal sense to its “news feed,” the cascade of posts users see when they first sign in. The result of this strategy is staggering. A Pew Research poll released in May showed that more than 66 per cent of Facebook users get news on the site. Pew contextualized that figure thusly: “Facebook is by far the largest social networking site, reaching 67 per cent of U.S. adults. The two-thirds of Facebook users who get news there, then, amounts to 44 per cent of the general population.”
That access has only increased in the last year, since Facebook has introduced its Instant Articles, a feature that allows news organizations, including Maclean’s, to post stories directly into Facebook’s app, thereby eliminating the need for users to wait to be redirected to the news site after clicking a headline. News content, in other words, is now increasingly difficult to physically, as well as cognitively, separate from Facebook itself. Facebook wants it that way, in order to build and retain engagement, and thereby market share. It just doesn’t seem to want to suffer any of the residual consequences.
“We care about creating a product that people want,” Will Cathcart, who oversees product management of Facebook’s news feed (i.e. not the trending section specifically), told The Verge in May. “Whether or not we can do that entirely with automated systems, or it’s helpful to have people help, is actually just a detail. What’s more important is the product principle, which is that we want to show you what you’re most interested in.”
But what about the details of the things people are most interested in?
Presciently, The Verge’s Casey Newton asked Cathcart: “What happens when everybody’s saying, ‘Obama was born in Kenya.’ Is there someone who comes in and says no, actually he was born in America?”
Cathcart replied: “I think you already see that happen on the platform today. It doesn’t have anything to do with us—people post a lot of this stuff and talk about it, and other people post different points of view. And the nitty-gritty of the details of how we should be involved I actually think is less important than building a platform where if people want to talk about that, it’s really easy to talk about that and find different points of view.”
Of course, it is on this last point that Facebook cannot step away, claiming it is merely a technology company. For along with all the other changes that Facebook has brought to news distribution, the most notable is its creation of a personalized echo chamber—a feature so defined and self-reinforcing that the Wall Street Journal can create sample feeds of “Liberal Facebook and Conservative Facebook” to highlight the silos.
In other words, Cathcart’s vision of what Facebook actually does with news and information (that it allows for a cross-ideological discussion) seems like pure fantasy. Facebook is a news aggregator and distributor like no other in history. It is a media company that, thanks to big data, targets information to individuals. Crucially, it does not do this benignly, just so that people can see it. Facebook targets online news content so that the information can be further disseminated, and so, consequently, Facebook’s hold on readership and traffic can increase. Facebook is in the news business, whether it likes it or not.
Moreover, just because information is distributed and a discussion ensues does not mean the process is inherently, or democratically, pure, as Cathcart suggests. For that to be the case, the information upon which that discussion takes place has to first be based in reality. As it turns out, technology—bots and algorithms—doesn’t know what that reality is, yet. Uncomfortably for Facebook, human news editors do.