Technology

What the Facebook, Google and Twitter algorithms hide from you

It’s not just about what we’re missing on our social media feeds, but the invisible ways we are being manipulated as we travel across the internet

Berlin, Germany - October 10, 2015: Apple iPhone 6 screen with social media internet application icons Facebook, Twitter, Instagram, LinkedIn, Vimeo, Youtube, Skype, Blogger, Pinterest etc.

In late October, Reddit user yooston, a sports fan, complained on the site’s Instagram forum that their feed was “showing posts from 4-7 days ago.”

“I’m going f–king crazy cause my feed is filled with crap from last week,” yooston complained. “I follow a bunch of sports accounts, and there is nothing more annoying than seeing a post of highlights from a game from last week in your feed. I miss the chronological feed so much.”

Fellow Reddit users sympathized. “Welcome to Instagram,” one replied. Welcome, in fact, to being online.

A cursory glance around the internet—either in comment forums or even via Google search—reveals a virtually endless array of complaints about the way information, particularly social media posts, is presented to users.

RELATED: Is Facebook a ‘con’ or not?

People are increasingly becoming aware of the ghosts in their social media streams; that what they see when they search Google, or scroll through Facebook and Twitter—the news stories, updates, tweets, cooking videos, photos etc.—has been filtered and reordered for them. In other words, they’re not seeing the Internet, but a curated facsimile of it.

In an innocuously titled 2013 blog on its news page (“News Feed FYI: A Window Into News Feed”) Facebook announced a major change to the screen its billions of users see the moment they log in to the site or access its mobile app. “Now organic stories that people did not scroll down far enough to see can reappear near the top of News Feed if the stories are still getting lots of likes and comments,” Lars Backstrom, a Facebook engineer, wrote. The move was necessary, Backstrom wrote, because “with so many stories, there is a good chance people would miss something they wanted to see if we displayed a continuous, unranked system of information.”

Backstrom also noted that, “the number of stories people read and the likes and comments they make decrease,” when Facebook showed unranked news feeds rather than those where posts were curated by the platform. It was to be an important discovery.

Fast-forward four years, and the same idea has been applied to Facebook-owned Instagram, and similarly to Twitter. Meanwhile, in 2016, Google updated its mobile app to present users with a more personalized search screen, showing not only upcoming calendar events, but curated news and information for each user. The platforms via which we see the world don’t just show us more of the same kind of content we saw before, but predict what amongst it all we are interested in seeing most.

RELATED: What will happen when we fall out of love with tech?

It is all the work of algorithms, computer programs that are designed to operate on Boolean logic—“if this, then that”—and scan vast datasets that are either compiled by or purchased by companies and include information about what each of us has done online: what we’ve liked or disliked, commented on, purchased, or asked Google.

In an immediate sense, that means there are plenty of things we don’t see, such as updates from friends or news items that, for one reason or another an algorithm has determined interests us less than other content. But the Facebook posts or Google searches we don’t see are merely the tip of the iceberg when it comes to what’s become invisible to us online. As consumers, we also know little about how the algorithms actually work.

“For the average person who doesn’t know a lot about technology, there’s no way for them to make informed choices about the systems that they’re using and the kinds of behaviours that they’re doing on those systems, because there’s no visibility about what’s being collected and what’s not,” says Emilee Rader, an associate professor at Michigan State University whose research focuses on sociotechnical systems (those in which people, information, and technology interact).

That’s because the algorithms are proprietary. While we have some idea of how social media ranking systems function, the details are locked away safely deep inside each company’s vaults.

The sign outside the main entrance to Facebook HQ. Image via Facebook.

The sign outside the main entrance to Facebook HQ. Image via Facebook.

It might not seem like a big deal that posts from particular people are shown to us more than others, or that ads for items on your Amazon wish list follow you around the internet as you browse, but there have already been unexpected consequences of increasingly allowing our interactions with technology, and each other, to be dictated by invisible algorithms.

For one, algorithms have helped make the big tech companies incredibly powerful.

Our information is valuable, and algorithms help monetize it. The simple reason Facebook was concerned that people spent less time engaging with posts in a chronological feed than one ranked and edited to match inferred preferences is because the better the company can show it knows its users, the more valuable it becomes to advertisers. So far, what Facebook knows about us has made the company a lot of money. The overwhelming majority of internet ad dollars flow through Google and Facebook, and ad revenues make up the lion’s share of revenue for both companies: 87 per cent of Google’s $90 billion in sales and 98 per cent of Facebook’s $27 billion.

Online platforms like Facebook feel like a “nice happy social place where you’re talking to your friends,” says Rader. “But that’s not what it exists for. It exists to make Facebook money. Google search exists to make Google money.”

RELATED: Russia’s Facebook memes have been revealed, and politics will never be the same

In short, our data, given willingly for free, has effectively helped create a duopoly in the global advertising business, and elevated Google and Facebook, along with other tech giants like Amazon and Apple, to a place of such dominance in the economy that trust-busting is being seriously debated.

What we also don’t know much about are the intricate patterns algorithms draw between datasets, as the information that’s collected about us all online is increasingly being merged with data collected offline from other companies and organizations.

Again, the implications may go well beyond simply being shown an ad that you can ignore. In fact, ignoring ads is no longer the problem, Zeynep Tufekci, a leading techno-sociologist, said recently.

In a September talk, Tufekci offered the example of ads promoting tickets to Las Vegas. We might assume they were targeted based on demographic data – young men or people with high credit card limits – but algorithms might make a different kind of connection, Tufekci speculated. “What if the system that we do not understand was picking up that it’s easier to sell Vegas tickets to people who are bipolar and are about to enter the manic phase?” she asked. “Such people tend to become over-spenders, compulsive gamblers,” she said. The algorithms could make such a connection “and you’d have no clue that’s what they were picking up on.”

RELATED: Why Twitter needs Donald Trump

The point is this: The algorithms that decide what you see on your social media stream and in search results are not limited to that space. What we do on Facebook or Twitter or Google can inform how we experience the internet as a whole—an increasingly crucial contact point we have with the world.

The internet in whatever form (platform, app, etc.) is not benign. It is not a passive medium, but one that is incredibly, if invisibly, active. The frustration we feel when we scan our social media feeds and realize we’re not seeing everything is all the more visceral because of how personal the content feels (why shouldn’t we be allowed to see what our friends are trying to show us?). But there is a reason: it’s just hidden, and the algorithm’s decision to hide that content was informed by you.

“Everything that you do communicates something to the system. Every action you take is used to make guesses about what you want to see in the future,” Rader says. “Thinking about whether you want to like that page or not, thinking about whether you want to like that cute dog, Halloween photo or not – those things, you’re training the system about what you like.”

Looking for more?

Get the Best of Maclean's sent straight to your inbox. Sign up for news, commentary and analysis.
  • By signing up, you agree to our terms of use and privacy policy. You may unsubscribe at any time.