Mark Zuckerberg is sorry again.
The founder of Facebook—which had revenue of $40 billion last year thanks to its two billion regular monthly users—will humble himself, a giant on his knees, when he apologizes to a congressional committee today in his first ever appearance in front of lawmakers.
Facebook, he will say, has allowed its tools to be used to do harm.
“That goes for fake news, foreign interference in elections, and hate speech, as well as developers and data privacy,” he will say. “We didn’t take a broad enough view of our responsibility, and that was a big mistake. It was my mistake, and I’m sorry.”
Zuckerberg will need to use all his nerdy charm to sell the apology. The problem, he will tell the congressmen, is that Facebook is an “idealistic and optimistic company.”
I wouldn’t buy it, but the congressmen might. It will be a carefully crafted message from the leader of one of the great success stories of 21st century American capitalism. And, also, Facebook has donated $381,000 to committee members over the years, Democrats and Republicans, which is the kind of thing that congressmen are said to take note of.
Also, Zuckerberg has had a lot of practice doing this kind of thing.
As Zeynep Tufekci explained recently in Wired, Zuckerberg has been issuing apologies since 2003, a year before he started Facebook, when he violated the privacy of Harvard students with a web site using pilfered pictures of students in a crude hot-or-not kind of website called Facemash.
In the 15 years since he was a sexually frustrated undergrad with a modest gift for writing code and a killer instinct for clickbait, Zuckerberg has triumphed, growing his company until 30 per cent of the people on the planet visit his site every month. He has grown fabulously wealthy by capturing people’s attention, holding it as long as possible, gathering personal information about them and then selling that information.
Advertisers are the customers. Users are the product. He sells us, our information, our attention, to anyone who has anything to sell. When, as happens fairly often, he is shown to have misused that information, he apologizes and promises to do better.
It has been a winning formula, and if Zuckerberg gets through the political and regulatory gauntlet ahead of him, he can likely continue to violate users’ privacy and keep apologizing for it for the rest of his life.
But first, he must walk the gauntlet, responding to questions from lawmakers and regulators empowered like never before.
Zuckerberg has been forced to this most recent apology, to endure all kinds of nitpicking, by the continued fallout over Russian interference in the election that brought Donald Trump to the White House.
It was brought to a head by Canadian pink-haired voter-contact-guy-turned-whistleblower Christopher Wylie, who revealed last month that Cambridge Analytica had harvested personal information from (87 million!) Facebook users and used the data to establish pseudo-scientific psychographic voter profiles, the better to convince them vote for Trump.
Working with Trump aide Steve Bannon and his billionaire benefactors the Mercers, Cambridge Analytica specialized in “dark advertising,” using data from Facebook to find people who might respond to a particular message, and delivering those ads to them secretly. For example, they sent ads to African Americans reminding them that in a 1996 speech Hillary Clinton had used the racially charged term “super predator.” The goal? Not to get them to vote for Trump but to discourage them from voting at all.
The same strategy appears to have been behind the Russian bots backing Bernie Sanders, not because the Russians wanted Sanders in the White House, but because they wanted Trump, and wanted to give the impression online that Bernie bros disliked Clinton more than they disliked Trump.
And, of course, Facebook sold ads to Russians, made light of their impact, then apologized for that, declined to release them until forced by Congress, eventually admitting that as many as 126 million Americans saw content from the Russian Internet Research Agency.
The picture that has slowly emerged since the election is disquieting, each revelation like another piece in a puzzle, and the thing is we don’t know how many puzzle pieces are still missing. The pieces, like the ads, are secret.
For instance, we learned recently that the Mercers backed a group that used Facebook to secretly spread anti-Muslim videos to voters in swing states, and produced other anti-Muslim videos in concert with Canada’s own Rebel, seemingly in a sophisticated attempt to stampede people into voting for Trump’s Muslim ban.
And Facebook was used by fly-by-night fake news sites, many based in eastern Europe, to spread fake stories, claiming, for instance, that Denzel Washington supported Trump.
As a former longtime newspaper reporter, I watched all this unfold with queasy astonishment. At every paper where I ever worked, a reporter would have expected to be immediately fired for knowingly reporting a falsehood.
Those newspapers are much reduced now, and in their place is Facebook, which has won the attention of our former readers and, with it, the revenue that sustained the kind of plodding, fact-based journalism that filled the spaces between the ads and kept the politicians on their toes.
Now, newspapers are on the way out, replaced by a social network that is reluctant to accept that there is a difference between truth and lies, because that difference, which was everything in the newspaper culture, doesn’t matter to the advertisers, and Facebook can deliver exactly what advertisers need, carefully sorted demographic target audiences.
It was journalists—likely as irked as I am by this harsh new information economy—who brought about the reckoning that Zuckerberg now faces.
Last month, after the Guardian revealed the Cambridge Analytica data breach, other reporters exposed disturbing information about the tactics the firm used in elections around the world. Facebook responded by complaining that it was being unfairly accused of a data breach, part of a long series of PR blunders as the company slowly came to realize that its normal Silicon Valley happy talk—engagement! connection!—would not make the issue go away.
Facebook could have avoided all of this if it had listened to Elizabeth Denham, the former assistant Privacy Commissioner of Canada, who investigated the company in 2009 after law professor David Fewer and some University of Ottawa students filed a complaint about the company’s data-harvesting practices, an early, crucial step in the struggle to make Facebook account for itself.
Fewer and the law students convinced Denham and her boss, commissioner Jennifer Stoddart, that Facebook was not getting meaningful consent for its data collection, in particular as regards third-party applications—games and quizzes—which were then a huge source of “engagement” for Facebook.
A third-party application is how Cambridge Analytica got its hands on personal data that it used in the Trump election, a foreseeable breach that seems to have shocked poor idealistic Zuckerberg.
In 2009, Denham warned Facebook that it was not protecting data gathered by dodgy companies running third-party applications: “I am not satisfied that contractual arrangements in themselves with the developers constitute adequate safeguards for the users’ personal information in the Facebook context.”
Facebook lawyers objected angrily, boasting of a “well-designed structure that allows identification and removal of potentially problematic applications.”
Denham was right and Facebook was wrong. So now Zuckerberg is having to apologize and establish the kinds of safeguards that Fewer and his students wanted in 2009.
And maybe that will be the end of it.
Facebook does seem to now know that it can’t allow anyone with a few rubles to secretly run election ads in Florida anymore. It is testing a transparency tool in Canada that allows everyone to see who is running what political ads, which it should have realized is a necessary part of a healthy election debate in a democracy, if that was the kind of thing that it cared about at all.
But Professor Fewer, who made the complaint that Facebook failed to heed back in 2009, thinks that politicians will have to give regulators, like Canada’s privacy commissioner, the power to issue orders and fine companies, and establish stricter rules: “I think we’re recognizing some of the market failures inherent to the surveillance economy online, and I think that we’re now starting to recognize that a more hands-on regulatory approach is appropriate.”
Denham, who is now the U.K. Information commissioner, is seeking a warrant to investigate Cambridge Analytica. Canada’s ethics committee is going to hold hearings. The Canadian and B.C. commissioners are looking into the business. A new data-protection framework goes into effect later this month across the European Union, which will inevitably have legal implications in North America, where regulators have not been similarly empowered.
From the beginning of the internet age, politicians have been reluctant to regulate the internet giants, because they didn’t want to restrain innovative economic activity, and often sought to have some of the tech glamour rub off on them.
Now, like lilliputians with many tiny ropes, lawmakers have the motive and the opportunity to bind the Gulliverian internet giants, peg them to the ground with rules and regulations.
It is about time. Apologies are well and good, but laws are more likely to protect our privacy, and nobody—including Zuckerberg—can argue that Facebook has been doing that.