In the aftermath of Donald Trump’s election, Facebook founder Mark Zuckerberg spoke publicly about the role Russian trolls and fake news on Facebook played in shaping public perception and influencing the presidential election. The company has since changed its mission statement from “making the world more open and connected” to “give people the power to build community and bring the world closer together.” The timing is no coincidence. The slogan’s also hogwash. Facebook is concerned with its brand, and with two billion monthly users (there’s 7.4 billion people on earth) and an 18% growth rate, Zuckerberg does not want bad publicity to disrupt the lucrative company’s continued expansion, which is based on the acquisition of free content from users, which it then uses to target users with advertising. Calling Facebook users ‘users’ is fitting, since it was always the public that was being used.
At the London Review of Books, John Lanchester examines three actual books to look closely at what Facebook really is on the inside and how it goes about its data-collecting business. It’s essentially an advertising business, which means, in Lanchester’s words, “Facebook is in the surveillance business.”
Facebook, in fact, is the biggest surveillance-based enterprise in the history of mankind. It knows far, far more about you than the most intrusive government has ever known about its citizens. It’s amazing that people haven’t really understood this about the company. I’ve spent time thinking about Facebook, and the thing I keep coming back to is that its users don’t realise what it is the company does. What Facebook does is watch you, and then use what it knows about you and your behaviour to sell ads. I’m not sure there has ever been a more complete disconnect between what a company says it does – ‘connect’, ‘build communities’ – and the commercial reality. Note that the company’s knowledge about its users isn’t used merely to target ads but to shape the flow of news to them. Since there is so much content posted on the site, the algorithms used to filter and direct that content are the thing that determines what you see: people think their news feed is largely to do with their friends and interests, and it sort of is, with the crucial proviso that it is their friends and interests as mediated by the commercial interests of Facebook. Your eyes are directed towards the place where they are most valuable for Facebook.
Now that the public knows how Facebook’s fake election stories have created more reader engagement than top New York Times stories, Zuckerberg has a social responsibility to use his powerful platform in a way that doesn’t further erode its users’ society. Instead of factoring in the social costs of social media, though, Facebook remains committed solely to growth and monetization. Google’s public maxim is “Don’t be evil.” Even if you doubt that maxim’s veracity, as consumers, we have to ask ourselves: when a company cares more about monetizing users’ data than about protecting users from a Russian misinformation campaign, why should anyone use their service? In Capitalist America, too many people see it as un-American to say that businesses have a social responsibility. But when it comes to capitalism, we consumers ultimately wield the most power: we can choose not to spend our money or time on businesses who ignore the social costs of their operations. If you’ve been on the verge of deactivating Facebook, now is a good time.
The fact is that fraudulent content, and stolen content, are rife on Facebook, and the company doesn’t really mind, because it isn’t in its interest to mind. Much of the video content on the site is stolen from the people who created it. An illuminating YouTube video from Kurzgesagt, a German outfit that makes high-quality short explanatory films, notes that in 2015, 725 of Facebook’s top one thousand most viewed videos were stolen. This is another area where Facebook’s interests contradict society’s. We may collectively have an interest in sustaining creative and imaginative work in many different forms and on many platforms. Facebook doesn’t. It has two priorities, as Martínez explains in Chaos Monkeys: growth and monetisation. It simply doesn’t care where the content comes from. It is only now starting to care about the perception that much of the content is fraudulent, because if that perception were to become general, it might affect the amount of trust and therefore the amount of time people give to the site.
Zuckerberg himself has spoken up on this issue, in a Facebook post addressing the question of ‘Facebook and the election’. After a certain amount of boilerplate bullshit (‘Our goal is to give every person a voice. We believe deeply in people’), he gets to the nub of it. ‘Of all the content on Facebook, more than 99 per cent of what people see is authentic. Only a very small amount is fake news and hoaxes.’ More than one Facebook user pointed out that in their own news feed, Zuckerberg’s post about authenticity ran next to fake news. In one case, the fake story pretended to be from the TV sports channel ESPN. When it was clicked on, it took users to an ad selling a diet supplement. As the writer Doc Searls pointed out, it’s a double fraud, ‘outright lies from a forged source’, which is quite something to have right slap next to the head of Facebook boasting about the absence of fraud. Evan Williams, co-founder of Twitter and founder of the long-read specialist Medium, found the same post by Zuckerberg next to a different fake ESPN story and another piece of fake news purporting to be from CNN, announcing that Congress had disqualified Trump from office. When clicked-through, that turned out to be from a company offering a 12-week programme to strengthen toes. (That’s right: strengthen toes.) Still, we now know that Zuck believes in people. That’s the main thing.