Facebook Executives Shut Down Efforts to Make the Site Less Divisive
A Facebook Inc. team had a blunt message for senior executives. The company’s algorithms weren’t bringing people together. They were driving people apart.
“Our algorithms exploit the human brain’s attraction to divisiveness,” read a slide from a 2018 presentation. “If left unchecked,” it warned, Facebook would feed users “more and more divisive content in an effort to gain user attention & increase time on the platform.”
Facebook have shelved their own research to understand how the social media platform shaped user behaviour and how the company might address potential harms of polarizing its users, people familiar with the matter told The Wall Street Journal.
Chief Executive Mark Zuckerberg had in public and private expressed concern about “sensationalism and polarization.” The research was kicked off after media reports surfaced that certain posts on the social media, with millions of users, especially in recent weeks, have been known for promoting bogus Covid-19 cures and conspiracy theories about the origins of the coronavirus, despite efforts by the social-media giant to crack down on misinformation.
But in the end, Facebook’s interest was fleeting. Zuckerberg and other senior executives largely shelved the basic research, according to previously unreported internal documents and people familiar with the effort, and weakened or blocked efforts to apply its conclusions to the company’s products.
Facebook policy chief Joel Kaplan, who played a central role in vetting proposed changes, argued at the time that efforts to make conversations on the platform more civil were “paternalistic,” said people familiar with his comments.
Another concern, company officials said, was that some proposed changes would have disproportionately affected conservative users and publishers, at a time when the company faced accusations from the right of political bias.
Many of the company’s own experts are in a view that Facebook is under fire for making the world more divided and it could mitigate many of the problems but the company chose not to.
Kaplan himself said in a recent interview that he and other executives had approved certain changes meant to improve the civic discussion. In other cases where proposals were blocked, he said, he was trying to “instil some discipline, rigor and responsibility into the process” as he vetted the effectiveness and potential unintended consequences of changes to how the platform operated.
Americans were drifting apart on fundamental societal issues well before the creation of social media, decades of Pew Research Center surveys have shown. But 60 per cent of Americans think the country’s biggest tech companies are helping further divide the country, while only 11 per cent believe they are uniting it, according to a Gallup-Knight survey in March.
Fixing the polarization problem would be difficult, requiring Facebook to rethink some of its core products. Most notably, the project forced Facebook to consider how it prioritized “user engagement”–a metric involving time spent, likes, shares and comments that for years had been the lodestar of its system.
Data scientists involved with the effort found some interest groups–often hobby-based groups with no explicit ideological alignment–brought people from different backgrounds together constructively. Other groups appeared to incubate impulses to fight, spread falsehoods or demonize a population of outsiders.
In keeping with Facebook’s commitment to neutrality, the teams decided Facebook shouldn’t police people’s opinions, stop conflict on the platform, or prevent people from forming communities. The vilification of one’s opponents was the problem, according to one internal document from the team.
“We’re explicitly not going to build products that attempt to change people’s beliefs,” the company officials said in the document.
“We’re focused on products that increase empathy, understanding, and humanization of the ‘other side’,” they added.