Social Networks: is it right that platforms don’t bear responsibility for content that they carry?
Following the controversy that arose after the recent #stophateforprofit campaign, we consider what is the role of social media in the information landscape.
Since its inception, Facebook has garnered billions of fans, but it has also been at the centre of numerous controversies. It faced issues around copyright infringement while still a university project, drew global outrage with the Cambridge Analytica scandal, and today is embroiled in the advertising boycott campaign.
The platform, born from the mind of the then 19-year-old Mark Zuckerberg in 2004, has grown exponentially to form a true ecosystem of its own. In fact, in addition to the original social network, the company also owns Instagram, Messenger and WhatsApp and, according to a report by Ansa, has more than three billion monthly users.
Although Facebook continues to raise its profile by making headlines around the world, albeit negative ones, Zuckerberg’s latest slip up could cost him a lot in terms of income. Several large companies, from Unilever to The North Face, have decided to boycott the social network, with the campaign represented by the hashtag (already trending) #StopHateforProfit.
This initiative emerged in the United States following the BlackLivesMatter protests, and was launched by several civil rights groups, including NAACP and the Anti-Defamation League. Together they urge large companies to “pause” the purchase of advertising space on Facebook, resulting in huge losses for the company.
The motivation is simple: “the repeated failure of society to deal with the significant and vast proliferation of hatred on its platforms”. The campaign’s website (https://www.stophateforprofit.org/demandchange) details how Facebook has allowed incitement to violence against protesters fighting for racial justice in America, in the wake of the murders of George Floyd, Breonna Taylor, Ahmaud Arbery and many others. It argues this is allowing hatred and extremism to spread faster and further than ever before, causing real damage in the world.
The problem for Facebook seems to be that it doesn’t want to take sides, continuing to assert its independence from the content that the platform allows users to publish, and rejecting any direct responsibility.
But is that really so?
Social networks have changed the way we communicate at both a personal and professional level, and for many they have also changed the way in which people learn about what’s happening in the world. It is therefore right to expect more proactivity, as seen with Twitter, in the fight against misinformation?
Twitter recently won praise for introducing a label to counteract false information circulating via tweets about the Covid-19 pandemic. The warning messages flag “disputed or misleading information” and contain a link to a page edited by “reliable sources”, to provide additional information on the subject. In addition to trying to combat fake news, and the inclusion of #factchecking, Twitter has always had a strict company policy to block or remove accounts that incite hatred or harm to others.
Facebook, on the other hand, has never implemented a significant policy of this sort and the unflattering comparison with Twitter and founder Jack Dorsey’s choices are now becoming uncomfortable.
This could be one reason Facebook recently made a move towards promoting more reliable information, with the announcement that it will fully integrate a News section into the platform, which has been in development since October. With this initiative Facebook will pay media outlets, including local press, for a licence to republish quality articles on the platform, notably departing from the policy of another web giant, Google.
The regulatory disparity that exists in the US between newspaper publishers and the big social networks, comes from the Communication Decency Act that dates back to 1996. This law offers a legal shield from content delivered by platforms and search engines online, while forcing publishers to take responsibility for what they say in their paper publications.
Facebook’s choice to pay newspapers to use their content would bring great advantages both for readers, who would be guaranteed to be informed about content whose truthfulness has been guaranteed by law, and for publishers that would get fairly paid for the work they’ve done.
It would be interesting to understand if similar initiatives may follow, so as to ease the existing pressure on the information industry, whose expertise is as essential as never before.
It goes without saying that the social networks born and headquartered on American territory are governed by U.S. laws and agencies. But the content hosted on their platforms reaches much further, appearing in Europe and beyond. And so we, too, will come to experience that changes that come about as these events unfold.