[vc_row][vc_column][vc_column_text]
“Are social media companies publishers or platforms?” Juliet Oosthuysen, who was recently banned from Twitter for expressing an opinion regarding the UK’s Gender Recognition Act, asked at a panel discussion to launch Article 19’s Missing Voices campaign on 20 June.
Oosthuysen was joined by Jennifer Robinson, a barrister who specialises in international media law, Paulina Gutierrez, an international human rights lawyer who has worked on developing the digital rights agenda for Article 19, and Pavel Marozau, an online activist whose satirical films have been removed from YouTube. The event was co-sponsored by Index on Censorship.
Article 19 is an organisation devoted to protecting freedom of expression. Missing Voices is its campaign to call for more transparency and accountability from the likes of Twitter, Facebook and YouTube over content removal. The aim is to protect online free expression in the complex web of intellectual property laws, community standards, algorithms and government censorship mandates that regulate what can and cannot be posted on social media platforms.
As described in Article 19’s 2018 policy brief, Missing Voices’ mission is to “Call on social media platforms to respect due process guarantees in the content moderation and account suspension or removal processes, create clear and transparent mechanisms to enforce such guarantees, and at the same time, call for them to align their policies with their responsibility to respect human rights, set out in the Guiding Principles of Business and Human Rights.”
As social media now spans hundreds of countries and their respective laws surrounding censorship, companies have to either model universally applied community standards to fit within every country’s unique laws or to impose standards unique to each country. This can create overly strict restrictions or even more barriers to free expression. Robinson said: “If any one country can determine that their takedown requirements based on their own free speech standards can be applied globally then we are going to see a race to the bottom of what is available online.”
In the breakout session that followed the panel, various groups discussed the difficulty in balancing the protection of opinions expressed online and fighting against the rampant harassment faced by ethnic, racial, sexual or gender minorities. The line between what is and is not acceptable is often blurry, argued multiple panellists, and is even more so when the decisions about what content to remove and which users to ban are increasingly made by artificial intelligence or algorithms. Speaking about her own experience with being banned and her multiple fruitless attempts to regain her Twitter account, Oosthuysen said: “A person made the decision to terminate my account, and I would like to speak with a person to get it reinstated, not an algorithm.”
Community standards are difficult to navigate. One audience member jokingly suggested social media platforms institute a “cooling off period,” so that users could be protected from censorship for posts made in the heat of the moment following a tragedy. This is not, in fact, a new suggestion: human Facebook content moderators are encouraged to consider recent personal events, such as romantic upheaval, when deciding whether to remove a piece of content that expresses hatred towards a gender, for example. However, the idea that circumstances could excuse certain content that was otherwise inexcusable is difficult to enshrine in community standards that are supposed to be universally implemented. Algorithms — and even human censors — are not always able to determine when a piece of content is intended as a joke or whether it is condoned by the perceived target.
Marozau said that when attempting to understand what community standards he had broken, it was obvious content-sharing platforms “can’t say clearly what they’re against”. Marozau’s film attacked Belarusian president Alexander Lukashenko, and he was subsequently persecuted by the Belarusian government. His film, which he does not believe violated any community standards, was removed shortly after. It can often be difficult, noted Robinson, to determine the reason for the removal of a piece of content, and when removals are manipulated for political ends rather than legitimate online harassment.
There have been instances — some quite recent — when content-sharing platforms have been criticised for censorship after barring high-profile users whose content has been controversial. For example, American far-right conspiracy theorist Alex Jones was censored and banned by Facebook, Youtube, Instagram and Twitter, platforms that haven’t pursued many accounts with less followers but more violent rhetoric. Community standards, it seems, are applied most frequently to send a message rather than act punitively.
The Missing Voices campaign seeks to counteract censorship and consolidate laws and community standards wherever possible. The campaign will lobby media companies by spreading the message of free speech through social media influencers, marginalised groups, employees of the companies and shareholders.
According to Gutierrez: “If we put all the processes together, then we can… find inconsistencies between the actual responses and what [social media companies] are publishing in their transparency reports.” Gutierrez hopes that clarifying the regulations about posting will lead to better awareness about when community standards and laws are used fairly and when for political ends, and that doing so will make social media platforms more conducive to free speech. [/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_basic_grid post_type=”post” max_items=”4″ element_width=”6″ grid_id=”vc_gid:1561458322739-c029debc-beba-1″ taxonomies=”4883, 16927″][/vc_column][/vc_row]