The dichotomy of Turkey

On 1 August, a significant prisoner swap between the USA and Russia took place in Turkey’s capital Ankara and 26 prisoners were freed, including the peerless American reporter Evan Gershkovich. In playing a central role in the most extensive prisoner exchange since the end of the Cold War, Turkey’s National Intelligence Organization (MIT) won accolades. The operation reminded the world that its NATO membership has been the cornerstone of Turkey’s defence and security policy since it joined the bloc in 1952.

Yet over the next 24 hours, Turkey’s Information and Communication Technologies Authority barred access to Instagram without providing a specific reason. Reports suggested the ban was a response to Instagram’s removing posts related to the death of Hamas leader Ismail Haniyeh, a close ally of Turkey’s strongman president

During his 21-year reign, Recep Tayyip Erdoğan has established himself as the most relentless implementer of censorship in Turkish history. Twitter, Wikipedia, OnlyFans, YouTube, Google Sites, Blogger, Blogspot, Google Docs, SoundCloud, WordPress, Facebook, Reddit, Google Drive, Dropbox, WhatsApp, Voice of America, Deutsche Welle, and Roblox have been among the victims of Erdoğan’s censorship.

Erdoğan has always oppressed free voices by tagging them as fascists. He has attacked and imprisoned all sectors of Turkish society under that accusation – except for Turkey’s actual fascistic groups which are parts of his far-right governing coalition.

On 5 August, Erdoğan accused Mark Zuckerberg’s Meta of “digital fascism.” But five days later, Turkey restored access to Instagram. The nine-day block reminded people of the arbitrary nature of Erdoğan’s regime, which is built on macho posturing to audiences at home and bullying “foreign powers” in the name of the Turkish nation.

Turkish users could then re-access Instagram after the country’s minister of transport and infrastructure claimed Instagram had accepted that “our demands… will be met”. Yet Instagram continues to remove posts mourning the death of Haniyeh: nothing has changed.

Three days after Instagram was reinstated, a woman who criticised Erdoğan’s ban in a YouTube interview was arrested for “insulting Turkey’s President”. She was sent to a prison where she remains at the time of writing this.

For some, Erdoğan’s Instagram ban was but a pointless act. I see it as part of a more ominous tactic. Banning Instagram solidifies the idea that censorship in Turkey is all about Erdoğan’s whims. The strongman can cut access to Google, Amazon, Netflix, iCloud, and other vital internet services if and when he feels like it. He’s all-powerful: no legal entity can stop him from doing whatever he wants.

Facebook policies put human rights defenders at risk

If Priscilla Chan, an American citizen and wife of Facebook CEO Mark Zuckerburg, was passing through Cairo International Airport and was stopped by a police officer who searched her phone illegally would she file a lawsuit? Possibly.

If Priscilla and her husband were Egyptian then the answer is definitely not. It is common knowledge that in Egypt, the police are above the law. If this hypothetical situation actually came to pass, I would advise Mark Zuckerburg not to run any social media campaigns publicising what happened with his wife because he would either be arrested or forcibly disappeared. Even if our hypothetical Egyptian Mark Zuckerberg managed to flee the country after that, he wouldn’t be able to create a campaign to help those in similar dangers -Facebook only allows political campaigns for those physically inside the country. If he managed to seek help from a friend or family member inside Egypt, then they will also likely be arrested immediately; Facebook’s policy now requires someone’s full name in order to make a political campaign or advertisement. Thus, my advice to you my friend would be to internalise your anger. Facebook’s policies aid and abet tyrants. That is what Egyptians must face.

In 2017, the executive boards of Facebook, Twitter, and Google all announced that they found Russian hackers had bought ads on their platforms and used fake names a year previously to create controversial stories and spread fake news ahead of the American presidential elections of 2016. The companies handed over three thousand divisive ads to the US Congress, which they believed were bought by Russian parties in the months leading up to the elections in order to influence the outcome.

Between them, the tech companies appointed more than a thousand employees to review ads to ensure they are consistent with their terms and conditions and prevent misleading content. This was intended to deter Russia and others from using their social networks to interfere in the elections of other nations.

This led to Mark Zuckerburg’s announcement that outlined steps to help prevent network manipulation, including imposing more transparency on political and social ads that appear on Facebook. This included making advertisers provide identifying documentation for political, social and election-related ads. Likewise, he announced that the advertisements would have to bear the name of the person that funded the advertisement, and that the predominant funder of the advertisement must reside in the country itself, and the financing must be done locally.

I find that there is a significant gap between the reality of what is truly happening in the Middle East and what the West understands about it. What is happening in Egypt specifically is not comparable to anything happening in the US or Europe and thus the international policies for such companies cannot be developed based on the desires and needs of only the American public.

These laws were supposed to help American society be more transparent but instead are being used as a weapon by the Egyptian regime in order to crack down on people’s rights and freedoms and they put human rights activists in Egypt in further danger.

Revealing the full names of those creating political or human rights campaigns leads to these individuals being constantly under threat, of both their posts being taken down and a potential government crackdown on them. As a result, these laws become a means of control for the government to further silence the voices of the masses. We, as human rights defenders in Egypt, need security and privacy, as the nature of our work exposes violations within systems and governments. There are a large number of risks that we are already exposed to daily because of our activity, and it is possible to monitor us in many ways, including the digital system, in which the system can currently determine all our activity through such transparency laws.

We are not looking for equal rights or to enter elections, rather we are merely attempting to possess our own humanity, preserve our dignity, and stand up for our rights. We dedicate our lives for equality and to prevent infringements on the rights of those in our society. Now that our activism is deemed nearly impossible by your laws, Mr Zuckerburg, you truly leave us with no option as we cannot put our families and communities at risk of imprisonment due to our names and the names of those helping us being made available.

I urge you to make digital privacy and security of human rights defenders a top priority, as today these activists have become truly vulnerable to repressive tactics as the Egyptian regime uses your laws as a loophole to remove opposition.

We have already had bad experiences with your laws.

Human rights defender Sherif Alrouby has been imprisoned by the Egyptian regime for years and we attempted to campaign for his release. We tried to spread a song entitled ‘Sherif Alrouby is imprisoned oh country’ and were impeded by Facebook’s policies. We had no option but to stop our campaign in order to prevent any security issues with the individuals that funded our advertisement as their full names were displayed.

Facebook’s policies impede our work as human rights defenders. We recognize that you support freedom of speech and desire increased transparency, but you do not realise the severity of what is happening in Egypt. A prime example of the severity of the situation is the killing of activist Shady Habash inside prison for merely making a song criticising the regime’s policies during the reign of El-Sisi. Likewise, my friend Galal El Behiery spent more than six years in prison for writing the song’s words – he has been on hunger strike in prison for more than 14 days.

I urge you all to understand the differences between nations. Egypt is not a transparent nation. Rather, it is an oppressive nation that exploits transparency to kill and dispose of opposition.

Joint statement on Facebook’s draft charter for content oversight board

[vc_row][vc_column][vc_column_text]

Mark Zuckerberg at TechCrunch Disrupt 2012. Credit: JD Lasica

Mark Zuckerberg at TechCrunch Disrupt 2012. Credit: JD Lasica

 

We, the undersigned, welcome the consultation on Facebook’s draft charter for the proposed oversight board. The individuals and organisations listed below agree that the following six comments highlight essential aspects of the design and implementation of the new board, and we urge Facebook to consider them fully during their deliberations.

The board should play a meaningful role in developing and modifying policies: The draft charter makes reference to the relationship between the board and Facebook when it comes to the company’s content moderation policies (i.e. that “Facebook takes responsibility for our (…) policies” and “is ultimately responsible for making decisions related to policy, operations and enforcement” but that (i) Facebook may also seek policy guidance from the board, (ii) the board’s decisions can be incorporated into Facebook’s policy development process, and (iii) the board’s decisions “could potentially set policy moving forward”. As an oversight board, and given that content moderation decisions are ultimately made on the basis of the policies which underpin them, it is critical that the board has a clear and meaningful role when it comes to developing and modifying those underlying Terms of Service/policies. For example, the board must be able to make recommendations to Facebook and be consulted on changes to key policies that significantly impact the moderation of user content. If Facebook wishes to decline to adopt the board’s recommendations, it should set out its reasoning in writing. Providing the board with such policy-setting authority would also help legitimize the board, and ensure it is not viewed as simply a mechanism for Facebook to shirk responsibility for making challenging content-related decisions.

To ensure independence, the board should establish its own rules of operation: Facebook’s final charter is unlikely to contain all of the details of the board’s internal procedural rules and working methods. In any event, it should be for the board itself to establish those rules and working methods, if it is to be sufficiently independent. Such rules and working methods might include how it will choose which cases to hear, how it will decide who will sit on panels, how it will make public information about the status of cases and proceedings, and how it will solicit and receive external evidence and expertise. The final charter should therefore set out that the board will be able to develop and amend its own internal procedural rules and working methods.

Independence of the board and its staff: The draft charter makes reference to a “full- time staff, which will serve the board and ensure that its decisions are implemented”. This staff will therefore have a potentially significant role, particularly if it is in any way involved in reviewing cases and liaising between the board and Facebook when it comes to implementation of decisions. The draft charter does not, however, set out much detail on the role and powers that this staff will have. The final charter should provide clarity on the role and powers of this staff, including how Facebook will structure the board to maintain the independence of the board and its staff.

Ability for journalists, advocates and interested citizens to raise issues of concern: At present, issues can only be raised to the board via Facebook’s own content decision- making processes and “Facebook users who disagree with a decision”. This suggests that only users who are appealing decisions related to their content can play this role. However, it is important that there also be a way for individuals (such as journalists, advocates and interested citizens) to be able to influence problematic policy and raise concerns directly to the board.

Ensuring diverse board representation: According to the Draft Charter, “the board will be made of experts with experience in content, privacy, free expression, human rights, journalism, civil rights, safety and other relevant disciplines” and “will be made up of a diverse set of up to 40 global experts”. While it is important for this board to reflect a diversity of disciplines, it is also integral that it reflects a diversity of global perspectives including different regional, linguistic and cultural perspectives from the various countries in which Facebook operates. The exact board composition will also be dependent upon the agreed scope of the board.

Promoting greater transparency around content regulation practices: Given that the board is a newfound mechanism for regulating content on Facebook and enforcing the company’s content policies, it should similarly seek to demonstrate transparency and be held accountable for its content-related practices. According to the Draft Charter, panel “decisions will be made public with all appropriate privacy protections for users” and “the board will have two weeks to issue an explanation for each decision.” In addition to providing transparency around individual board decisions, Facebook should issue a transparency report that provides granular and meaningful data including statistical data on the number of posts and accounts removed and impacted.

Organisational Signatories

AfroLeadership Center for Democracy & Technology Center for Studies on Freedom of Expression CELE Centre for Communication Governance at National Law University Delhi Committee to Protect Journalists Derechos Digitales Digital Empowerment Foundation Fundación Karisma Global Partners Digital Index on Censorship International Media Support Internet Sans Frontières Internews IPANDETEC New America’s Open Technology Institute Paradigm Initiative PEN America R3D: Red en Defensa de los Derechos Digitales Ranking Digital Rights SMEX Software Freedom Law Center, India Trillium Asset Management, LLC

Individual Signatories

Jessica Fjeld Meg Roggensack Molly Land

Notes to editors

For further information, please contact Charles Bradley, Executive Director at Global Partners Digital ([email protected]). [/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_basic_grid post_type=”post” max_items=”4″ element_width=”6″ grid_id=”vc_gid:1557483094212-3285f4a4-f83a-5″ taxonomies=”136″][/vc_column][/vc_row]

An open letter to Mark Zuckerberg

Dear Mark Zuckerberg:

What do the Philadelphia Museum of Art, a Danish member of parliament, and a news anchor from the Philippines have in common? They have all been subject to a misapplication of Facebook’s Community Standards. But unlike the average user, each of these individuals and entities received media attention, were able to reach Facebook staff and, in some cases, receive an apology and have their content restored. For most users, content that Facebook removes is rarely restored and some users may be banned from the platform even in the event of an error.

When Facebook first came onto our screens, users who violated its rules and had their content removed or their account deactivated were sent a message telling them that the decision was final and could not be appealed. It was only in 2011, after years of advocacy from human rights organizations, that your company added a mechanism to appeal account deactivations, and only in 2018 that Facebook initiated a process for remedying wrongful takedowns of certain types of content. Those appeals are available for posts removed for nudity, sexual activity, hate speech or graphic violence.

This is a positive development, but it doesn’t go far enough.

Today, we the undersigned civil society organizations, call on Facebook to provide a mechanism for all of its users to appeal content restrictions, and, in every case, to have the appealed decision re-reviewed by a human moderator.

Facebook’s stated mission is to give people the power to build community and bring the world closer together. With more than two billion users and a wide variety of features, Facebook is the world’s premier communications platform. We know that you recognize the responsibility you have to prevent abuse and keep users safe. As you know, social media companies, including Facebook, have a responsibility to respect human rights, and international and regional human rights bodies have a number of specific recommendations for improvement, notably concerning the right to remedy.

Facebook remains far behind its competitors when it comes to affording its users due process. 1 We know from years of research and documentation that human content moderators, as well as machine learning algorithms, are prone to error, and that even low error rates can result in millions of silenced users when operating at massive scale. Yet Facebook users are only able to appeal content decisions in a limited set of circumstances, and it is impossible for users to know how pervasive erroneous content takedowns are without increased transparency on Facebook’s part. 2

While we acknowledge that Facebook can and does shape its Community Standards according to its values, the company nevertheless has a responsibility to respect its users’ expression to the best of its ability. Furthermore, civil society groups around the globe have criticized the way that Facebook’s Community Standards exhibit bias and are unevenly applied across different languages and cultural contexts. Offering a remedy mechanism, as well as more transparency, will go a long way toward supporting user expression.

Earlier this year, a group of advocates and academics put forward the Santa Clara Principles on Transparency and Accountability in Content Moderation, which recommend a set of minimum standards for transparency and meaningful appeal. This set of recommendations is consistent with the work of the UN Special Rapporteur on the promotion of the right to freedom of expression and opinion David Kaye, who recently called for a “framework for the moderation of user- generated online content that puts human rights at the very center.” It is also consistent with the UN Guiding Principles on Business and Human Rights, which articulate the human rights responsibilities of companies.

Specifically, we ask Facebook to incorporate the Santa Clara Principles into their content moderation policies and practices and to provide:

Notice: Clearly explain to users why their content has been restricted.

  • Notifications should include the specific clause from the Community Standards that the content was found to violate.
  • Notice should be sufficiently detailed to allow the user to identify the specific content that was restricted and should include information about how the content was detected, evaluated, and removed.
  • Individuals must have clear information about how to appeal the decision.

Appeals: Provide users with a chance to appeal content moderation decisions.

  • Appeals mechanisms should be easily accessible and easy to use.
  • Appeals should be subject to review by a person or panel of persons that was not involved in the initial decision.
  • Users must have the right to propose new evidence or material to be considered in the review.
  • Appeals should result in a prompt determination and reply to the user.
  • Any exceptions to the principle of universal appeals should be clearly disclosed and compatible with international human rights principles.
  • Facebook should collaborate with other stakeholders to develop new independent self-regulatory mechanisms for social media that will provide greater accountability3

Numbers: Issue regular transparency reports on Community Standards enforcement.

  • Present complete data describing the categories of user content that are restricted (text, photo or video; violence, nudity, copyright violations, etc), as well as the number of pieces of content that were restricted or removed in each category.
  • Incorporate data on how many content moderation actions were initiated by a user flag, a trusted flagger program, or by proactive Community Standards enforcement (such as through the use of a machine learning algorithm).
  • Include data on the number of decisions that were effectively appealed or otherwise found to have been made in error.
  • Include data reflecting whether the company performs any proactive audits of its unappealed moderation decisions, as well as the error rates the company found.

Article 19, Electronic Frontier Foundation, Center for Democracy and Technology, and Ranking Digital Rights

Fundación Ciudadano Inteligente
7amleh – Arab Center for Social Media Advancement
Access Now
ACLU Foundation of Northern California
Adil Soz – International Foundation for Protection of Freedom of Speech
Africa Freedom of Information Centre (AFIC)
Albanian Media Institute
American Civil Liberties Union
Americans for Democracy & Human Rights in Bahrain (ADHRB)
Arab Digital Expression Foundation
Artículo 12
Asociación Mundial de Radios Comunitarias América Latina y el Caribe (AMARC ALC)
Association for Progressive Communications
Brennan Center for Justice at NYU School of Law
Bytes for All (B4A)
CAIR San Francisco Bay Area
CALAM
Cartoonists Rights Network International (CRNI)
Cedar Rapids, Iowa Collaborators
Center for Independent Journalism – Romania
Center for Media Studies & Peace Building (CEMESP)
Child Rights International Network (CRIN)
Committee to Protect Journalists (CPJ)
Digital Rights Foundation
EFF Austin
El Instituto Panameño de Derecho y Nuevas Tecnologías (IPANDETEC)
Electronic Frontier Finland
Elektronisk Forpost Norge
Foro de Periodismo Argentino
Foundation for Press Freedom – FLIP
Freedom Forum
Fundación Acceso
Fundación Ciudadano Inteligente
Fundación Datos Protegidos
Fundación Internet Bolivia.org
Fundación Vía Libre
Fundamedios – Andean Foundation for Media Observation and Study
Garoa Hacker Club
Gulf Center for Human Rights
HERMES Center for Transparency and Digital Human Rights
Hiperderecho
Homo Digitalis
Human Rights Watch
Idec – Brazilian Institute of Consumer Defense
Independent Journalism Center (IJC)
Index on Censorship
Initiative for Freedom of Expression – Turkey
Instituto Nupef
International Press Centre (IPC)
Internet without borders
La Asociación para una Ciudadanía Participativa ACI Participa
MARCH
May First/People Link
Media Institute of Southern Africa (MISA)
Media Rights Agenda (MRA)
Mediacentar Sarajevo
New America’s Open Technology Institute
NYC Privacy
Open MIC (Open Media and Information Companies Initiative)
OpenMedia
Pacific Islands News Association (PINA)
Panoptykon Foundation
PEN America
PEN Canada
Peninsula Peace and Justice Center
Portland TA3M
Privacy Watch
Raging Grannies
ReThink LinkNYC
Rhode Island Rights
SFLC.in
SHARE Foundation
SMEX
South East Europe Media Organisation
Southeast Asian Press Alliance (SEAPA)
SumOfUs
Syrian Archive
Syrian Center for Media and Freedom of Expression (SCM)
t4tech
Techactivist.org
The Association for Freedom of Thought and Expression
Viet Tan
Vigilance for Democracy and the Civic State
Visualizing Impact
Witness


1See EFF’s Who Has Your Back? 2018 Report https://www.eff.org/who-has-your-back-2018, and Ranking Digital Rights Indicator G6, https://rankingdigitalrights.org/index2018/indicators/g6/.

2 See Ranking Digital Rights, Indicators F4 https://rankingdigitalrights.org/index2018/indicators/f4/, and F8, https://rankingdigitalrights.org/index2018/indicators/f8/ and New America’s Open Technology Institute, “Transparency Reporting Toolkit: Content Takedown Reporting”,https://www.newamerica.org/oti/reports/transparency-reporting-toolkit-content-takedown-reporting/

3 For example, see Article 19’s policy brief, “Self-regulation and ‘hate speech’ on social media platforms,”https://www.article19.org/wp-content/uploads/2018/03/Self-regulation-and-%E2%80%98hate- speech%E2%80%99-on-social-media-platforms_March2018.pdf.