Members of Twitter’s Trust & Safety Council Resign in Protest Against Elon Musk’s ‘Safety’ Messages to Date

The following is the resignation letter of three of Twitter’s “Trust and Safety Council”:

We are announcing our resignation from Twitter’s Trust and Safety Council because it is clear from research evidence that, contrary to claims by Elon Musk, the safety and wellbeing of Twitter’s users are on the decline. The question has been on our minds: Should Musk be allowed to define digital safety as he has freedom of expression? Our answer is a categorical “no.”

Eirliani Abdul Rahman and Anne Collier have been members of Twitter’s Trust & Safety Council since its inception in 2016. Eirliani was the first female representative from Asia and had served on the Council’s Child Sexual Exploitation (CSE) Prevention advisory group. Anne has been working with social media platforms on youth digital safety for more than 20 years and served on the Twitter Council’s Online Safety and Harassment Prevention group.

We know that, even after the resignations and dismissals of thousands of employees, there are people working at Twitter who care about reducing hate speech and protecting users on the platform. We are deeply saddened by this decision because Twitter has been a place of joy in many ways: our work with fellow Council members, interacting with our professional networks, and supporting the public discussion about our respective passions.

Despite a lack of acknowledgment on the part of Twitter’s new ownership, we would like to acknowledge the hard work of all members of its Trust and Safety Council over the past six years. The establishment of the Council represented Twitter’s commitment to move away from a US-centric approach to user safety, stronger collaboration across regions, and the importance of having deeply experienced people on the safety team. That last commitment is no longer evident, given Twitter’s recent statement that it will rely more heavily on automated content moderation. Algorithmic systems can only go so far in protecting users from ever evolving abuse and hate speech before detectable patterns have developed.

Anne: “Having followed the research on youth online risk since 1999, I know how hard it is for platforms to get it right, honoring young users’ rights of protection, participation and privacy simultaneously. But some progress has been made in the industry. Tragically, the research shows that Twitter is going in the opposite direction, and I can no longer find a reason to stay in tacit support of what Twitter has become.”

Eirliani: “I have watched with, dare I say, trepidation, the negotiations over Elon Musk’s purchase of Twitter. I had written down some commitments to myself at the time. Should Musk step over those thresholds, I told myself I would resign. Those red lines have been crossed. We know from research by the Anti-Defamation League and the Center for Countering Digital Hate that slurs against Black Americans and gay men have jumped 195 percent and 58 percent respectively since Musk’s takeover. Antisemitic posts have soared more than 61 percent in the two weeks after Musk’s acquiring of Twitter. Another red line for me was when previously banned accounts such as those on the far right, and those who had incited others to violence, such as then US President Donald Trump’s, were reinstated.”

We fear a two-tiered Twitter: one for those who can pay and reap the benefits, and another one for those who cannot. This, we fear, will take away the credibility of the system and the beauty of Twitter, the platform where anyone could be heard, regardless of the number of their followers.

We cannot therefore, in full conscience, remain on Twitter’s Trust and Safety Council for reasons above. A Twitter ruled by diktat is not a place for us. Content moderation is a nuanced business that requires full transparency, adherence to policies informed by best practices and advice from trusted partners on the ground as well as dedicated resources. This is in no way a disavowal of our friends who remain on the Council. They choose to do so for their own reasons, including continued safeguarding and the hope that reason will prevail.

Eirliani Abdul Rahman
Co-Founder
YAKIN (Youth, Adult survivors & Kin In Need)
Email: [email protected]
Twitter: @eirliani

Anne Collier
Founder and Executive Director
The Net Safety Collaborative
Email: [email protected]
Twitter: @annecollier

Lesley Podesta
Young and Resilient Research Center
Western Sydney University
Email: [email protected]
Twitter: @podesta_lesley

Spread the Truth:
,
Latest Stories

RELATED ARTICLES:

Menu