Twitter is now without a group to keep it safe from hate speech, suicide, self-harm, and other dangerous activities.
The platform recently disbanded its Trust and Safety Council as it was deemed insufficient to bring external insights into its product and policy development work.
The implication of the council's dissolution has yet to be made clear as of the writing of this article.
Twitter Trust and Safety Council Disbandment Details
Council members received an email from Twitter on Dec. 12 stating that the council is not good enough to be a part of the app's future under The Big Twit himself, Elon Musk.
According to a screenshot of the company's email provided by Anthony DeRosa, Twitter's leadership believes that the work to make Twitter a "safe, informative place" will move faster and more aggressively than ever before. Because of this work level, the company leadership believes that its Trust and Safety Council is inadequate to handle it.
You may remember that three important council members resigned from the council, with one of them, Anne Collier, writing in a letter that Musk ignored the group despite his promises on content moderation and user safety.
Collier and the two other members also said that Twitter was experiencing a decline in the "safety and well-being" of Twitter's users since Musk took over the company almost two months ago.
The council's disbandment comes at a bad time for Twitter. According to Business Insider, the platform found a steady increase in hate speech on its site. Normally, Twitter would take down hate speech at the behest of the council's members.
However, with the council's dissolution, it is unlikely that hate speech within the platform would be dealt with on time.
For those unaware, Twitter's Trust and Safety Council, which was formed in 2016, provided input on different content and human-rights-related issues. These issues include the removal of Child Sexual Abuse Material, hate speech, suicide prevention, online safety, and other problems on the platform, per AP News.
Twitter previously created the council to move away from a US-centric approach to user safety, stronger collaboration across regions, and have experienced people on the safety team.
Twitter's Trust and Safety Council consists of roughly 100 independent researchers and human rights activists.
However, with automated content moderation, having these experts around is redundant, with Twitter saying that AI vastly improved from the time when experts were still needed.
Twitter's Content Moderation Future
It is unknown if Twitter's Trust and Safety Council would be replaced by Musk's Content Moderation Council - the same one he promised during the initial days of his takeover of the company.
Whether Musk intends to replace the council with his own, experts agree that the platform needs its content moderation fast.
The Center for Countering Digital Hate and the Anti-Defamation League reported a sharp increase in hate speech, racist and homophobic slurs, and antisemitic posts like the one Ye made.