Ankita Tripathy, 2 hours ago
X Platform Comes Up With New Data, Showcasing Its Efforts To Eliminate CSAM
X makes another move that raises the skepticism of the press and media. It publishes new updates on content moderation amidst this holiday season when message traction is very low, thanks to the Christmas and New Year festivities.
On 28th December 2023, X came up with an update about how the platform is looking forward to changes in the content moderation policy. Especially it claimed that tackling child sexual exploitation has been the number one priority of the platform in the last year. It further mentioned that the platform had no tolerance for the exploitation of children, and it would not allow anyone making efforts to exploit minors in any way.
Further, talking about the 2023 progress of the platform, X mentioned,
From January to November of 2023, X permanently suspended over 11 million accounts for violations of our CSE policies. For reference, in all of 2022, Twitter suspended 2.3 million accounts. In the first half of 2023, X sent a total of 430,000 reports to the NCMEC CyberTipline. In all of 2022, Twitter sent over 98,000 reports.
Not only are we detecting more bad actors faster, we’re also building new defenses that proactively reduce the discoverability of posts that contain this type of content. One such measure that we have recently implemented has reduced the number of successful searches for known Child Sexual Abuse Material (CSAM) patterns by over 99% since December 2022.
However, there are still doubts regarding how X is going to maintain its stance against child sexual abuse material or CSAM. Today, when X identifies a particular hashtag and blocks it, the peddlers start using another hashtag on the platform to continue child sexual abuse.
So, despite the updates from X coming at a stage when everyone is busy with festivities, third-party researchers and gatekeepers are not convinced how X is going to tackle this situation sincerely.