Musk’s Twitter is Failing to Remove Child Porn, Even Some of the ‘Easiest to Detect and Eliminate’: NY Times Analysis

“Removing child exploitation is priority #1,” Musk tweeted in November, one of many comments he has made declaring an intention to devote resources to keeping child sexual abuse material (CSAM) from being disseminated and amplified on Twitter.

But CSAM has continued to proliferate on Twitter, according to the Times‘ analysis, “including widely circulated material that the authorities consider the easiest to detect and eliminate.”

The Times created a new individual Twitter account and an “automated computer program that could scour the platform for [CSAM] without displaying the actual images, which are illegal to view,” and conducted a review of what content was available on Twitter, how it was presented, and what kind of actions were taken against it.

“The material wasn’t difficult to find,” the Times reported, and in many cases Twitter was actually promoting it through the platform’s “recommendation algorithm — a feature that suggests accounts to follow based on user activity.”

ARTICLE HERE