Politics

Non-Profit Ends Partnership with X Amid Concerns Over Child Abuse Content

Chong Wei Liew
Junior Editor
Updated
June 18, 2025 2:41 PM
News Image

Ben Goggin / NBC News: Accounts peddling child abuse content flood some X hashtags as safety partner cuts ties  —  Thorn, a non-profit that provides detection and moderation software related to child safety, said they canceled their contract with X after the …


Why it matters
  • A surge in accounts sharing child abuse material raises alarms about safety on social media.
  • Thorn's decision to sever ties with X highlights growing concerns over the platform's ability to manage harmful content.
  • This incident underscores the urgent need for effective moderation tools to protect vulnerable users online.
In a significant move against the proliferation of child abuse content on social media, Thorn, a prominent non-profit organization dedicated to child safety, has announced the termination of its contract with X, formerly known as Twitter. This decision comes amid escalating concerns regarding the presence of accounts disseminating abusive material under various hashtags on the platform.

Thorn has developed specialized software aimed at detecting and moderating content related to child exploitation. Their technology is designed to assist social media companies in creating safer environments for their users, particularly children. However, recent developments have raised serious questions about the efficacy of X's content moderation practices, prompting Thorn to reevaluate its partnership.

The decision to cut ties with X appears to be a direct response to a noticeable increase in accounts that are actively promoting child abuse content. Reports indicate that these accounts have been using specific hashtags to share disturbing images and narratives, which has alarmed child safety advocates and parents alike. Thorn's leadership expressed deep concern over the potential harm that such content poses to vulnerable children and the broader implications for online safety.

The impact of this decision is multifaceted. For one, it represents a growing recognition among non-profit organizations and advocacy groups that social media platforms must be held accountable for the content that is allowed to circulate on their sites. Thorn's exit from the partnership sends a clear message to X and other platforms: merely having moderation software is not enough; effective implementation and active management of that software are crucial.

Thorn's departure may also signal a broader trend within the tech industry, where organizations are increasingly unwilling to associate with platforms that fail to prioritize user safety. This could lead to more stringent expectations for content moderation practices and a push for comprehensive reforms in how social media handles sensitive content.

Critics have long pointed out that the rapid growth of social media has outpaced the development of adequate safety measures. The situation surrounding X is a case in point, where the platform's ability to manage harmful content has come under fire. Thorn's decision to withdraw its support highlights the pressing need for social media companies to invest in better technologies and strategies to combat the spread of abusive content.

In recent years, the conversation around child safety in the digital age has grown increasingly urgent. With children spending more time online than ever before, ensuring their protection from exploitation and abuse has become a critical issue. Thorn's software not only identifies harmful content but also aids in the prevention of such material being disseminated in the first place. The organization's commitment to child safety has been unwavering, and their recent actions reflect a proactive stance in addressing the ongoing challenges faced in this area.

As Thorn moves forward without X, it is expected that they will seek to partner with platforms that demonstrate a genuine commitment to safeguarding users. This may involve collaborating with companies that are willing to invest in robust moderation practices and prioritize the well-being of their users over profit. The hope is that such partnerships will lead to more effective solutions that can combat child abuse content and create a safer online environment.

The implications of Thorn's decision extend beyond just one organization and one platform; they resonate throughout the entire social media landscape. As awareness of online child exploitation grows, so too does the expectation that platforms take decisive action to protect their users. This incident serves as a reminder that the responsibility for ensuring online safety lies not only with non-profits and advocacy groups but also with the tech companies that create and maintain these platforms.

In conclusion, Thorn's withdrawal from X serves as a critical juncture in the fight against online child abuse. It underscores the necessity for social media platforms to take significant steps in enhancing their content moderation and safety protocols, ensuring that no child is left vulnerable in the digital space.
CTA Image
CTA Image
CTA Image
CTA Image
CTA Image
CTA Image
CTA Image
CTA Image
CTA Image
CTA Image
CTA Image

Boston Never Sleeps, Neither Do We.

From Beacon Hill to Back Bay, get the latest with The Bostonian. We deliver the most important updates, local investigations, and community stories—keeping you informed and connected to every corner of Boston.