Elon Musk's social media platform, X, has announced plans to hire 100 content moderators for a new office in Austin, Texas, with a focus on combating child abuse online. The initiative is part of X's efforts to address child sexual exploitation (CSE) on the platform. X's CEO, Linda Yaccarino, stated in a blog post that while the platform is not intended for children and minors, they are taking steps to make it more difficult for bad actors to share or engage with CSE material while simplifying the reporting process for users. Yaccarino also mentioned that X is improving its detection mechanisms to identify more reportable content to share with the National Center for Missing and Exploited Children (NCMEC).
In 2023, X suspended 12.4 million accounts for violating its CSE policies, a significant increase from 2.3 million account suspensions in 2022. The platform also referred 850,000 reports to NCMEC in 2023, including through its first-ever fully automated report. These numbers represent a significant increase compared to the period before Elon Musk's ownership group acquired Twitter, which was over eight times less. Musk has faced criticism for content moderation policy changes since acquiring Twitter, leading to some advertisers reconsidering their spending on the platform.
X's efforts to combat CSE come as the platform's value has reportedly decreased by 71% since Musk purchased it. Despite this, X is committed to expanding its content moderation team and establishing a Trust and Safety center of excellence in Austin. Joe Benarroch, X's head of business operations, stated that the team is currently being built, and the goal is to fill the positions by the end of the year, depending on finding the right talent. The new Austin facility will not only focus on combating child abuse but also address other forms of harmful content on social media.
In conclusion, X's new content moderation push and focus on countering child sexual exploitation highlight the platform's commitment to creating a safer online environment. The company's efforts to suspend accounts and report content to NCMEC demonstrate its dedication to addressing this important issue. Additionally, the establishment of a Trust and Safety center in Austin signifies X's commitment to fighting harmful content beyond CSE.