Users on X, formerly Twitter, continue to report arbitrary account shadowbanning and reach restrictions, casting doubt on Elon Musk’s self-proclaimed status as a champion of free speech. Despite promises of increased transparency, the platform remains mired in opaque content moderation practices that lack clear accountability for those affected by sudden, unexplained visibility sanctions.
The Broken Promise of Transparency
Last August, Musk pledged to resolve the platform’s lack of transparency regarding shadowbanning. At the time, he cited a complex codebase—bloated with layers of “trust and safety” software—as the primary obstacle to identifying why accounts were being suspended or restricted. He promised a ground-up rewrite to simplify these processes. More than six months later, however, the complaints persist, suggesting little progress has been made.
Lilian Edwards, an Internet law academic at the University of Newcastle, is among the latest to experience these draconian restrictions. Her replies to threads have been hidden—even from her own followers—replaced by a “this post is unavailable” notice. Her experience included a temporary disappearance of her entire DM history and a failure of the platform to send notifications for private messages, severely hindering her ability to communicate.
Users Caught in a “Spam” Net
The impact of these automated flags is extensive. Users report that replies are hidden, mentions fail to appear in notification tabs, and push notifications are suppressed. Some users, like the handle @gateklons, believe these issues stem from new, highly erroneous spam detection algorithms. In some cases, the platform appears to be using crude signals, such as IP addresses from travel locations, to trigger these flags, regardless of the user’s actual behavior.
When reached for comment, X’s automated press system provided no substantive response, leaving users in the dark. As Edwards noted, being labeled for “platform manipulation” while posting less than usual—due to being on holiday—is both baffling and infuriating.
The EU’s Digital Services Act: A Regulatory Turning Point
While global users remain frustrated, those in the European Union may soon see relief. The EU’s Digital Services Act (DSA) has designated X as a “very large online platform” (VLOP), subjecting it to strict transparency and content moderation requirements. The European Commission has already opened a formal investigation into X, specifically citing concerns over its content moderation policies and transparency.
Article 17 of the DSA is particularly critical here, as it requires platforms to provide a “clear and specific statement of reasons” whenever they restrict the visibility of a user’s content. This must include the facts behind the decision, the involvement of automated tools, and a clear path for appeal. Currently, X’s generic claims of “spam” or “platform manipulation” fail to meet these legal standards, potentially exposing the company to fines of up to 6% of its global annual turnover.
A Path Toward Compliance or Continued Chaos?
Whether this failure to provide transparency is a deliberate attempt to circumvent the law or simply a byproduct of post-acquisition technical debt remains unclear. Former Twitter head of trust and safety Yoel Roth previously explained that the platform’s reliance on non-machine-readable “free-text notes” makes modernizing enforcement a significant hurdle. Furthermore, the massive headcount cuts initiated by Musk have likely exacerbated the difficulty of untangling these legacy systems.
EU Internal Market Commissioner Thierry Breton has already signaled that “arbitrarily suspending accounts” is unacceptable. As the Commission continues its probe, the pressure on X to resolve its “Gordian Knot” of content moderation is mounting. Balancing the desire for a “free speech” platform with the legal requirements of the DSA will be the ultimate test for Musk’s leadership of the company.
Musk says X will address shadowbanning ‘soon,’ but former Trust & Safety exec explains why that will be difficult
Elon Musk’s X faces first DSA probe in EU over illegal content risks, moderation, transparency and deceptive design
