In recent days, Instagram and Threads users have experienced widespread issues with moderation, including locked accounts, disappearing posts, and other frustrating problems. On Friday, Instagram’s head, Adam Mosseri, addressed these concerns, acknowledging that the platform had “found mistakes.” Surprisingly, the company is attributing the errors not to its automated systems or AI, as many might expect, but rather to human content reviewers.
Mosseri took to Threads to explain the situation, admitting that human moderators had made errors while reviewing flagged content. He clarified that these reviewers were missing key context about how conversations played out on the platform, which led to inaccurate decisions. “The mistakes were made by people, not our automated systems,” he said. Mosseri further emphasized that Instagram is actively working to fix these issues by ensuring that content reviewers are provided with the necessary context to make more informed decisions and reduce the error rate.
However, Mosseri’s explanation left some questions unanswered. Some users’ accounts were mistakenly labeled as belonging to individuals under the age of 13 and were subsequently disabled. It’s unclear how a human reviewer, even without context, would arrive at such an assumption. This situation has raised doubts about whether human error alone can account for the full scope of the issues users are experiencing.
Tool Malfunctions & Ongoing Investigations
In response to a user comment on Threads, Mosseri elaborated further, revealing that part of the problem stemmed from a malfunctioning tool that failed to provide reviewers with the proper context for making moderation decisions. “One of the tools we built broke, and so it wasn’t showing them sufficient context,” he said, acknowledging that this was Instagram’s oversight.
Instagram later confirmed that not all user complaints were tied directly to human moderator errors. The platform pointed to the aforementioned tool failure as a contributing factor, while also admitting that other issues, such as accounts being flagged as belonging to underage users, were still under investigation. While the company is exploring various explanations, it’s unclear whether Instagram will ever provide a detailed breakdown of the full range of problems users have encountered.
Accounts Disabled, Posts Disappearing, and Engagement Drops
The issues extended beyond just account suspensions. Many users, including prominent figures, noticed a sharp decline in post engagement and follower growth. Posts were being downranked, flagged as spam, or even removed, despite users having no history of spam-like behavior.
One such case was brought to light by former Wall Street Journal tech columnist Walt Mossberg. Mossberg, who had previously enjoyed significant engagement on Threads, saw his likes plummet from an average of 100–1,000 per post to 0–20 within just 24 hours. This dramatic drop in engagement left him baffled and concerned about the platform’s handling of content visibility.
Social media strategist Matt Navarra echoed these concerns, noting that not only had he encountered moderation issues himself, but he had also observed a growing number of users reporting steep declines in follower growth and engagement. “It’s like it fell off a cliff,” Navarra remarked, highlighting the abruptness of the drop-off.
Rivals Seizing the Opportunity
While Instagram and Threads grappled with these ongoing moderation problems, competitors wasted no time in capitalizing on the platform’s woes. One such rival was Bluesky, a social networking startup positioning itself as an alternative to both X (formerly Twitter) and Threads. In a savvy marketing move, Bluesky created an account on Threads to drive frustrated users toward its platform, promoting its features and touting a safer, more reliable user experience.
This opportunistic move by Bluesky underscores the fierce competition in the social media landscape, where even a brief stumble by a major player like Instagram can create opportunities for newer, smaller platforms to lure users away.
Mosseri’s Commitment to Improvement
In his statement on Threads, Mosseri acknowledged the gravity of the situation and the need for Instagram to improve its moderation processes. “We’re trying to provide a safer experience, and we need to do better,” he said. Despite his assurances, Mosseri hinted that the problem may not be fully resolved anytime soon, urging users to remain patient and continue providing feedback as the company works through these issues.
For many users, however, this explanation may not be enough to restore confidence in the platform’s ability to handle moderation fairly and effectively. As Instagram continues to address these problems, it remains to be seen whether the company can fully regain the trust of its user base — especially those who have already been affected by the recent moderation missteps.
The Bigger Picture: Moderation Challenges in Social Media
The situation Instagram is facing shines a spotlight on the broader challenges of content moderation across all social media platforms. As platforms like Instagram, Threads, X, and others grapple with increasing amounts of user-generated content, they rely on a combination of automated systems and human moderators to manage the flow of information. However, as this recent episode demonstrates, even the best-intentioned systems and reviewers can make mistakes — especially when they lack the necessary context to make nuanced decisions.
In the age of AI and automation, users often assume that the root of moderation problems lies with faulty algorithms. But as Mosseri’s comments make clear, the role of human moderators remains a critical (and sometimes flawed) component of the process. Striking the right balance between human oversight and machine learning will be an ongoing challenge for social media platforms as they strive to create safer, more equitable spaces for users to engage.
Until Instagram resolves these issues, many users may remain cautious about their engagement on the platform, while competitors like Bluesky stand ready to offer alternatives. Mosseri’s call for patience will undoubtedly be tested as the company works to regain its footing in the eyes of its users.
Conclusion:
Instagram’s latest moderation challenges have sparked frustration across its user base, with a mix of human error and broken tools causing significant disruptions. While Instagram is working to rectify the situation, the full scope of the problem remains unclear. Competitors, such as Bluesky, have taken advantage of the chaos to promote their platforms. As Instagram navigates these challenges, its success will hinge on how quickly and effectively it can address both human and systemic issues in its moderation process.