The board reviewed the cases on an expedited basis, citing ‘exceptional circumstances’ following Venezuela’s presidential election.
In a groundbreaking and controversial decision, Meta’s Oversight Board has ruled that users can post “aspirational” statements calling for the death of Venezuelan paramilitary groups known as colectivos, according to a decision released on Thursday. The ruling follows intense debate surrounding two specific posts on Facebook and Instagram, made in the aftermath of Venezuela’s highly disputed presidential election. Both posts contained strong language advocating for violence against the colectivos.
The board determined that these posts, rather than being explicit calls to action, were merely “statements expressing a hope that violent actors will be killed.” As such, the board concluded that they did not violate Meta’s community standards. This decision has ignited discussions around the balance between freedom of expression and the incitement of violence, particularly in politically volatile contexts.
The Context of the Decision
Venezuela has been grappling with a severe crisis, and the role of colectivos in violently suppressing dissent has been well-documented. The board acknowledged the “limited outlets for free expression” in the country, noting that these paramilitary groups have played a significant role in the brutal crackdown on protesters challenging the election results. The urgency and complexity of the situation led the board to expedite its decision, recognizing the potential for “urgent real-world consequences.”
One of the posts in question was a Facebook video showing a group of men on motorbikes, presumed to be members of a colectivo. The caption bluntly urged people to “kill those damn colectivos,” leading Meta’s moderators to remove the post for violating the platform’s policies against inciting violence. The second post, an Instagram video, featured a woman expressing her disdain for the colectivos, stating, “Go to hell, I hope they kill you all.” This post was not removed, as the board interpreted it as a “conditional or aspirational statement against a violent actor,” rather than a direct call for violence.
A Nuanced Approach to Moderation
The Oversight Board’s decision to uphold the Instagram post and overturn the removal of the Facebook post underscores the nuanced approach that Meta is now adopting towards content moderation in volatile political environments. According to the board, Meta’s policies make a distinction between “permitted ‘statements expressing a hope that violent actors will be killed,’” and “prohibited ‘calls for action against violent actors.’” The board admitted that drawing this line is “particularly difficult” when dealing with threats against colectivos, yet ultimately concluded that both posts fell within the realm of allowable speech.
This ruling is not without precedent. The Oversight Board has previously allowed users to post statements wishing harm upon high-profile political figures, including Russian President Vladimir Putin and Iran’s Supreme Leader Ayatollah Ali Khamenei. In a similar vein, the board in December 2023 overturned Meta’s decision to remove a video depicting people storming a police station in Haiti, a move that led Meta to refine its policies. The updated rules now allow for exceptions in cases where “threats are shared in an awareness-raising or condemning context” or when they target “violent actors, like terrorist groups.”
Implications and Controversy
The decision is likely to fuel ongoing debates about the role of social media platforms in managing content that teeters on the edge of violence and free expression. Critics may argue that permitting any form of violent rhetoric, even if aspirational, could lead to real-world harm, especially in regions already destabilized by conflict. Supporters, however, may view this as a necessary allowance for those living under oppressive regimes with few other means of expressing their frustration and anger.
As Meta continues to refine its content moderation policies in response to the Oversight Board’s rulings, the company faces the daunting task of balancing the protection of free speech with the need to prevent harm—a challenge that grows more complex as political tensions around the world continue to rise.