Graphic violence and disturbing content flooded Instagram users’ feeds this week after a technical error in the platform’s recommendation system.
Meta confirmed and fixed the issue on February 27 after thousands saw videos of murders, shootings, and animal abuse despite having content filters turned on.
“We have fixed an error that caused some users to see content in their Instagram Reels feed that should not have been recommended,” a Meta spokesperson shared to CNBC. “We apologize for the mistake.”
Users around the world saw their normal feeds suddenly fill with graphic videos. One Reddit user stated how their Reels page was inundated with “school shootings and murder.” Many reported seeing people being beaten or killed, often labeled as “sensitive content.”
CNBC reporters viewed several Instagram posts showing dead bodies, injuries, and violent attacks during the glitch. They showed up even for users with Instagram’s “Sensitive Content Control” set to its strictest level.
The timing has raised eyebrows about Meta’s recent content policy changes. In January, the company cut back its automated content removal to focus mainly on “illegal and high-severity violations” like terrorism and child exploitation. Meta also said it would swap third-party fact-checking for a user-based “Community Notes” system.
When announcing these changes, CEO Mark Zuckerberg admitted the company would “catch less bad stuff” but said it would allow more free speech on its platforms.
A Meta spokesperson, however, told reporters the error had nothing to do with the recent content policy changes.
The glitch came after Meta cut nearly 25% of its staff in 2022 and 2023, including many from trust and safety teams. Critics suggest these cuts happened as the company tried to improve its standing with the Trump administration.
Jason Koebler from 404 Media tested the glitch by logging into an affected account. Numerous violent videos he saw had thousands of likes and hundreds of comments.
“When we talk about ‘content moderation,’ the vast majority of the job is deleting videos of terrorism, murder, horrific violence,” Koebler wrote. “Meta has signaled that it intends to do less content moderation overall.”
While Meta has resolved this particular error, the incident highlights challenges in balancing algorithmic content moderation with the company’s shifting stance on free expression across its platforms.