Instagram announced on Thursday that it is officially rolling out its content restriction policy for teen users worldwide. The policy utilizes a film-rating-style (13+) filtering standard to automatically block or limit access to content involving extreme violence, nudity, and substance abuse.
Instagram previously piloted these standards in Australia, Canada, the UK, and the US starting in October 2025. This global expansion means that all underage users across the platform will now be subject to this content-filtering system.
Strengthening Protections for Teens
According to an official blog post from Meta, the system will automatically identify and reduce recommendations for posts containing strong language, high-risk stunts, and marijuana-related content. Additionally, Instagram has introduced a "Limited Content" setting, which further tightens filtering thresholds and restricts teens from viewing or engaging with comments on certain posts.
In its announcement, Meta stated: "While we cannot guarantee the system is 100% perfect, we will do our utmost to reduce the chances of teens encountering this type of content and will continue to optimize our algorithms."
This policy shift comes against a backdrop of significant legal pressure on Meta regarding teen safety. Last month, courts in New Mexico and Los Angeles ruled against the company, holding it accountable for failing to effectively protect the physical and mental well-being of its younger users.
Notably, Meta had previously used "PG-13 film rating" as a marketing label while promoting the mechanism. However, the Motion Picture Association (MPA) subsequently issued a cease-and-desist letter to Meta, arguing that social media content cannot be equated to the film rating system and demanding they stop using the analogy. Meta has since downplayed this branding in its latest official statements, shifting focus to emphasize that the mechanism is an independently developed system based on content safety logic.