User Beware: Disturbing Report Reveals How Much Sexual Content Is Fed to Minors on Instagram
A disturbing new report is shedding some potentially damning light on one of the world’s most popular social media platforms.
In a joint effort with Laura Edelson, a computer science professor at Northeastern University, The Wall Street Journal reported that the Meta-owned Instagram has an odd propensity to aggressively push sexual content to users as young as 13 years old.
“Instagram regularly recommends sexual videos to accounts for teenagers that appear interested in racy content, and does so within minutes of when they first log in,” the news outlet reported.
WSJ collected a number of examples, and they go well beyond what one might expect from “Instagram models.”
WARNING: The following contains descriptions that some readers will find offensive or disturbing.
Some examples of the sexual content WSJ reported on included:
- A video of a woman pulling her underwear off (from a first-person perspective, so you see nothing but feet and underwear), alongside a caption of “When he says he’s into small Asian white sophomore nursing students.”
- Two images making coded references to masturbation (alongside an image of a woman in lingerie) and unprotected sex (alongside an image focused on a woman’s buttocks), respectively.
- A short video of a woman in a skirt provocatively spreading her legs (nothing is shown) to the caption “I’ve learned that 40 is…”
- A young woman in lingerie taking a selfie on a bed.
- A provocatively zoomed-in image of a woman in a red dress.
Edelson and the WSJ investigated this concerning phenomena by creating new, dummy Instagram accounts, and setting the age on the profile as 13. Those dummy accounts would then just “watch” Instagram Reels, which are the platform’s curated offerings. These tests were performed primarily from January through April. The Journal performed an additional test in June.
“Instagram served a mix of videos that, from the start, included moderately racy content such as women dancing seductively or posing in positions that emphasized their breasts,” The Journal reported. “When the accounts skipped past other clips but watched those racy videos to completion, Reels recommended edgier content.
“Adult sex-content creators began appearing in the feeds in as little as three minutes.
“After less than 20 minutes watching Reels, the test accounts’ feeds were dominated by promotions for such creators, some offering to send nude photos to users who engaged with their posts.”
That is a swift and aggressive escalation of sexually charged offerings.
For Instagram parent company Meta (formerly Facebook), these are familiar issues.
As CNBC noted back in January, Meta came out and said that it would restrict the sort of content that minors would see on its platforms. Those promises came as the platform came under scrutiny for being addictive and bad for the mental health of young people.
Those restrictions — if they were ever actually even put in place — clearly aren’t working, and it very much seems like a Meta problem.
As the Wall Street Journal noted, similar tests conducted on social media platforms Snapchat and TikTok did not yield results that were anywhere near as aggressive as Instagram.
Meta, for its part, shrugged off this report as much ado about nothing.
“This was an artificial experiment that doesn’t match the reality of how teens use Instagram,” Meta representative Andy Stone told the outlet.
Even if these findings are the nothingburger that Stone makes them out to be, it should still be a sobering reminder of how important it is for parents to be aware of what sort of content their children are consuming.
Truth and Accuracy
We are committed to truth and accuracy in all of our journalism. Read our editorial standards.
Advertise with The Western Journal and reach millions of highly engaged readers, while supporting our work. Advertise Today.