Instagram still posing serious risks to children, campaigners say

Tony Smith and Angus Crawford
BBC News Investigations
Getty Images The Instagram application on the Apple App Store arranged on a smartphoneGetty Images

Young Instagram users could still be exposed to "serious risks" even if they use new Teen Accounts brought in to provide more protection and control, research by campaigners suggests.

Researchers behind a new report have said they were able to set up accounts using fake birthdays and they were then shown sexualised content, hateful comments, and recommended adult accounts to follow.

Meta, which owns Instagram, says its new accounts have "built-in protections" and it shares "the goal of keeping teens safe online".

The research, from online child safety charity 5Rights Foundation, is released as Ofcom, the UK regulator, is about to publish its children's safety codes.

They will outline the rules platforms will have to follow under the Online Safety Act. Platforms will then have three months to show that they have systems in place which protect children.

That includes robust age checks, safer algorithms which don't recommend harmful content, and effective content moderation.

Instagram Teen Accounts were set up in September 2024 to offer new protections for children and to create what Meta called "peace of mind for parents".

The new accounts were designed to limit who could contact users and reduce the amount of content young people could see.

Existing users would be transferred to the new accounts and those signing up for the first time would automatically get one.

But researchers from 5Rights Foundation were able to set up a series of fake Teen Accounts using false birthdays, with no additional checks by the platform.

They found that immediately on sign up they were offered adult accounts to follow and message.

Instagram's algorithms, they claim, "still promote sexualised imagery, harmful beauty ideals and other negative stereotypes".

The researchers said their Teen Accounts were also recommended posts "filled with significant amounts of hateful comments".

The charity also had concerns about the addictive nature of the app and exposure to sponsored, commercialised content.

Baroness Beeban Kidron founder of 5Rights Foundation said: "This is not a teen environment."

"They are not checking age, they are recommending adults, they are putting them in commercial situations without letting them know and it's deeply sexualised."

Meta said the accounts "provide built-in protections for teens limiting who's contacting them, the content they can see, and the time spent on our apps".

"Teens in the UK have automatically been moved into these enhanced protections and under 16s need a parent's permission to change them," it added.

UK Parliament Headshot of a woman - Beeban Tania Kidron - wearing a green blouse and dressUK Parliament
Baroness Beeban Kidron founder of 5Rights Foundation

In a separate development BBC News has also learned about the existence of groups dedicated to self-harm on X.

The groups or "communities", as they are known on the platform, contain tens of thousands of members sharing graphic images and videos of self-harm.

Some of the users involved in the groups appear to be children.

Becca Spinks, an American researcher who discovered the groups, said: "I was absolutely floored to see 65,000 members of a community."

"It was so graphic, there were people in there taking polls on where they should cut next."

X was approached for comment, but did not respond.

But in a submission to an Ofcom consultation last year X said: "We have clear rules in place to protect the safety of the service and the people using it."

"In the UK, X is committed to complying with the Online Safety Act," it added.