Meta launched Instagram Teen Accounts on Tuesday, offering a more restricted experience for younger users of the platform, the tech company’s latest attempt to ease concerns about the impact of social media on kids.
Meta will automatically migrate all Instagram users under 16 to the new service, offering built-in protections through parent-controlled settings. The move aims to address growing criticism that social media can harm young people’s mental health and reassure parents that they want to know what their children are exposed to and who they can interact with.
Teen Account user profiles are automatically made private and can only be viewed if a request to access a teen’s information is accepted. The new tool also restricts messaging so parents can see who their children are communicating with and includes the ability to mute notifications at night. Such features can be disabled, but with parental permission.
“We know parents want to feel confident that their teens can use social media to connect with their friends and explore their interests, without having to worry about unsafe or inappropriate experiences,” Meta said in a statement Tuesday. “We understand parents’ concerns, and that’s why we’re reimagining our apps for teens with new Teen Accounts.”
According to Meta, a new “Explore” function on Instagram allows kids to choose which topics they want to see more of, in addition to providing carers with more control over their child’s usage.
Legal pressure for changes
Meta’s global head of security Antigone Davis told CBS News that Meta has been consulting with teen parents to develop teen accounts, and that the changes will affect tens of millions of Instagram users. Meta has made incremental changes over the years, but the new service “standardizes the experience,” she said.
“It gives parents peace of mind. Their teens are in a certain set of protections,” Davis said, adding that Meta is seeking to “reimagine how parents and teens interact online.”
In 2023, dozens of states sued Meta, alleging that the company intentionally designed Instagram and Facebook to make young users addicted in order to increase profits. The lawsuits also accused Meta of violating federal law and collecting data from children under the age of 13 without parental consent.
Meta denied such allegations and said it is focused on providing teens with a “positive online experience” and has introduced numerous tools aimed at making social media safer for teens.
How do teen accounts apply?
Meta said that teen accounts require parental permission for users under 16 to lift restrictions. Additional features allow parents to further influence their teens’ online experience by indicating who they message, how they message them, and how much time they spend on the platform. Parents can also deny teens access to Instagram at certain times of the day.
To get teenagers to be honest, Meta appears to be asking them to upload their ID and verify their age using a tool called Yoti, which analyzes a person’s facial features to determine if they’re under or over 18.
Teens will be notified that their account will be migrated to a teen account. The migration is expected to happen within 60 days in the US, UK, Canada and Australia.
- WNBA Announces Historic Playoff Format Change for 2025 Season - October 11, 2024
- Rafael Nadal Announces His Retirement from Professional Tennis - October 11, 2024
- Who Are the Top 5 Leaders in the NBA Right Now? - October 11, 2024