According to Snapchat, it is aiming to make its teen-focused platform even safer.
A new features for deleting age-inappropriate information is among the new features and policies that parent company Snap announced Thursday it is implementing to help protect users between the ages of 13 and 17. Other changes include limitations on friend suggestions. Along with an updated website outlining its teen safety and parental control policies, the business additionally released a series of YouTube videos for parents explaining the features.
The launch of new features coincides with increased pressure on social media platforms from policymakers, educators, and parents to shield young users from objectionable material, unwanted adult attention, the sale of illegal drugs, and other problems. In a fall 2021 Senate committee hearing on juvenile safety on social media, a Snap CEO spoke with officials from TikTok and YouTube, offering new tools to assist parents keep their teens safe. And since then, Snapchat has released a number of new teenager safety and parental control options, similar to other sites.
The information released on Thursday comes after Snapchat’s Family Centre, which gives parents additional knowledge about who their teenagers are corresponding with on the messaging service, was introduced last year. Other teen safety features in the app include preventing users under 13 from creating public accounts and turning off by default the location-sharing feature Snap Map for teens.
In an effort to prevent teenagers from adding people to the app who they don’t know in real life, Snapchat will now require 13 to 17-year-old users to have a greater number of mutual friends before that account will show up in Search results or as a friend suggestion. Teens who are planning to add an account that does not already have any mutual Snapchat friends or phone book contacts will also receive a pop-up warning from the app.
“When a teen becomes friends with someone on Snapchat, we want to be confident it is someone they know in real life — such as a friend, family member, or other trusted person,” the company said in a blog post.
In its Stories and Spotlight areas, where users can publish content publicly on the app, Snapchat will also enforce a new strike system for accounts promoting content unsuitable for teenagers. If the company receives feedback about or discovers inappropriate information, it will remove it right away and give the poster’s account a strike. The platform states that if a user receives “too many strikes over a defined period of time, their account will be disabled,” however it is unclear exactly how many strikes would result in a suspension.
Teen users will also start to see in-app content aimed at educating them on online risks like catfishing and financial sextortion, which occurs when a victim is coerced into sharing nude photos before being blackmailed for money. This content will also provide hotlines for users to call for help. On Snapchat’s Stories platform and in reaction to particular search terms or keywords, PSA-style content will be promoted.
- Eagles defeat Tampa Bay and set a new record with a unique final score - September 27, 2023
- How to create a contact poster in iOS 17 - September 27, 2023
- Tubi launches ChatGPT-4 powered “Rabbit AI” content discovery tool - September 27, 2023