MarketWatch

Instagram's new teen protections 'do not go far enough.' Here's what else is needed.

By Weston Blasi

The 'new features do not impact what we believe to be the most significant risk of teen Instagram use'

Facebook parent Meta (META) is making a series of modifications it says will help protect children on its social-media platform Instagram.

These new protections will affect users under age 18 in the U.S., U.K., Canada and Australia:

Default private accounts: Private accounts mean that users have to accept new followers, thus limiting who can interact with their content.Messaging restrictions: Users can only be messaged by people they follow or are already connected to.Sensitive-content restrictions: Teenagers' accounts will have the most restrictive setting for sensitive content, which limits content such as people fighting, for example.Interaction limits: Teenage users can only be tagged or mentioned by people they follow, and a hidden-words blocker restricts offensive words and phrases in comments and direct-message requests.Time-limit reminders: Teens will be automatically notified if they've been on the Instagram app for more than 60 minutes in a 24-hour period.Sleep mode: Notifications will be turned off between 10 p.m. and 7 a.m., during which time the app will send automatic replies to DMs.

These new rules come as social-media use by teens has come under scrutiny in recent years. Spending large amounts of time on social media has been linked to mental-health issues including depression and anxiety, and numerous academic studies have shown that time spent on these platforms can disrupt sleep, expose teens to bullying and give young people unrealistic views of other people's lives.

See: Apple's iPhone 16 is selling better than the iPhone 15, says T-Mobile CEO

But will these new protections make a difference?

"We appreciate any social-media platform that makes design changes to improve the safety of their child users," Marc Berkman, CEO of the nonprofit Organization for Social Media Safety, told MarketWatch. "Unfortunately, we are used to the social-media industry making announcements of seemingly impactful safety changes, only to see that the execution of such policies, usually implemented months after the announcement, does not have a material effect on the risk of harm to adolescents."

No matter what features Meta or any other social-media company might add for their youngest users, Berkman believes it will be difficult to remove some of the obvious dangers social media can pose to children.

"Most importantly, Instagram's new features do not impact what we believe to be the most significant risk of teen Instagram use," he said. "The platform inherently promotes social comparison and consequently negatively impacts teens' body image and self-confidence. That is why Instagram's own research found that the platform makes body-image issues worse for one in three teen girls."

Mental-health issues related to social-media usage are particularly linked with young girls. Frequent use of social media was a predictor of worse sleep, online harassment, poor body image, low self-esteem and higher depressive symptom scores, with a larger association for girls than boys, according to an advisory from the U.S. surgeon general in 2023.

See: TikTok heads to court over U.S. law that could lead to a ban on the popular platform

Members of other online-safety groups agreed that while the rules were a step in the right direction, they don't fully address the biggest risks of social media.

"We commend the recent efforts to address privacy concerns, but it is crucial that we push for more comprehensive action," Ronn Nozoe, CEO of the National Association of Secondary School Principals, told MarketWatch.

And Lisa Honold, director of the Center for Online Safety, told MarketWatch: "These new Instagram policies do not go far enough."

Several online-safety organizations contacted by MarketWatch stated their support for various pieces of federal legislation that aim to protect kids, rather than relying on social-media companies to police themselves.

One of those bills is COPPA 2.0, which has been passed by the Senate and will be considered by the House of Representatives. It bans targeted advertising to children and creates an "eraser button" for parents to delete personal information online, among other things. Other bills include the Kids Online Safety Act and Sammy's Law (HR 5778).

All teens under age 18 will be automatically transitioned to the new teen accounts and will fall under the new protections. Users age 16 and 17 will have the ability to turn these features off, while those under16 will need a parent's permission to change any of these settings.

These new teen accounts will be rolled out within the next 60 days, Meta said.

"Teen Accounts have built-in protections which limit who can contact them and the content they see, and also provide new ways for teens to explore their interests. We consulted with parents, teens, and experts to inform these changes," Meta spokesperson Dani Lever told MarketWatch.

Meta has publicly supported some legislation in regards to online activity for teens, including a law that would require parental approval for app-store downloads by users under 16 years of age.

Some lawmakers lauded the move by Meta. New York Attorney General Letitia James called the new rules an "important first step" and Sen. Mitt Romney, a Utah Republican, said it's a "positive step, but more must be done."

Read on: U.S. Surgeon General urges tobacco-like warning label for social media's effect on youths

-Weston Blasi

This content was created by MarketWatch, which is operated by Dow Jones & Co. MarketWatch is published independently from Dow Jones Newswires and The Wall Street Journal.

 

(END) Dow Jones Newswires

09-19-24 1534ET

Copyright (c) 2024 Dow Jones & Company, Inc.

Market Updates

Sponsor Center