A person walks past a sign bearing Meta's logo at the company's Menlo Park headquarters in October 2021. The company is introducing new safety measures for younger Instagram users after facing widespread scrutiny.
Caption

A person walks past a sign bearing Meta's logo at the company's Menlo Park headquarters in October 2021. The company is introducing new safety measures for younger Instagram users after facing widespread scrutiny. / AFP via Getty Images

Facebook parent company Meta is rolling out additional parental supervision measures for Instagram and its virtual reality headset, expanding on a suite of tools released in the U.S. in recent months.

The changes come on the heels of a year of intense public scrutiny for the company, with significant criticism focused on child safety and Instagram's detrimental effects on younger users, particularly teenage girls.

Last fall, a Wall Street Journal investigation reported that the company's studies had repeatedly confirmed the harmful effects of the photo sharing app on teenage girls' mental health, even as Meta proceeded with a controversial plan to develop a version of the social media platform for kids under 13. (That project has since been put on hold.)

The ensuing months brought additional revelations from whistleblower Frances Haugen, a congressional inquiry about child safety and an investigation by several states' attorneys general into how Instagram recruits and affects children.

The company announced in December that it would be releasing new safety tools aimed at teens and their parents, which they started rolling out in March.

Instagram says users must be at least 13 years old in order to create an account — a rule that's easy to skirt because the app has no age verification process.

Antigone Davis, Meta's head of safety, told Morning Edition that the company is working on specific safeguards — like developing artificial intelligence to better identify underage users — but it remains a challenge.

"There really is no panacea for solving that problem," she said. "That's a problem that the industry faces, and we're trying to come up with multiple ways to address that issue."

In the meantime, Meta is taking steps to give parents and guardians more oversight of their kids' activities in virtual reality and on Instagram — implementing some of the changes that it had first teased back in March.

Meta announced on Tuesday that it is rolling out parental supervision tools to all of its Oculus Quest virtual reality headsets, and expanding certain parental controls on Instagram in the U.S. before launching others in more than half a dozen countries.

The new features will allow parents to approve or deny requests to purchase certain apps for the Quest, to block apps that may be inappropriate for younger users and to view their child's apps, headset screen time and list of Oculus Friends. Parents also can prevent their teen from accessing content from their PC on their Quest headset by blocking Link and AirLink.

The teen must initiate the process, and both parties have to agree in order for parents to link to their teens' Quest account, Meta added.

On Instagram, parents and guardians can now invite their teens to initiate supervision tools (a process that previously only worked the other way around), can set limits on their teen's use of Instagram during specific times of the day or days of the week and can see more information when their teen reports a post or account.

Instagram also will launch new "nudges" for teen users in certain countries, encouraging them to switch to a different topic if they're repeatedly looking at the same type of content on their Explore page.

"We designed this new feature because research suggests that nudges can be effective for helping people — especially teens — be more mindful of how they're using social media in the moment," Meta explained. The company cited internal research showing from a one-week testing period, which showed that one in five teens who saw the new nudges switched to a different topic.

The company says it soon will launch reminders for teens to turn on its existing Take a Break feature when they've been scrolling through Reels for a certain length of time.

As part of this new suite of updates, Meta is also working to provide parents and guardians with more information and resources. It says it's adding new articles — including tips for talking to teens about various online topics — to its Family Center education hub, and launching a parent education hub for virtual reality.

"This is just a starting point, informed by careful collaboration with industry experts, and we'll continue to grow and evolve our parental supervision tools over time," it adds.

The company's announcement came after it recently was hit with eight lawsuits across the country, all of which accused it of deliberately making Instagram and Facebook addictive to young people to boost Meta's profits, as Bloomberg reported.

A Meta spokesperson declined to comment to Bloomberg on the litigation, but noted the time limits and other parental controls it has developed for Instagram.

Editor's note: Meta pays NPR to license NPR content.

Copyright 2022 NPR. To see more, visit https://www.npr.org.