By Emma Nitzsche 

On Tuesday, Instagram announced it would be changing its advertising and privacy policies to protect teenagers and other underage users from sexual predators and bullying. The new guidelines come after Instagram faced pressure from lawmakers, regulators, parents, and child-safety advocates worried about the impact of social media on kids’ safety, privacy, and mental health.

“Instagram has been on a journey to really think thoughtfully about the experience that young people have,” said Karina Newton, Instagram’s head of public policy, in an interview ahead of the announcement. “We need to get this right.”

Now, when a user under 16 joins Instagram, their account will be set to private automatically. This feature will make the account owner’s posts only visible to approved followers. Before this week, younger users who already have an established public account will see notifications about the benefits of making a private account and how to switch their settings. Instagram said that they expect 80 percent of young people to keep the privacy settings after this week.

In addition to the privacy settings, Instagram is working to prevent underage users from experiencing unwanted contact from adults. The new software will limit accounts that have shown potentially suspicious behavior.

Instagram cites examples of suspicious behaviors as being blocked or reported by young people in the past. The targeted accounts will be restricted from seeing teenagers’ posts among the recommendations in the Explore and Reels sections, and Instagram won’t suggest they follow teenager’s accounts. Moreover, even if the restricted adult users search for specific teens by username, they will be barred from following them and blocked from commenting on their posts.

Both Facebook and Instagram have been working on better approaches of verifying users’ ages, so they can prohibit children under 13 from using the app. Before 2019, Instagram simply asked users to confirm whether they were at least 13, but now all users are required to enter their birthdate before continuing to create an account. Newton said both apps utilize artificial intelligence to scan profiles for indicators that the user is under 18 years old. The AI software is tipped off by subtle comments, such as analyzing what people say in comments wishing users a happy birthday.

One of the groups pushing Instagram to extend the protections is Fairplay, a children’s advocacy nonprofit. Josh Golin, the executive director of Fairplay, said the change to the social media accounts appears to be a step in the right direction.

“There has been such a groundswell to do more to protect teen safety, but also around manipulative behavioral advertising. So it’s good that they’re responding finally to that pressure,” he told NPR.

Alongside Instagram, Facebook also developed new technology to stop accounts with potentially suspicious behavior from seeing or interacting with people under 18 on Instagram. Despite being opposed by attorney generals for 44 states and jurisdictions and an internal coalition of children and consumers groups, the social media platform intends to create an Instagram just for younger users. Still, Facebook CEO Mark Zuckerberg has defended the idea under the rationalization that younger users are already using Instagram, so it would be better to provide them with a tailored version.