Snapchat introduces parental controls to help teens manage social media use: NPR

Snapchat is rolling out new parental controls that allow parents to see their teen’s contacts and confidentially report who belongs to them to the social media company. A child lies on a bed lit by the glow of a cell phone.

Elva Etienne / Getty Images


hide caption

toggle caption

Elva Etienne / Getty Images


Snapchat is rolling out new parental controls that allow parents to see their teen’s contacts and confidentially report who belongs to them to the social media company. A child lies on a bed lit by the glow of a cell phone.

Elva Etienne / Getty Images

Snapchat is rolling out parental controls that allow parents to view their teen’s contacts and report to the social media company — without their child’s information — any accounts that may concern them. .

The goal, officials say, is to enable parents to monitor their child’s connections without compromising the teen’s autonomy. Named Family Center, a new suite of tools released Tuesday requires both caregivers and teens to choose.

“It allows parents to see who’s in the universe of their teens,” said Nona Farahnik, director of platform policy for Snap, the company that created Snapchat. “It gives parents the ability to ask who someone might be, how they might know a contact, which prompts this type of real-time conversation about who teens are talking to.”

Farahnik says the Family Center is based on real-life parenting.

“If your teen is going to the mall, you might ask who they’re going to with. ‘How do you know them? Are you guys on a sports team together? Do you go to school together?'” Farahnik said. “But you wouldn’t be sitting at the mall with them listening to their conversation.”

Likewise, parents can’t see the content their teens are sending or receiving on Snapchat. They can only see who their child has communicated with in the last seven days. Snapchat is popular among young people, partly because messages on the platform disappear within 24 hours.

The company says it consulted with security experts and academics and conducted focus groups with parents to develop the Family Center and plans to launch more features in the coming months. This tool is only for parents of children under the age of 18.

With Family Center, Snap follows other social media platforms, including Instagram, which has recently enhanced parental controls, at least one SurveySnapchat is the second most popular social network among teenagers. The first, TikTok, offers “family sharing,” which gives parents a few ways to limit the amount of videos their kids can show.

A promotional screengrab of Snapchat’s new Family Center, which the company shared ahead of the rollout.

Snapchat


hide caption

toggle caption

Snapchat


A promotional screengrab of Snapchat’s new Family Center, which the company shared ahead of the rollout.

Snapchat

“I think these platforms want to show that they can take steps to protect children, that they can self-regulate and that they are able to do it themselves without the government getting involved,” Irene Lee said. . general knowledge mediaWho reviews apps, games and media for families.

bipartisan law in congress More comprehensive changes targeted at protecting children on social media will be needed, although lawmakers have yet to vote on the measures.

Advocate: Social media networks should be ‘safe by design’ for kids

Parental controls can be helpful for some families, says Josh Golin, executive director of fair play, an advocacy group focused on improving online safety for children. But they do require time, energy and commitment to locate the social media tools parents have and use them regularly.

“Are you going to spend 20 minutes a day figuring out what’s going on in Snap and 20 minutes on TikTok and 20 minutes on Instagram?” They said. “I don’t think parents particularly want to spend their time like this. What they would love to see is that these platforms take real steps by design to be safe.”

For example, Golin says, it should be easy for kids to put down their phones and take a break from social media.

“At 12, you might feel like, ‘Oh my god, my life is going to end if I don’t communicate with my friend on Snapchat today,'” Golin said. “I don’t think we should be giving kids rewards and badges and things for using online platforms more. It’s not promoting intentional, thoughtful use. I think it’s promoting compulsion and only the company benefits.”

Snap’s Terms of Use require a child to be 13 years of age or older before they can sign up for the Service. Snap says it screens underage users in compliance with the Children’s Online Privacy Protection Act.

“We already have millions of young people on Snap, millions of whom are under the age of 13 and shouldn’t even be there in the first place,” Golin said.

He says companies can do a better job of verifying the ages of their users rather than just listening to users.

Common Sense’s Lai says companies can also look at how their algorithms amplify content that is harmful to children.

For example, Ly said, a child might interact with a post that encourages healthy eating for a fitness routine. But algorithms built to show users more of what they like can quickly lead that child down a rabbit hole of misinformation about disordered eating or other harmful eating habits.

Leave a Comment