+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

Meta is begging the government for new rules on age verification. But is it really trying to stick it to TikTok?

Nov 18, 2023, 02:42 IST
Business Insider
Meta suggests parents should be the ones to approve any app downloads for 13-16 year olds.rbkomar / Getty
  • Meta published a blog inviting legislation that would require parental approval to download apps.
  • Thirty-three states are suing Meta, alleging it ignored warnings that Instagram harmed teens.
Advertisement

Meta is asking for federal rules that would require kids and younger teens who want to download apps — like Facebook and Instagram, for instance — to get permission from their parents first.

Yes, you read that right: Meta is asking for more government regulation. But its motivation might not be totally altruistic. More on that in a minute.

Mark Zuckerberg's company made the call for the new rules in a blog post earlier this week. Antigone Davis, Meta's global head of safety, described its hopes for US government regulation in the post titled: "Parenting in a Digital World Is Hard. Congress Can Make It Easier."

Here's a highlight:

Sounds amazing, right? Well, maybe not so fast.

Advertisement

For starters, making it harder for 13-to-16 year olds to download social apps would probably hurt TikTok far more than it would Meta. It's a nice upside for Meta if this regulation just so happened to slow the growth of a main competitor, which dominates the younger audience.

Although Meta's apps are somewhat rebounding, the chokehold TikTok has on younger kids compared to Meta is stark.

A recent study showed that kids 11 to 17 who use TikTok spend a median of nearly 2 hours daily on the app — compared to just one minute on Meta's Facebook (LOL) and 16 minutes on its Instagram.

TikTok has faced its own challenges with possible government regulation, although for a different set of reasons — mainly concerns over its ties to China.

And under Meta's proposed rules, the nuts-and-bolts of age verification wouldn't be the responsibility of social media companies. Punting verification to Apple's App Store and the Google Play Store would likely make things easier for Meta.

Advertisement

It would appear to turns Meta's problem into Apple and Google's problems.

(And keep in mind that all these social platforms can also be accessed on a browser; you don't need an app.)

Most crucially, Meta happens to be in the middle of a lawsuit over teen safety on its apps. Thirty-three states, including New York and California, are suing Meta, alleging it purposely designed its apps to be addictive to children and teenagers — knowing the social apps risked the mental health and wellbeing of young people.

And last week, in a Massachusetts filing, was buried a potential bombshell: Mark Zuckerberg had repeatedly ignored another executive's request to invest more resources in improving teen wellbeing, the lawsuit claimed, and he ignored other internal warnings about the harmful effects of Instagram beauty filters on teen girls, it said.

Meta said at the time it has a "robust central team overseeing youth well-being efforts across the country," and said the legal complaint was "filled with selective quotes from handpicked documents" that don't provide the full context.

Advertisement

As for its call for federal legislation on age verification, a Meta spokesperson told me it's to make sure there's one national standard for every app.

"We've always said that we support internet regulation, particularly when it comes to young people," said Meta spokesperson Faith Eischen. "That said, we're concerned about what is amounting to a patchwork of divergent laws across a variety of US states. Laws that hold different apps to different standards in different states will leave teens with inconsistent online experiences."

Meanwhile, a bipartisan bill is working its way through Congress that would update the Children's Online Privacy Protection Act, or COPPA, from 1998 that regulates privacy for users under 13 years old. The beefed up version, "COPPA 2.0," would also prevent data collection on users under 16 and would ban targeted advertising to kids and teens.

Meta might not like that regulation quite as much.

Danny Weiss, chief advocacy officer of Common Sense Media, an organization that provides parental resources for navigating technology, told me: "Meta is really not in a position to be recommending what is good or not good for children and teens online, given that at the highest levels, Meta has sought to prevent steps that would protect children online."

Advertisement

Weiss also said Meta's proposed regulations shifts the burden too much onto parents, who are already overwhelmed with trying to keep up with constantly changing digital media for their kids.

Correction: November 17, 2023 — An earlier version of this story misstated the last name of Danny Weiss of Common Sense Media.

You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article