X Media™ Company

Meta Calls for New Legislation That Would Force App Stores To Implement Age Restrictions

One of the key challenges for social apps is protecting younger users, and ensuring that teens, or even younger kids, are not exposed to harmful material, as youngsters work their way around the various safeguard measures.

Because kids want to see the controversial content. They want to see the latest content from the musicians they like, the comedians, some of which, of course, includes adult references.

It’s difficult to police such, but Meta thinks that it could have a new solution to help address this: Make the app stores do it.

As presented by Meta’s Global Head of Safety Antigone Davis, Meta has proposed that the app stores themselves take on a bigger role in keeping young kids out of adult-focused apps, or at the least, in ensuring that parents are aware of such before they download them.

Which Meta says would address several key problems.

As per Davis:

US states are passing a patchwork of different laws, many of which require teens (of varying ages) to get their parent’s approval to use certain apps, and for everyone to verify their age to access them. Teens move interchangeably between many websites and apps, and social media laws that hold different platforms to different standards in different states will mean teens are inconsistently protected.”

A better solution, according to Davis, is to get the app stores themselves to implement tighter controls and processes to stop teens from downloading apps without a parents’ approval.

“We support federal legislation that requires app stores to get parents’ approval whenever their teens under 16 download apps. With this solution, when a teen wants to download an app, app stores would be required to notify their parents, much like when parents are notified if their teen attempts to make a purchase. Parents can decide if they want to approve the download.”

Which is a good suggestion.

Right now, as Davis notes, it’s the apps themselves that are held accountable for policing user ages, and detecting when teens try to cheat the system. But the app stores are the broader gatekeepers, which would mean that any solution that applies to them would have far wider-reaching impacts.

Davis suggests that app stores should implement their own age verification elements to gate certain apps, which would then negate the need for each individual platform to verify user ages.

And if an app does include adult elements, it would require parental approval.

“This way parents can oversee and approve their teen’s online activity in one place. They can ensure their teens are not accessing adult content or apps, or apps they just don’t want their teens to use. And where apps like ours offer age-appropriate features and settings, parents can help ensure their teens use them.”

Though as Davis notes, many apps do offer some level of adult content, along with age-appropriate experiences. Facebook itself, for example, would likely be age-gated, even though it also has its own, internal settings to protect younger users. In such cases, Davis says that age-gating these apps, with variable offerings, would then mean that parents are aware of the apps that their kids are using, and would then be able to help them set up kid-safe measures.

The apps themselves could also then provide pointers to guide parents on this, and ensure that more kids are using apps in a safe way.

It’s not a foolproof plan, and there will always be ways that kids will find to side-step protective measures, and they’re always going to try to get to the more controversial content.

But maybe, this could add another level of protection, at a far more broad-reaching level, which could facilitate more security.

Davis says that an “industry-wide solution, where all apps are held to the same, consistent standard” is the best way to address this element, which she and Meta will be presenting to lawmakers as part of a new legislative push.

And if this gets up, the same approach could also apply to other elements, including content moderation and identity verification.

Right now, the app stores do have their own content management requirements for apps, in order for them to maintain their listing in the respective app store. But maybe, through more advanced measures, this could be a path towards more comprehensive, industry-wide approaches to similar challenges.

The app stores are already the gatekeepers in many respects. Maybe it’s time that they used that power for broader purpose. 

It’s an interesting proposal, which could lead to a new shift in social platform policy.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Tailored for You

Get a Quote

Take the first step towards achieving your business goals, let us help you stand out and grow your business.

Important Notice from X Media™

If you don’t receive your sales quote in your inbox within 45 minutes, please check your junk or spam folder for an email from team@xmediacompany.com, as it may have been filtered there by mistake.