'Instagram Protects Children:' App Bans Minors, Created New App for Kids Alone

Child on Social Media
Bruce Mars on Unsplash
Instagram App
Solen Feyissa on Unsplash

Popular photo-sharing app Instagram reveals plans to build a version catering to children under the age of 13, Business Insider reports.

No more minors on Instagram

On Thursday, March 18, Instagram's Vice President of Product Vishal Shah wrote on an employee board to announce the news.

"I'm excited to announce that going forward, we have identified youth work as a priority for Instagram and have added it to our H1 priority list," he wrote.

Shah highlighted that the project's goal is to build a new youth pillar within the Community Product Group.

The group's main focus will be accelerating the app's integrity and privacy work to ensure the safest possible experience for teens and building a version of Instagram that allows children aged 13 and below to use the app for the first time safely.

Instagram Head Adam Mosseri confirmed the news but admitted that the company does not have a detailed plan as of yet.

Mosseri noted that the company is exploring creating a version of Instagram for children wherein parents have transparency or control.

He also acknowledged that the company recognizes the rising demand from children to use apps like Instagram, but verifying identification proves to be a challenge.

According to Instagram's Terms of Use, children below 13 years of age are forbidden to use the social network service.

Protecting the Youth

The internal announcement comes two days after the Facebook-owned app said it needs to do more to protect its youngest users.

In a blog post published on March 16, the company highlighted the importance of protecting young people.

While the post did not mention Instagram's plans for the kid-friendly version of the app, it announced the plans to share updates on new features and resources in an effort to keep the youngest community members safe.

Guarding Children's Online Safety

The announcement also laid the groundwork for Facebook to expand its user base, and Instagram understands that children under 13 are a viable growth segment.

However, ads targeting children under 13 had been burdened with concerns about privacy and legal issues.

One good example is when the Federal Trade Commission fined Google $170 million after the search engine giant tracked children's viewing histories to serve ads to them on YouTube.

According to the Children's Online Privacy Protection Act (COPPA), it is forbidden to collect personal information from viewers of child-directed channels without their parents' knowledge and consent.

In 2017, Facebook launched an ad-free version of its Messenger chat platform targeting children between the ages of 6 and 12.

The move was criticized mainly by children's health advocates, noting that it is harmful to children and urged Facebook CEO Mark Zuckerberg to shut down the app.

Eventually, a bug allowing children to join groups with strangers was discovered in "Messenger Kids," leaving thousands in chats with unauthorized users.

While Facebook claims that the bug affected "a small number" of users, the social media giant quietly closed down those group chats and alerted its users.

This article is owned by Tech Times

Written by Lee Mercado

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics