YouTube is reportedly finalizing plans to stop running targeted advertisements on videos that underage viewers are likely to watch.
The move comes a month after the Federal Trade Commission reached a settlement with YouTube's parent company Google after investigations found that the popular video-sharing site violated the Children's Online Privacy Protection Act of 1998.
YouTube, FTC Clash Over Children's Data Collection
It is still unclear whether the new policy on targeted ads, which is yet to be officially introduced, is a result of the settlement. In July, The Washington Post reported that the FTC has reached a settlement with Google over the improper collection of children's data on YouTube. However, the exact terms of the settlement and a fine that is said to be several million dollars were not disclosed to the public.
Targeted ads, also called behavioral ads, rely on collecting a massive trove of data from users, a practice that is at the center of Google's and, therefore, YouTube's business. However, the Children's Online Privacy Protection Act prohibits websites and operators from collecting data from users under the age of 13.
YouTube is not meant to be used by minors. The company has a separate platform called YouTube Kids which does not run targeted ads on videos. However, the main website still has content, like nursery rhymes, that are meant for young viewers.
A recent study conducted by Pew Research also found that YouTube videos that feature children received up to three times as many views as other videos.
Bloomberg, which was first to break the story, said that it is unclear whether the proposed change is a result of Google's settlement with the FTC. The report also stated that nothing is set in stone as of this moment.
If implemented, the end of targeted ads on videos that are aimed at children could result in a serious dent in YouTube's ad sales.
Neither YouTube nor the FTC agreed to comment.
YouTube Criticized For Disturbing Content
YouTube has also previously been criticized for featuring disturbing content alongside videos that are meant to be watched by young viewers. In response to the criticism, the site hired 10,000 moderators to police content as well as pulled ad revenue from inappropriate videos. The site also added stronger parental controls and disabled comments on videos featuring minors.