India has ordered X, YouTube, and Telegram to make sure there are no child sexual abuse materials on their platforms, the government said on Friday.
Reuters reported that the country's Ministry of Electronics and Information Technology (MEITY) has sent notices to these tech companies to inform them that they must immediately remove child sexual abuse materials on their platforms.
In a press statement, the government noted that X, YouTube, and Telegram could be stripped of their protection from legal liability if they do not comply. The MEITY said any child sexual abuse material on these platforms must be immediately and permanently removed.
The agency also asked X, YouTube, and Telegram to moderate content and to place reporting mechanisms to prevent the spread of child sexual abuse material in the future.
Rajeev Chandrashekhar, the junior minister for information technology, said in the statement that if the tech companies do not act swiftly, "their safe harbour under section 79 of the IT Act would be withdrawn and consequences under the Indian law will follow."
India's Present Regulations on X, YouTube, and Telegram Content
Section 79 of the Information Technology Act of 2000 (IT Act) exempts social media platforms from responsibility in certain circumstances. It specifies that X, YouTube, Telegram, and others are not responsible for any user-generated or posted content on their platforms.
"The Act extends 'safe harbor protection' only to those instances where the intermediary merely acts a facilitator and does not play any part in creation or modification of the data or information," the law noted.
With the recent order, the Indian government has stated it was determined to build a safer and "trusted internet" under Prime Minister Narendra Modi. X, YouTube, and Telegram have yet to respond regarding the order.
According to Reuters, India also told Netflix, Disney, and other streaming platforms last July to review their content before being shown online. While a government-appointed board review and certify all films in Indian cinemas, streamed content is not.
Representatives from the streaming platforms reportedly objected to a proposal that their content be independently reviewed for obscenity and violence before being released online.
Government officials asked the industry to consider an independent panel to review their content, but the industry objected. The officials asked them to consider the idea.
The proposal came as streaming giants protested a government order to add 50-second tobacco health warnings in each piece of content. It also happened two years after the government ordered setting up self-regulatory bodies for complaints about streaming content.
Content moderation and the proposed warnings have been welcomed by various activists, with Truth Initiative claiming that streaming shows expose around "25 million" adolescents to smoking back in 2021.
Global Call for Content Moderation
Hindustan Times reported that last April, India was among the top countries to request X to remove content related to "abuse or harassment abuse or harassment, child sexual exploitation, hacked materials, hateful conduct..." and other illegal or sensitive content.
France, Japan, and Germany were the other countries that joined India. India's government and the public have previously scrutinized various content from streaming platforms where it was claimed to be too "vulgar or offensive to religious sentiments."
Different national and international bodies, including the EU and UN, have long proposed content moderation on social media platforms.