Google Escalates Fight Against Terrorism: Here Are The 4 Things That It Will Do

Google has escalated its fight against terrorism, pledging to do four things to solve the issue of extremist content that is being propagated online.

The Alphabet unit's plans were detailed in a blog post written by Kent Walker, general counsel for Google, as the company works with governments, law enforcement authorities, and civil society groups to eliminate terrorist content on its services.

Google Ramps Up Fight Against Online Terrorism

"While we and others have worked for years to identify and remove content that violates our policies, the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done. Now," Walker wrote in his post, which also appeared as an editorial in the Financial Times.

Google has already implemented several measures to remove extremist content online. These include hiring thousands of employees to review and maintain control over its platforms, developing technology that uses image-matching tools to prevent terrorist content from being re-uploaded, investing in systems using content-based signals to identify videos that need to be taken down, and forging partnerships with other groups, agencies, and companies to support these initiatives.

In Walker's post, Google announced four new steps that the company will take to fight against terrorism and the spread of extremist content online.

First, Google is devoting more resources to apply its machine learning technology in identifying and removing terrorist content online. The challenge is determining the difference between videos such as those of attacks uploaded by legitimate news source and those uploaded by terrorists to incite violence. However, Google believes that its video analysis models can get the job done.

Second, Google will also be inviting more independent experts to work under the Trusted Flagger program of YouTube. These people will support the work of the computers, as humans still play a significant role in making the decision on whether or not a video can be considered extremist content. A total of 50 expert NGOs will be added to the 63 organizations already part of the program and will be provided operational grants for their contributions.

Third, there will be a stricter stance against videos that do not explicitly violate Google's policies, such as those that feature inflammatory religious and supremacist themes. Such videos, which could not be taken down, will appear behind a warning and will not be monetized nor open to user endorsements and comments. This is to make these videos harder to find while still respecting the right to free expression.

Fourth, YouTube will be expanding its role against radicalization, with targeted advertising that will redirect potential Islamic State or ISIS recruits toward anti-terrorism content that will hopefully change their mind in taking up arms.

Google Teams Up With Tech Companies Against Terrorism

Google committed to teaming up with its colleagues in the tech industry, including Microsoft, Facebook, and Twitter, with the goal of creating an international forum for companies to jointly develop and share anti-terrorism technology to accelerate the methods of fighting back against them.

"Together, we can build lasting solutions that address the threats to our security and our freedoms," Walker said to end his post, adding that Google is committed to playing its part in the fight.

Google and other social media companies have faced increasing pressure to ramp up efforts to remove extremist content online after a rash of recent terrorist attacks. Hopefully, the four steps that the company outlined will help kickstart a worldwide initiative to solve the issue.

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics