How a 3D Mask Can Access Your Accounts Using Facial Recognition

How a 3D Mask Can Access Your Accounts Using Facial Recognition
How a 3D Mask Can Access Your Accounts Using Facial Recognition fotografierende on Unsplash

Facial recognition is already termed as 'safer' compared to biometrics or fingerprint scanning when we talked about security. A research report named "Facial Recognition Market" by Component even found out that facial recognition will grow $7.0 billion by 2024 in the United States alone. With its huge impact on the market, we can conclude that a lot of Americans already use this technology. However, is it really safe to use facial recognition?

According to the findings of an artificial intelligence company based in San Diego, there is still a lot of security loopholes in facial recognition and worse, users of this technology can be a fraud victim once this loophole cannot be resolved in time.

Why Facial Recognition is Unsafe, according to this experiment

Kneron, a San Diego-based Artificial Intelligence company, recently did an experiment to find out whether facial recognition can be tricked through the use of technology itself. According to their experiment shared in The Verge, Kneron created 3D masks of a person. These masks were used by a different person wearing a mask in China's terminal transportation. The main goal is to trick facial recognition technology through the 3D masks and figure out whether fraudsters can have access or use a different face just to pay on a simple vehicle.

On its first video, a tester approached AliPay and WeChat terminals-- China's most popular online and payment platform-- and wore his 3D mask of a different person. After facial recognition scanned the face of the tester, the purchase in the store was successfully made, and the payment was instantly transferred.

Another example was the experiment on a train station wherein the tester puts his 3D mask and feeds his ID card into the train station turnstile. After facial recognition scanned the 3D face mask, the system allowed the tester to access the train using the fake mask.

What Does the Facial Recognition Experiment Explain?

"This shows the threat to the privacy of users with sub-par facial recognition that is masquerading as "AI".The technology is available to fix these issues, but firms have not upgraded it. They are taking shortcuts at the expense of security," said Kneron's CEO Albert Liu.

Although the findings of Kneron seemed to be scary and alarming for most facial recognition users all over the world, the company also clarified that this experiment cannot be used by fraudsters instantly since the 3D masks were made by specialty mask makers in Japan. This means that in order for someone to trick facial recognition, they will need a huge amount of money to have it accomplished. However, this experiment explains that facial technology cannot be all secured, and by the use of a simple yet expensive 3D mask, an identity of a person can be compromised using the technology.

Facial Recognition and Facebook in 2016

Although the experiment already tells us that fraud cannot be done instantly with facial recognition-- however, it's possible-- in 2016, researchers from the University of North Carolina already found out its negative impact. Just like what Kneron did, they easily tricked facial recognition with the use of 3D face replicas and opened different devices used on Facebook or any other account using the technology. With this, are you sure that your face is still safe on frauds?

ALSO READ: California Law Bans Police From Using Facial Recognition Software In Body Cameras

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics