Warning! AI Systems Claiming to 'Read' Emotions Are Dangerous

Bald person
Pixabay

Experts warn that various tech companies using Artificial Intelligence (AI) 'read' your facial expressions. According to research, the function runs the danger of being unreliable and discriminatory.

Lisa Feldman Barrett, Northeastern University's psychology professor, said that technologies drive a developing frame of evidence that weakens the notion that facial expressions are universal across cultures.

Barrett told The Guardian that organizations continue to justify what they're doing when it's undoubtedly clear what their proof is. "There are some organizations that simply hold to claim matters that can't be true in all likelihood," she said.

Companies create technology to know facial muscle

Her warning comes as some companies have begun growing technology to recognize facial muscle movements and assign emotion or purpose to those movements.

HireVue's AI system scans candidates' facial expressions, body language, and word desire and cross-references them with developments that considered to be correlated with process success.

Amazon claims its very own facial recognition system, Rekognition, which can stumble on seven fundamental emotions - happiness, sadness, anger, wonder, disgust, calmness, and confusion. In October, Unilever claimed that it had stored 100,000 hours of human recruitment time last year by deploying such software to analyze video interviews.

The EU is mentioned to be trialing a software program that purportedly can detect deception thru evaluating micro-expressions in an attempt to bolster border security.

"Based at the posted scientific evidence, these technologies shouldn't be rolled out and used to make consequential choices about peoples' lives," said Barrett.

Professor Aleix Martinez of Ohio State University said computer algorithms try to understand social cues. Martinez said deciphering a person's intent goes beyond their facial expression, and it's important that people - and the computer algorithms they create.

Professor Martinez said: 'The challenge is how this is probably used in towns like London, which has a large number of safety cameras. If in the future that is used to unmarried out people primarily based on how they behave, that would be very dangerous."


Facial expressions don't totally reflect innermost feelings, too - research suggests

It may appear apparent that AIs could paint out someone's mood primarily based on whether or not they're smiling or frowning. But facial indicators are, in fact, very different signs for our inner feelings, a study has observed.

Professor Aleix Martinez of Ohio State University underscored that one cannot surely tell one's feelings from facial expressions.

Some technology claims to 'stumble' on whether someone is guilty of a criminal offense or not, or whether or not someone is paying attention, or whether a purchaser is glad after a purchase.

The results of the research, according to Martinez, showed that those claims are 'complete baloney.' "There's no manner you could decide those things, and those can be dangerous," Martinez added.

The danger, Martinez said, lies in the opportunity of missing the real emotion or rationale in any other person and make choices depending on the person's destiny or abilities.

It is likewise true, Martinez said, that sometimes, human beings smile out of an obligation to the social norms. He said people are, in reality, entitled to put on a smile for the rest of the world.

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics