Eyes on the Ears: Scientists Can Now Know Where You're Looking Just by Listening to Your Ears

Researchers have discovered that sounds emanating from the ears can be used to know where a person is looking.

In a fascinating exploration of the interplay between vision and hearing, scientists from Duke University have uncovered a remarkable link between the movement of eyes and sounds produced by the ears.

The researchers discovered that subtle sounds emanating from the ears, previously thought to be imperceptible, can be used to know where a person is looking.

Eyes on the Ears: Scientists Can Now Know Where You're Looking Just by Listening to Your Ears
Scientists have uncovered a remarkable link between the movement of eyes and sounds produced by the ears. PublicDomainPictures from Pixabay

Ear and Eye Connection

Led by senior author Jennifer Groh, PhD, a professor in the departments of psychology & neuroscience and neurobiology at Duke University, the study builds upon the team's earlier finding in 2018 that ears produce inaudible noises when the eyes move.

The latest research demonstrates that these sounds can reveal the direction of a person's gaze. Remarkably, this relationship works both ways. By knowing where someone is looking, the researchers were able to predict the waveform of the subtle ear sounds.

While the exact purpose of these ear sounds remains uncertain, Groh speculated that they may contribute to sharpening people's perceptions.

"We think this is part of a system for allowing the brain to match up where sights and sounds are located, even though our eyes can move when our head and ears do not," Groh said in a statement.

"If each part of the ear contributes individual rules for the eardrum signal, then they could be used as a type of clinical tool to assess which part of the anatomy in the ear is malfunctioning," added Stephanie Lovich, one of the study's lead authors and a graduate student at Duke.

The study involved 16 adults with unimpaired vision and hearing who participated in a simple eye test. They tracked a static green dot on a computer screen with their eyes as it moved in different directions.

The researchers recorded the corresponding sounds generated by the eye movements using microphone-embedded earbuds.

Unique Signatures

Analysis of the ear sounds revealed unique signatures for different directions of movement. It allowed the researchers to decipher the code embedded in the soundwaves and accurately calculate where the participants were looking.

This discovery opens up intriguing possibilities, including the potential development of new clinical tests for hearing. Lovich suggested that if each part of the ear contributes individual rules for the eardrum signal, they could be used to assess which part of the ear's anatomy is malfunctioning.

Groh is now exploring whether these ear sounds play a role in perception, particularly in individuals with hearing or vision loss. The team is investigating whether people without sensory impairments who generate consistent ear signals may perform better on tasks that require mapping auditory information onto a visual scene.

The study's findings offer a unique insight into the intricate connections between the senses and pave the way for further investigations into the role of these ear sounds in human perception. The research team's findings were published in the journal Proceedings of the National Academy of Sciences.

Byline
Tech Times
ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Tags:Eyes
Join the Discussion
Real Time Analytics