Computer scientists at the University of Maryland have unveiled a new camera inspired by the mechanics of the human eye. It is designed to enhance how robots perceive and respond to their surroundings. 

New Camera Inspired by the Human Eye

Event cameras are a relatively recent innovation. They excel at tracking moving objects but still struggle to capture clear, blur-free images in dynamic environments.

This limitation poses challenges for applications like robotics and autonomous vehicles that rely on precise visual data to navigate and interact effectively in real-time scenarios, according to study lead author Botao He, a PhD student in computer science at UMD.

He explained that their camera called the Artificial Microsaccade-Enhanced Event Camera (AMI-EV), was inspired by observing how the human eye maintains focus through microsaccades-small, involuntary eye movements that stabilize vision on a target despite motion.

These microsaccades enable humans to perceive details like color, depth, and shadows. The team integrated a rotating prism into the AMI-EV to mimic this biological mechanism.

This prism redirects light captured by the camera lens, simulating the continuous movement observed in human eyes during microsaccades. Software algorithms were then developed to compensate for the prism's motion, ensuring stable image capture even amidst dynamic scenes.

Eye

(Photo : Mario from Pixabay)

Features of AMI-EV Camera

Professor Yiannis Aloimonos, co-author of the study and director of UMIACS' Computer Vision Laboratory, underscored the significance of this innovation in advancing robotic perception.

He likened the camera's role in robots to that of human eyes and emphasizes that better cameras translate directly to improved perception and decision-making capabilities in robotics. Beyond robotics, the researchers foresee broad applications across various industries reliant on high-quality image capture and analysis.

According to the study, AMI-EV's unique features, including superior performance in challenging lighting conditions, low latency, and minimal power consumption, make it suitable for applications ranging from virtual reality to security monitoring and space exploration.

Research scientist Cornelia Fermüller, senior author of the paper, highlights the potential of event sensors and AMI-EV in smart wearable technologies, wherein rapid computation of movements is crucial for seamless user experiences.

In initial testing, the team reported that AMI-EV accurately captures rapid movements, such as detecting human pulses and swiftly identifying moving shapes.

It outperformed conventional commercial cameras by capturing tens of thousands of frames per second, far surpassing traditional cameras' typical 30 to 1000 frames per second.

Read Also: New AI-Powered 'Eye Clock' Predicts Molecular Age, Illnesses Through Tear Fluid

Broad Applications

The team added that this advancement's implications extend to fields like augmented reality, wherein realistic motion depiction enhances user immersion, and astronomy, where precise image capture in dynamic environments is critical.

Aloimonos emphasized that AMI-EV's potential applications are vast, from enhancing the capabilities of autonomous vehicles to improving smartphone camera performance. The team believes their innovation could aid in developing advanced systems capable of addressing complex challenges in various technological domains. 

The findings of the team were published in the journal Science Robotics.

Related Article: Scientists Create Retinas in the Lab that May One Day Treat Eye Diseases

Byline

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion