Researchers from the National University of Singapore or NUS has announced that they are officially starting the work to provide robots a sense of feeling with "artificial skin," Digital Trends shared the story.
Two of the researchers, who are also members of Intel's Neuromorphic Research Community (INRC) showcased this research that demonstrated the promising ideology behind robots that can feel with their artificial skin. Specifically, this will feature event-based capabilities with touch sensors, partnered with neuromorphic processing from Intel amongst these robots.
It is a step ahead and forward from what current robots can do, limited to visual processing and lacking capabilities concerning the human sense of touch.
Detecting touches
The Singaporean researchers are hoping to alter and modify the artificial epidermis, which the University regarded as being able to read touches 1,000 more than the nervous systems of humans. These artificial skins, according to the University, can also identify an object's hardness, texture, shapes, and more.
It could be "10 times faster than the blink of an eye," the NUS research added.
The partner, Intel's Neuromorphic Computing Lab director Mike Davies indicated that the research offers the overview of the future of the field of robotics, where information can be sensed via event-driven approaches.
Also Read: Trashbot Uses Robotics To Sort Your Garbage For A Cleaner Tomorrow
Davies noted, "The work adds to a growing body of results showing that neuromorphic computing can deliver significant gains in latency and power consumption once the entire system is re-engineered in an event-based paradigm spanning sensors, data formats, algorithms, and hardware architecture."
For the researchers in Singapore, enabling the sense of touch and feel will improve the functionality of what robots can do as of the moment. When used in commercial spaces, the chief researchers also noticed how they can easily adapt to manufacturing in factories.
NUS added, "The ability to feel and better perceive surroundings could also allow for closer and safer human-robotic interaction, such as in caregiving professions, or bring us closer to automating surgical tasks by giving surgical robots the sense of touch that they lack today."
Chip inside the robot
Their partner, Intel, is also helping researchers offer chips deployed within the robots to make conclusions more precise when monitoring and detecting in real-time.
With the so-called neuromorphic chip, the researchers will test a robotic hand that mimics artificial skin to be able to read Braille, and find out if they can. Then, the information will pass through cloud technology to unleash how the robots interpreted this, or the "semantic meaning."
"Using the same tactile and vision sensors, they also tested the ability of the perception system to identify rotational slip, which is important for stable grasping," the National University of Singapore explained.
Follow Tech Times on Facebook to see the latest news, updates, and trending stories in the world of technology.