Intel Neuromorphic Computing was integrated by researchers at the National University of Singapore to build robots with sense of touch and vision to use in the healthcare, logistics and food manufacturing industries.

Two researchers from Singapore National University, members of the Intel Neuromorphic Research - presented new findings that demonstrate a combination of vision and touch sensing with robotics neuromorphic processing.

The human sense of touch is sensitive enough to feel the difference between surfaces differentiated by just one layer of molecules, yet most of today's robots only operate on visual processing.

Enabling a human-like touch feeling in robotics could significantly improve the current functionality and even lead to new usage cases.

The Team used a robotic hand to read Braille. The hand was covered with the artificial skin and passed the tactile data through the cloud to Loihi. The robot was converting the hand-felt micro bumps into a semantic sense.  

In classifying the Braille letters, Loihi achieved more than 92% precision and 20 times less power was used than in a standard processor.

The researchers have used a chip that can draw precise conclusions in real time based on the sensory data of the skin and is operating at a power level that is sufficiently efficient.

This smart robot therefore has an extremely fast artificial skin-sensor.

"Building on this work, the NUS team successfully commissioned a robot to classify different opaque containers that hold different amounts of liquid using artificial skin sensory inputs and an event-based camera.

This year, researchers from Intel Labs and Cornell University published a research paper in Nature Machine Intelligence on Intel's ability to learn and recognize hazardous chemicals in the presence of significant noise and occlusion.

Read the original article "Singapore scientists invent a robotic system with skin and vision sensors that are driven by AI" at