Prostock-studio - stock.adobe.co

NUS develops e-skin system in neuromorphic computing breakthrough

The National University of Singapore has developed an electronic skin system that could give robots and prosthetic devices a sense of touch

A team of researchers at the National University of Singapore (NUS) has developed an electronic skin system that could give robots and prosthetic devices a sense of touch in future.

Dubbed Asynchronous Coded Electronic Skin (Aces), the “e-skin” system comprises an artificial nervous system that analyses data transmitted from a network of independent sensors using the Intel Loihi neuromorphic chip.

The system, which mimics the human sensory nervous system, is said to have the ability to detect touches 1,000 times faster than the human sensory nervous system.

According to NUS, Acres can differentiate physical contact between different sensors in less than 60 nanoseconds – the fastest ever achieved for an electronic skin technology – even with large numbers of sensors.

It can also accurately identify the shape, texture and hardness of objects within 10 milliseconds, 10 times faster than the blinking of an eye.

And when fitted on robots, the e-skin could enable machines to identify and grip unfamiliar objects with the right amount of pressure to prevent slipping.

The ability to feel and better perceive surroundings could also facilitate closer and safer human-robot interactions in caregiving or automate surgical tasks by giving surgical robots the sense of touch that they lack today.

“Making an ultrafast artificial skin sensor solves about half the puzzle of making robots smarter. They also need an artificial brain that can ultimately achieve perception and learning as another critical piece in the puzzle”
Benjamin Tee, NUS

While the creation of artificial skin is one step closer in bringing this vision to life, it also requires a chip that can draw accurate conclusions based on the skin’s sensory data in real time, while operating at a power level efficient enough to be deployed inside the robot.

“Making an ultrafast artificial skin sensor solves about half the puzzle of making robots smarter,” said Benjamin Tee, an assistant professor from the NUS Department of Materials Science and Engineering and NUS Institute for Health Innovation and Technology.

“They also need an artificial brain that can ultimately achieve perception and learning as another critical piece in the puzzle. Our unique demonstration of an AI [artificial intelligence] skin system with neuromorphic chips such as the Intel Loihi provides a major step towards power efficiency and scalability,” he added.

The NUS researchers went one step further by tasking a robot to classify various opaque containers holding different amounts of liquid using sensory inputs from the e-skin and a camera, effectively combining the use of two senses: sight and touch.

Read more about AI in APAC

They also tested the ability of the robot to identify rotational slip, which is important for stable grasping. The results – combining event-based vision and touch using a spiking neural network enabled 10% greater accuracy in object classification compared with a vision-only system.

Harold Soh, an assistant professor from NUS School of Computing, said the team’s research demonstrates the promise of neuromorphic systems in combining multiple sensors to improve robot perception.

“It’s a step toward building power-efficient and trustworthy robots that can respond quickly and appropriately in unexpected situations,” he added.

Mike Davies, director of Intel’s neuromorphic computing lab, said NUS’s work “adds to a growing body of results showing that neuromorphic computing can deliver significant gains in latency and power consumption once the entire system is re-engineered in an event-based paradigm spanning sensors, data formats, algorithms and hardware architecture”.

Read more on Artificial intelligence, automation and robotics

CIO
Security
Networking
Data Center
Data Management
Close