Robots could soon feel feel as scientists develop artificial skin

It may sound a little unsettling and perhaps more akin to a dystopian sci-fi thriller.

But robots could soon feel pain thanks to the development of a new electronic skin which can mimic uncomfortable sensations.

The scientists behind the invention say a mechanical hand fitted with the smart skin showed a remarkable ability to learn to react to external stimuli such as a jab in the palm.

It uses a new type of processing system based on ‘synaptic transistors, which mimics the brain’s neural pathways in order to learn’ to feel pain.

Experts have been working for decades to build artificial skin with touch sensitivity, with one widely-explored method featuring an array of contact sensors across an electronic skin’s surface to allow it detect when it comes into contact with an object.

Handy: Scientists believe robots could soon feel pain after developing an electronic skin which can mimic uncomfortable sensations (pictured)

The scientists behind the invention say a mechanical hand fitted with the smart skin showed a remarkable ability to learn to react to external stimuli, such as a sharp jab in the palm

 The scientists behind the invention say a mechanical hand fitted with the smart skin showed a remarkable ability to learn to react to external stimuli, such as a sharp jab in the palm

But these sensors typically produce a large volume of data that can take time to be properly processed by a computer and responded to, causing delays which would reduce the skin’s potential effectiveness in real-world tasks. 

A team of engineers from the University of Glasgow have now come up with a new prototype ‘e-skin’ which they believe is a big advancement in the area of touch-sensitive robotics. 

They drew inspiration from how the human peripheral nervous system interprets signals from skin in order to eliminate the delays and power consumption of previous concepts. 

As soon as human skin receives an input, the peripheral nervous system begins processing it at the point of contact, reducing it to only the vital information before it is sent to the brain. 

That reduction of sensory data allows efficient use of communication channels needed to send the data to the brain, which then responds almost immediately for the body to react appropriately. 

To build an electronic skin capable of a computationally efficient, synapse-like response, the researchers printed a grid of 168 synaptic transistors made from zinc-oxide nanowires directly onto the surface of a flexible plastic surface. 

They then connected the synaptic transistor with the skin sensor over the palm of a human-shaped robot hand.

The sensor registers a change in its electrical resistance when it is touched, with a light touch corresponding to a small change and a harder touch creating a larger change.

Scientists said this input designed to mimic the way sensory neurons work in the human body.

In earlier prototypes of electronic skin, that input data would be sent to a computer to be processed, where there would often be delays.

The electronic skin uses a new type of processing system based on 'synaptic transistors, which mimics the brain's neural pathways in order to learn'

The electronic skin uses a new type of processing system based on ‘synaptic transistors, which mimics the brain’s neural pathways in order to learn’

But in this design, a circuit built into the skin acts as an artificial synapse, reducing the input down into a simple spike and speeding up the process of reaction.

The team used the varying output of that voltage spike to teach the skin appropriate responses to simulated pain, which would trigger the robot hand to react. 

By setting a threshold of input voltage to cause a reaction, the researchers found they could make the robot hand recoil from a sharp jab in the centre of its palm.

In other words, it learned to move away from this source of simulated discomfort through a process of onboard information processing that mimics how the human nervous system works.

The development of the electronic skin is the latest breakthrough in flexible, stretchable printed surfaces from the University of Glasgow’s Bendable Electronics and Sensing Technologies (BEST) Group, led by Professor Ravinder Dahiya. 

‘We all learn early on in our lives to respond appropriately to unexpected stimuli like pain in order to prevent us from hurting ourselves again,’ he said.

‘Of course, the development of this new form of electronic skin didn’t really involve inflicting pain as we know it — it’s simply a shorthand way to explain the process of learning from external stimulus.

A team of engineers from the University of Glasgow, led by Professor Ravinder Dahiya (pictured), have come up with a new prototype 'e-skin'

A team of engineers from the University of Glasgow, led by Professor Ravinder Dahiya (pictured), have come up with a new prototype ‘e-skin’

‘What we’ve been able to create through this process is an electronic skin capable of distributed learning at the hardware level, which doesn’t need to send messages back and forth to a central processor before taking action. 

‘Instead, it greatly accelerates the process of responding to touch by cutting down the amount of computation required.

‘We believe that this is a real step forward in our work towards creating large-scale neuromorphic printed electronic skin capable of responding appropriately to stimuli.’

Fengyuan Liu, a member of the BEST group and a co-author of the paper, added: ‘In the future, this research could be the basis for a more advanced electronic skin which enables robots capable of exploring and interacting with the world in new ways, or building prosthetic limbs which are capable of near-human levels of touch sensitivity.’

The new research has been published in the journal Science Robotics.

WILL YOUR JOB BE TAKEN BY A ROBOT? PHYSICAL JOBS ARE AT THE GREATEST RISK

Physical jobs in predictable environments, including machine-operators and fast-food workers, are the most likely to be replaced by robots.

Management consultancy firm McKinsey, based in New York, focused on the amount of jobs that would be lost to automation, and what professions were most at risk.

The report said collecting and processing data are two other categories of activities that increasingly can be done better and faster with machines. 

This could displace large amounts of labour – for instance, in mortgages, paralegal work, accounting, and back-office transaction processing.

Conversely, jobs in unpredictable environments are least are risk.

The report added: ‘Occupations such as gardeners, plumbers, or providers of child- and eldercare – will also generally see less automation by 2030, because they are technically difficult to automate and often command relatively lower wages, which makes automation a less attractive business proposition.’