Revolutionary touch-sensing synthetic skin developed in Stanford research

© DARPA / Handout via Reuters
Breakthrough optogenetics research has made a huge leap in developing synthetic skin capable of sensing. The prototype can currently distinguish pressure, but the decade-long Stanford research is far from over.

Barely any time has passed since everyone thought cybernetic limbs were amazing. This could change very quickly, as a decade-long research project by Stanford University scientists has just made a huge breakthrough in mimicking the human skin’s ability to feel everything from temperature to pain. And they’ve created a plastic skin to prove it.

Two layers are involved. The top layer would hold the sensing mechanism, while the bottom layer acts as a conductor for the electric signals to be translated into biochemical stimuli and transmitted to the nerve cells. It’s all about pressure. The latest model has an in-built sensor that can detect and understand pressure changes on the same range as human skin.

This simple communication “is the first time a flexible, skin-like material has been able to detect pressure and also transmit a signal to a component of the nervous system,” says Andre Berndt, one of the 17 researchers.

It’s the brainchild of Zhenan Bao, whose final objective isn’t just a skin that can feel, but even heal the way a human’s does. And Stanford says that one aspect of our ability to sense touch has just been achieved: the ability to distinguish things like a limp handshake from a firm one.

The researchers exploited and improved a discovery from five years ago. It was then that they were able to use plastic as a pressure sensor. This would have been difficult without measuring the natural springiness of human skin. Now, however, they’ve fitted the skin with billions of atomically-thin graphene tubes. As pressure is applied to the skin, the tubes squeeze closer together. This creates the millions of tiny electrical impulses combined together to create the sensation of feeling – a bit like Morse code, according to Bao and the team.

The most complicated part was getting the signals to be recognized by a biological neuron. The team did this with optogenetics, which does exactly what it says – combine optics and genetics. The team had to specially engineer cells that responded to specific frequencies of light. Using the light pulses they would switch things on and off – the cells themselves or specific processes that go on inside.

But optogentics was only a proof-of-concept stage. Bao’s team believes that as the research matures, a different method will be used in actual prosthetic devices. They are confident because work has already been done to prove that direct stimulation of neurons with electrical pulse is possible.

In the end, Bao hopes to make sensors that, upon touch, would distinguish between silk and corduroy. However, the two-layer method works perfectly at this stage, as it allows for the introduction of new sensations, as new mechanisms are developed.

And there is a total of six bio-sensing mechanisms in the human hand. The following research only describes one. "We have a lot of work to take this from experimental to practical applications," Bao says. "But after spending many years in this work, I now see a clear path where we can take our artificial skin."

That is not to say research from all corners isn’t trying to do the exact same thing. After all, science is one big race: a mid-September report from the Defense Advanced Research Projects Agency (DARPA) claimed they not only produced a prosthetic limb with an ability to be controlled by thought, but that the limb granted the wearer a “near-natural” sense of touch.

READ MORE: New DARPA prosthetic hand grants ‘near-natural’ sense of touch

This adds to the already impressive list of recent research that wants the human brain to talk to robotic body parts.