“We found our 3D-printed tactile fingertip can produce artificial nerve signals that look like recordings from real, tactile neurons,” according to Bristol’s professor of robotics Nathan Lepora. “Human tactile nerves transmit signals from various mechanoreceptors, which can signal the pressure and shape of a contact. Work by Phillips and Johnson in 1981 first plotted electrical recordings from these nerves to study tactile spatial resolution using a set of standard ridged shapes. We tested our artificial fingertip as it felt those same ridged shapes and discovered a startlingly close match to the neural data.”
The fingertip is essentially a thin-walled hollow elastomer dome around 25mm across (cut in half, right). Through the rubber are printed short rigid rods in a contrasting colour (white in the photo). The whole thing it printed in one go on a multi-material printer.
A camera inside the dome views the ends of the rods, then image processing infers distortion in the dome caused by contact with its outside.
The structure is similar to that in a fingertip, where signals from a matrix of bumps just under the skin, called papillae and each with a nerve ending, are processed to produce a sensation of touch. It is these papillae signals that were recorded by Phillips and Johnson in 1981.
“Our work helps uncover how the complex internal structure of human skin creates our human sense of touch,” said Lepora. “This is an exciting development in soft robotics – being able to 3D-print tactile skin could create robots that are more dexterous or significantly improve the performance of prosthetic hands by giving them an in-built sense of touch.”
The artificial structure is not as sensitive to fine detail as human skin, said Lepora, which he suspects is due to it being thicker than real skin, so efforts are being made to print structures on a similar scale to the natural ones.
The Journal of the Royal Society Interface has published two papers on this work: ‘Artificial SA-I, RA-I and RA-II/vibrotactile afferents for tactile sensing of texture‘ and ‘Artificial SA-I and RA-I afferents for tactile sensing of ridges and gratings‘.
From the texture paper, the team said:
We find: spatially encoded frictional cues provide a salient representation of texture; a simple transformation of spatial tactile features to model natural afferent responses improves the temporal coding; and the harmonic structure of induced vibrations provides a pertinent code for speed-invariant texture classification.
Just as human touch relies on an interplay between slowly adapting, rapidly adapting and vibrotactile channels, this tripartite structure may be needed for future robot applications with human-like dexterity.