The research by the University of Pisa and the University of California, Santa Barbara awarded at the IEEE Haptics Symposium.

The vibrations transmitted through the hand and arm when we touch an object play an essential role in making us experience a certain tactile sensation, but measuring them and building models of them is extremely complex.

A study by the University of Pisa and the University of California Santa Barbara has developed SkinSource, an open-source tool capable of measuring and modeling the vibrations of the skin along the upper limbs in response to a very large set of localized tactile stimuli.
Potential applications range from the design of new robotic arms and hands to tactile augmented reality.

The toolbox has generated great enthusiasm in the community of tactile scholars, which includes disciplines as diverse as neuroscience, medicine and engineering, and was awarded the Best Paper Award at the Haptics Symposium, one of the major international conferences in the field of Haptics, which took place in recent days in Long Beach, California.

“The sense of touch - explains Matteo Bianchi, professor of robotics at the Department of Information Engineering of the University of Pisa - is the most complex and widespread throughout the body. Our tactile sensations depend not only on local receptors in the skin, but also on several other factors, such as the vibrations that spread through the hand and arm when we touch an object. Touch is a key sense both for proprioception - that is, the ability to perceive our body as being placed in space in a certain position - and for exteroception, that is, the ability to understand the physical properties of objects, such as roughness and stiffness. It is with touch that we know if we are in a stable position, if we are holding a paper cup straight enough not to tip it over, but delicate enough not to crush it. Without touch we could not walk, sit, hold a position consciously, interact, explore and modify the world around us. For this reason, the ability to model the dynamics of our body when we touch an object, to be able to reproduce them for example in a prosthesis, or in a robotic hand, or in an augmented or virtual reality application, takes on great importance. Our device is able to predict the extension and intensity of the vibrations that are transmitted along the upper limbs following different types of localized force stimuli, in an accurate way.

The toolbox is open-source, and is therefore available to the entire scientific community that wants to use it, freely downloadable from this link https://doi.org/10.5281/zenodo.10547601 Precisely this aspect, given the complexity and difficulty of reliably reproducing tactile sensations, has been welcomed with great enthusiasm”.

In fact, SkinSource, in addition to being a tool for studying the mechanical bases of tactile perception, can be used for the planning and design of interfaces for sensory restitution in upper limb prostheses and in teleoperated robotic hands in which tactile feedback is essential, as well as in robotic surgery and in many applications that use augmented reality. In fact, the Pisan engineers, within the FoReLab, the laboratory of the information engineering department dedicated to 5.0, are developing augmented reality devices that integrate visual and tactile perception.

“With SkinSource, in fact,” Bianchi concludes, “we can exploit the study and modeling of mechanical energy propagation along the skin to build distributed wearable systems, to ensure an immersive experience that integrates visual and tactile stimuli. The system can operate alongside the augmented reality interfaces we are working on in the FoReLab laboratory, which allow you to see or touch real objects by modifying their visual or tactile perception through artificial stimuli.”

haptics symposium