A. BICCHI*^, P. Pietrini*, E. Ricciardi*, M. Santello^^
*Università di Pisa, IT – ^Istituto Italiano di Tecnologia, IT – ^^Arizona State University, US
bicchi@ing.unipi.it

Antonio Bicchi is Professor of Robotics at the University of Pisa, a Senior Scientist at the Italian Institute of Technology in Genoa, and an Adjunct Professor at Arizona State University. His main research interests are in Robotics, Haptics, and Control Systems. He has published more than 400 papers and is the recipient of several awards and honors, including an individual Advanced Grant from the European Research Council for his research on human and robot hands in 2012.
In the last decade, in collaboration with Pietro Pietrini and Emiliano Ricciardi, University of Pisa, and Marco Santello, Arizona State University, he created a multidisciplinary group to capitalize on the interaction between neuroscience and robotics to study human perception and motor control and to realize assistive technologies.
Vision and touch, two very complex and apparently very different senses in humans, have more in common than it may appear. They both are eminently active senses which dynamically control motions of their organs – the eye and the hand – to elicit information. Also, both provide so complex elementary information that a great deal of built-in abstraction capabilities has to be postulated to explain their astounding real-time performance.
Although vision usually plays a central role in how we represent and interact with the world around us, several observations on congenitally blind and sighted individuals are pointing out that the lack of visual experience may have just limited effects on the perception and mental representation of the surrounding world. These studies suggest that much of the brain cortical functional architecture appears to be programmed to occur even in the absence of any visual experience, and to be able to process non- visual sensory information, a property that can be defined as supramodality. This more abstract nature of the brain organization may explain how congenitally blind individuals acquire knowledge, form mental representations of and effectively interact with an external world that they have never seen.
Thus, for instance, the well-studied perception of optic flow, which has a fundamental role in the assessment of dynamic motion, has a counterpart in touch: cortical correlates of tactile flow have been observed to share supramodal patterns with optic flow.
Not only these and other findings have produced a better understanding of our brain, but they have been very useful to develop innovative technology, such as haptic interfaces to enable visually impaired people to increase their independence and social inclusion, and prosthetic devices to afford people deprived of their hands with simpler and better artificial replacements, and to render their control more natural.