Robotic hand illusion with tactile feedback: Unravelling the relative contribution of visuotactile and visuomotor input to the representation of body parts in space.

Journal: PloS one
Published Date:

Abstract

The rubber hand illusion describes a phenomenon in which participants experience a rubber hand as being part of their body by the synchronous application of visuotactile stimulation to the real and the artificial limb. In the recently introduced robotic hand illusion (RobHI), a robotic hand is incorporated into one's body representation due to the integration of synchronous visuomotor information. However, there are no setups so far that combine visuotactile and visuomotor feedback, which is expected to unravel mechanisms that cannot be detected in experimental designs applying this information in isolation. We developed a robotic hand, controlled by a sensor glove and equipped with pressure sensors, and varied systematically and separately the synchrony for motor feedback (MF) and tactile feedback (TF). In Experiment 1, we implemented a ball-grasping task and assessed the perceived proprioceptive drift of one's own hand as a behavioral measure of the spatial calibration of body coordinates as well as explicit embodiment experiences by a questionnaire. Results revealed significant main effects of both MF and TF for proprioceptive drift data, but we only observed main effects for MF on perceived embodiment. Furthermore, for the proprioceptive drift we found that synchronous feedback in one factor compensates for asynchronous feedback in the other. In Experiment 2, including a new sample of naïve participants, we further explored this finding by adding unimodal conditions, in which we manipulated the presence or absence of MF and/or TF. These findings replicated the results from Experiment 1 and we further found evidence for a supper-additive multisensory effect on spatial body representation caused by the presence of both factors. Results on conscious body perception were less consistent across both experiments. The findings indicate that sensory and motor input equally contribute to the representation of spatial body coordinates which for their part are subject to multisensory enhancing effects. The results outline the potential of human-in-the-loop approaches and might have important implications for clinical applications such as for the future design of robotic prostheses.

Authors

  • The Vu Huynh
    Work and Engineering Psychology Research Group, Technische Universität Darmstadt, Darmstadt, Germany.
  • Robin Bekrater-Bodmann
    Department of Cognitive and Clinical Neuroscience, Central Institute of Mental Health, Medical Faculty Mannheim, Heidelberg University, Mannheim, Germany.
  • Jakob Fröhner
    Chair of Information-oriented Control, Department of Electrical and Computer Engineering, Technical University of Munich, Munich, Germany.
  • Joachim Vogt
    Work and Engineering Psychology Research Group, Technische Universität Darmstadt, Darmstadt, Germany.
  • Philipp Beckerle
    Institut für Mechatronische Systeme im Maschinenbau, Technische Universität Darmstadt, Otto-Berndt-Straße 2, 64287, Darmstadt, Germany.