Blending of brain-machine interface and vision-guided autonomous robotics improves neuroprosthetic arm performance during grasping.

Journal: Journal of neuroengineering and rehabilitation
Published Date:

Abstract

BACKGROUND: Recent studies have shown that brain-machine interfaces (BMIs) offer great potential for restoring upper limb function. However, grasping objects is a complicated task and the signals extracted from the brain may not always be capable of driving these movements reliably. Vision-guided robotic assistance is one possible way to improve BMI performance. We describe a method of shared control where the user controls a prosthetic arm using a BMI and receives assistance with positioning the hand when it approaches an object.

Authors

  • John E Downey
    Department of Bioengineering, University of Pittsburgh, Pittsburgh, PA, USA.
  • Jeffrey M Weiss
    Department of Bioengineering, University of Pittsburgh, Pittsburgh, PA, USA.
  • Katharina Muelling
    Robotics Institute, Carnegie Mellon University, Pittsburgh, PA, USA.
  • Arun Venkatraman
    Robotics Institute, Carnegie Mellon University, Pittsburgh, PA, USA.
  • Jean-Sebastien Valois
    Robotics Institute, Carnegie Mellon University, Pittsburgh, PA, USA.
  • Martial Hebert
    Robotics Institute, Carnegie Mellon University, Pittsburgh, PA, USA.
  • J Andrew Bagnell
    Robotics Institute, Carnegie Mellon University, Pittsburgh, PA, USA.
  • Andrew B Schwartz
    Department of Bioengineering, University of Pittsburgh, Pittsburgh, PA, USA.
  • Jennifer L Collinger
    Department of Bioengineering, University of Pittsburgh, Pittsburgh, PA, USA. collinger@pitt.edu.