A.Eye Drive: Gaze-based semi-autonomous wheelchair interface.
Journal:
Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual International Conference
Published Date:
Jul 1, 2019
Abstract
Existing wheelchair control interfaces, such as sip & puff or screen based gaze-controlled cursors, are challenging for the severely disabled to navigate safely and independently as users continuously need to interact with an interface during navigation. This puts a significant cognitive load on users and prevents them from interacting with the environment in other forms during navigation. We have combined eyetracking/gaze-contingent intention decoding with computer vision context-aware algorithms and autonomous navigation drawn from self-driving vehicles to allow paralysed users to drive by eye, simply by decoding natural gaze about where the user wants to go: A.Eye Drive. Our "Zero UI" driving platform allows users to look and interact visually with at an object or destination of interest in their visual scene, and the wheelchair autonomously takes the user to the intended destination, while continuously updating the computed path for static and dynamic obstacles. This intention decoding technology empowers the end-user by promising more independence through their own agency.