Integration Of An Eye Gaze Interface And BCI With Biofeedback For Human-Robot Interaction

This paper presents an eye gaze and brain controlled interface, where eye gaze is used to select a target, and motor imagery is used to drive a mobile robot towards the target. Vibrotactile haptic feedback about where eye gaze is being tracked by the system and kinesthetic haptic feedback about the brain activity associated with movement intention was provided. The system was tested with five non-disabled adults and one individual with physical impairments. A robotic task to knock down a pile of blocks was performed with and without the haptic feedback, and the completion times of the task were compared. All six participants accomplished the robotic task with the haptic feedback faster than without it, and five participants thought that the task with the haptic feedback required less workload than the task without it. Haptic feedback can be a feasible component for eye gaze and brain controlled interfaces.