1. Blickbasierte Steuerung des Roboterarms in Dreidimensionaler Raum
- Author
-
Alsharif, Shiva, Gräser, Axel, and Anheier, Walter
- Subjects
eye-gaze interaction ,7 degrees of freedom (DOF) ,gaze gesture ,gaze-gesture recognition ,620 Engineering ,inactive-zone gaze gesture recognition ,dynamic command area ,gaze gesture-based human-robot ,HRI ,robot arm ,hands-free human-robot interface ,dynamic time warping (DTW) ,ddc:620 ,a solution for Midas touch problem ,complex gaze gestures - Abstract
Eye tracking technology has opened up a new communication channel for people with very restricted body movements. These devices had already been successfully applied as a human computer interface, e.g. for writing a text, or to control different devices like a wheelchair. This thesis proposes a Human Robot Interface (HRI) that enables the user to control a robot arm in 3-Dimensional space using only 2-Dimensional gaze direction and the states of the eyes. The introduced interface provides all required commands to translate, rotate, open or close the gripper with the definition of different control modes. In each mode, different commands are provided and direct gaze direction of the user is applied to generate continuous robot commands. To distinguish between natural inspection eye movements and the eye movements that intent to control the robot arm, dynamic command areas are proposed. The dynamic command areas are defined around the robot gripper and are updated with its movements. To provide a direct interaction of the user, gaze gestures and states of the eyes are used to switch between different control modes. For the purpose of this thesis, two versions of the above-introduced HRI were developed. In the first version of the HRI, only two simple gaze gestures and two states of the eye (closed eyes and eye winking) are used for switching. In the second version, instead of the two simple gestures, four complex gaze gestures were applied and the positions of the dynamic command areas were optimized. The complex gaze gestures enable the user to switch directly from initial mode to the desired control mode. These gestures are flexible and can be generated directly in the robot environments. For the recognition of complex gaze gestures, a novel algorithm based on Dynamic Time Warping (DTW) is proposed. The results of the studies conducted with both HRIs confirmed their feasibility and showed the high potential of the proposed interfaces as hands-free interfaces. Furthermore, the results of subjective and objective measurements showed that the usability of the interface with simple gaze gestures had been improved with the integration of complex gaze gestures and the new positions of the dynamic command areas.
- Published
- 2018