Back to Search Start Over

EXGbuds

Authors :
Hsiao-Wei Tung
Ker-Jiun Wang
Prakash C. Thakur
Zhi-Hong Mao
Zihang Huang
Ming-Xian You
Source :
HRI (Companion)
Publication Year :
2018
Publisher :
ACM, 2018.

Abstract

Current assistive technologies need complicated, cumbersome, and expensive equipment, which are not user-friendly, not portable, and often require extensive fine motor control. Our approach aims at solving these problems by developing, a compact, non-obtrusive and ergonomic wearable device, to measure signals associated with human physiological gestures, and thereafter generate useful commands to interact with the environment. Our innovation uses machine learning and non- invasive biosensors on top of the ears to identify eye movements and facial expressions with over 95% accuracy. Users can control different applications, such as a robot, powered wheelchair, cell phone, smart home, or other Internet of Things (IoT) devices. Combined with VR headset and hand gesture recognition devices, user can use our technology to control a camera-mounted robot (e.g., telepresence robot, drones, or any robotic manipulator) to navigate around the environment in first-person's view simply by eye movements and facial expressions. It enables a human- intuitive way of interaction totally 'touch-free'. The experimental results show satisfactory performance in different applications, which can be a powerful tool to help disabled people interact with the environment and measure other physiological signals as a universal controller and health monitoring device.

Details

Database :
OpenAIRE
Journal :
Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction
Accession number :
edsair.doi...........2afb3c5b5e4625bfdf5ceddaa5849b2a
Full Text :
https://doi.org/10.1145/3173386.3177836