[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3173386.3177836acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
abstract

EXGbuds: Universal Wearable Assistive Device for Disabled People to Interact with the Environment Seamlessly

Published: 01 March 2018 Publication History

Abstract

Current assistive technologies need complicated, cumbersome, and expensive equipment, which are not user-friendly, not portable, and often require extensive fine motor control. Our approach aims at solving these problems by developing, a compact, non-obtrusive and ergonomic wearable device, to measure signals associated with human physiological gestures, and thereafter generate useful commands to interact with the environment. Our innovation uses machine learning and non- invasive biosensors on top of the ears to identify eye movements and facial expressions with over 95% accuracy. Users can control different applications, such as a robot, powered wheelchair, cell phone, smart home, or other Internet of Things (IoT) devices. Combined with VR headset and hand gesture recognition devices, user can use our technology to control a camera-mounted robot (e.g., telepresence robot, drones, or any robotic manipulator) to navigate around the environment in first-person's view simply by eye movements and facial expressions. It enables a human- intuitive way of interaction totally 'touch-free'. The experimental results show satisfactory performance in different applications, which can be a powerful tool to help disabled people interact with the environment and measure other physiological signals as a universal controller and health monitoring device.

References

[1]
W. Jia, J. Wu, D. Gao, and M. Sun. Characteristics of Skin-Electrode Impedance for a Novel Screw Electrode. Proceedings of the IEEE Annual Northeast Bioengineering Conference, 1--2, 2014.
[2]
B. Luan, W. Jia, P. D. Thirumala, J. Balzer, D. Gao, and M. Sun. A feasibility study on a single-unit wireless EEG sensor. The 12th IEEE International Conference on Signal Processing (ICSP 2014), Chicago, USA, 2282--2285, October 2014.
[3]
K.-J. Wang, L. Zhang, B. Luan, H.-W. Tung, Q. Liu, J. Wei, M. Sun, and Z.-H. Mao. Brain-computer interface combining eye saccade two-electrode EEG signals and voice cues to improve the maneuverability of wheelchair. Proceedings of the15th IEEE/RAS-EMBS International Conference on Rehabilitation Robotics (ICORR 2017), London, UK, 1073--1078, July 2017.
[4]
Myo Gesture Control Armband website. URL:https://www.myo.com/,2017.
[5]
D. H. Lee and A. K. Anderson. Reading what the mind thinks from how the eye sees. Psychological science, 28(4), 494--503, 2017.
[6]
Lots of people dislike voice assistants. URL: https://www.cnet.com/news/lots- of-people-dislike-voiceassistants-blame-siri/, 2017.

Cited By

View all
  • (2022)Artificial Intelligence of Things Applied to Assistive Technology: A Systematic Literature ReviewSensors10.3390/s2221853122:21(8531)Online publication date: 5-Nov-2022
  • (2021)Computer Vision applied to improve interaction and communication of people with motor disabilities: A systematic mappingTechnology and Disability10.3233/TAD-20030833:1(11-28)Online publication date: 24-Feb-2021
  • (2021)Wearable Interactions for Users with Motor Impairments: Systematic Review, Inventory, and Research ImplicationsProceedings of the 23rd International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3441852.3471212(1-15)Online publication date: 17-Oct-2021
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
HRI '18: Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction
March 2018
431 pages
ISBN:9781450356152
DOI:10.1145/3173386
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 March 2018

Check for updates

Author Tags

  1. brain-computer interface
  2. human-robot interaction
  3. teleoperation
  4. vr/ar
  5. wearable devices

Qualifiers

  • Abstract

Conference

HRI '18
Sponsor:

Acceptance Rates

HRI '18 Paper Acceptance Rate 49 of 206 submissions, 24%;
Overall Acceptance Rate 192 of 519 submissions, 37%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)21
  • Downloads (Last 6 weeks)3
Reflects downloads up to 10 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2022)Artificial Intelligence of Things Applied to Assistive Technology: A Systematic Literature ReviewSensors10.3390/s2221853122:21(8531)Online publication date: 5-Nov-2022
  • (2021)Computer Vision applied to improve interaction and communication of people with motor disabilities: A systematic mappingTechnology and Disability10.3233/TAD-20030833:1(11-28)Online publication date: 24-Feb-2021
  • (2021)Wearable Interactions for Users with Motor Impairments: Systematic Review, Inventory, and Research ImplicationsProceedings of the 23rd International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3441852.3471212(1-15)Online publication date: 17-Oct-2021
  • (2021)Designing Telepresence Drones to Support Synchronous, Mid-air Remote Collaboration: An Exploratory StudyProceedings of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411764.3445041(1-17)Online publication date: 6-May-2021
  • (2020)A Review of Smart Design Based on Interactive Experience in Building SystemsSustainability10.3390/su1217676012:17(6760)Online publication date: 20-Aug-2020
  • (2019)Inferring Static Hand Poses from a Low-Cost Non-Intrusive sEMG SensorSensors10.3390/s1902037119:2(371)Online publication date: 17-Jan-2019
  • (2019)Hand- and gaze-control of telepresence robotsProceedings of the 11th ACM Symposium on Eye Tracking Research & Applications10.1145/3317956.3318149(1-8)Online publication date: 25-Jun-2019
  • (2019)Using Inferred Gestures from sEMG Signal to Teleoperate a Domestic Robot for the DisabledAdvances in Computational Intelligence10.1007/978-3-030-20518-8_17(198-207)Online publication date: 16-May-2019

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media