[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3544793.3560363acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
poster

Tongue Gestures for Hands-Free Interaction in Head Worn Displays

Published: 24 April 2023 Publication History

Abstract

Head worn displays are often used in situations where users’ hands may be occupied or otherwise unusable due to permanent or situational movement impairments. Hands-free interaction methods like voice recognition and gaze tracking allow accessible interaction with reduced limitations for user ability and environment. Tongue gestures offer an alternative method of private, hands-free and accessible interaction. However, past tongue gesture interfaces come in intrusive or otherwise inconvenient form factors preventing their implementation in head worn displays. We present a multimodal tongue gesture interface using existing commercial headsets and sensors only located in the upper face. We consider design factors for choosing robust and usable tongue gestures, introduce eight gestures based on the criteria and discuss early work towards tongue gesture recognition with the system.

References

[1]
Toshiyuki Ando, Yuki Kubo, Buntarou Shizuki, and Shin Takahashi. 2017. CanalSense: Face-Related Movement Recognition System based on Sensing Air Pressure in Ear Canals. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology(UIST ’17). Association for Computing Machinery, New York, NY, USA, 679–689. https://doi.org/10.1145/3126594.3126649
[2]
Abdelkareem Bedri, Richard Li, Malcolm Haynes, Raj Prateek Kosaraju, Ishaan Grover, Temiloluwa Prioleau, Min Yan Beh, Mayank Goel, Thad Starner, and Gregory Abowd. 2017. EarBit: Using Wearable Sensors to Detect Eating Episodes in Unconstrained Environments. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 1, 3 (Sept. 2017), 1–20. https://doi.org/10.1145/3130902
[3]
Victor Chen, Xuhai Xu, Richard Li, Yuanchun Shi, Shwetak Patel, and Yuntao Wang. 2021. Understanding the Design Space of Mouth Microgestures. In Designing Interactive Systems Conference 2021(DIS ’21). Association for Computing Machinery, New York, NY, USA, 1068–1081. https://doi.org/10.1145/3461778.3462004
[4]
Morten Lund Dybdal, Javier San Agustin, and John Paulin Hansen. 2012. Gaze input for mobile devices by dwell and gestures. In Proceedings of the Symposium on Eye Tracking Research and Applications(ETRA ’12). Association for Computing Machinery, New York, NY, USA, 225–228. https://doi.org/10.1145/2168556.2168601
[5]
Mayank Goel, Chen Zhao, Ruth Vinisha, and Shwetak N. Patel. 2015. Tongue-in-Cheek: Using Wireless Signals to Enable Non-Intrusive and Flexible Facial Gestures Detection. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems(CHI ’15). Association for Computing Machinery, New York, NY, USA, 255–258. https://doi.org/10.1145/2702123.2702591
[6]
Naoki Kimura, Tan Gemicioglu, Jonathan Womack, Richard Li, Yuhui Zhao, Abdelkareem Bedri, Zixiong Su, Alex Olwal, Jun Rekimoto, and Thad Starner. 2022. SilentSpeller: Towards mobile, hands-free, silent speech text entry using electropalatography. In CHI Conference on Human Factors in Computing Systems(CHI ’22). Association for Computing Machinery, New York, NY, USA, 1–19. https://doi.org/10.1145/3491102.3502015
[7]
Rasmus Leck Kæseler, Lotte N. S. Andreasen Struijk, and Mads Jochumsen. 2020. Detection and classification of tongue movements from single-trial EEG. In 2020 IEEE 20th International Conference on Bioinformatics and Bioengineering (BIBE). 376–379. https://doi.org/10.1109/BIBE50027.2020.00068 ISSN: 2471-7819.
[8]
Richard Li and Gabriel Reyes. 2018. Buccal: low-cost cheek sensing for inferring continuous jaw motion in mobile virtual reality. In Proceedings of the 2018 ACM International Symposium on Wearable Computers(ISWC ’18). Association for Computing Machinery, New York, NY, USA, 180–183. https://doi.org/10.1145/3267242.3267265
[9]
T. Scott Saponas, Daniel Kelly, Babak A. Parviz, and Desney S. Tan. 2009. Optically sensing tongue gestures for computer input. In Proceedings of the 22nd annual ACM symposium on User interface software and technology(UIST ’09). Association for Computing Machinery, New York, NY, USA, 177–180. https://doi.org/10.1145/1622176.1622209
[10]
Qiao Zhang, Shyamnath Gollakota, Ben Taskar, and Raj P.N. Rao. 2014. Non-intrusive tongue machine interface. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(CHI ’14). Association for Computing Machinery, New York, NY, USA, 2555–2558. https://doi.org/10.1145/2556288.2556981

Cited By

View all
  • (2023)TongueTap: Multimodal Tongue Gesture Recognition with Head-Worn DevicesProceedings of the 25th International Conference on Multimodal Interaction10.1145/3577190.3614120(564-573)Online publication date: 9-Oct-2023
  • (2023)Exploring user-defined gestures for lingual and palatal interactionJournal on Multimodal User Interfaces10.1007/s12193-023-00408-717:3(167-185)Online publication date: 10-Aug-2023

Index Terms

  1. Tongue Gestures for Hands-Free Interaction in Head Worn Displays

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      UbiComp/ISWC '22 Adjunct: Adjunct Proceedings of the 2022 ACM International Joint Conference on Pervasive and Ubiquitous Computing and the 2022 ACM International Symposium on Wearable Computers
      September 2022
      538 pages
      ISBN:9781450394239
      DOI:10.1145/3544793
      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 24 April 2023

      Check for updates

      Author Tags

      1. BCI
      2. hands-free
      3. multimodal interface
      4. non-intrusive
      5. tongue gestures
      6. tongue interface

      Qualifiers

      • Poster
      • Research
      • Refereed limited

      Conference

      UbiComp/ISWC '22

      Acceptance Rates

      Overall Acceptance Rate 764 of 2,912 submissions, 26%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)45
      • Downloads (Last 6 weeks)4
      Reflects downloads up to 13 Jan 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2023)TongueTap: Multimodal Tongue Gesture Recognition with Head-Worn DevicesProceedings of the 25th International Conference on Multimodal Interaction10.1145/3577190.3614120(564-573)Online publication date: 9-Oct-2023
      • (2023)Exploring user-defined gestures for lingual and palatal interactionJournal on Multimodal User Interfaces10.1007/s12193-023-00408-717:3(167-185)Online publication date: 10-Aug-2023

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media