[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3411764.3445197acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article
Open access

SoloFinger: Robust Microgestures while Grasping Everyday Objects

Published: 07 May 2021 Publication History

Abstract

Using microgestures, prior work has successfully enabled gestural interactions while holding objects. Yet, these existing methods are prone to false activations caused by natural finger movements while holding or manipulating the object. We address this issue with SoloFinger, a novel concept that allows design of microgestures that are robust against movements that naturally occur during primary activities. Using a data-driven approach, we establish that single-finger movements are rare in everyday hand-object actions and infer a single-finger input technique resilient to false activation. We demonstrate this concept’s robustness using a white-box classifier on a pre-existing dataset comprising 36 everyday hand-object actions. Our findings validate that simple SoloFinger gestures can relieve the need for complex finger configurations or delimiting gestures and that SoloFinger is applicable to diverse hand-object actions. Finally, we demonstrate SoloFinger’s high performance on commodity hardware using random forest classifiers.

References

[1]
Saleema Amershi, Dan Weld, Mihaela Vorvoreanu, Adam Fourney, Besmira Nushi, Penny Collisson, Jina Suh, Shamsi Iqbal, Paul N. Bennett, Kori Inkpen, Jaime Teevan, Ruth Kikin-Gil, and Eric Horvitz. 2019. Guidelines for Human-AI Interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’19). ACM. https://doi.org/10.1145/3290605.3300233
[2]
Fraser Anderson, Tovi Grossman, Daniel Wigdor, and George Fitzmaurice. 2015. Supporting Subtlety with Deceptive Devices and Illusory Interactions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’15). ACM. https://doi.org/10.1145/2702123.2702336
[3]
Leonardo Angelini, Francesco Carrino, Stefano Carrino, Maurizio Caon, Omar Abou Khaled, Jürgen Baumgartner, Andreas Sonderegger, Denis Lalanne, and Elena Mugellini. 2014. Gesturing on the Steering Wheel. In Proceedings of the International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI ’14). ACM. https://doi.org/10.1145/2667317.2667414
[4]
Kazuyuki Arimatsu and Hideki Mori. 2020. Evaluation of Machine Learning Techniques for Hand Pose Estimation on Handheld Device with Proximity Sensor. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’20). ACM. https://doi.org/10.1145/3313831.3376712
[5]
Ian M. Bullock, Joshua Z. Zheng, Sara De La Rosa, Charlotte Guertler, and Aaron M. Dollar. 2013. Grasp Frequency and Usage in Daily Household and Machine Shop Tasks. IEEE Transactions on Haptics. https://doi.org/10.1109/toh.2013.6
[6]
Liwei Chan, Rong-Hao Liang, Ming-Chang Tsai, Kai-Yin Cheng, Chao-Huai Su, Mike Y. Chen, Wen-Huang Cheng, and Bing-Yu Chen. 2013. FingerPad: private and subtle interaction using fingertips. In Proceedings of the SIGCHI Symposium on User Interface Software and Technology (UIST ’13). ACM. https://doi.org/10.1145/2501988.2502016
[7]
Maximilian Christ, Nils Braun, Julius Neuffer, and Andreas W. Kempa-Liehr. 2018. Time Series FeatuRe Extraction on basis of Scalable Hypothesis tests (tsfresh – A Python package). Neurocomputing 307. https://doi.org/10.1016/j.neucom.2018.03.067
[8]
Vanessa Cobus and Wilko Heuten. 2019. To Beep or Not to Beep? Evaluating Modalities for Multimodal ICU Alarms. Multimodal Technologies and Interaction 3. https://doi.org/10.3390/mti3010015
[9]
M.R. Cutkosky. 1989. On grasp choice, grasp models, and the design of hands for manufacturing tasks. IEEE Transactions on Robotics and Automation 5. https://doi.org/10.1109/70.34763
[10]
Travis Deyle, Szabolcs Palinko, Erika Shehan Poole, and Thad Starner. 2007. Hambone: A Bio-Acoustic Gesture Interface. In IEEE International Symposium on Wearable Computers (ISWC ’07). IEEE. https://doi.org/10.1109/iswc.2007.4373768
[11]
Tanja Döring, Dagmar Kern, Paul Marshall, Max Pfeiffer, Johannes Schöning, Volker Gruhn, and Albrecht Schmidt. 2011. Gestural interaction on the steering wheel. In Proceedings of the SIGCHI Conference on Human factors in Computing Systems (CHI ’11). ACM. https://doi.org/10.1145/1978942.1979010
[12]
Graham Dove, Kim Halskov, Jodi Forlizzi, and John Zimmerman. 2017. UX Design Innovation. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’17). ACM. https://doi.org/10.1145/3025453.3025739
[13]
Anna Maria Feit, Daryl Weir, and Antti Oulasvirta. 2016. How We Type: Movement Strategies and Performance in Everyday Typing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’16). ACM. https://doi.org/10.1145/2858036.2858233
[14]
Thomas Feix, Javier Romero, Heinz-Bodo Schmiedmayer, Aaron M. Dollar, and Danica Kragic. 2016. The GRASP Taxonomy of Human Grasp Types. IEEE Transactions on Human-Machine Systems. https://doi.org/10.1109/thms.2015.2470657
[15]
George W. Fitzmaurice, Hiroshi Ishii, and William A. S. Buxton. 1995. Bricks: laying the foundations for graspable user interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’95). ACM. https://doi.org/10.1145/223904.223964
[16]
Euan Freeman, Gareth Griffiths, and Stephen A. Brewster. 2017. Rhythmic micro-gestures: discreet interaction on-the-go. In Proceedings of the International Conference on Multimodal Interaction (ICMI ’17). ACM. https://doi.org/10.1145/3136755.3136815
[17]
Guillermo Garcia-Hernando, Shanxin Yuan, Seungryul Baek, and Tae-Kyun Kim. 2018. First-Person Hand Action Benchmark with RGB-D Videos and 3D Hand Pose Annotations. In Proceedings of Computer Vision and Pattern Recognition (CVPR ’18). IEEE. https://doi.org/10.1109/cvpr.2018.00050
[18]
Leilani H. Gilpin, David Bau, Ben Z. Yuan, Ayesha Bajwa, Michael Specter, and Lalana Kagal. 2018. Explaining Explanations: An Overview of Interpretability of Machine Learning. In International Conference on Data Science and Advanced Analytics (DSAAI ’18). IEEE. https://doi.org/10.1109/dsaa.2018.00018
[19]
Jun Gong, Yang Zhang, Xia Zhou, and Xing-Dong Yang. 2017. Pyro: Thumb-Tip Gesture Recognition Using Pyroelectric Infrared Sensing. In Proceedings of the SIGCHI Symposium on User Interface Software and Technology (UIST ’17). ACM. https://doi.org/10.1145/3126594.3126615
[20]
Victor Gonzalez-Sanchez, Jennifer Rowson, and Alaster Yoxall. 2016. Analysis of finger movement coordination during the Variable Dexterity Test and comparative activities of daily living. International Journal of Therapy and Rehabilitation 23. https://doi.org/10.12968/ijtr.2016.23.10.481
[21]
Daniel Groeger and Jürgen Steimle. 2018. ObjectSkin: Augmenting Everyday Objects with Hydroprinted Touch Sensors and Displays. In Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT ’18). ACM. https://doi.org/10.1145/3161165
[22]
Seongkook Heo, Michelle Annett, Benjamin Lafreniere, Tovi Grossman, and George Fitzmaurice. 2017. No Need to Stop What You’Re Doing: Exploring No-Handed Smartwatch Interaction. In Proceedings of the Graphics Interface Conference (GI ’17). Canadian Human-Computer Communications Society. https://dl.acm.org/doi/10.5555/3141475.3141498
[23]
Ken Hinckley, Patrick Baudisch, Gonzalo Ramos, and Francois Guimbretiere. 2005. Design and analysis of delimiters for selection-action pen gesture phrases in scriboli. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’05). ACM. https://doi.org/10.1145/1054972.1055035
[24]
Da-Yuan Huang, Liwei Chan, Shuo Yang, Fan Wang, Rong-Hao Liang, De-Nian Yang, Yi-Ping Hung, and Bing-Yu Chen. 2016. DigitSpace: Designing Thumb-to-Fingers Touch Interfaces for One-Handed and Eyes-Free Interactions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’16). ACM. https://doi.org/10.1145/2858036.2858483
[25]
Harmanpreet Kaur, Harsha Nori, Samuel Jenkins, Rich Caruana, Hanna Wallach, and Jennifer Wortman Vaughan. 2020. Interpreting Interpretability: Understanding Data Scientists' Use of Interpretability Tools for Machine Learning. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’20). ACM. https://doi.org/10.1145/3313831.3376219
[26]
Ryo Kawahata, Atsushi Shimada, Takayoshi Yamashita, Hideaki Uchiyama, and Rin ichiro Taniguchi. 2016. Design of a Low-false-positive Gesture for a Wearable Device. In Proceedings of the International Conference on Pattern Recognition Applications and Methods (ICPRAM ’16). SCITEPRESS. https://doi.org/10.5220/0005701905810588
[27]
Frederic Kerber, Philipp Schardt, and Markus Löchtefeld. 2015. WristRotate: a personalized motion gesture delimiter for wrist-worn devices. In Proceedings of the International Conference on Mobile and Ubiquitous Multimedia (MUM ’15). ACM. https://doi.org/10.1145/2836041.2836063
[28]
Daniel Kohlsdorf, Thad Starner, and Daniel Ashbrook. 2011. MAGIC 2.0: A web tool for false positive prediction and prevention for gesture recognition systems. In International Conference on Automatic Face and Gesture Recognition 2011. IEEE. https://doi.org/10.1109/fg.2011.5771412
[29]
Gierad Laput, Robert Xiao, and Chris Harrison. 2016. ViBand: High-Fidelity Bio-Acoustic Sensing Using Commodity Smartwatch Accelerometers. In Proceedings of the SIGCHI Symposium on User Interface Software and Technology (UIST ’16). ACM. https://doi.org/10.1145/2984511.2984582
[30]
Huy Viet Le, Sven Mayer, and Niels Henze. 2018. InfiniTouch: Finger-Aware Interaction on Fully Touch Sensitive Smartphones. In Proceedings of the SIGCHI Symposium on User Interface Software and Technology (UIST ’18). ACM. https://doi.org/10.1145/3242587.3242605
[31]
Huy Viet Le, Sven Mayer, Benedict Steuerlein, and Niels Henze. 2019. Investigating Unintended Inputs for One-Handed Touch Interaction Beyond the Touchscreen. In Proceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI ’19). ACM. https://doi.org/10.1145/3338286.3340145
[32]
Hanchuan Li, Can Ye, and Alanson P. Sample. 2015. IDSense: A human object interaction detection system based on passive UHF RFID. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’15). ACM. https://doi.org/10.1145/2702123.2702178
[33]
Jaime Lien, Nicholas Gillian, M. Emre Karagozler, Patrick Amihood, Carsten Schwesig, Erik Olson, Hakim Raja, and Ivan Poupyrev. 2016. Soli: ubiquitous gesture sensing with millimeter wave radar. In Transactions on Graphics (SIGGRAPH ’16). ACM. https://doi.org/10.1145/2897824.2925953
[34]
Zachary C. Lipton. 2018. The mythos of model interpretability. In Communications of the ACM. ACM. https://doi.org/10.1145/3233231
[35]
Hao Lu and Yang Li. 2015. Gesture On: Enabling Always-On Touch Gestures for Fast Mobile Access from the Device Standby Mode. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’15). ACM. https://doi.org/10.1145/2702123.2702610
[36]
Christine L MacKenzie and Thea Iberall. 1994. The Grasping Hand, Volume 104. Elsevier.
[37]
Fabrice Matulic, Riku Arakawa, Brian Vogel, and Daniel Vogel. 2020. PenSight: Enhanced Interaction with a Pen-Top Camera. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’20). ACM. https://doi.org/10.1145/3313831.3376147
[38]
Fabrice Matulic, Brian Vogel, Naoki Kimura, and Daniel Vogel. 2019. Eliciting Pen-Holding Postures for General Input with Suitability for EMG Armband Detection. In Proceedings of the ACM International Conference on Interactive Surfaces and Spaces (ISS ’19). ACM. https://doi.org/10.1145/3343055.3359720
[39]
Franziska Mueller, Florian Bernard, Oleksandr Sotnychenko, Dushyant Mehta, Srinath Sridhar, Dan Casas, and Christian Theobalt. 2018. GANerated Hands for Real-Time 3D Hand Tracking from Monocular RGB. In Proceedings of Computer Vision and Pattern Recognition (CVPR ’18). IEEE. https://handtracker.mpi-inf.mpg.de/projects/GANeratedHands/
[40]
John Russell. Napier and Russell H. Tuttle. 1993. Hands. Princeton University Press.
[41]
Fernando Nogueira. 2014–. Bayesian Optimization: Open source constrained global optimization tool for Python. https://github.com/fmfn/BayesianOptimization
[42]
Noitom. 2020 (accessed August 10, 2020). Hi5 VR GLOVE. https://hi5vrglove.com
[43]
Donald A. Norman. 1998. The design of everyday things. MIT.
[44]
Alex Olwal, Thad Starner, and Gowa Mainini. 2020. E-Textile Microinteractions: Augmenting Twist with Flick, Slide and Grasp Gestures for Soft Electronics. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’20). ACM. https://doi.org/10.1145/3313831.3376236
[45]
OptiTrack. 2020 (accessed August 10, 2020). Motive - Optical motion capture software. https://optitrack.com/products/motive/
[46]
Farshid Salemi Parizi, Eric Whitmire, and Shwetak Patel. 2019. AuraRing: Precise Electromagnetic Finger Tracking. In Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT ’19), Vol. 3. ACM. https://doi.org/10.1145/3369831
[47]
Fabian Pedregosa, Gaël Varoquaux, Alexandre Gramfort, Vincent Michel, Bertrand Thirion, Olivier Grisel, Mathieu Blondel, Peter Prettenhofer, Ron Weiss, Vincent Dubourg, 2011. Scikit-learn: Machine learning in Python. The Journal of machine Learning Research 12 (2011).
[48]
Patryk Pomykalski, Mikołaj P. Woźniak, Paweł W. Woźniak, Krzysztof Grudzień, Shengdong Zhao, and Andrzej Romanowski. 2020. Considering Wake Gestures for Smart Assistant Use. In Extended Abstracts of the SIGCHI Conference on Human Factors in Computing Systems (CHI EA ’20). ACM. https://doi.org/10.1145/3334480.3383089
[49]
Narjes Pourjafarian, Anusha Withana, Joseph A. Paradiso, and Jürgen Steimle. 2019. Multi-Touch Kit: A Do-It-Yourself Technique for Capacitive Multi-Touch Sensing Using a Commodity Microcontroller. In Proceedings of the SIGCHI Symposium on User Interface Software and Technology (UIST ’19). ACM. https://doi.org/10.1145/3332165.3347895
[50]
Philip Quinn, Seungyon Claire Lee, Melissa Barnhart, and Shumin Zhai. 2019. Active Edge: Designing Squeeze Gestures for the Google Pixel 2. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’19). ACM. https://doi.org/10.1145/3290605.3300504
[51]
Christian Rendl, David Kim, Patrick Parzer, Sean Fanello, Martin Zirkl, Gregor Scheipl, Michael Haller, and Shahram Izadi. 2016. FlexCase: Enhancing Mobile Interaction with a Flexible Sensing and Display Cover. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’16). ACM. https://doi.org/10.1145/2858036.2858314
[52]
Jaime Ruiz and Yang Li. 2011. DoubleFlip: a motion gesture delimiter for mobile interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’11). ACM. https://doi.org/10.1145/1978942.1979341
[53]
T. Scott Saponas, Desney S. Tan, Dan Morris, Ravin Balakrishnan, Jim Turner, and James A. Landay. 2009. Enabling always-available input with muscle-computer interfaces. In Proceedings of the SIGCHI Symposium on User Interface Software and Technology (UIST ’09). ACM. https://doi.org/10.1145/1622176.1622208
[54]
M. H. Schieber. 1991. Individuated finger movements of rhesus monkeys: a means of quantifying the independence of the digits. Journal of Neurophysiology 65. https://doi.org/10.1152/jn.1991.65.6.1381
[55]
G. Schlesinger. 1919. Der mechanische Aufbau der künstlichen Glieder. In Ersatzglieder und Arbeitshilfen. Springer Berlin Heidelberg, 321–661. https://doi.org/10.1007/978-3-662-33009-8_13
[56]
Philipp Schoessler, Sang won Leigh, Krithika Jagannath, Patrick van Hoof, and Hiroshi Ishii. 2015. Cord UIs: Controlling Devices with Augmented Cables. In Proceedings of the International Conference on Tangible, Embedded, and Embodied Interaction (TEI ’15). ACM. https://doi.org/10.1145/2677199.2680601
[57]
Marcos Serrano, Eric Lecolinet, and Yves Guiard. 2013. Bezel-Tap Gestures: Quick Activation of Commands from Sleep Mode on Tablets. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’13). ACM. https://doi.org/10.1145/2470654.2481421
[58]
Adwait Sharma, Joan Sol Roo, and Jürgen Steimle. 2019. Grasping Microgestures: Eliciting Single-hand Microgestures for Handheld Objects. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’19). ACM. https://doi.org/10.1145/3290605.3300632
[59]
Yilei Shi, Haimo Zhang, Kaixing Zhao, Jiashuo Cao, Mengmeng Sun, and Suranga Nanayakkara. 2020. Ready, Steady, Touch!: Sensing Physical Contact with a Finger-Mounted IMU. In Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT ’20), Vol. 4. ACM. https://doi.org/10.1145/3397309
[60]
Hyunyoung Song, Hrvoje Benko, Francois Guimbretiere, Shahram Izadi, Xiang Cao, and Ken Hinckley. 2011. Grips and gestures on a multi-touch pen. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’11). ACM. https://doi.org/10.1145/1978942.1979138
[61]
Srinath Sridhar, Anna Maria Feit, Christian Theobalt, and Antti Oulasvirta. 2015. Investigating the Dexterity of Multi-Finger Input for Mid-Air Text Entry. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’15). ACM. https://doi.org/10.1145/2702123.2702136
[62]
Srinath Sridhar, Franziska Mueller, Antti Oulasvirta, and Christian Theobalt. 2015. Fast and Robust Hand Tracking Using Detection-Guided Optimization. In Proceedings of Computer Vision and Pattern Recognition (CVPR ’15). IEEE. http://handtracker.mpi-inf.mpg.de/projects/FastHandTracker/
[63]
Thad Starner. 2013. Project Glass: An Extension of the Self. IEEE Pervasive Computing 12 (April 2013). https://doi.org/10.1109/mprv.2013.35
[64]
Brandon T. Taylor and V Michael Bove. 2008. The Bar of Soap: A Grasp Recognition System Implemented in a Multi-functional Handheld Device. In Extended Abstracts of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’08). ACM. https://doi.org/10.1145/1358628.1358874
[65]
Brandon T. Taylor and V. Michael Bove. 2009. Graspables: grasp-recognition as a user interface. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’09). ACM. https://doi.org/10.1145/1518701.1518842
[66]
Bugra Tekin, Federica Bogo, and Marc Pollefeys. 2019. H+O: Unified Egocentric Recognition of 3D Hand-Object Poses and Interactions. In Proceedings of Computer Vision and Pattern Recognition (CVPR ’19). IEEE. https://doi.org/10.1109/cvpr.2019.00464
[67]
Bryan Wang and Tovi Grossman. 2020. BlyncSync: Enabling Multimodal Smartwatch Gestures with Synchronous Touch and Blink. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’20). ACM. https://doi.org/10.1145/3313831.3376132
[68]
Jindong Wang, Yiqiang Chen, Shuji Hao, Xiaohui Peng, and Lisha Hu. 2019. Deep learning for sensor-based activity recognition: A survey. Pattern Recognition Letters 119. https://doi.org/10.1016/j.patrec.2018.02.010
[69]
Daniel Wigdor and Dennis Wixon. 2011. Brave NUI world : designing natural user interfaces for touch and gesture. Morgan Kaufmann/Elsevier.
[70]
John Williamson. 2016. Fingers of a Hand Oscillate Together: Phase Syncronisation of Tremor in Hover Touch Sensing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’16). ACM. https://doi.org/10.1145/2858036.2858235
[71]
Raphael Wimmer. 2010. Grasp sensing for human-computer interaction. In Proceedings of the International Conference on Tangible Embedded and Embodied Interaction (TEI ’10). ACM. https://doi.org/10.1145/1935701.1935745
[72]
Katrin Wolf. 2016. Microgestures—Enabling Gesture Input with Busy Hands. In Peripheral Interaction: Challenges and Opportunities for HCI in the Periphery of Attention. Springer International Publishing. https://doi.org/10.1007/978-3-319-29523-7_5
[73]
Katrin Wolf, Sven Mayer, and Stephan Meyer. 2016. Microgesture detection for remote interaction with mobile devices. In Proceedings of the International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct (MobileHCI ’16). ACM. https://doi.org/10.1145/2957265.2961865
[74]
Katrin Wolf, Anja Naumann, Michael Rohs, and Jörg Müller. 2011. A Taxonomy of Microinteractions: Defining Microgestures Based on Ergonomic and Scenario-Dependent Requirements. In Human-Computer Interaction – INTERACT 2011. Springer. https://doi.org/10.1007/978-3-642-23774-4_45
[75]
Shanxin Yuan, Qi Ye, Bjorn Stenger, Siddhant Jain, and Tae-Kyun Kim. 2017. BigHand2.2M Benchmark: Hand Pose Dataset and State of the Art Analysis. In Proceedings of Computer Vision and Pattern Recognition (CVPR ’17). IEEE. https://doi.org/10.1109/cvpr.2017.279
[76]
Yang Zhang, Gierad Laput, and Chris Harrison. 2017. Electrick: Low-Cost Touch Sensing Using Electric Field Tomography. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’17). ACM. https://doi.org/10.1145/3025453.3025842

Cited By

View all
  • (2024)Studying the Simultaneous Visual Representation of MicrogesturesProceedings of the ACM on Human-Computer Interaction10.1145/36765238:MHCI(1-34)Online publication date: 24-Sep-2024
  • (2024)Understanding Gesture and Microgesture Inputs for Augmented Reality MapsProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661630(409-423)Online publication date: 1-Jul-2024
  • (2024)Stick-To-XR: Understanding Stick-Based User Interface Design for Extended RealityProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661627(168-179)Online publication date: 1-Jul-2024
  • Show More Cited By

Index Terms

  1. SoloFinger: Robust Microgestures while Grasping Everyday Objects
      Index terms have been assigned to the content through auto-classification.

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      CHI '21: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems
      May 2021
      10862 pages
      ISBN:9781450380966
      DOI:10.1145/3411764
      This work is licensed under a Creative Commons Attribution-NonCommercial International 4.0 License.

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 07 May 2021

      Check for updates

      Author Tags

      1. everyday objects
      2. false activation
      3. grasping
      4. microgesture

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Funding Sources

      Conference

      CHI '21
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

      Upcoming Conference

      CHI 2025
      ACM CHI Conference on Human Factors in Computing Systems
      April 26 - May 1, 2025
      Yokohama , Japan

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)590
      • Downloads (Last 6 weeks)65
      Reflects downloads up to 11 Dec 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Studying the Simultaneous Visual Representation of MicrogesturesProceedings of the ACM on Human-Computer Interaction10.1145/36765238:MHCI(1-34)Online publication date: 24-Sep-2024
      • (2024)Understanding Gesture and Microgesture Inputs for Augmented Reality MapsProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661630(409-423)Online publication date: 1-Jul-2024
      • (2024)Stick-To-XR: Understanding Stick-Based User Interface Design for Extended RealityProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661627(168-179)Online publication date: 1-Jul-2024
      • (2024)GraV: Grasp Volume Data for the Design of One-Handed XR InterfacesProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661567(151-167)Online publication date: 1-Jul-2024
      • (2024)GraspUI: Seamlessly Integrating Object-Centric Gestures within the Seven Phases of GraspingProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661551(1275-1289)Online publication date: 1-Jul-2024
      • (2024)Designing Stick-Based Extended Reality Controllers: A Participatory ApproachExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3650925(1-6)Online publication date: 11-May-2024
      • (2024)ecSkin: Low-Cost Fabrication of Epidermal Electrochemical Sensors for Detecting Biomarkers in SweatProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642232(1-20)Online publication date: 11-May-2024
      • (2024)A survey of deep learning methods and datasets for hand pose estimation from hand-object interaction imagesComputers and Graphics10.1016/j.cag.2023.09.013116:C(474-490)Online publication date: 4-Mar-2024
      • (2023)VibAware: Context-Aware Tap and Swipe Gestures Using Bio-Acoustic SensingProceedings of the 2023 ACM Symposium on Spatial User Interaction10.1145/3607822.3614544(1-12)Online publication date: 13-Oct-2023
      • (2023)Studying the Visual Representation of MicrogesturesProceedings of the ACM on Human-Computer Interaction10.1145/36042727:MHCI(1-36)Online publication date: 13-Sep-2023
      • Show More Cited By

      View Options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Login options

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media