[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3173574.3173813acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

KeyTime: Super-Accurate Prediction of Stroke Gesture Production Times

Published: 21 April 2018 Publication History

Abstract

We introduce KeyTime, a new technique and accompanying software for predicting the production times of users' stroke gestures articulated on touchscreens. KeyTime employs the principles and concepts of the Kinematic Theory, such as lognormal modeling of stroke gestures' velocity profiles, to estimate gesture production times significantly more accurately than existing approaches. Our experimental results obtained on several public datasets show that KeyTime predicts user-independent production times that correlate r=.99 with groundtruth from just one example of a gesture articulation, while delivering an average error in the predicted time magnitude that is 3 to 6 times smaller than that delivered by CLC, the best prediction technique up to date. Moreover, KeyTime reports a wide range of useful statistics, such as the trimmed mean, median, standard deviation, and confidence intervals, providing practitioners with unprecedented levels of accuracy and sophistication to characterize their users' a priori time performance with stroke gesture input.

Supplementary Material

suppl.mov (pn2482-file5.mp4)
Supplemental video

References

[1]
Johnny Accot and Shumin Zhai. 1997. Beyond Fitts' law: Models for trajectory-based HCI tasks. In Proc. CHI '97.
[2]
Abdullah Almaksour, Eric Anquetil, Réjean Plamondon, and Christian O'Reilly. 2011. Synthetic handwritten gesture generation using Sigma-Lognormal model for evolving handwriting classifiers. In Proc. IGS '11.
[3]
Lisa Anthony, Radu-Daniel Vatavu, and Jacob O. Wobbrock. 2013. Understanding the consistency of users' pen and finger stroke gesture articulation. In Proc. GI '13.
[4]
Lisa Anthony and Jacob O. Wobbrock. 2010. A lightweight multistroke recognizer for user interface prototypes. In Proc. GI '10.
[5]
Lisa Anthony and Jacob O. Wobbrock. 2012. $N-protractor: a fast and accurate multistroke recognizer. In Proc. GI '12.
[6]
Caroline Appert and Shumin Zhai. 2009. Using strokes as command shortcuts: Cognitive benefits and toolkit support. In Proc. CHI '09.
[7]
Xiaojun Bi, Yang Li, and Shumin Zhai. 2013. Ffitts law: Modeling finger touch with Fitts' law. In Proc. CHI '13.
[8]
Rachel Blagojevic, Samuel Hsiao-Heng Chang, and Beryl Plimmer. 2010. The power of automatic feature selection: Rubine on steroids. In Proc. SBIM '10.
[9]
Xiang Cao and Shumin Zhai. 2007. Modeling human performance of pen stroke gestures. In Proc. CHI '07.
[10]
Stuart K. Card, Thomas P. Moran, and Allen Newell. 1980. The keystroke-level model for user performance time with interactive systems. Commun. ACM 23(1).
[11]
Steven J. Castellucci and I. Scott MacKenzie. 2008. Graffiti vs. Unistrokes: An empirical comparison. In Proc. CHI '08.
[12]
Moussa Djioua and Réjean Plamondon. 2009. Studying the variability of handwriting patterns using the Kinematic Theory. Hum. Mov. Sci. 28(5).
[13]
Paul M. Fitts. 1954. The information capacity of the human motor system in controlling the amplitude of movement. J. Exp. Psychol. 47(6).
[14]
Tamar Flash and Neville Hogan. 1985. The coordination of arm movements: an experimentally confirmed mathematical model. J. Neurosci. 5(7).
[15]
Javier Galbally, Réjean Plamondon, Julián Fierrez, and Javier Ortega-García. 2012. Synthetic on-line signature generation. Part II: Experimental validation. Pattern Recogn. 45(7).
[16]
Poika Isokoski. 2001. Model for unistroke writing time. In Proc. CHI '01.
[17]
Shaun K. Kane, Jeffrey P. Bigham, and Jacob O. Wobbrock. 2008. Slide rule: Making mobile touch screens accessible to blind people using multi-touch interaction techniques. In Proc. ASSETS '08.
[18]
Shaun K. Kane, Jacob O. Wobbrock, and Richard E. Ladner. 2011. Usable gestures for blind people: Understanding preference and performance. In Proc. CHI '11.
[19]
Per Ola Kristensson and Shumin Zhai. 2004. SHARK2 : A large vocabulary shorthand writing system for pen-based computers. In Proc. UIST '04.
[20]
Luis A. Leiva. 2017. Large-scale user perception of synthetic stroke gestures. In Proc. DIS '17.
[21]
Luis A. Leiva, Vicent Alabau, Verónica Romero, Alejandro H. Toselli, and Enrique Vidal. 2014. Context-aware gestures for mixed-initiative text editing UIs. Interact. Comput. 27(1).
[22]
Luis A. Leiva, Daniel Martín-Albo, and Réjean Plamondon. 2016. Gestures à Go Go: Authoring synthetic human-like stroke gestures using the kinematic theory of rapid movements. ACM T. Intel. Syst. Tec. 7(2).
[23]
Luis A. Leiva, Daniel Martín-Albo, and Réjean Plamondon. 2017a. The Kinematic Theory produces human-like stroke gestures. Interact. Comput. 29(4).
[24]
Luis A. Leiva, Daniel Martín-Albo, and Radu-Daniel Vatavu. 2017b. Synthesizing stroke gestures across user populations: A case for users with visual impairments. In Proc. CHI '17.
[25]
A. Chris Long, Jr., James A. Landay, Lawrence A. Rowe, and Joseph Michiels. 2000. Visual similarity of pen gestures. In Proc. CHI '00.
[26]
Daniel Martín-Albo and Luis A. Leiva. 2016. G3: bootstrapping stroke gestures design with synthetic samples and built-in recognizers. In Proc. MobileHCI '16.
[27]
Daniel Martín-Albo, Réjean Plamondon, and Enrique Vidal. 2014. Training of on-line handwriting text recognizers with synthetic text generated using the Kinematic Theory of rapid human movements. In Proc. ICFHR '14.
[28]
Daniel Martín-Albo, Réjean Plamondon, and Enrique Vidal. 2015. Improving sigma-lognormal parameter extraction. In Proc. ICDAR '15.
[29]
Jörg Müller, Antti Oulasvirta, and Roderick Murray-Smith. 2017. Control theoretic models of pointing. ACM Trans. Comput.-Hum. Interact. 24(4).
[30]
Christian O'Reilly, Réjean Plamondon, Mohamed K. Landou, and Brigitte Stemmer. 2013. Using kinematic analysis of movement to predict the time occurrence of an evoked potential associated with a motor command. Eur. J. Neurosci. 37(2).
[31]
Réjean Plamondon. 1995a. A kinematic theory of rapid human movements. Part I: Movement representation and control. Biol. Cybern. 72(4).
[32]
Réjean Plamondon. 1995b. A kinematic theory of rapid human movements. Part II: Movement time and control. Biol. Cybern. 72(4).
[33]
Réjean Plamondon, Adel M. Alimi, Pierre Yergeau, and Franck Leclerc. 1993. Modelling velocity profiles of rapid movements: a comparative study. Biol. Cybern. 69(1).
[34]
Réjean Plamondon and Moussa Djioua. 2006. A multi-level representation paradigm for handwriting stroke generation. Hum. Mov. Sci. 25(4--5).
[35]
Réjean Plamondon, Chunhua Feng, and Anna Woch. 2003. A kinematic theory of rapid human movement. Part IV: A formal mathematical proof and new insights. Biol. Cybern. 89(2).
[36]
Philip Quinn and Shumin Zhai. 2016. Modeling gesture-typing movements. Hum.-Comput. Interact. Advance online publication.
[37]
Yosra Rekik, Radu-Daniel Vatavu, and Laurent Grisoni. 2014. Understanding users' perceived difficulty of multi-touch gesture articulation. In Proc. ICMI '14.
[38]
Quentin Roy, Sylvain Malacria, Yves Guiard, Eric Lecolinet, and James Eagan. 2013. Augmented letters: Mnemonic gesture-based shortcuts. In Proc. CHI '13.
[39]
Eugene M. Taranta, II, Mehran Maghoumi, Corey R. Pittman, and Joseph J. LaViola, Jr. 2016. A rapid prototyping approach to synthetic data generation for improved 2D gesture recognition. In Proc. UIST '16.
[40]
Eugene M. Taranta II, Amirreza Samiei, Mehran Maghoumi, Pooya Khaloo, Corey R. Pittman, and Joseph J. LaViola Jr. 2017. Jackknife: A reliable recognizer with few samples and many modalities. In Proc. CHI '17.
[41]
Huawei Tu, Xiangshi Ren, and Shumin Zhai. 2012. A comparative evaluation of finger and pen stroke gestures. In Proc. CHI '12.
[42]
John W. Tukey. 1977. Exploratory data analysis. Addison-Wesley Publishing Co.
[43]
Radu-Daniel Vatavu, Lisa Anthony, and Jacob O. Wobbrock. 2012. Gestures as point clouds: a $P recognizer for user interface prototypes. In Proc. ICMI '12.
[44]
Radu-Daniel Vatavu, Lisa Anthony, and Jacob O. Wobbrock. 2013. Relative accuracy measures for stroke gestures. In Proc. ICMI '13.
[45]
Radu-Daniel Vatavu, Lisa Anthony, and Jacob O. Wobbrock. 2014. Gesture heatmaps: Understanding gesture performance with colorful visualizations. In Proc. ICMI '14.
[46]
Radu-Daniel Vatavu, Gabriel Cramariuc, and Doina Maria Schipor. 2015. Touch interaction for children aged 3 to 6 years: Experimental findings and relationship to motor skills. Int. J. Hum.-Comput. Stud. 74.
[47]
Radu-Daniel Vatavu, Daniel Vogel, Géry Casiez, and Laurent Grisoni. 2011. Estimating the perceived difficulty of pen gestures. In Proc. INTERACT '11.
[48]
Paolo Viviani and Tamar Flash. 1995. Minimum-jerk, two-thirds power law, and isochrony: converging approaches to movement planning. J. Exp. Psychol. 21(1).
[49]
Paolo Viviani and Carlo Terzuolo. 1982. Trajectory determines movement dynamics. Neuroscience 7(2).
[50]
Rand Wilcox. 2012. Modern Statistics for the Social and Behavioral Sciences. Taylor & Francis Group, LLC, Boca Raton, FL, USA.
[51]
Jacob O. Wobbrock, Edward Cutrell, Susumu Harada, and I. Scott MacKenzie. 2008. An error model for pointing based on fitts' law. In Proc. CHI '08.
[52]
Jacob O. Wobbrock, Andrew D. Wilson, and Yang Li. 2007. Gestures without libraries, toolkits or training: A $1 recognizer for user interface prototypes. In Proc. UIST '07.
[53]
Shumi Zhai, Per Ola Kristensson, Caroline Appert, Tue H. Anderson, and Xiang Cao. 2012. Foundational issues in touch-surface stroke gesture design - an integrative review. In Foundations and Trends in Human-Computer Interaction. Vol. 5.

Cited By

View all
  • (2024)Take a Seat, Make a Gesture: Charting User Preferences for On-Chair and From-Chair Gesture InputProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642028(1-17)Online publication date: 11-May-2024
  • (2024)Generating Virtual Reality Stroke Gesture Data from Out-of-Distribution Desktop Stroke Gesture Data2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00093(732-742)Online publication date: 16-Mar-2024
  • (2023)An Expressivity-Complexity Tradeoff?: User-Defined Gestures from the Wheelchair Space are Mostly DeicticExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3585695(1-8)Online publication date: 19-Apr-2023
  • Show More Cited By

Index Terms

  1. KeyTime: Super-Accurate Prediction of Stroke Gesture Production Times

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      CHI '18: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems
      April 2018
      8489 pages
      ISBN:9781450356206
      DOI:10.1145/3173574
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 21 April 2018

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. gesture synthesis
      2. human performance
      3. kinematic theory
      4. rapid prototyping
      5. stroke gestures
      6. touch gestures

      Qualifiers

      • Research-article

      Funding Sources

      • UEFISCDI
      • NSERC-Canada

      Conference

      CHI '18
      Sponsor:

      Acceptance Rates

      CHI '18 Paper Acceptance Rate 666 of 2,590 submissions, 26%;
      Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

      Upcoming Conference

      CHI 2025
      ACM CHI Conference on Human Factors in Computing Systems
      April 26 - May 1, 2025
      Yokohama , Japan

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)24
      • Downloads (Last 6 weeks)1
      Reflects downloads up to 19 Dec 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Take a Seat, Make a Gesture: Charting User Preferences for On-Chair and From-Chair Gesture InputProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642028(1-17)Online publication date: 11-May-2024
      • (2024)Generating Virtual Reality Stroke Gesture Data from Out-of-Distribution Desktop Stroke Gesture Data2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00093(732-742)Online publication date: 16-Mar-2024
      • (2023)An Expressivity-Complexity Tradeoff?: User-Defined Gestures from the Wheelchair Space are Mostly DeicticExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3585695(1-8)Online publication date: 19-Apr-2023
      • (2023)ChartDetective: Easy and Accurate Interactive Data Extraction from Complex Vector ChartsProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581113(1-17)Online publication date: 19-Apr-2023
      • (2023)iFAD Gestures: Understanding Users’ Gesture Input Performance with Index-Finger Augmentation DevicesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580928(1-17)Online publication date: 19-Apr-2023
      • (2023)Computational Model of the Transition from Novice to Expert Interaction TechniquesACM Transactions on Computer-Human Interaction10.1145/350555730:5(1-33)Online publication date: 23-Sep-2023
      • (2023)Lognormality: An Open Window on Neuromotor ControlGraphonomics in Human Body Movement. Bridging Research and Practice from Motor Control to Handwriting Analysis and Recognition10.1007/978-3-031-45461-5_15(205-258)Online publication date: 9-Oct-2023
      • (2022)Understanding Gesture Input Articulation with Upper-Body Wearables for Users with Upper-Body Motor ImpairmentsProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3501964(1-16)Online publication date: 29-Apr-2022
      • (2022)GearWheels: A Software Tool to Support User Experiments on Gesture Input with Wearable DevicesInternational Journal of Human–Computer Interaction10.1080/10447318.2022.209890739:18(3527-3545)Online publication date: 22-Jul-2022
      • (2021)GestuRING: A Web-based Tool for Designing Gesture Input with Rings, Ring-Like, and Ring-Ready DevicesThe 34th Annual ACM Symposium on User Interface Software and Technology10.1145/3472749.3474780(710-723)Online publication date: 10-Oct-2021
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media