[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2516540.2516563acmotherconferencesArticle/Chapter ViewAbstractPublication PagesautomotiveuiConference Proceedingsconference-collections
research-article

Opportunistic synergy: a classifier fusion engine for micro-gesture recognition

Published: 28 October 2013 Publication History

Abstract

In this paper, we present a novel opportunistic paradigm for in-vehicle gesture recognition. This paradigm allows using two or more subsystems in a synergistic manner: they can work in parallel but the lack of some of them does not compromise the functioning of the whole system. In order to segment and recognize micro-gestures performed by the user on the steering wheel, we combine a wearable approach based on the electromyography of the user's forearm muscles, with an environmental approach based on pressure sensors integrated directly on the steering wheel. We present and analyze several fusion methods and gesture segmentation strategies. A prototype has been developed and evaluated with data from nine subjects. The results prove that the proposed opportunistic system performs equal or better than each stand-alone subsystem while increasing the interaction possibilities.

References

[1]
Riener, A. Driver-vehicle confluence or how to control your car in future? In Proc. of AutomotiveUI '12, no. c, p. 217, 2012.
[2]
Riener, A. Gestural Interaction in Vehicular Applications. Computer, vol. 45, no. 4, pp. 42--47, Apr. 2012.
[3]
Rhodes, B. J., Minar, N., and J. Weaver, J. Wearable computing meets ubiquitous computing: reaping the best of both worlds. In Digest of Papers International Symposium on Wearable Computers, pp. 141--149.
[4]
Wolf, K., Naumann, A., and Rohs, M. A Taxonomy of Microinteractions: Defining Micro-gestures based on Ergonomic and Scenario-dependent Requirements. In Proc. of the 13th IFIP INTERACT, 559--575, 2011.
[5]
Pfleging, B., Schmidt, A., Döring, T., and Knobel, M. AutoNUI. In Proc.of AutomotiveUI '11, 2011, p. 207.
[6]
González, I. E., Wobbrock, J. O., Chau, D. H., Faulring, A., and Myers, B. A. Eyes on the road, hands on the wheel: thumb-based interaction techniques for input on steering wheels. In Proc. of Graphics Interface 2007, 95--102, 2007.
[7]
Döring, T., Kern, D., Marshall, P., Pfeiffer, M., Schöning, J., Gruhn, V., Schmidt, A. Gestural interaction on the steering wheel: reducing the visual demand. In Proc. of CHI '11, pp. 483--492, 2011.
[8]
Mahr, A., Endres, C., Müller, C., and Schneeberger, T. Determining human-centered parameters of ergonomic micro-gesture interaction for drivers using the theater approach. In Proc. of AutomotiveUI '11, 2011, p. 151.
[9]
Angelini, L., Caon, M., Carrino, F., Carrino, S., Lalanne, D., Abou Khaled, O., and Mugellini, E. WheelSense: Enabling Tangible Gestures on the Steering Wheel for In-Car Natural Interaction. In Proc. of HCII 2013 LNCS Vol. 8005, 2013, pp 531--540.
[10]
Khushaba, R. N., Kodagoda, S., Liu, D., and Dissanayake, G. Muscle computer interfaces for driver distraction reduction. Computer methods and programs in biomedicine, vol. 110, no. 2, pp. 137--49, May 2013.
[11]
Carrino, F., Carrino, S., Caon, M., Angelini, L., Abou Khaled, O., and Mugellini, E. In-Vehicle Natural Interaction Based on Electromyography. In Proc. of AutomotiveUI '12, pp. 1--3, 2012.
[12]
Bose, R., Brakensiek, J., and Park, K. Terminal mode. In Proc. of AutomotiveUI '10, 2010, p. 148.
[13]
Carrino, S., Mugellini, E., Abou Khaled, O., Ingold, R. ARAMIS: Toward a Hybrid Approach for Human-Environment Interaction. In Proc. of HCII 2011, Vol. 6763, 2011, pp 165--174.
[14]
Wachs, J. P., Kölsch, M., Stern, H., and Edan, Y. Vision-based hand-gesture applications. Communications of the ACM, vol. 54, no. 2, p. 60, Feb. 2011.
[15]
Costanza, E., Inverso, S., and Allen, R. Toward subtle intimate interfaces for mobile devices using an EMG controller In Proc. of CHI '05, p. 481, 2005.
[16]
"Teaching manual for the formation and examination of driving instructors", "Manuel d'enseignement pour la formation et l'examen des moniteurs de conduite", Département fédéral de justice et police: http://www.asa.ch/media/archive1/Shop/Ausbildungsunterla-gen/gratis/Leitfaden_Fahrlehrer_Fachgruppe_7_f.pdf, (June 2013).
[17]
Sokolova, M., and Lapalme, G. A systematic analysis of performance measures for classification tasks. Information Processing & Management, vol. 45, no. 4, pp. 427--437, Jul. 2009.
[18]
Kittler, J., and Alkoot, F. M. Sum versus vote fusion in multiple classifier systems. IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 25, no. 1, pp. 110--115, Jan. 2003.
[19]
Kittler, J., Hatef, M., Duin, R. P. W., and Matas, J. On combining classifiers. IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 20, no. 3, pp. 226--239, Mar. 1998.
[20]
Kuncheva, L. I., Bezdek, J. C., and Duin, R. P. W. Decision templates for multiple classifier fusion: an experimental comparison. Pattern Recognition, vol. 34, no. 2, pp. 299--314, Feb. 2001.
[21]
Matthews, B. Comparison of the predicted and observed secondary structure of T4 phage lysozyme. Biochimica et Biophysica Acta (BBA)-Protein Structure, 1975.
[22]
Jurman, G., Riccadonna, S., and Furlanello, C. A comparison of MCC and CEN error measures in multi-class prediction. PloS one, vol. 7, no. 8, p. e41882, Jan. 2012.
[23]
Caputo, B., Sim, K., Furesjo, F., and Smola, A. Appearance-based object recognition using SVMs: which kernel should I use? Proc. of Neural Information Processing Systems Workshop on Statistical methods for Computational Experiments In Visual Processing and Computer Vision. Whistler, 2001.
[24]
Carrino, S., Péclat, A., Mugellini, E., Abou Khaled, O. and Ingold, R. Humans and smart environments: a novel multimodal interaction approach. Proc. ICMI '11, 2011, p. 105.
[25]
Souza, C. http://sourceforge.net/projects/accord-net/, 2013. (June 2013).
[26]
Tekscan FlexiForce: http://www.tekscan.com/flexible-force-sensors, (June 2013).
[27]
Cram, J. 2011. Cram's introduction to surface electromyography. Jones & Bartlett Publishers.

Cited By

View all
  • (2024)From Smart Buildings to Smart Vehicles: Mobile User Interfaces for Multi-Environment Interactions2024 International Conference on Development and Application Systems (DAS)10.1109/DAS61944.2024.10541208(152-155)Online publication date: 23-May-2024
  • (2022)A Design Space for Human Sensor and Actuator Focused In-Vehicle Interaction Based on a Systematic Literature ReviewProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35346176:2(1-51)Online publication date: 7-Jul-2022
  • (2021)Smart Vehicle Proxemics: A Conceptual Framework Operationalizing Proxemics in the Context of Outside-the-Vehicle InteractionsHuman-Computer Interaction – INTERACT 202110.1007/978-3-030-85616-8_11(150-171)Online publication date: 26-Aug-2021
  • Show More Cited By

Index Terms

  1. Opportunistic synergy: a classifier fusion engine for micro-gesture recognition

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Other conferences
      AutomotiveUI '13: Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
      October 2013
      281 pages
      ISBN:9781450324786
      DOI:10.1145/2516540
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      • Eindhoven University of Technology Department of Industrial Design: Eindhoven University of Technology, Department of Industrial Design

      In-Cooperation

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 28 October 2013

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. electromyography
      2. environmental and wearable paradigms
      3. in-vehicle interaction
      4. micro-gestures
      5. tangible gestures
      6. ubiquitous computing

      Qualifiers

      • Research-article

      Conference

      AutomotiveUI '13
      Sponsor:
      • Eindhoven University of Technology Department of Industrial Design

      Acceptance Rates

      AutomotiveUI '13 Paper Acceptance Rate 41 of 67 submissions, 61%;
      Overall Acceptance Rate 248 of 566 submissions, 44%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)9
      • Downloads (Last 6 weeks)0
      Reflects downloads up to 12 Dec 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)From Smart Buildings to Smart Vehicles: Mobile User Interfaces for Multi-Environment Interactions2024 International Conference on Development and Application Systems (DAS)10.1109/DAS61944.2024.10541208(152-155)Online publication date: 23-May-2024
      • (2022)A Design Space for Human Sensor and Actuator Focused In-Vehicle Interaction Based on a Systematic Literature ReviewProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35346176:2(1-51)Online publication date: 7-Jul-2022
      • (2021)Smart Vehicle Proxemics: A Conceptual Framework Operationalizing Proxemics in the Context of Outside-the-Vehicle InteractionsHuman-Computer Interaction – INTERACT 202110.1007/978-3-030-85616-8_11(150-171)Online publication date: 26-Aug-2021
      • (2020)A Synopsis of Input Modalities for In-Vehicle Infotainment and Consumption of Interactive MediaProceedings of the 2020 ACM International Conference on Interactive Media Experiences10.1145/3391614.3399400(195-199)Online publication date: 17-Jun-2020
      • (2020)User interface for in-vehicle systems with on-wheel finger spreading gestures and head-up displaysJournal of Computational Design and Engineering10.1093/jcde/qwaa052Online publication date: 19-Jun-2020
      • (2020)A multistudy investigation of drivers and passengers’ gesture and voice input preferences for in-vehicle interactionsJournal of Intelligent Transportation Systems10.1080/15472450.2020.1846127(1-24)Online publication date: 18-Nov-2020
      • (2019)A Fast Automatic Holoscopic 3D Micro-gesture Recognition System for Immersive ApplicationsAdvances in Natural Computation, Fuzzy Systems and Knowledge Discovery10.1007/978-3-030-32591-6_74(696-703)Online publication date: 7-Nov-2019
      • (2017)Eliminating Driving Distractions: Human-Computer Interaction with Built-In ApplicationsIEEE Vehicular Technology Magazine10.1109/MVT.2016.262533112:1(20-29)Online publication date: Mar-2017
      • (2016)A comparison of three interaction modalities in the carActes de la 28ième conference francophone sur l'Interaction Homme-Machine10.1145/3004107.3004118(188-196)Online publication date: 25-Oct-2016
      • (2015)Move, Hold and Touch: A Framework for Tangible Gesture Interactive SystemsMachines10.3390/machines30301733:3(173-207)Online publication date: 18-Aug-2015
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media