[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2774225.2774846acmconferencesArticle/Chapter ViewAbstractPublication PageseicsConference Proceedingsconference-collections
research-article

Kinect analysis: a system for recording, analysing and sharing multimodal interaction elicitation studies

Published: 23 June 2015 Publication History

Abstract

Recently, guessability studies have become a popular means among researchers to elicit user-defined interaction sets involving gesture, speech and multimodal input. However, tool support for capturing and analysing interaction proposals is lacking and the method itself is still evolving. This paper presents Kinect Analysis---a system designed for interaction elicitation studies with support for record-and-replay, visualisation and analysis based on Kinect's depth, audio and video streams. Kinect Analysis enables post-hoc analysis during playback and live analysis with real-time feedback while recording. In particular, new visualisations such as skeletal joint traces and heatmaps can be superimposed for analysis and comparison of multiple recordings. It also introduces KinectScript---a simple scripting language to query recordings and automate analysis tasks based on skeleton, distance, audio and gesture scripts. The paper discusses Kinect Analysis both as a tool and a method that could enable researchers to more easily collect, study and share interaction proposals. Using data from a previous guessability study with 25 users, we show that Kinect Analysis in combination with KinectScript is useful and effective for a range of analysis tasks.

References

[1]
Ashbrook, D., and Starner, T. MAGIC: A Motion Gesture Design Tool. In Proc. CHI (2010).
[2]
Burr, B. VACA: a tool for qualitative video analysis. In Proc. CHI EA (2006).
[3]
Fouse, A., Weibel, N., Hutchins, E., and Hollan, J. D. ChronoViz: A System for Supporting Navigation of Time-coded Data. In Proc. CHI EA (2011).
[4]
Heath, C., Hindmarsh, J., and Luff, P. Video in Qualitative Research. SAGE, 2010.
[5]
Hinrichs, U., and Carpendale, S. Gestures in the wild: studying multi-touch gesture sequences on interactive tabletop exhibits. In Proc. CHI (2011).
[6]
Hoste, L., and Signer, B. SpeeG2: a speech- and gesture-based interface for efficient controller-free text input. In Proc. ICMI (2013).
[7]
Jang, S., Elmqvist, N., and Ramani, K. GestureAnalyzer: Visual Analytics for Pattern Analysis of Mid-Air Hand Gestures. In Proc. SUI (2014).
[8]
Jones, B. R., Benko, H., Ofek, E., and Wilson, A. D. Illumiroom: peripheral projected illusions for interactive experiences. In Proc. CHI (2013).
[9]
Kato, J., McDirmid, S., and Cao, X. DejaVu: Integrated Support for Developing Interactive Camera-Based Programs. In Proc. UIST (2012).
[10]
Klemmer, S. R., Hartmann, B., and Takayama, L. How Bodies Matter: Five Themes for Interaction Design. In Proc. DIS (2006).
[11]
Lee, S.-S., Chae, J., Kim, H., Lim, Y.-K., and Lee, K.-P. Towards more Natural Digital Content Manipulation via User Freehand Gestural Interaction in a Living Room. In Proc. UbiComp (2013).
[12]
Morris, M. R. Web on the Wall: Insights from a Multimodal Interaction Elicitation Study. In Proc. ITS (2012).
[13]
Morris, M. R., Danielescu, A., Drucker, S. M., Fisher, D., Lee, B., m. c. schraefel, and Wobbrock, J. O. Reducing Legacy Bias in Gesture Elicitation Studies. Interactions 21, 3 (2014).
[14]
Morris, M. R., Wobbrock, J. O., and Wilson, A. D. Understanding Users Preferences for Surface Gestures. In Proc. GI (2010).
[15]
Nacenta, M. A., Kamber, Y., Qiang, Y., and Kristensson, P. O. Memorability of Pre-designed and User-defined Gesture Sets. In Proc. CHI (2013).
[16]
Nebeling, M., Huber, A., Ott, D., and Norrie, M. C. Web on the Wall Reloaded: Implementation, Replication and Refinement of User-Defined Interaction Sets. In Proc. ITS (2014).
[17]
Nebeling, M., Speicher, M., and Norrie, M. C. W3Touch: Metrics-based Web Page Adaptation for Touch. In Proc. CHI (2013).
[18]
Nebeling, M., Teunissen, E., Husmann, M., and Norrie, M. C. XDKinect: Development Framework for Cross-Device Interaction using Kinect. In Proc. EICS (2014).
[19]
North, C., Dwyer, T., Lee, B., Fisher, D., Isenberg, P., Robertson, G. G., and Inkpen, K. Understanding Multi-touch Manipulation for Surface Computing. In Proc. INTERACT (2009).
[20]
Oh, U., and Findlater, L. The Challenges and Potential of End-User Gesture Customization. In Proc. CHI (2013).
[21]
Ruiz, J., Li, Y., and Lank, E. User-Defined Motion Gestures for Mobile Interaction. In Proc. CHI (2011).
[22]
Schmidt, D., Seifert, J., Rukzio, E., and Gellersen, H. A Cross-Device Interaction Style for Mobiles and Surfaces. In Proc. DIS (2012).
[23]
Seyed, T., Burns, C., Sousa, M. C., Maurer, F., and Tang, A. Eliciting Usable Gestures for Multi-Display Environments. In Proc. ITS (2012).
[24]
Troiano, G. M., Pedersen, E. W., and Hornbæk, K. User-Defined Gestures for Elastic, Deformable Displays. In Proc. AVI (2014).
[25]
Vatavu, R. User-Defined Gestures for Free-Hand TV Control. In Proc. EuroITV (2012).
[26]
Weibel, N., Emmenegger, C., Lyons, J., Dixit, R., Hill, L. L., and Hollan, J. D. Interpreter-Mediated Physician-Patient Communication: Opportunities for Multimodal Healthcare Interfaces. In Proc. PervasiveHealth (2013).
[27]
Weichel, C., Lau, M., Kim, D., Villar, N., and Gellersen, H. MixFab: A Mixed-Reality Environment for Personal Fabrication. In Proc. CHI (2014).
[28]
Wobbrock, J. O., Morris, M. R., and Wilson, A. D. User-Defined Gestures for Surface Computing. In Proc. CHI (2009).

Cited By

View all
  • (2024)Investigating the Gap: Gaze and Movement Analysis in Immersive EnvironmentsProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3653522(1-7)Online publication date: 4-Jun-2024
  • (2024)PoseCoach: A Customizable Analysis and Visualization System for Video-Based Running CoachingIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2022.323085530:7(3180-3195)Online publication date: Jul-2024
  • (2024)Exploring Methods to Optimize Gesture Elicitation Studies: A Systematic Literature ReviewIEEE Access10.1109/ACCESS.2024.338726912(64958-64979)Online publication date: 2024
  • Show More Cited By

Index Terms

  1. Kinect analysis: a system for recording, analysing and sharing multimodal interaction elicitation studies

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      EICS '15: Proceedings of the 7th ACM SIGCHI Symposium on Engineering Interactive Computing Systems
      June 2015
      316 pages
      ISBN:9781450336468
      DOI:10.1145/2774225
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 23 June 2015

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. kinect analysis
      2. multimodal interaction recording
      3. tools for guessability studies
      4. visualisation and analysis

      Qualifiers

      • Research-article

      Funding Sources

      • Swiss NSF Advanced Postdoc.Mobility grant

      Conference

      EICS'15
      Sponsor:

      Acceptance Rates

      EICS '15 Paper Acceptance Rate 19 of 64 submissions, 30%;
      Overall Acceptance Rate 73 of 299 submissions, 24%

      Upcoming Conference

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)20
      • Downloads (Last 6 weeks)3
      Reflects downloads up to 31 Dec 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Investigating the Gap: Gaze and Movement Analysis in Immersive EnvironmentsProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3653522(1-7)Online publication date: 4-Jun-2024
      • (2024)PoseCoach: A Customizable Analysis and Visualization System for Video-Based Running CoachingIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2022.323085530:7(3180-3195)Online publication date: Jul-2024
      • (2024)Exploring Methods to Optimize Gesture Elicitation Studies: A Systematic Literature ReviewIEEE Access10.1109/ACCESS.2024.338726912(64958-64979)Online publication date: 2024
      • (2023)Brave New GES World: A Systematic Literature Review of Gestures and Referents in Gesture Elicitation StudiesACM Computing Surveys10.1145/363645856:5(1-55)Online publication date: 7-Dec-2023
      • (2023)Gesture-Based InteractionHandbook of Human Computer Interaction10.1007/978-3-319-27648-9_20-1(1-47)Online publication date: 9-Feb-2023
      • (2022)AnyGesture: Arbitrary One-Handed Gestures for Augmented, Virtual, and Mixed Reality ApplicationsApplied Sciences10.3390/app1204188812:4(1888)Online publication date: 11-Feb-2022
      • (2022)The Gesture Authoring Space: Authoring Customised Hand Gestures for Grasping Virtual Objects in Immersive Virtual EnvironmentsProceedings of Mensch und Computer 202210.1145/3543758.3543766(85-95)Online publication date: 4-Sep-2022
      • (2022)GearWheels: A Software Tool to Support User Experiments on Gesture Input with Wearable DevicesInternational Journal of Human–Computer Interaction10.1080/10447318.2022.209890739:18(3527-3545)Online publication date: 22-Jul-2022
      • (2021)GestureMap: Supporting Visual Analytics and Quantitative Analysis of Motion Elicitation Data by Learning 2D EmbeddingsProceedings of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411764.3445765(1-12)Online publication date: 6-May-2021
      • (2021)MIRIA: A Mixed Reality Toolkit for the In-Situ Visualization and Analysis of Spatio-Temporal Interaction DataProceedings of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411764.3445651(1-15)Online publication date: 6-May-2021
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media