[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2909132.2909258acmconferencesArticle/Chapter ViewAbstractPublication PagesaviConference Proceedingsconference-collections
research-article

YouTouch! Low-Cost User Identification at an Interactive Display Wall

Published: 07 June 2016 Publication History

Abstract

We present YouTouch!, a system that tracks users in front of an interactive display wall and associates touches with users. With their large size, display walls are inherently suitable for multi-user interaction. However, current touch recognition technology does not distinguish between users, making it hard to provide personalized user interfaces or access to private data. In our system we place a commodity RGB + depth camera in front of the wall, allowing us to track users and correlate them with touch events. While the camera's driver is able to track people, it loses the user's ID whenever she is occluded or leaves the scene. In these cases, we re-identify the person by means of a descriptor comprised of color histograms of body parts and skeleton-based biometric measurements. Additional processing reliably handles short-term occlusion as well as assignment of touches to occluded users. YouTouch! requires no user instrumentation nor custom hardware, and there is no registration nor learning phase. Our system was thoroughly tested with data sets comprising 81 people, demonstrating its ability to re-identify users and correlate them to touches even under adverse conditions.

References

[1]
M. Annett, T. Grossman, D. Wigdor, and G. Fitzmaurice. Medusa: A Proximity-aware Multi-touch Tabletop. In Proc. UIST '11. ACM, 337--346.
[2]
T. Ballendat, N. Marquardt, and S. Greenberg. Proxemic interaction: designing for a proximity and orientation-aware environment. In Proc. ITS '10. ACM, 121--130.
[3]
A. Bedagkar-Gala and S. Shah. 2014. A Survey of Approaches and Trends in Person Re-identification. Image Vision Comput. 32, 4 (April 2014), 270--286.
[4]
B. Blazica, D. Vladušič, and D. Mladenić. 2013. MTi: A Method for User Identification for Multitouch Displays. Int. J. Hum.-Comput. Stud. 71, 6 (June 2013), 691--702.
[5]
Y. Chen, Z. Liu, P. Chou, and Z. Zhang. VTouch: Vision-enhanced interaction for large touch displays. In Proc. ICME '15. 1--6.
[6]
A. Clayphan, R. Martinez Maldonado, C. Ackad, and J. Kay. An Approach for Designing and Evaluating a Plug-in Vision-based Tabletop Touch Identification System. In Proc. OzCHI '13. ACM, 373--382.
[7]
M. De Marsico, R. Distasi, S. Ricciardi, and D. Riccio. 2014. A comparison of approaches for person re-identification. In Proc. ICPRAM (ICPRAM). 189--198.
[8]
P. Dietz and D. Leigh. DiamondTouch: a multi-user touch technology. In Proc. UIST '01. ACM, 219--226.
[9]
K.C. Dohse, T. Dohse, J.D. Still, and D.J. Parkhurst. Enhancing Multi-user Interaction with Multi-touch Tabletop Displays Using Hand Tracking. In Proc. Conference on Advances in Computer-Human Interaction '08. 297--302.
[10]
G. Doretto, T. Sebastian, P. Tu, and J. Rittscher. 2011. Appearance-based person reidentification in camera networks: problem overview and current approaches. Journal of Ambient Intelligence and Humanized Computing 2 (2011), 127--151.
[11]
Y. Du, H. Ai, and S. Lao. Evaluation of color spaces for person re-identification. In Proc. ICPR '12. 1371--1374.
[12]
S. Gong, M. Cristani, C. Loy, and T. Hospedales. 2014. The Re-identification Challenge. In Person Re-Identification. Springer London, 1--20.
[13]
C. Harrison, M. Sato, and I. Poupyrev. Capacitive Fingerprinting: Exploring User Differentiation by Sensing Electrical Properties of the Human Body. In Proc. UIST '12. ACM, 537--544.
[14]
C. Holz and P. Baudisch. Fiberio: A Touchscreen that Senses Fingerprints. In Proc. UIST '13. ACM, 41--50.
[15]
C. Holz and M. Knaust. Biometric Touch Sensing: Seamlessly Augmenting Each Touch with Continuous Authentication. In Proc. UIST '15. ACM, 303--312.
[16]
M. R. Jakobsen and K. Hornbæk. 2014. Up Close and Personal: Collaborative Work on a High-resolution Multitouch Wall Display. ACM TOCHI 21, 2 (Feb. 2014), 11:1--11:34.
[17]
U. Kister, P. Reipschläger, F. Matulic, and R. Dachselt. 2015. BodyLenses: Embodied Magic Lenses and Personal Territories for Wall Displays. In Proc. ITS '15. ACM, 117--126.
[18]
T. Leyvand, C. Meekhof, Y. Wei, J. Sun, and B. Guo. 2011. Kinect Identity: Technology and Experience. Computer 44, 4 (April 2011), 94--96.
[19]
T. Meyer and D. Schmidt. IdWristbands: IR-based user identification on multi-touch surfaces. In Proc. ITS '10. ACM, 277--278.
[20]
S. Murugappan, Vinayak, N. Elmqvist, and K. Ramani. Extended Multitouch: Recovering Touch Posture and Differentiating Users Using a Depth Camera. In Proc. UIST '12. ACM, 487--496.
[21]
F. Pala, R. Satta, G. Fumera, and F. Roli. 2015. Multi-modal Person Re-Identification Using RGB-D Cameras. IEEE transactions on circuits and systems for video technology (2015), 788--799.
[22]
R. Ramakers, D. Vanacken, K. Luyten, K. Coninx, and J. Schöning. Carpus: a non-intrusive user identification technique for interactive surfaces. In Proc. UIST 2012. ACM, 35--44.
[23]
J. Rekimoto. Pick-and-drop: A Direct Manipulation Technique for Multiple Computer Environments. In Proc. UIST '97 (UIST '97). ACM, New York, NY, USA, 31--39.
[24]
S. Richter, C. Holz, and P. Baudisch. Bootstrapper: Recognizing Tabletop Users by Their Shoes. In Proc. CHI '12. ACM, 1249--1252.
[25]
V. Roth, P. Schmidt, and B. Güldenring. The IR Ring: Authenticating Users' Touches on a Multi-touch Display. In Proc. UIST '10. ACM, 259--262.
[26]
D. Schmidt, F. Chehimi, E. Rukzio, and H. Gellersen. PhoneTouch: A Technique for Direct Phone Interaction on Surfaces. In Proc. UIST '10. ACM, 13--16.
[27]
D. Schmidt, M. Chong, and H. Gellersen. HandsDown: hand-contour-based user identification for interactive surfaces. In Proc. NordiCHI '10. ACM, 432--441.
[28]
J. Shotton, R. Girshick, A. Fitzgibbon, T Sharp, M. Cook, M. Finocchio, R. Moore, P. Kohli, A. Criminisi, A. Kipman, and A. Blake. 2012. Efficient Human Pose Estimation from Single Depth Images. Trans. Pattern Analysis and Machine Intelligence (2012), 2821--2840.
[29]
Deller M. Eichner J. Ebert A. Thelen, S. Enhancing Large Display Interaction with User Tracking Data. In Proc. CGVR '12. 3--8.
[30]
Nolte A. Ksoll M. Turnwald, M. 2012. Easy collaboration on interactive wall-size displays in a user distinction environment. In Workshop Designing Collaborative Interactive Spaces for e-Creativity, e-Science and e-Learning.
[31]
R. Vezzani, D. Baltieri, and R. Cucchiara. 2013. People Reidentification in Surveillance and Forensics: A Survey. ACM Comput. Surv. 46, 2 (Dec. 2013), 29:1--29:37.
[32]
U. v. Zadow, W. Büschel, R. Langner, and R. Dachselt. 2014. SleeD: Using a Sleeve Display to Interact with Touch-sensitive Display Walls. In Proc. ITS '14. ACM, New York, NY, USA, 129--138.

Cited By

View all
  • (2024)Supporting Mixed-Presence Awareness across Wall-Sized Displays Using a Tracking Pipeline based on Depth CamerasProceedings of the ACM on Human-Computer Interaction10.1145/36646348:EICS(1-32)Online publication date: 17-Jun-2024
  • (2024)Interactive Visualization on Large High‐Resolution Displays: A SurveyComputer Graphics Forum10.1111/cgf.1500143:6Online publication date: 30-Apr-2024
  • (2023)Side-by-Side vs Face-to-Face: Evaluating Colocated Collaboration via a Transparent Wall-sized DisplayProceedings of the ACM on Human-Computer Interaction10.1145/35796237:CSCW1(1-29)Online publication date: 16-Apr-2023
  • Show More Cited By
  1. YouTouch! Low-Cost User Identification at an Interactive Display Wall

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    AVI '16: Proceedings of the International Working Conference on Advanced Visual Interfaces
    June 2016
    400 pages
    ISBN:9781450341318
    DOI:10.1145/2909132
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 07 June 2016

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Display wall
    2. RGBD sensor
    3. interactive surface
    4. multi-user interaction
    5. multitouch
    6. re-identification
    7. user identification

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    AVI '16

    Acceptance Rates

    AVI '16 Paper Acceptance Rate 20 of 96 submissions, 21%;
    Overall Acceptance Rate 128 of 490 submissions, 26%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)26
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 17 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Supporting Mixed-Presence Awareness across Wall-Sized Displays Using a Tracking Pipeline based on Depth CamerasProceedings of the ACM on Human-Computer Interaction10.1145/36646348:EICS(1-32)Online publication date: 17-Jun-2024
    • (2024)Interactive Visualization on Large High‐Resolution Displays: A SurveyComputer Graphics Forum10.1111/cgf.1500143:6Online publication date: 30-Apr-2024
    • (2023)Side-by-Side vs Face-to-Face: Evaluating Colocated Collaboration via a Transparent Wall-sized DisplayProceedings of the ACM on Human-Computer Interaction10.1145/35796237:CSCW1(1-29)Online publication date: 16-Apr-2023
    • (2022)Understanding and Creating Spatial Interactions with Distant Displays Enabled by Unmodified Off-The-Shelf SmartphonesMultimodal Technologies and Interaction10.3390/mti61000946:10(94)Online publication date: 19-Oct-2022
    • (2020)Dynamic layout optimization for multi-user interaction with a large displayProceedings of the 25th International Conference on Intelligent User Interfaces10.1145/3377325.3377481(401-409)Online publication date: 17-Mar-2020
    • (2020)Short‐Contact Touch‐Manipulation of Scatterplot Matrices on Wall DisplaysComputer Graphics Forum10.1111/cgf.1397939:3(265-276)Online publication date: 18-Jul-2020
    • (2019)Effect of age on use of interactive exhibits in a museum contextProceedings of the 13th Biannual Conference of the Italian SIGCHI Chapter: Designing the next interaction10.1145/3351995.3352047(1-11)Online publication date: 23-Sep-2019
    • (2018)When David Meets GoliathProceedings of the 2018 CHI Conference on Human Factors in Computing Systems10.1145/3173574.3173593(1-13)Online publication date: 21-Apr-2018
    • (2018)Interaction for Immersive AnalyticsImmersive Analytics10.1007/978-3-030-01388-2_4(95-138)Online publication date: 16-Oct-2018
    • (2017)GIAnTProceedings of the 2017 CHI Conference on Human Factors in Computing Systems10.1145/3025453.3026006(2639-2647)Online publication date: 2-May-2017
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media