[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1007/978-3-030-29387-1_15guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

TouchGlass: Raycasting from a Glass Surface to Point at Physical Objects in Public Exhibits

Published: 02 September 2019 Publication History

Abstract

Physical objects such as natural items or fine art pieces are often placed behind glass cases to protect them from dust and damage. Generally, interacting with such objects is indirect, based for example on an adjacent touch interface detracting users’ attention from the object. In this paper, we explore whether the glass case could be used as an input surface to point and select distant physical objects. With such an approach, the glass case offers a physical delimiter for interaction to avoid unintended activations. We explore this innovative approach through a two steps approach. First, we carry an informative study with 46 participants to validate the most appropriate “walk-up and use” technique. Our results show that using a ray orthogonal to the glass surface is the most natural approach in a public setting. Next, we further explore this orthogonal raycasting technique and conduct a target acquisition experiment to evaluate the impact on target selection performance of the target size, target distance, presence of spatial references and user’s head position with regards to the glass case. Results reveal that using the glass as touch surface allows to easily select targets as small as 3 cm up to 35 cm away from the glass. From these results, we provide a set of guidelines to design interactive exhibits using a touch glass case.

References

[1]
Roberts, J., Banerjee, A., Hong, A., McGee, S., Horn, M., Matcuk, M.: Digital exhibit labels in museums: promoting visitor engagement with cultural artifacts. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, CHI 2018, pp. 1–12 (2018).
[2]
Spindler, M., Dachselt, R.: PaperLens: advanced magic lens interaction above the tabletop. In: Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces, ITS 2009, p. 1 (2009).
[3]
Martinez Plasencia, D., Berthaut, F., Karnik, A., Subramanian, S.: Through the combining glass. In: Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology, UIST 2014, pp. 341–350 (2014).
[4]
Schmidt D, Block F, Gellersen H, et al. Gross T et al. A comparison of direct and indirect multi-touch input for large surfaces Human-Computer Interaction – INTERACT 2009 2009 Heidelberg Springer 582-594
[5]
Sears A and Shneiderman B High precision touchscreens: design strategies and comparisons with a mouse Int. J. Man-Mach. Stud. 1991 34 593-613
[6]
Forlines, C., Balakrishnan, R.: Evaluating tactile feedback and direct vs. indirect stylus input in pointing and crossing selection tasks. In: Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems, vol. 1, pp. 1563–1572 (2008).
[7]
Argelaguet F and Andujar C A survey of 3D object selection techniques for virtual environments Comput. Graph. (Pergamon). 2013 37 121-136
[8]
Mine, M.: Virtual Environment Interaction Techniques (1995)
[9]
Debarba HG, Grandi JG, Maciel A, Nedel L, and Boulic R Kotzé P, Marsden G, Lindgaard G, Wesson J, and Winckler M Disambiguation canvas: a precise selection technique for virtual environments Human-Computer Interaction – INTERACT 2013 2013 Heidelberg Springer 388-405
[10]
Matulic, F., Vogel, D.: Multiray: multi-finger raycasting for large displays. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, CHI 2018, pp. 1–13 (2018).
[11]
Song, C.G., Kwak, N., Jeong, H.: Developing an efficient technique of Selection and Manipulation in Immersive VE. In: VRST, pp. 142–146 (2000).
[12]
Gallo, L., De Pietro, G., Marra, I.: 3D interaction with volumetric medical data: experiencing the Wiimote. In: Proceedings of the 1st international Conference on Ambient Media and Systems, pp. 1–6 (2008).
[13]
Hincapié-Ramos, J.D., Guo, X., Irani, P.: Designing interactive transparent exhibition cases. In: 7th International Workshop on Personalized Access to Cultural Heritage: The Future of Experiencing Cultural Heritage, pp. 16–19 (2014)
[14]
Teather, R.J., Stuerzlinger, W.: Pointing at 3D targets in a stereo head-tracked virtual environment. In: Proceedings of the IEEE Symposium on 3D User Interfaces, 3DUI 2011, pp. 87–94 (2011).
[15]
Grossman T, Wigdor D, and Balakrishnan R Multi-finger gestural interaction with 3D volumetric displays ACM Trans. Graph. 2005 24 931
[16]
Grossman, T., Balakrishnan, R.: The design and evaluation of selection techniques for 3D volumetric displays. In: Proceedings of the 19th Annual ACM Symposium on User Interface Software and Technology, UIST 2006, p. 3 (2006).
[17]
Benko, H., Feiner, S.: Balloon selection: a multi-finger technique for accurate low-fatigue 3D selection. In: Proceedings of the IEEE Symposium on 3D User Interfaces, 3DUI 2007, pp. 79–86 (2007).
[18]
Lee, M., Green, R., Billinghurst, M.: 3D natural hand interaction for AR applications. In: 2008 23rd International Conference Image and Vision Computing New Zealand, IVCNZ (2008).
[19]
Lee, S., Lim, Y., Chun, J.: 3D interaction in Augmented Reality with stereo-vision technique. In: 2013 15th International Conference on Advanced Communication Technology (ICACT), pp. 401–405 (2013)
[20]
Khamis Mohamed, Buschek Daniel, Thieron Tobias, Alt Florian, and Bulling Andreas EyePACT Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 2018 1 4 1-18
[21]
Migge, B., Kunz, A.: User model for predictive calibration control on interactive screens. In: Proceedings of the 2010 International Conference on Cyberworlds, CW 2010, pp. 32–37 (2010).
[22]
Lee, J.H., Bae, S.: Binocular cursor: enabling selection on transparent displays troubled by binocular parallax. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI), pp. 3169–3172 (2013).
[23]
Bandyopadhyay, D., Raskar, R., Fuchs, H.: Dynamic shader lamps: painting on movable objects. In: Proceedings of the IEEE and ACM International Symposium on Augmented Reality, ISAR 2001, pp. 207–216 (2001).
[24]
Myers, B.A., et al.: Interacting at a distance: measuring the performance of laser pointers and other devices. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems Changing Our World, Changing Ourselves, CHI 2002, p. 33 (2002).
[25]
Freeman, E., Williamson, J., Subramanian, S., Brewster, S.: Point-and-shake: selecting from levitating object displays. In: Proceedings of CHI, pp. 1–10 (2018).
[26]
Hinckley, K., Wigdor, D.: Input technologies and techniques. In: The Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies and Emerging Applications, pp. 151–168 (2002)
[27]
Corsten, C., Cherek, C., Karrer, T., Borchers, J.: HaptiCase: back-of-device tactile landmarks for eyes-free absolute indirect touch. In: Proceedings of the ACM CHI 2015 Conference on Human Factors in Computing Systems, vol. 1, pp. 2171–2180 (2015).
[28]
Gilliot, J., Casiez, G., Roussel, N.: Impact of form factors and input conditions on absolute indirect-touch pointing tasks. In: Proceedings of the 32nd Annual ACM Conference on Human Factors in Computing Systems, CHI 2014, pp. 723–732 (2014).
[29]
Pietroszek, K., Lank, E.: Clicking blindly: using spatial correspondence to select targets in multi-device environments. In: Proceedings of the 14th International Conference on Human Computer Interaction with Mobile Devices and Services, MobileHCI 2012, pp. 331–334 (2012).
[30]
Gehring, S., Löchtefeld, M., Daiber, F., Böhmer, M., Krüger, A.: Using intelligent natural user interfaces to support sales conversations. In: Proceedings of the 2012 ACM International Conference on Intelligent User Interfaces, IUI 2012, p. 97 (2012).
[31]
Green J, Pridmore T, and Benford S Exploring attractions and exhibits with interactive flashlights Pers. Ubiquit. Comput. 2014 18 239-251
[32]
Ridel B, Reuter P, and Laviole J The Revealing Flashlight: interactive spatial augmented reality for detail exploration of cultural heritage artifacts J. Comput. Cult. Heritage 2014 7 1-18
[33]
Pierce, J.S., Forsberg, A.S., Conway, M.J., Hong, S., Zeleznik, R.C., Mine, M.R.: Image plane interaction techniques in 3D immersive environments. In: Proceedings of the 1997 Symposium on Interactive 3D Graphics, SI3D 1997, p. 39 (1997).
[34]
Pierce, J.S., Steams, B.C., Pausch, R.: Voodoo dolls: seamless interaction at multiple scales in virtual environments, pp. 141–145 (1999)
[35]
Hoang, T.N., Porter, S.R., Thomas, B.H.: Augmenting image plane AR 3D interactions for wearable computers. In: Conferences in Research and Practice in Information Technology Series, pp. 9–16 (2009)
[36]
Marquardt N and Greenberg S Informing the design of proxemic interactions IEEE Pervasive Comput. 2012 11 14-23
[37]
Hall AD, Cunningham JB, Roache RP, and Cox JW Factors affecting performance using touch-entry systems: tactual recognition fields and system accuracy J. Appl. Psychol. 1988 4 711-720
[38]
Vogel, D., Baudisch, P.: Shift: a technique for operating pen-based interfaces using touch. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2007, p. 657 (2007).
[39]
Wang, F., Ren, X.: Empirical evaluation for finger input properties in multi-touch interaction. In: Proceedings of the 27th International Conference on Human Factors in Computing Systems, CHI 2009, p. 1063 (2009).
[40]
Holz, C., Baudisch, P.: The generalized perceived input point model and how to double touch accuracy by extracting fingerprints. In: Proceedings of the 28th International Conference on Human Factors in Computing Systems, CHI 2010, p. 581 (2010).
[41]
Roudaut, A., Pohl, H., Baudisch, P.: Touch input on curved surfaces. In: Proceedings of the International Conference on Human Factors in Computing Systems, CHI 2011, pp. 1011–1020 (2011).
[42]
VandenBos GR Publication Manual of the American Psychological Association 2009 6 Washington, D.C. American Psychological Association
[43]
Dubois, E., Serrano, M., Raynal, M.: Rolling-menu: rapid command selection in toolbars using roll gestures with a multi-DoF Mouse. In: Proceedings of the Conference on Human Factors in Computing Systems, April 2018.
[44]
AVIZ Group: R Macros for Data Analysis. www.aviz.fr/reliefshearing
[45]
McGuffin, M., Balakrishnan, R.: Acquisition of expanding targets. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems: Changing our World, Changing Ourselves, CHI 2002, pp. 57–64 (2002).
[46]
McGuffin MJ and Balakrishnan R Fitts’ law and expanding targets ACM Trans. Comput.-Hum. Interact. 2005 12 388-422
[47]
Kopper, R., Bacim, F., Bowman, D.A.: Rapid and accurate 3D selection by progressive refinement. In: Proceedings of the IEEE Symposium on 3D User Interfaces, 3DUI 2011, pp. 67–74 (2011).
[48]
Vanacken, L., Grossman, T., Coninx, K.: Exploring the effects of environment density and target visibility on object selection in 3D virtual environments. In: Proceedings of the IEEE Symposium on 3D User Interfaces 2007, 3DUI 2007, pp. 115–122 (2007).

Cited By

View all
  • (2024)Interaction with Physical Referents In and Around the Field of View of Optical See-Through Augmented-Reality HeadsetsProceedings of the 35th Conference on l'Interaction Humain-Machine10.1145/3649792.3649796(1-15)Online publication date: 25-Mar-2024
  • (2021)KeyTch: Combining the Keyboard with a Touchscreen for Rapid Command Selection on ToolbarsProceedings of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411764.3445288(1-13)Online publication date: 6-May-2021

Index Terms

  1. TouchGlass: Raycasting from a Glass Surface to Point at Physical Objects in Public Exhibits
    Index terms have been assigned to the content through auto-classification.

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image Guide Proceedings
    Human-Computer Interaction – INTERACT 2019: 17th IFIP TC 13 International Conference, Paphos, Cyprus, September 2–6, 2019, Proceedings, Part III
    Sep 2019
    794 pages
    ISBN:978-3-030-29386-4
    DOI:10.1007/978-3-030-29387-1

    Publisher

    Springer-Verlag

    Berlin, Heidelberg

    Publication History

    Published: 02 September 2019

    Author Tags

    1. Touch input
    2. Distant pointing
    3. Transparent touch surface
    4. Absolute pointing
    5. Evaluation
    6. Physical objects

    Qualifiers

    • Article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 14 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Interaction with Physical Referents In and Around the Field of View of Optical See-Through Augmented-Reality HeadsetsProceedings of the 35th Conference on l'Interaction Humain-Machine10.1145/3649792.3649796(1-15)Online publication date: 25-Mar-2024
    • (2021)KeyTch: Combining the Keyboard with a Touchscreen for Rapid Command Selection on ToolbarsProceedings of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411764.3445288(1-13)Online publication date: 6-May-2021

    View Options

    View options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media