[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3430524.3442447acmotherconferencesArticle/Chapter ViewAbstractPublication PagesteiConference Proceedingsconference-collections
Work in Progress

inDepth: Force-based Interaction with Objects beyond A Physical Barrier

Published: 14 February 2021 Publication History

Abstract

We propose inDepth, a novel system that enables force-based interaction with objects beyond a physical barrier by using scalable force sensor modules. inDepth transforms a physical barrier (eg. glass showcase or 3D display) to a tangible input interface that enables users to interact with objects out of reach, by applying finger pressure on the barrier’s surface. To achieve this interaction, our system tracks the applied force as a directional vector by using three force sensors installed underneath the barrier. Meanwhile, our force-to-depth conversion algorithm translates force intensity into a spatial position along its direction beyond the barrier. Finally, the system executes various operations on objects in that position based on the type of application. In this paper, we introduce inDepth concept and its design space. We also demonstrate example applications, including selecting items in showcases and manipulating 3D rendered models.

Supplementary Material

MP4 File (3430524.3442447.mp4)
Supplementary video

References

[1]
Hrvoje Benko and Andy Wilson. 2009. DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface. Technical Report MSR-TR-2009-23. Microsoft.
[2]
Alvaro Cassinelli and Masatoshi Ishikawa. 2005. Khronos projector. In ACM SIGGRAPH 2005 Emerging technologies. 10–es.
[3]
Laura Dipietro, Angelo M Sabatini, and Paolo Dario. 2008. A survey of glove-based systems and their applications. Ieee transactions on systems, man, and cybernetics, part c (applications and reviews) 38, 4 (2008), 461–482.
[4]
Gregg E Favalora. 2005. Volumetric 3D displays and application infrastructure. Computer 38, 8 (2005), 37–44.
[5]
Gustav Theodor Fechner, Davis H Howes, and Edwin Garrigues Boring. 1966. Elements of psychophysics. Vol. 1. Holt, Rinehart and Winston New York.
[6]
Eisuke Fujinawa, Kenji Goto, Atsushi Irie, Songtao Wu, and Kuanhong Xu. 2019. Occlusion-aware Hand Posture Based Interaction on Tabletop Projector. In The Adjunct Publication of the 32nd Annual ACM Symposium on User Interface Software and Technology. 113–115.
[7]
William W Gaver, John Bowers, Andrew Boucher, Hans Gellerson, Sarah Pennington, Albrecht Schmidt, Anthony Steed, Nicholas Villars, and Brendan Walker. 2004. The drift table: designing for ludic engagement. In CHI’04 extended abstracts on Human factors in computing systems. 885–900.
[8]
Tovi Grossman and Ravin Balakrishnan. 2006. The design and evaluation of selection techniques for 3D volumetric displays. In Proceedings of the 19th annual ACM symposium on User interface software and technology. ACM, 3–12.
[9]
Jefferson Y Han. 2005. Low-cost multi-touch sensing through frustrated total internal reflection. In Proceedings of the 18th annual ACM symposium on User interface software and technology. 115–118.
[10]
Chris Harrison, Hrvoje Benko, and Andrew D. Wilson. 2011. OmniTouch: Wearable Multitouch Interaction Everywhere. In Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology (Santa Barbara, California, USA) (UIST ’11). Association for Computing Machinery, New York, NY, USA, 441–450. https://doi.org/10.1145/2047196.2047255
[11]
Chris Harrison and Scott Hudson. 2012. Using shear as a supplemental two-dimensional input channel for rich touchscreen interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 3149–3152.
[12]
Seongkook Heo and Geehyuk Lee. 2013. Indirect shear force estimation for multi-point shear force operations. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 281–284.
[13]
Valentin Heun, James Hobin, and Pattie Maes. 2013. Reality editor: programming smarter objects. In Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publication. 307–310.
[14]
Matthew Hirsch, Douglas Lanman, Henry Holtzman, and Ramesh Raskar. 2009. BiDi screen: a thin, depth-sensing LCD for 3D interaction using light fields. ACM Transactions on Graphics (ToG) 28, 5 (2009), 1–9.
[15]
Charles Hudin, Sabrina Panëels, and Steven Strachan. 2016. INTACT: Instant interaction with 3D printed objects. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. ACM, 2719–2725.
[16]
Amayikai A Ishaku, Aris Tranganidas, Slavomír Matúška, Róbert Hudec, Graeme McCutcheon, Lina Stankovic, and Helena Gleskova. 2019. Flexible force sensors embedded in office chair for monitoring of sitting postures. In 2019 IEEE International Conference on Flexible and Printable Sensors and Systems (FLEPS). IEEE, 1–3.
[17]
Hiroshi Ishii and Brygg Ullmer. 1997. Tangible bits: towards seamless interfaces between people, bits and atoms. In Proceedings of the ACM SIGCHI Conference on Human factors in computing systems. ACM, 234–241.
[18]
Wolf Kienzle and Ken Hinckley. 2014. LightRing: always-available 2D input on any surface. In Proceedings of the 27th annual ACM symposium on User interface software and technology. 157–160.
[19]
David Kim, Otmar Hilliges, Shahram Izadi, Alex D Butler, Jiawen Chen, Iason Oikonomidis, and Patrick Olivier. 2012. Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor. In Proceedings of the 25th annual ACM symposium on User interface software and technology. 167–176.
[20]
Hong-Ki Kim, Seunggun Lee, and Kwang-Seok Yun. 2011. Capacitive tactile sensor array for touch screen application. Sensors and Actuators A: Physical 165, 1 (2011), 2–7.
[21]
Gierad Laput, Chouchang Yang, Robert Xiao, Alanson Sample, and Chris Harrison. 2015. Em-sense: Touch recognition of uninstrumented, electrical and electromechanical objects. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology. 157–166.
[22]
Jinha Lee and Hiroshi Ishii. 2010. Beyond: collapsible tools and gestures for computational design. In CHI’10 Extended Abstracts on Human Factors in Computing Systems. ACM, 3931–3936.
[23]
Jinha Lee, Alex Olwal, Hiroshi Ishii, and Cati Boulanger. 2013. SpaceTop: integrating 2D and spatial 3D interactions in a see-through desktop environment. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 189–192.
[24]
Mark H Lee and Howard R Nicholls. 1999. Review Article Tactile sensing for mechatronics—a state of the art survey. Mechatronics 9, 1 (1999), 1–31.
[25]
Nobuyuki Matsushita, Yuji Ayatsuka, and Jun Rekimoto. 2000. Dual touch: a two-handed interface for pen-based PDAs. In Proceedings of the 13th annual ACM symposium on User interface software and technology. 211–212.
[26]
Kazuya Murao, Junna Imai, Tsutomu Terada, and Masahiko Tsukamoto. 2015. Recognizing activities and identifying users based on tabletop activities with load cells. In Proceedings of the 17th International Conference on Information Integration and Web-based Applications & Services. 1–6.
[27]
Takehiro Niikura, Yoshihiro Watanabe, and Masatoshi Ishikawa. 2014. Anywhere surface touch: utilizing any surface as an input area. In Proceedings of the 5th Augmented Human International Conference. 1–8.
[28]
Makoto Ono, Buntarou Shizuki, and Jiro Tanaka. 2013. Touch & activate: adding interactivity to existing objects using active acoustic sensing. In Proceedings of the 26th annual ACM symposium on User interface software and technology. ACM, 31–40.
[29]
Voxon Photonics. [n.d.]. Voxon VX1 official website. https://voxon.co. Accessed: 2019-08-10.
[30]
Albrecht Schmidt, Martin Strohbach, Kristof Van Laerhoven, Adrian Friday, and Hans-Werner Gellersen. 2002. Context acquisition based on load sensing. In International Conference on Ubiquitous Computing. Springer, 333–350.
[31]
Jürgen Steimle, Andreas Jordt, and Pattie Maes. 2013. Flexpad: Highly Flexible Bending Interactions for Projected Handheld Displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Paris, France) (CHI ’13). Association for Computing Machinery, New York, NY, USA, 237–246. https://doi.org/10.1145/2470654.2470688
[32]
Naoki Sugita, Daisuke Iwai, and Kosuke Sato. 2008. Touch sensing by image analysis of fingernail. In 2008 SICE Annual Conference. IEEE, 1520–1525.
[33]
Yuta Sugiura, Masahiko Inami, and Takeo Igarashi. 2012. A thin stretchable interface for tangential force measurement. In Proceedings of the 25th annual ACM symposium on User interface software and technology. 529–536.
[34]
Frank Weichert, Daniel Bachmann, Bartholomäus Rudak, and Denis Fisseler. 2013. Analysis of the accuracy and robustness of the leap motion controller. Sensors 13, 5 (2013), 6380–6393.
[35]
Julie R. Williamson, Daniel Sundén, and Jay Bradley. 2015. GlobalFestival: Evaluating Real World Interaction on a Spherical Display. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing(Osaka, Japan) (UbiComp ’15). Association for Computing Machinery, New York, NY, USA, 1251–1261. https://doi.org/10.1145/2750858.2807518
[36]
Xing-Dong Yang, Tovi Grossman, Daniel Wigdor, and George Fitzmaurice. 2012. Magic finger: always-available input through finger instrumentation. In Proceedings of the 25th annual ACM symposium on User interface software and technology. 147–156.
[37]
Takatoshi Yoshida, Xiaoyan Shen, Tal Achituv, and Hiroshi Ishii. 2018. 3D Touch Point Detection on Load Sensitive Surface Based on Continuous Fluctuation of a User Hand. In SIGGRAPH Asia 2018 Posters (Tokyo, Japan) (SA ’18). ACM, New York, NY, USA, Article 39, 2 pages. https://doi.org/10.1145/3283289.3283339
[38]
Takatoshi Yoshida, Xiaoyan Shen, Koichi Yoshino, Ken Nakagaki, and Hiroshi Ishii. 2019. SCALE: Enhancing Force-based Interaction by Processing Load Data from Load Sensitive Modules. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology. 901–911.
[39]
Yang Zhang, Gierad Laput, and Chris Harrison. 2017. Electrick: Low-cost touch sensing using electric field tomography. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 1–14.

Cited By

View all
  • (2024)Interaction with Physical Referents In and Around the Field of View of Optical See-Through Augmented-Reality HeadsetsProceedings of the 35th Conference on l'Interaction Humain-Machine10.1145/3649792.3649796(1-15)Online publication date: 25-Mar-2024
  • (2024)DeformIO: Dynamic Stiffness Control on a Deformable Force-sensing DisplayExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3650772(1-8)Online publication date: 11-May-2024
  • (2023)ShadowTouch: Enabling Free-Form Touch-Based Hand-to-Surface Interaction with Wrist-Mounted Illuminant by Shadow ProjectionProceedings of the 36th Annual ACM Symposium on User Interface Software and Technology10.1145/3586183.3606785(1-14)Online publication date: 29-Oct-2023
  • Show More Cited By

Index Terms

  1. inDepth: Force-based Interaction with Objects beyond A Physical Barrier
    Index terms have been assigned to the content through auto-classification.

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    TEI '21: Proceedings of the Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction
    February 2021
    908 pages
    ISBN:9781450382137
    DOI:10.1145/3430524
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 14 February 2021

    Check for updates

    Author Tags

    1. force sensing;
    2. touch input
    3. user interface: tangible interaction

    Qualifiers

    • Work in progress
    • Research
    • Refereed limited

    Conference

    TEI '21

    Acceptance Rates

    TEI '21 Paper Acceptance Rate 40 of 136 submissions, 29%;
    Overall Acceptance Rate 393 of 1,367 submissions, 29%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)53
    • Downloads (Last 6 weeks)12
    Reflects downloads up to 28 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Interaction with Physical Referents In and Around the Field of View of Optical See-Through Augmented-Reality HeadsetsProceedings of the 35th Conference on l'Interaction Humain-Machine10.1145/3649792.3649796(1-15)Online publication date: 25-Mar-2024
    • (2024)DeformIO: Dynamic Stiffness Control on a Deformable Force-sensing DisplayExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3650772(1-8)Online publication date: 11-May-2024
    • (2023)ShadowTouch: Enabling Free-Form Touch-Based Hand-to-Surface Interaction with Wrist-Mounted Illuminant by Shadow ProjectionProceedings of the 36th Annual ACM Symposium on User Interface Software and Technology10.1145/3586183.3606785(1-14)Online publication date: 29-Oct-2023
    • (2022)Freehand Target Acquisition With a Force-Assisted DeviceIEEE Sensors Journal10.1109/JSEN.2021.313712822:3(1972-1979)Online publication date: 1-Feb-2022

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media