[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3491102.3517666acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article
Open access

Enabling Tangible Interaction on Non-touch Displays with Optical Mouse Sensor and Visible Light Communication

Published: 28 April 2022 Publication History

Abstract

This paper presents Centaur, an input system that enables tangible interaction on displays, e.g., untouchable computer monitors. Centaur’s tangibles are built from low-cost optical mouse sensors, or can alternatively be emulated by commercial optical mice already available. They are trackable when put on the display, rendering a real-time and high-precision tangible interface. Even for ordinary personal computers, enabling Centaur requires no new hardware and installation burden. Centaur’s cost-effectiveness and wide availability open up new opportunities for tangible user interface (TUI) users and practitioners. Centaur’s key innovation lies in its tracking method. It embeds high-frequency light signals into different portions of the display content as location beacons. When the tangibles are put on the screen, they are able to sense the light signals with their optical mouse sensors, and thus determine the locations accordingly. We develop four applications to showcase the potential usage of Centaur.

Supplementary Material

MP4 File (3491102.3517666-video-figure.mp4)
Video Figure
MP4 File (3491102.3517666-video-preview.mp4)
Video Preview

References

[1]
2022. Animal Video. https://www.youtube.com/watch?v=a5V6gdu5ih8.
[2]
2022. AOC AG251FZ. https://eu.aoc.com/en/gaming/products/ag251fz.
[3]
2022. Convert Optical Mouse into Arduino Web Camera. https://maker.wiznet.io/2014/11/14/convert-optical-mouse-into-arduino-web-camera.
[4]
2022. HID Clients Supported in Windows. https://docs.microsoft.com/en-us/windows-hardware/drivers/hid/hid-architecture.
[5]
2022. Lion. https://www.dropbox.com/s/u86su9hdyuwzagm/lion.pdf?dl=0.
[6]
2022. Monitor Stand. https://cjlegend.en.made-in-china.com/productimage/PjOmklucbzWa-2f1j00iTYUhZbWhAkV/China-Black-Mounts-Metal-Folding-Studio-Monitor-Stand.html.
[7]
2022. Nintendo amiibo. https://www.nintendo.com/amiibo/.
[8]
2022. OpenGL - The Industry Standard for High Performance Graphics. https://www.opengl.org/.
[9]
2022. Optical Mouse-Cam. https://spritesmods.com/?art=mouseeye.
[10]
2022. osmo. https://www.playosmo.com/en/.
[11]
2022. Pixart PMW3360. https://github.com/SunjunKim/PMW3360.
[12]
2022. Rotating Color Video. https://www.youtube.com/watch?v=XMmJIMGUe94.
[13]
2022. Shining Sunlight Video. https://www.youtube.com/watch?v=cI9u9R1pRNM&list=PLDTG-mhC5UzPo-QFAg7dXbJUVtRCjUN6J&index=1.
[14]
2022. Source Code. https://github.com/zlab-pub/mousetouch.
[15]
Satoshi Abe, Atsuro Arami, Takefumi Hiraki, Shogo Fukushima, and Takeshi Naemura. 2017. Imperceptible Color Vibration for Embedding Pixel-by-Pixel Data into LCD Images. In Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems. 1464–1470.
[16]
Michelle Annett, Tovi Grossman, Daniel Wigdor, and George Fitzmaurice. 2011. Medusa: a proximity-aware multi-touch tabletop. In Proceedings of the 24th annual ACM symposium on User interface software and technology. 337–346.
[17]
AVAGOTECH. 2008. Solid-State Optical Mouse Lens Data Sheet. https://www.sparkfun.com/datasheets/Widgets/AV02-1278EN.pdf.
[18]
Lonni Besançon, Paul Issartel, Mehdi Ammi, and Tobias Isenberg. 2017. Mouse, Tactile, and Tangible Input for 3D Manipulation. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 4727–4740.
[19]
Andrea Bianchi and Ian Oakley. 2015. MagnID: Tracking Multiple Magnetic Tokens. In Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction. 61–68.
[20]
Scott Brave, Hiroshi Ishii, and Andrew Dahley. 1998. Tangible interfaces for remote collaboration and communication. In Proceedings of the 1998 ACM conference on Computer supported cooperative work. 169–178.
[21]
Géry Casiez, Stéphane Conversy, Matthieu Falce, Stéphane Huot, and Nicolas Roussel. 2015. Looking through the eye of the mouse: A simple method for measuring end-to-end latency using an optical mouse. In Proceedings of the 28th Annual ACM Symposium on User Interface Software and Technology. ACM, 629–636.
[22]
Chung-Lin Chan, Jing-Yeu Chen, and Hsin-Mu Tsai. 2014. MouseVLC: Visible Light Communications Using Mouse Sensors. In 2014 IEEE International Conference on Internet of Things(iThings), and IEEE Green Computing and Communications (GreenCom) and IEEE Cyber, Physical and Social Computing (CPSCom). 328–331.
[23]
Liwei Chan, Stefanie Müller, Anne Roudaut, and Patrick Baudisch. 2012. CapStones and ZebraWidgets: sensing stacks of building blocks, dials and sliders on capacitive touch screens. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 2189–2192.
[24]
Richard O Duda and Peter E Hart. 1972. Use of the Hough transformation to detect lines and curves in pictures. Communications of The ACM 15, 1 (1972), 11–15.
[25]
George W. Fitzmaurice and William Buxton. 1997. An empirical evaluation of graspable user interfaces: towards specialized, space-multiplexed input. In Proceedings of the ACM SIGCHI Conference on Human factors in computing systems. 43–50.
[26]
Mathieu Le Goc, Lawrence H. Kim, Ali Parsaei, Jean-Daniel Fekete, Pierre Dragicevic, and Sean Follmer. 2016. Zooids: Building Blocks for Swarm User Interfaces. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology. 97–109.
[27]
Rafael Morales González, Caroline Appert, Gilles Bailly, and Emmanuel Pietriga. 2016. TouchTokens: Guiding Touch Patterns with Passive Tokens. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. 4189–4202.
[28]
Björn Hartmann, Meredith Ringel Morris, Hrvoje Benko, and Andrew D. Wilson. 2009. Augmenting interactive tables with mice and keyboards. In Proceedings of the 22nd annual ACM symposium on User interface software and technology. 149–152.
[29]
Ken Hinckley, Mike Sinclair, Erik Hanson, Richard Szeliski, and Matt Conway. 1999. The VideoMouse: A Camera-Based Multi-Degree-of-Freedom Input Device. In Proceedings of the 12th Annual ACM Symposium on User Interface Software and Technology (Asheville, North Carolina, USA) (UIST ’99). Association for Computing Machinery, New York, NY, USA, 103–112. https://doi.org/10.1145/320719.322591
[30]
Takefumi Hiraki, Yoshihiro Kawahara, and Takeshi Naemura. 2018. Projection-based Localization and Control Method of Robot Swarms for Swarm User Interfaces. In Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers. 584–589.
[31]
Eva Hornecker and Jacob Buur. 2006. Getting a grip on tangible interaction: a framework on physical space and social interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 437–446.
[32]
Lukas Hostettler, Ayberk Özgür, Séverin Lemaignan, Pierre Dillenbourg, and Francesco Mondada. 2016. Real-time high-accuracy 2d localization with structured patterns. In 2016 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 4536–4543.
[33]
Meng-Ju Hsieh, Rong-Hao Liang, Da-Yuan Huang, Jheng-You Ke, and Bing-Yu Chen. 2018. RFIBricks: Interactive Building Blocks Based on RFID. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 189.
[34]
Wenjun Hu, Gu Hao, and Qifan Pu. 2013. LightSync:unsynchronized visual communication over screen-camera links.
[35]
Hiroshi Ishii. 2008. The tangible user interface and its evolution. Communications of The ACM 51, 6 (2008), 32–36.
[36]
Bret Jackson, Tung Yuen Lau, David Schroeder, Kimani C. Toussaint, and Daniel F. Keefe. 2013. A Lightweight Tangible 3D Interface for Interactive Visualization of Thin Fiber Structures. IEEE Transactions on Visualization and Computer Graphics 19, 12(2013), 2802–2809.
[37]
Sergi Jordà, Günter Geiger, Marcos Alonso, and Martin Kaltenbrunner. 2007. The reacTable: exploring the synergy between live music performance and tabletop tangible interfaces. In Proceedings of the 1st international conference on Tangible and embedded interaction. 139–146.
[38]
Kunihiro Kato, Kaori Ikematsu, and Yoshihiro Kawahara. 2020. CAPath: 3D-Printed Interfaces with Conductive Points in Grid Layout to Extend Capacitive Touch Inputs. Proceedings of the ACM on Human-Computer Interaction 4, ISS(2020), 1–17.
[39]
Hyunyoung Kim, Céline Coutrix, and Anne Roudaut. 2018. KnobSlider: Design of a Shape-Changing UI for Parameter Control. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 339.
[40]
Sunjun Kim, Byungjoo Lee, Thomas van Gemert, and Antti Oulasvirta. 2020. Optimal Sensor Position for a Computer Mouse. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–13.
[41]
Ye Sheng Kuo, Pat Pannuto, and Prabal Dutta. 2014. Luxapose: indoor positioning with mobile phones and visible light. In International Conference on Mobile Computing and Networking.
[42]
Johnny C. Lee, Scott E. Hudson, Jay W. Summet, and Paul H. Dietz. 2005. Moveable interactive projected displays using projector based tracking. In Proceedings of the 18th annual ACM symposium on User interface software and technology. 63–72.
[43]
L. Li, P. Hu, C. Peng, G. Shen, and F. Zhao. 2014. Epsilon: A visible light based positioning system. In Usenix Conference on Networked Systems Design and Implementation.
[44]
Tianxing Li, Chuankai An, Xinran Xiao, Andrew T Campbell, and Xia Zhou. 2015. Real-time screen-camera communication behind any scene. In Proceedings of the 13th Annual International Conference on Mobile Systems, Applications, and Services. ACM, 197–211.
[45]
Sven Mayer, Xiangyu Xu, and Chris Harrison. 2021. Super-Resolution Capacitive Touchscreens. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–10.
[46]
TW Ng. 2003. The optical mouse as a two-dimensional displacement sensor. Sensors and Actuators A: Physical 107, 1 (2003), 21–25.
[47]
Viet Nguyen, Yaqin Tang, Ashwin Ashok, Marco Gruteser, Kristin Dana, Wenjun Hu, Eric Wengrowski, and Narayan Mandayam. 2016. High-rate flicker-free screen-camera communication with spatially adaptive embedding. In IEEE INFOCOM 2016-The 35th Annual IEEE International Conference on Computer Communications. IEEE, 1–9.
[48]
Masa Ogata, Yuta Sugiura, Hirotaka Osawa, and Michita Imai. 2013. FlashTouch: data communication through touchscreens. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 2321–2324.
[49]
Alex Olwal and Andrew D. Wilson. 2008. SurfaceFusion: unobtrusive tracking of everyday objects in tangible user interfaces. In GI ’08 Proceedings of Graphics Interface 2008. 235–242.
[50]
Esben Warming Pedersen and Kasper Hornbæk. 2011. Tangible bots: interaction with active tangibles in tabletop interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 2975–2984.
[51]
Samuel David Perli, Nabeel Ahmed, and Dina Katabi. 2010. PixNet: Interference-free wireless links using LCD-camera pairs. In Proceedings of the sixteenth annual international conference on Mobile computing and networking. ACM, 137–148.
[52]
Majken K Rasmussen, Esben W Pedersen, Marianne G Petersen, and Kasper Hornbæk. 2012. Shape-changing interfaces: a review of the design space and open research questions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 735–744.
[53]
Dennis Schüsselbauer, Andreas Schmid, and Raphael Wimmer. 2021. Dothraki: Tracking Tangibles Atop Tabletops Through De-Bruijn Tori. In Proceedings of the Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction(Salzburg, Austria) (TEI ’21). Association for Computing Machinery, New York, NY, USA, Article 37, 10 pages. https://doi.org/10.1145/3430524.3440656
[54]
Daisuke Sekimori and Fumio Miyazaki. 2005. Self-localization for indoor mobile robots based on optical mouse sensor values and simple global camera information. In 2005 IEEE International Conference on Robotics and Biomimetics-ROBIO. IEEE, 605–610.
[55]
Michel Melo Silva, Jose Roberto De Almeida Nozela, Marcio Jose Chaves, Roberto A Braga, and H Rabal. 2011. Optical mouse acting as biospeckle sensor. Optics Communications 284, 7 (2011), 1798–1802.
[56]
M. Sugimoto, K. Kodama, A. Nakamura, M. Kojima, and M. Inami. 2007. A Display-Based Tracking System: Display-Based Computing for Measurement Systems. In 17th International Conference on Artificial Reality and Telexistence (ICAT 2007). 31–38.
[57]
Agilent Technologies. 2003. Agilent ADNS-2051 Optical Mouse Sensor Data Sheet. http://bdml.stanford.edu/twiki/pub/Rise/OpticalDisplacementSensor/ADNS2051.pdf.
[58]
Lucia Terrenghi, David Kirk, Abigail Sellen, and Shahram Izadi. 2007. Affordances for manipulation of physical versus digital media on interactive surfaces. In Proceedings of the SIGCHI conference on Human factors in computing systems. 1157–1166.
[59]
Philip Tuddenham, David Kirk, and Shahram Izadi. 2010. Graspables revisited: multi-touch vs. tangible input for tabletop displays in acquisition and manipulation tasks. In Proceedings of the SIGCHI conference on human factors in computing systems. 2223–2232.
[60]
Brygg Ullmer and Hiroshi Ishii. 2000. Emerging frameworks for tangible user interfaces. IBM systems journal 39, 3.4 (2000), 915–931.
[61]
Nicolas Villar, Daniel Cletheroe, Greg Saul, Christian Holz, Tim Regan, Oscar Salandin, Misha Sra, Hui-Shyong Yeo, William Field, and Haiyan Zhang. 2018. Project Zanzibar: A Portable and Flexible Tangible Interaction Platform. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 515.
[62]
Simon Voelker, Christian Cherek, Jan Thar, Thorsten Karrer, Christian Thoresen, Kjell Ivar Øvergård, and Jan Borchers. 2015. PERCs: persistently trackable tangibles on capacitive multi-touch displays. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology. 351–356.
[63]
Simon Voelker, Kosuke Nakajima, Christian Thoresen, Yuichi Itoh, Kjell Ivar Øvergård, and Jan Borchers. 2013. PUCs: detecting transparent, passive untouched capacitive widgets on unmodified multi-touch displays. In Proceedings of the adjunct publication of the 26th annual ACM symposium on User interface software and technology. 101–104.
[64]
Daniel Vogel and Ravin Balakrishnan. 2010. Occlusion-aware interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 263–272.
[65]
Anran Wang, Zhuoran Li, Chunyi Peng, Guobin Shen, Gan Fang, and Bing Zeng. 2015. Inframe++: Achieve simultaneous screen-human viewing and hidden screen-camera communication. In Proceedings of the 13th Annual International Conference on Mobile Systems, Applications, and Services. ACM, 181–195.
[66]
Malte Weiss, Julie Wagner, Yvonne Jansen, Roger Jennings, Ramsin Khoshabeh, James D. Hollan, and Jan Borchers. 2009. SLAP widgets: bridging the gap between virtual and physical controls on tabletops. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 481–490.
[67]
Andrew D. Wilson. 2005. PlayAnywhere: a compact interactive tabletop projection-vision system. In Proceedings of the 18th annual ACM symposium on User interface software and technology. 83–92.
[68]
Sang won Leigh, Philipp Schoessler, Felix Heibeck, Pattie Maes, and Hiroshi Ishii. 2015. THAW: Tangible Interaction with See-Through Augmentation for Smartphones on Computer Screens. In Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction. 89–96.
[69]
Zhice Yang, Zeyu Wang, Jiansong Zhang, Chenyu Huang, and Qian Zhang. 2015. Wearables can afford: Light-weight indoor positioning with visible light. In Proceedings of the 13th Annual International Conference on Mobile Systems, Applications, and Services. 317–330.
[70]
Kentaro Yasu. 2019. Magnetact: magnetic-sheet-based haptic interfaces for touch devices. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1–8.
[71]
Hui-Shyong Yeo, Wenxin Feng, and Michael Xuelin Huang. 2020. WATouCH: Enabling Direct Input on Non-touchscreen Using Smartwatch’s Photoplethysmogram and IMU Sensor Fusion. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–10.
[72]
Hui-Shyong Yeo, Juyoung Lee, Andrea Bianchi, Alejandro Samboy, Hideki Koike, Woontack Woo, and Aaron Quigley. 2020. WristLens: Enabling Single-Handed Surface Gesture Interaction for Wrist-Worn Devices Using Optical Motion Sensor. In Proceedings of the Augmented Humans International Conference. 1–8.
[73]
Neng-Hao Yu, Li-Wei Chan, Seng Yong Lau, Sung-Sheng Tsai, I-Chun Hsiao, Dian-Je Tsai, Fang-I Hsiao, Lung-Pan Cheng, Mike Chen, Polly Huang, 2011. TUIC: enabling tangible interaction on capacitive multi-touch displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 2995–3004.
[74]
Neng-Hao Yu, Sung-Sheng Tsai, I-Chun Hsiao, Dian-Je Tsai, Meng-Han Lee, Mike Y. Chen, and Yi-Ping Hung. 2011. Clip-on gadgets: expanding multi-touch interaction area with unpowered tactile controls. In Proceedings of the 24th annual ACM symposium on User interface software and technology. 367–372.
[75]
Kai Zhang, Chenshu Wu, Chaofan Yang, Yi Zhao, Kehong Huang, Chunyi Peng, Yunhao Liu, and Zheng Yang. 2018. ChromaCode: A Fully Imperceptible Screen-Camera Communication System. In Proceedings of the 24th Annual International Conference on Mobile Computing and Networking. ACM, 575–590.
[76]
Oren Zuckerman, Saeed Arida, and Mitchel Resnick. 2005. Extending tangible interfaces for education: digital montessori-inspired manipulatives. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 859–868.

Cited By

View all
  • (2024)ArgusEyes: Interactions by Combining Multiple Modules with Optical Flow SensorsExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3648663(1-5)Online publication date: 11-May-2024

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI '22: Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems
April 2022
10459 pages
ISBN:9781450391573
DOI:10.1145/3491102
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike International 4.0 License.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 28 April 2022

Check for updates

Author Tags

  1. Optical Mouse
  2. Tabletop
  3. Tangible
  4. Visible Light Communication

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

Conference

CHI '22
Sponsor:
CHI '22: CHI Conference on Human Factors in Computing Systems
April 29 - May 5, 2022
LA, New Orleans, USA

Acceptance Rates

Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

Upcoming Conference

CHI 2025
ACM CHI Conference on Human Factors in Computing Systems
April 26 - May 1, 2025
Yokohama , Japan

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)267
  • Downloads (Last 6 weeks)27
Reflects downloads up to 06 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)ArgusEyes: Interactions by Combining Multiple Modules with Optical Flow SensorsExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3648663(1-5)Online publication date: 11-May-2024

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media