[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3379503.3403531acmconferencesArticle/Chapter ViewAbstractPublication PagesmobilehciConference Proceedingsconference-collections
research-article

Active PinScreen: Exploring Spatio-Temporal Tactile Feedbackfor Multi-Finger Interaction

Published: 05 October 2020 Publication History

Abstract

Multiple fingers are often used for efficient interaction with handheld computing devices. Currently, any tactile feedback provided is felt on the finger pad or the palm with coarse granularity. In contrast, we present a new tactile feedback technique, Active PinScreen, that applies localised stimuli on multiple fingers with fine spatial and temporal resolution. The tactile screen uses an array of solenoid-actuated magnetic pins with millimetre scale form-factor which could be deployed for back-of-device handheld use without instrumenting the user. As well as presenting a detailed description of the prototype, we provide the potential design configurations and the applications of the Active PinScreen and evaluate the human factors of tactile interaction with multiple fingers in a controlled user evaluation. The results of our study show a high recognition rate for directional and patterned stimulation across different grip orientations as well as within- and between- fingers. We end the paper with a discussion of our main findings, limitations in the current design and directions for future work.

References

[1]
Jessalyn Alvina, Shengdong Zhao, Simon T Perrault, Maryam Azh, Thijs Roumen, and Morten Fjeld. 2015. OmniVib: Towards cross-body spatiotemporal vibrotactile notifications for mobile phones. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM, 2487–2496.
[2]
Patrick Baudisch and Gerry Chu. 2009. Back-of-device interaction allows creating very small touch devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 1923–1932. https://doi.org/10.1145/1518701.1518995
[3]
Hrvoje Benko, Christian Holz, Mike Sinclair, and Eyal Ofek. 2016. NormalTouch and TextureTouch: High-fidelity 3D Haptic Shape Rendering on Handheld Virtual Reality Controllers(UIST ’16). ACM, New York, NY, USA, 717–728. https://doi.org/10.1145/2984511.2984526
[4]
Lynn S Bickley, Peter G Szilagyi, and Barbara Bates. 2009. Bates’ guide to physical examination and history taking. Lippincott Williams & Wilkins.
[5]
Stephen Brewster and Lorna M. Brown. 2004. Tactons: Structured Tactile Messages for Non-visual Information Display(AUIC ’04). Australian Computer Society, Inc., Darlinghurst, Australia, Australia, 15–23. http://dl.acm.org/citation.cfm?id=976310.976313
[6]
Jessica R. Cauchard, Janette L. Cheng, Thomas Pietrzak, and James A. Landay. 2016. ActiVibe: Design and Evaluation of Vibrations for Progress Monitoring(CHI ’16). ACM, New York, NY, USA, 3261–3271. https://doi.org/10.1145/2858036.2858046
[7]
Daniel K.Y. Chen, Jean-Baptiste Chossat, and Peter B. Shull. 2019. HaptiVec: Presenting Haptic Feedback Vectors in Handheld Controllers Using Embedded Tactile Pin Arrays(CHI ’19). ACM, New York, NY, USA, Article 171, 11 pages. https://doi.org/10.1145/3290605.3300401
[8]
Christian Corsten, Christian Cherek, Thorsten Karrer, and Jan Borchers. 2015. HaptiCase: Back-of-Device Tactile Landmarks for Eyes-Free Absolute Indirect Touch(CHI ’15). ACM, New York, NY, USA, 2171–2180. https://doi.org/10.1145/2702123.2702277
[9]
James C Craig and Keith B Lyle. 2001. A comparison of tactile spatial sensitivity on the palm and fingerpad. Perception & psychophysics 63, 2 (2001), 337–347. https://doi.org/10.3758/BF03194474
[10]
Heather Culbertson, Julie M Walker, Michael Raitor, and Allison M Okamura. 2017. WAVES: a wearable asymmetric vibration excitation system for presenting three-dimensional translation and rotation cues. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM, 4972–4982. https://doi.org/10.1145/3025453.3025741
[11]
Yasmine N El-Glaly, Francis Quek, Tonya Smith-Jackson, and Gurjot Dhillon. 2013. Touch-screens are not tangible: Fusing tangible interaction with touch glass in readers for the blind. In Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction. ACM, 245–253. https://doi.org/10.1145/2460625.2460665
[12]
Sean Follmer, Daniel Leithinger, Alex Olwal, Akimitsu Hogge, and Hiroshi Ishii. 2013. inFORM: Dynamic Physical Affordances and Constraints Through Shape and Object Actuation(UIST ’13). ACM, New York, NY, USA, 417–426. https://doi.org/10.1145/2501988.2502032
[13]
Alix Goguey, Deepak Ranjan Sahoo, Simon Robinson, Jennifer Pearson, and Matt Jones. 2019. Pulp Friction: Exploring the Finger Pad Periphery for Subtle Haptic Feedback(CHI ’19). ACM, New York, NY, USA, Article 641, 13 pages. https://doi.org/10.1145/3290605.3300871
[14]
Anhong Guo and Jeffrey P Bigham. 2018. Making Everyday Interfaces Accessible: Tactile Overlays by and for Blind People. IEEE Pervasive Computing 17, 2 (2018), 66–70. https://ieeexplore.ieee.org/document/8383663
[15]
Chris Harrison and Scott E. Hudson. 2009. Providing Dynamically Changeable Physical Buttons on a Visual Display(CHI ’09). ACM, New York, NY, USA, 299–308. https://doi.org/10.1145/1518701.1518749
[16]
Liang He, Zijian Wan, Leah Findlater, and Jon E Froehlich. 2017. TacTILE: a preliminary toolchain for creating accessible graphics with 3D-printed overlays and auditory annotations. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, 397–398. https://doi.org/10.1145/3132525.3134818
[17]
Fabian Hemmert, Susann Hamann, Matthias Löwe, Anne Wohlauf, and Gesche Joost. 2010. Shape-changing Mobiles: Tapering in One-dimensional Deformational Displays in Mobile Phones(TEI ’10). ACM, New York, NY, USA, 249–252. https://doi.org/10.1145/1709886.1709936
[18]
Fabian Hemmert, Gesche Joost, André Knörig, and Reto Wettach. 2008. Dynamic Knobs: Shape Change As a Means of Interaction on a Mobile Phone(CHI EA ’08). ACM, New York, NY, USA, 2309–2314. https://doi.org/10.1145/1358628.1358675
[19]
Ken Hinckley, Seongkook Heo, Michel Pahud, Christian Holz, Hrvoje Benko, Abigail Sellen, Richard Banks, Kenton O’Hara, Gavin Smyth, and William Buxton. 2016. Pre-Touch Sensing for Mobile Interaction(CHI ’16). ACM, New York, NY, USA, 2869–2881. https://doi.org/10.1145/2858036.2858095
[20]
Shigeo Hiraoka, Isshin Miyamoto, and Kiyoshi Tomimatsu. 2003. Behind touch, a text input method for mobile phones by the back and tactile sense interface. Information Processing Society of Japan, Interaction 2003 (2003), 131–138.
[21]
Eve Hoggan, Stephen A. Brewster, and Jody Johnston. 2008. Investigating the Effectiveness of Tactile Feedback for Mobile Touchscreens(CHI ’08). ACM, New York, NY, USA, 1573–1582. https://doi.org/10.1145/1357054.1357300
[22]
Ali Israr and Ivan Poupyrev. 2011. Tactile Brush: Drawing on Skin with a Tactile Grid Display(CHI ’11). ACM, New York, NY, USA, 2019–2028. https://doi.org/10.1145/1978942.1979235
[23]
Sungjune Jang, Lawrence H. Kim, Kesler Tanner, Hiroshi Ishii, and Sean Follmer. 2016. Haptic Edge Display for Mobile Tactile Interaction(CHI ’16). ACM, New York, NY, USA, 3706–3716. https://doi.org/10.1145/2858036.2858264
[24]
Jingun Jung, Eunhye Youn, and Geehyuk Lee. 2017. PinPad: touchpad interaction with fast and high-resolution tactile output. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM, 2416–2425. https://doi.org/10.1145/3025453.3025971
[25]
Hiroyuki Kajimoto, Masaki Suzuki, and Yonezo Kanno. 2014. HamsaTouch: Tactile Vision Substitution with Smartphone and Electro-tactile Display(CHI EA ’14). ACM, New York, NY, USA, 1273–1278. https://doi.org/10.1145/2559206.2581164
[26]
Shaun K Kane, Meredith Ringel Morris, and Jacob O Wobbrock. 2013. Touchplates: low-cost tactile overlays for visually impaired touch screen users. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, 22. https://doi.org/10.1145/2513383.2513442
[27]
Sugarragchaa Khurelbaatar, Yuriko Nakai, Ryuta Okazaki, Vibol Yem, and Hiroyuki Kajimoto. 2016. Tactile presentation to the back of a smartphone with simultaneous screen operation. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 3717–3721. https://doi.org/10.1145/2851581.2889459
[28]
Hwan Kim, HyeonBeom Yi, Hyein Lee, and Woohun Lee. 2018. Hapcube: A wearable tactile device to provide tangential and normal pseudo-force feedback on a fingertip. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 501. https://doi.org/10.1145/3173574.3174075
[29]
Robert Kincaid. 2012. Tactile guides for touch screen controls. In Proceedings of the 26th Annual BCS Interaction Specialist Group Conference on People and Computers. British Computer Society, 339–344. https://dl.acm.org/doi/10.5555/2377916.2377963
[30]
I. M. Koo, K. Jung, J. C. Koo, J. Nam, Y. K. Lee, and H. R. Choi. 2008. Development of Soft-Actuator-Based Wearable Tactile Display. IEEE Transactions on Robotics 24, 3 (June 2008), 549–558. https://doi.org/10.1109/TRO.2008.921561
[31]
Huy Viet Le, Sven Mayer, and Niels Henze. 2019. Investigating the feasibility of finger identification on capacitive touchscreens using deep learning. In Proceedings of the 24th International Conference on Intelligent User Interfaces. 637–649. https://doi.org/10.1145/3301275.3302295
[32]
Joseph Luk, Jerome Pasquero, Shannon Little, Karon MacLean, Vincent Levesque, and Vincent Hayward. 2006. A Role for Haptics in Mobile Interaction: Initial Design Using a Handheld Tactile Display Prototype(CHI ’06). ACM, New York, NY, USA, 171–180. https://doi.org/10.1145/1124772.1124800
[33]
BBC News. 2014. Braille phone goes on sale in ’world first’. http://www.bbc.co.uk/news/technology-27437770
[34]
Fabrizio Pece, Juan Jose Zarate, Velko Vechev, Nadine Besse, Olexandr Gudozhnik, Herbert Shea, and Otmar Hilliges. 2017. Magtics: Flexible and thin form factor magnetic actuators for dynamic and wearable haptic feedback. In Proceedings of the 30th annual ACM symposium on User interface software and technology. ACM, 143–154.
[35]
Ryan M Peters, Erik Hackeman, and Daniel Goldreich. 2009. Diminutive digits discern delicate details: fingertip size and the sex difference in tactile spatial acuity. Journal of Neuroscience 29, 50 (2009), 15756–15761.
[36]
Ivan Poupyrev, Tatsushi Nashida, and Makoto Okabe. 2007. Actuation and Tangible User Interfaces: The Vaucanson Duck, Robots, and Shape Displays(TEI ’07). ACM, New York, NY, USA, 205–212. https://doi.org/10.1145/1226969.1227012
[37]
Denise Prescher, Gerhard Weber, and Martin Spindler. 2010. A Tactile Windowing System for Blind Users(ASSETS ’10). ACM, New York, NY, USA, 91–98. https://doi.org/10.1145/1878803.1878821
[38]
M. Sadeghi, H. Kim, and K. Najafi. 2010. Electrostatically driven micro-hydraulic actuator arrays. In 2010 IEEE 23rd International Conference on Micro Electro Mechanical Systems (MEMS). 15–18. https://doi.org/10.1109/MEMSYS.2010.5442576
[39]
James Scott, Shahram Izadi, Leila Sadat Rezai, Dominika Ruszkowski, Xiaojun Bi, and Ravin Balakrishnan. 2010. RearType: Text Entry Using Keys on the Back of a Device(MobileHCI ’10). ACM, New York, NY, USA, 171–180. https://doi.org/10.1145/1851600.1851630
[40]
Youngbo Aram Shim, Keunwoo Park, and Geehyuk Lee. 2019. Using Poke Stimuli to Improve a 3x3 Watch-back Tactile Display. In Proceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services. 1–8. https://doi.org/10.1145/3338286.3340134
[41]
Evan Strasnick and Sean Follmer. 2016. Applications of Switchable Permanent Magnetic Actuators in Shape Change and Tactile Display(UIST ’16 Adjunct). ACM, New York, NY, USA, 123–125. https://doi.org/10.1145/2984751.2985728
[42]
Evan Strasnick, Jackie Yang, Kesler Tanner, Alex Olwal, and Sean Follmer. 2017. shiftIO: Reconfigurable Tactile Elements for Dynamic Affordances and Mobile Interaction(CHI ’17). ACM, New York, NY, USA, 5075–5086. https://doi.org/10.1145/3025453.3025988
[43]
Ian R Summers, Craig M Chanter, Anna L Southall, and Alan C Brady. 2001. Results from a Tactile Array on the Fingertip. In Proceedings of Eurohaptics, Vol. 2001.
[44]
Brandon Taylor, Anind Dey, Dan Siewiorek, and Asim Smailagic. 2016. Customizable 3D printed tactile maps as interactive overlays. In Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, 71–79. https://doi.org/10.1145/2982142.2982167
[45]
Jonathan Tong, Oliver Mao, and Daniel Goldreich. 2013. Two-point orientation discrimination versus the traditional two-point test for tactile spatial acuity assessment. Frontiers in human neuroscience 7 (2013), 579. https://www.frontiersin.org/articles/10.3389/fnhum.2013.00579/full
[46]
F. Vidal-Verdu and M. Hafez. 2007. Graphical Tactile Displays for Visually-Impaired People. IEEE Transactions on Neural Systems and Rehabilitation Engineering 15, 1 (March 2007), 119–130. https://doi.org/10.1109/TNSRE.2007.891375
[47]
Thorsten Völkel, Gerhard Weber, and Ulrich Baumann. 2008. Tactile graphics revised: the novel brailledis 9000 pin-matrix device with multitouch input. In International Conference on Computers for Handicapped Persons. Springer, 835–842. https://doi.org/10.1007/978-3-540-70540-6_124
[48]
EH Weber. 1835. Uber den tastsinn. Archiv fr Anatomie, Physiologie und Wissenschaftliche Medicin (1835), 152–160.
[49]
Sang-Yeun Won, Hye-Kyoung Kim, Mee-Eun Kim, and Ki-Suk Kim. 2017. Two-point discrimination values vary depending on test site, sex and test modality in the orofacial region: a preliminary study. Journal of Applied Oral Science 25, 4 (2017), 427–435. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5595116/
[50]
Gi-Hun Yang, Moon-sub Jin, Yeonsub Jin, and Sungchul Kang. 2010. T-mobile: Vibrotactile display pad with spatial and directional information for hand-held device. In 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 5245–5250. https://ieeexplore.ieee.org/document/5651759
[51]
Gi-Hun Yang, Ki-Uk Kyung, Mandayam A Srinivasan, and Dong-Soo Kwon. 2006. Quantitative tactile display device with pin-array type tactile feedback and thermal feedback. In Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006. IEEE, 3917–3922.
[52]
Tae-Heon Yang, Sang-Youn Kim, Wayne J Book, and Dong-Soo Kwon. 2010. Tiny feel: a new miniature tactile module using elastic and electromagnetic force for mobile devices. IEICE transactions on information and systems 93, 8 (2010), 2233–2242.
[53]
Koji Yatani, Nikola Banovic, and Khai Truong. 2012. SpaceSense: representing geographical information to visually impaired people using spatial tactile feedback. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 415–424.
[54]
Koji Yatani and Khai Nhut Truong. 2009. SemFeel: A User Interface with Semantic Tactile Feedback for Mobile Touch-screen Devices(UIST ’09). ACM, New York, NY, USA, 111–120. https://doi.org/10.1145/1622176.1622198

Cited By

View all
  • (2024)STButton: Exploring Opportunities for Buttons with Spatio-Temporal Tactile OutputExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3648671(1-5)Online publication date: 11-May-2024
  • (2022)Ultrasound Mid-Air Haptic Feedback at the FingertipUltrasound Mid-Air Haptics for Touchless Interfaces10.1007/978-3-031-04043-6_13(299-311)Online publication date: 17-Sep-2022
  • (2021)Intuitive Spatial Tactile Feedback for Better Awareness about Robot Trajectory during Human–Robot CollaborationSensors10.3390/s2117574821:17(5748)Online publication date: 26-Aug-2021

Index Terms

  1. Active PinScreen: Exploring Spatio-Temporal Tactile Feedbackfor Multi-Finger Interaction
      Index terms have been assigned to the content through auto-classification.

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      MobileHCI '20: 22nd International Conference on Human-Computer Interaction with Mobile Devices and Services
      October 2020
      418 pages
      ISBN:9781450375160
      DOI:10.1145/3379503
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 05 October 2020

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Haptic
      2. Mobile Interaction
      3. Tactile Interfaces

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Funding Sources

      • EPSRC

      Conference

      MobileHCI '20
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 202 of 906 submissions, 22%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)75
      • Downloads (Last 6 weeks)10
      Reflects downloads up to 13 Dec 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)STButton: Exploring Opportunities for Buttons with Spatio-Temporal Tactile OutputExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3648671(1-5)Online publication date: 11-May-2024
      • (2022)Ultrasound Mid-Air Haptic Feedback at the FingertipUltrasound Mid-Air Haptics for Touchless Interfaces10.1007/978-3-031-04043-6_13(299-311)Online publication date: 17-Sep-2022
      • (2021)Intuitive Spatial Tactile Feedback for Better Awareness about Robot Trajectory during Human–Robot CollaborationSensors10.3390/s2117574821:17(5748)Online publication date: 26-Aug-2021

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media