[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2807442.2807470acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article

Tactile Animation by Direct Manipulation of Grid Displays

Published: 05 November 2015 Publication History

Abstract

Chairs, wearables, and handhelds have become popular sites for spatial tactile display. Visual animators, already expert in using time and space to portray motion, could readily transfer their skills to produce rich haptic sensations if given the right tools. We introduce the tactile animation object, a directly manipulated phantom tactile sensation. This abstraction has two key benefits: 1) efficient, creative, iterative control of spatiotemporal sensations, and 2) the potential to support a variety of tactile grids, including sparse displays. We present Mango, an editing tool for animators, including its rendering pipeline and perceptually-optimized interpolation algorithm for sparse vibrotactile grids. In our evaluation, professional animators found it easy to create a variety of vibrotactile patterns, with both experts and novices preferring the tactile animation object over controlling actuators individually.

Supplementary Material

suppl.mov (uist2440.mp4)
Supplemental video
MP4 File (p21.mp4)

References

[1]
D. Alles. 1970. Information Transmission by Phantom Sensations. IEEE Transactions on Man Machine Systems 11, 1 (March 1970), 85--91.
[2]
J. Corbin and A. Strauss. 2008. Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory (3 ed.). Sage Publications. 379 pages.
[3]
D. Cuartielles, A. Goransson, T. Olsson, and S. Stenslie. 2012. Developing Visual Editors for High-Resolution Haptic Patterns. In HAID'12 Posters and Demos. 42--44.
[4]
F. Danieau, J. Fleureau, P. Guillotel, N. Mollet, A. Léyer, and M. Christie. 2012. HapSeat: producing motion sensation with multiple force-feedback devices embedded in a seat. In VRST '12. 69--76.
[5]
M. Eid, S. Andrews, A. Alamri, and A. El Saddik. 2008. HAMLAT: A HAML-Based Authoring Tool for Haptic Application Development. In LNCS 5024 - Haptics: Perception, Devices and Scenarios. Vol. 5024. 857--866.
[6]
M.J. Enriquez and K.E. MacLean. 2003. The hapticon editor: a tool in support of haptic communication research. In HAPTICS '03. 356--362.
[7]
E. Gunther, G. Davenport, and S. O'Modhrain. 2002. Cutaneous grooves: composing for the sense of touch. In NIME '02. 73--79.
[8]
V. Hayward. 2008. A brief taxonomy of tactile illusions and demonstrations that can be done in a hardware store. Brain research bulletin 75, 6 (April 2008), 742--52.
[9]
S.R. Herring, C.-C. Chang, J. Krantzler, and B.P. Bailey. 2009. Getting inspired! Understanding How and Why Examples are Used in Creative Design Practice. In CHI '09. 87--96.
[10]
K. Hong, J. Lee, and S. Choi. 2013. Demonstration-based vibrotactile pattern authoring. In TEI '13. 219--222.
[11]
W.A. IJsselsteijn. 2003. Presence in the past: What can we learn from media history? In Being There Concepts, Effects and Measurements of User Presence in Synthetic Environments. IOS Press, 17--40.
[12]
A. Israr and I. Poupyrev. 2011. Tactile brush: drawing on skin with a tactile grid display. In CHI '11. 2019--2028.
[13]
A. Israr, I. Poupyrev, C. Ioffreda, J. Cox, N. Gouveia, H. Bowles, A. Brakis, B. Knight, K. Mitchell, and T. Williams. 2011. Surround Haptics: Sending Shivers Down Your Spine. In SIGGRAPH Emerging Technologies.
[14]
J. Johnson and A. Henderson. 2002. Conceptual models: begin by designing what to design. Interactions 9, 1 (Jan. 2002), 25--32.
[15]
L.A. Jones, M. Nakamura, and B. Lockyer. 2004. Development of a tactile vest. In HAPTICS '04. 82--89.
[16]
M. Kim, S. Lee, and S. Choi. 2014. Saliency-driven real-time video-to-tactile translation. IEEE Transactions on Haptics 7, 3 (Jan. 2014), 394--404.
[17]
Y. Kim, J. Cha, I. Oakley, and J. Ryu. 2009. Exploring Tactile Movies: An Initial Tactile Glove Design and Concept Evaluation. IEEE Multimedia PP, 99 (2009), 1.
[18]
1J. Lee and S. Choi. 2012. Evaluation of vibrotactile pattern design using vibrotactile score. In HAPTICS '12. 231--238.
[19]
J. Lee and S. Choi. 2013. Real-time perception-level translation from audio signals to vibrotactile effects. In CHI '13. 2567--2576.
[20]
J. Lee, Y. Kim, and G. Kim. 2012. Funneling and saltation effects for tactile interaction with virtual objects. In CHI '12. 3141--3148.
[21]
C. Moussette and R. Banks. 2011. Designing through making. In TEI '11. 279--282.
[22]
C. Moustakas. 1994. Phenomenological Research Methods. Sage Publications.
[23]
S.A. Paneels, M. Anastassova, and L. Brunet. 2013. TactiPEd: Easy Prototyping of Tactile Patterns. INTERACT '13 8118 (2013), 228--245.
[24]
M. Resnick, B. Myers, K. Nakakoji, B. Shneiderman, R. Pausch, T. Selker, and M. Eisenberg. 2008. Design principles for tools to support creative thinking. In NSF Workshop Report on Creativity Support Tools.
[25]
J. Ryu and S. Choi. 2008. posVibEditor: Graphical authoring tool of vibrotactile patterns. In HAVE '08. 120--125.
[26]
O.S. Schneider and K.E. MacLean. 2014. Improvising Design with a Haptic Instrument. In HAPTICS '14.
[27]
O.S. Schneider, S. Zhao, and A. Israr. 2015. FeelCraft: User-Crafted Tactile Content. In LNEE 277: Haptic Interaction. 253--259.
[28]
H. Seifi, C. Anthonypillai, and K.E. MacLean. 2014. End-user customization of affective tactile messages: A qualitative examination of tool parameters. In HAPTICS '14. 251--256.
[29]
J. Seo and S. Choi. 2013. Perceptual analysis of vibrotactile flows on a mobile device. IEEE transactions on haptics 6, 4 (Jan. 2013), 522--7.
[30]
R. Sodhi, I. Poupyrev, M. Glisson, and A. Israr. 2013. AIREAL: Interactive Tactile Experiences in Free Air. ACM Transactions on Graphics 32, 4 (July 2013).
[31]
C. Swindells, E. Maksakov, K.E. MacLean, and V. Chung. 2006. The Role of Prototyping Tools for Haptic Behavior Design. In HAPTICS '06. 161--168.
[32]
C. Swindells, S. Pietarinen, and A. Viitanen. 2014. Medium fidelity rapid prototyping of vibrotactile haptic, audio and video effects. In HAPTICS '14. 515--521.
[33]
H. Tan, A. Lim, and R. Traylor. 2009. A psychophysical study of sensory saltation with an open response paradigm. Proceedings of the ASME Dynamic Systems and Control Division 69 (2009), 1109--1115. Issue 2.
[34]
K. Tanie, S. Tachi, K. Komoriya, M. Abe, K. Asaba, and Y. Tomita. 1980. Information Transmission Characteristics of Two-Dimensional Electrocutaneous Phantom Sensation. Transactions of the Society of Instrument and Control Engineers 16, 5 (1980), 732--739.
[35]
D. Tsetserukou, A. Neviarouskaya, H. Prendinger, N. Kawakami, and S. Tachi. 2009. Affective haptics in emotional communication. In Proc. ACII '09. 1--6.
[36]
R.T. Verrillo and G.A. Gescheider. 1992. Perception via the sense of touch. Tactile aids for the hearing impaired (1992), 1--36.
[37]
G. Wilson, T. Carter, S. Subramanian, and S.A. Brewster. 2014. Perception of ultrasonic haptic feedback on the hand. In CHI '14. 1133--1142.

Cited By

View all
  • (2024)Visual and Haptic Guidance for Enhancing Target Search Performance in Dual-Task SettingsApplied Sciences10.3390/app1411465014:11(4650)Online publication date: 28-May-2024
  • (2024)HapticPilotProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314537:4(1-28)Online publication date: 12-Jan-2024
  • (2024)AdapTics: A Toolkit for Creative Design and Integration of Real-Time Adaptive Mid-Air Ultrasound TactonsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642090(1-15)Online publication date: 11-May-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
UIST '15: Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology
November 2015
686 pages
ISBN:9781450337793
DOI:10.1145/2807442
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 05 November 2015

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. animation
  2. design
  3. haptics
  4. vibrotactile

Qualifiers

  • Research-article

Funding Sources

  • NSERC
  • Disney Research

Conference

UIST '15

Acceptance Rates

UIST '15 Paper Acceptance Rate 70 of 297 submissions, 24%;
Overall Acceptance Rate 561 of 2,567 submissions, 22%

Upcoming Conference

UIST '25
The 38th Annual ACM Symposium on User Interface Software and Technology
September 28 - October 1, 2025
Busan , Republic of Korea

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)140
  • Downloads (Last 6 weeks)17
Reflects downloads up to 17 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Visual and Haptic Guidance for Enhancing Target Search Performance in Dual-Task SettingsApplied Sciences10.3390/app1411465014:11(4650)Online publication date: 28-May-2024
  • (2024)HapticPilotProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314537:4(1-28)Online publication date: 12-Jan-2024
  • (2024)AdapTics: A Toolkit for Creative Design and Integration of Real-Time Adaptive Mid-Air Ultrasound TactonsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642090(1-15)Online publication date: 11-May-2024
  • (2024)Phantom Illusion Based Vibrotactile Rendering of Affective Touch PatternsIEEE Transactions on Haptics10.1109/TOH.2023.331596417:2(202-215)Online publication date: 1-Apr-2024
  • (2024)RecHap: An Interactive Recommender System for Navigating a Large Number of Mid-Air Haptic DesignsIEEE Transactions on Haptics10.1109/TOH.2023.327681217:2(165-176)Online publication date: 1-Apr-2024
  • (2024)CoTacs: A Haptic Toolkit to Explore Effective On-Body Haptic Feedback by Ideating, Designing, Evaluating and Refining Haptic Designs Using Group CollaborationInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2358460(1-21)Online publication date: 7-Jun-2024
  • (2024)HapMotion: motion-to-tactile framework with wearable haptic devices for immersive VR performance experienceVirtual Reality10.1007/s10055-023-00910-z28:1Online publication date: 9-Jan-2024
  • (2023)TactTongue: Prototyping ElectroTactile Stimulations on the TongueProceedings of the 36th Annual ACM Symposium on User Interface Software and Technology10.1145/3586183.3606829(1-14)Online publication date: 29-Oct-2023
  • (2023)Designing Interactive Shoes for Tactile Augmented RealityProceedings of the Augmented Humans International Conference 202310.1145/3582700.3582728(1-14)Online publication date: 12-Mar-2023
  • (2023)TOUCHLESS: Demonstrations of Contactless Haptics for Affective Touch.Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3583913(1-5)Online publication date: 19-Apr-2023
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media