[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content
Log in

User-defined gestures for mediated social touch on touchscreens

  • Original Paper
  • Published:
Personal and Ubiquitous Computing Aims and scope Submit manuscript

A Correction to this article was published on 21 March 2022

This article has been updated

Abstract

Mediated social touch is a new form of remote communication. Some researchers designed prototypes to deliver mediated social touch for mobile devices. However, there lacks a comprehensive analysis of the user-defined gestures of mediated social touch on touchscreens of mobile devices. We conducted an elicitation study for 24 social touch gestures on the touchscreen of smartphones and recorded physical parameters. We developed a user-defined gesture set considering physical properties and context. We provided classifications based on the movement forms. We found that social touch gestures with shorter duration were easier for participants to perform; participants were inclined to use social touch with an easier gesture more often. Participants were more likely to express happy or sad expressions rather than neutral emotions. Based on the findings, we discussed the implications for mediated social touch technology and its application on touchscreens.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Change history

References

  1. Rantala J, Salminen K, Raisamo R, Surakka V (2013) Touch gestures in communicating emotional intention via vibrotactile stimulation. Int J Hum Comput Stud 71:679–690. https://doi.org/10.1016/j.ijhcs.2013.02.004

    Article  Google Scholar 

  2. Haans A, IJsselsteijn W, (2006) Mediated social touch: a review of current research and future directions. Virtual Real 9:149–159. https://doi.org/10.1007/s10055-005-0014-2

    Article  Google Scholar 

  3. Gordon ML, Zhai S (2019) Touchscreen haptic augmentation effects on tapping, drag and drop, and path following. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI’19). ACM, Glasgow, Scotland, UK, pp 1–12. https://doi.org/10.1145/3290605.3300603

  4. Liu S, Cheng H, Chang C, Lin P (2018) A study of perception using mobile device for multi-haptic feedback. In: International Conference on Human Interface and the Management of Information (HIMI 2018). Springer, Las Vegas, NV, USA, pp 218–226 https://doi.org/10.1007/978-3-319-92043-6_19

  5. Park YW, Baek KM, Nam TJ (2013) The roles of touch during phone conversations: long-distance couples’ use of POKE in their homes. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’13). ACM, Paris, France. pp 1679–1688. https://doi.org/10.1145/2470654.2466222

  6. Park Y, Bae SH, Nam TJ (2012) How do couples use CheekTouch over phone calls? In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’12). ACM, Austin, TX, USA. pp 763–766. https://doi.org/10.1145/2207676.2207786

  7. Park YW, Bae SH, Nam TJ (2016) Design for sharing emotional touches during phone calls. Arch Des Res 29:95–106. https://doi.org/10.15187/adr.2016.05.29.2.95

    Article  Google Scholar 

  8. Hoggan E, Stewart C, Haverinen L, et al. (2012) Pressages: augmenting phone calls with non-verbal messages. In: Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology (UIST’12). ACM, Cambridge, MA, USA. pp 555–562. https://doi.org/10.1145/2380116.2380185

  9. Hemmert F, Gollner U, Löwe M, et al. (2011) Intimate mobiles: grasping, kissing and whispering as a means of telecommunication in mobile phones. In: Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services (MobileHCI’11). ACM, Stockholm, Swedem. pp 21–24. https://doi.org/10.1145/2037373.2037377

  10. Furukawa M, Kajimoto H, Tachi S (2012) KUSUGURI: a shared tactile interface for bidirectional tickling. In: Proceedings of the 3rd Augmented Human International Conference (AH’12). ACM, Megève, France. pp 1–8. https://doi.org/10.1145/2160125.2160134

  11. Yohanan S, MacLean KE (2012) Robot interaction: human intent and expectations in touching the haptic creature. Int J Soc Robot 4:163–180. https://doi.org/10.1007/s12369-011-0126-7

    Article  Google Scholar 

  12. Chang A, O’Modhrain S, Jacob R, et al. (2002) ComTouch: design of a vibrotactile communication device. In: Proceedings of the 4th conference on Designing interactive systems: processes, practices, methods, and techniques (DIS’02). ACM, London, England, UK. pp 312–320. https://doi.org/10.1145/778712.778755

  13. Wobbrock JO, Morris MR, Wilson AD (2009) User-defined gestures for surface computing. In: Proceedings of the SIGCHI conference on human factors in computing systems (CHI’09). ACM, Boston, MA, USA. pp 1083–1092. https://doi.org/10.1145/1518701.1518866

  14. Shimon SA, Lutton C, Xu Z, et al. (2016) Exploring non-touchscreen gestures for smartwatches. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’16). ACM, San Jose, CA, USA, pp 3822–3833. https://doi.org/10.1145/2858036.2858385

  15. Dim NK, Ren X (2014) Designing motion gesture interfaces in mobile phones for blind people. J Comput Sci Technol 29:812–824. https://doi.org/10.1007/s11390-014-1470-5

    Article  Google Scholar 

  16. Ruiz J, Li Y, Lank E (2011) User-defined motion gestures for mobile interaction. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’11). ACM, Vancouver, BC, Canada, pp 197–206. https://doi.org/10.1145/1978942.1978971

  17. Tu H, Huang Q, Zhao Y, Gao B (2020) Effects of holding postures on user-defined touch gestures for tablet interaction. Int J Hum Comput Stud 141:102451. https://doi.org/10.1016/j.ijhcs.2020.102451

    Article  Google Scholar 

  18. Findlater L, Lee BQ, Wobbrock JO (2012) Beyond QWERTY: augmenting touch-screen keyboards with multi-touch gestures for non-alphanumeric input. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’12). ACM, Austin, TX, USA, pp 2679–2682. https://doi.org/10.1145/2207676.2208660

  19. Kurdyukova E, Redlin M, André E (2012) Studying user-defined iPad gestures for interaction in multi-display environment. In: Proceedings of the 2012 ACM international conference on Intelligent User Interfaces (IUI’12). ACM, New York, NY, USA, pp 93–96. https://doi.org/10.1145/2166966.2166984

  20. Shimon SSA, Morrison-Smith S, John N, et al. (2015) Exploring user-defined back-of-device gestures for mobile devices. In: Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI’15), New York, NY, USA, pp 227–232. https://doi.org/10.1145/2785830.2785890

  21. Wu H, Yang L (2020) User-defined gestures for dual-screen mobile interaction. Int J Hum Comput Interact 36:978–992. https://doi.org/10.1080/10447318.2019.1706331

    Article  Google Scholar 

  22. Liang HN, Williams C, Semegen M, et al. (2012) User-defined surface+motion gestures for 3D manipulation of objects at a distance through a mobile device. In: Proceedings of the 10th Asia pacific conference on Computer human interaction (APCHI’12). ACM, Matsue-city, Shimane, Japan, pp 299–308. https://doi.org/10.1145/2350046.2350098

  23. Rust K, Malu M, Anthony L, Findlater LK (2014) Understanding child-defined gestures and children’s mental models for touchscreen tabletop interaction. In: Proceedings of the 2014 conference on Interaction design and children (IDC’14). ACM, Aarhus, Denmark, pp 201–204. https://doi.org/10.1145/2593968.2610452

  24. Vatavu RD, Wobbrock JO (2015) Formalizing agreement analysis for elicitation studies: new measures, significance test, and toolkit. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI’15). ACM, New York, NY, USA, pp 1325–1334. https://doi.org/10.1145/2702123.2702223

  25. Hurtienne J, Stößel C, Sturm C et al (2010) Physical gestures for abstract concepts: inclusive design with primary metaphors. Interact Comput 22:475–484. https://doi.org/10.1016/j.intcom.2010.08.009

    Article  Google Scholar 

  26. Oulasvirta A, Reichel A, Li W, Vertanen K (2013) Improving two-thumb text entry on touchscreen devices. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’13). ACM, Paris, France, pp 2765–2774. https://doi.org/10.1145/2470654.2481383

  27. Hertenstein MJ, Keltner D (2006) Touch communicates distinct Emotions. Emotion 6:528–533. https://doi.org/10.1037/1528-3542.6.3.528

    Article  Google Scholar 

  28. Hertenstein MJ, Holmes R, McCullough M, Keltner D (2009) The communication of emotion via touch. Emotion 9:566–573. https://doi.org/10.1037/a0016108

    Article  Google Scholar 

  29. Lang PJ, Bradley MM, Cuthbert BN (1997) International affective picture system (IAPS): technical manual and affective ratings. NIMH Cent Study Emot Atten 1:39–58. https://doi.org/10.1007/978-3-319-28099-8_42-1

    Article  Google Scholar 

  30. Bekker MM, Olson JS, Olson GM (1995) Analysis of gestures in face to face design teams provides guidance for how to use groupware in design. In: Proceedings of the 1st conference on Designing interactive systems: processes, practices, methods, & techniques (DIS’95). ACM, Ann Arbor, MI, USA, pp 157–166. https://doi.org/10.1145/225434.225452

  31. Lausberg H, Sloetjes H (2009) Coding gestural behavior with the NEUROGES-ELAN system. Behav Res 41:841–849. https://doi.org/10.3758/BRM.41.3.841

    Article  Google Scholar 

  32. Zaiţi IA, Pentiuc ŞG, Vatavu RD (2015) On free-hand TV control: experimental results. Pers Ubiquitous Comput 19:821–838. https://doi.org/10.1007/s00779-015-0863-y

  33. Askari SI, Haans A, Bos P, et al. (2020) Context matters: the effect of textual tone on the evaluation of mediated social touch. In: International Conference on Human Haptic Sensing and Touch Enabled Computer Applications (EuroHaptics’20). Springer, Leiden, Netherlands, pp 131–139. https://doi.org/10.1007/978-3-030-58147-3_15

  34. Villarreal-Narvaez S, Vanderdonckt J, Vatavu RD, Wobbrock JO (2020) A systematic review of gesture elicitation studies: what can we learn from 216 studies? In: Proceedings of the 2020 ACM Designing Interactive Systems Conference (DIS’20). ACM, Eindhoven, Nertherland, pp 855–872. https://doi.org/10.1145/3357236.3395511

  35. Patel VM, Chellappa R, Chandra D, Barbello B (2016) Continuous user authentication on mobile devices: recent progress and remaining challenges. IEEE Signal Process Mag 33:49–61. https://doi.org/10.1109/MSP.2016.2555335

    Article  Google Scholar 

  36. Jung MM, Poppe R, Poel M, Heylen DKJ (2014) Touching the void - introducing CoST: Corpus of social touch. In: Proceedings of the 16th International Conference on Multimodal Interaction (ICMI’14). ACM, Istanbul, Turkey, pp 120–127. https://doi.org/10.1145/2663204.2663242

  37. Emojis for smileys, people, families, hand gestures, clothing and accessories. https://emojipedia.org/people/

  38. Yoo Y, Yoo T, Kong J, Choi S (2015) Emotional responses of tactile icons: effects of amplitude, frequency, duration, and envelope. In: 2015 IEEE World Haptics Conference (WHC’15). IEEE, Evanston, IL, USA, pp 235–240. https://doi.org/10.1109/WHC.2015.7177719

  39. Réhman SU, Liu L (2008) iFeeling: vibrotactile rendering of human emotions on mobile phones. In: Workshop of Mobile Multmedia Processing (WMMP’08). Springer, Tampa, FL, USA, pp 1–20. https://doi.org/10.1007/978-3-642-12349-8_1

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Qianhui Wei.

Ethics declarations

Conflict of interest

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

The original online version of this article was revised due to descriptions of Figure 6 and figure 7 have been wrongly exchanged.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wei, Q., Hu, J. & Li, M. User-defined gestures for mediated social touch on touchscreens. Pers Ubiquit Comput 27, 271–286 (2023). https://doi.org/10.1007/s00779-021-01663-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00779-021-01663-9

Keywords

Navigation