Abstract
When willing to experience and learn a craft, “learning by doing” has been proven to be the most effective approach. Traditionally apprentices spend entire years close to the master, observing, imitating him/her, interacting with him/her, and receiving guidelines. Inspired by this natural process actively involving both the master and the apprentice, a technological metaphor has been developed permitting the user to discover and learn motor aspects of a craft. The user is invited to physically imitate and reproduce expert gestures and is guided then by an interactive mechanism on how to correct kinematic errors. This is achieved thanks to a Human-Centered AI Algorithm permitting to model of expert gestures/postures and then comparing them in real-time with the user’s ones to activate the sensorimotor feedback and to provide guidance for a better understanding of how a glass blower performs his/her expert gestures. This paper briefly presents the methodology followed for the development of this interactive mechanism and focuses on the results obtained from the experiments conducted. These results highlight that the feedback helps acquire glass blower motor skills, that different modalities are efficient for learning postures versus gestures, and that a retention effect is observed through three sessions of feedback use.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Siegel, I., Hooper, F.H.: Logical Thinking in Children. Holt, Rhinehart, and Winston, New York
Schmidt, R.A., Wrisberg, C.A.: Motor learning and performance: a situation-based learning approach. Human kinetics (2008)
Cramer, S.C., et al.: Harnessing neuroplasticity for clinical applications. Brain 134(6), 1591–1609 (2011). https://doi.org/10.1093/brain/awr039
Morone, G., et al.: Differentiation among bio- and augmented-feedback in technologically assisted rehabilitation. Expert Rev. Med. Devices 18(6), 513–522 (2021). https://doi.org/10.1080/17434440.2021.1927704
Schwenk, M., et al.: Interactive balance training integrating sensor-based visual feedback of movement performance: a pilot study in older adults. J. NeuroEngineering Rehabil. 11(1), 164 (2014). https://doi.org/10.1186/1743-0003-11-164
Zhang, X., Shan, G., Wang, Y., Wan, B., Li, H.: Wearables, biomechanical feedback, and human motor-skills’ learning & optimization. Appl. Sci. 9(2), 226 (2019). https://doi.org/10.3390/app9020226
Vieira, J., Sousa, M., Arsénio, A., Jorge, J.: Augmented reality for rehabilitation using multimodal feedback. In: Proceedings of the 3rd 2015 Workshop on ICTs for improving Patients Rehabilitation Research Techniques, Lisbon Portugal, pp. 38–41, October 2015. https://doi.org/10.1145/2838944.2838954
Garrido, J.E., Penichet, V.M.R., Lozano, M.D., Plata, A.M., Valls, J.A.F.: The use of joint coordinates to monitor patients in a movement-based interaction system. Univ. Access Inf. Soc. 18(1), 3–16 (2017). https://doi.org/10.1007/s10209-017-0587-z
Petancevski, E.L., Inns, J., Fransen, J., Impellizzeri, F.M.: The effect of augmented feedback on the performance and learning of gross motor and sport-specific skills: a systematic review. Psychol. Sport Exerc. 63, 102277 (2022). https://doi.org/10.1016/j.psychsport.2022.102277
Godbout, A., Boyd, J.E.: Corrective sonic feedback for speed skating: a case study (2010)
Jakus, G., Stojmenova, K., Tomažič, S., Sodnik, J.: A system for efficient motor learning using multimodal augmented feedback. Multimedia Tools Appl. 76(20), 20409–20421 (2016). https://doi.org/10.1007/s11042-016-3774-7
Sigrist, R., Rauter, G., Riener, R., Wolf, P.: Augmented visual, auditory, haptic, and multimodal feedback in motor learning: a review. Psychon. Bull. Rev. 20(1), 21–53 (2012). https://doi.org/10.3758/s13423-012-0333-8
Dimitropoulos, K., et al.: A multimodal approach for the safeguarding and transmission of intangible cultural heritage: the case of i-Treasures. IEEE Intell. Syst. 33(6), 3–16 (2018). https://doi.org/10.1109/MIS.2018.111144858
Glushkova, A., Manitsaris, S.: Gesture recognition and sensorimotor learning-by-doing of motor skills in manual professions: a case study in the wheel-throwing art of pottery. J. Comput. Assist. Learn. 34(1), 20–31 (2018). https://doi.org/10.1111/jcal.12210
Camarillo-Abad, H.M., Sánchez, J.A., Starostenko, O.: An environment for motor skill transfer based on wearable haptic communication. Pers. Ubiquit. Comput. 25(2), 411–435 (2020). https://doi.org/10.1007/s00779-020-01425-z
Carre, A.L., et al.: Mixed-reality demonstration and training of glassblowing. Heritage 5(1), 103–128 (2022). https://doi.org/10.3390/heritage5010006
Martınez, G.H.: OpenPose: Whole-Body Pose Estimation
Sorzano, C.O.S., Vargas, J., Pascual, A.: A survey of dimensionality reduction techniques
Wang, H., Wang, L.: Beyond joints: learning representations from primitive geometries for skeleton-based action recognition and detection. IEEE Trans. Image Process. 27(9), 4382–4394 (2018). https://doi.org/10.1109/TIP.2018.2837386
Manitsaris, S., Senteri, G., Makrygiannis, D., Glushkova, A.: Human movement representation on multivariate time series for recognition of professional gestures and forecasting their trajectories. Front. Robot. AI 7, 80 (2020). https://doi.org/10.3389/frobt.2020.00080
Acknowledgement
This work was supported by the EU Horizon 2020 Innovation Program and Grant No. 822336, Mingei project. The authors would like to express their gratitude to Jean-Pierre Mateus from CERFAV for collaborating on the expert motion capture.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Glushkova, A., Makrygiannis, D., Manitsaris, S. (2023). Interactive Sensorimotor Guidance for Learning Motor Skills of a Glass Blower. In: Rauterberg, M. (eds) Culture and Computing. HCII 2023. Lecture Notes in Computer Science, vol 14035. Springer, Cham. https://doi.org/10.1007/978-3-031-34732-0_3
Download citation
DOI: https://doi.org/10.1007/978-3-031-34732-0_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-34731-3
Online ISBN: 978-3-031-34732-0
eBook Packages: Computer ScienceComputer Science (R0)