Abstract
Deafblind people face remarkable challenges in communicating, because of their severe disability. The only way to interact with other people is the usage of the tactile sign language, which consists in understanding the sign language putting their hands on the signer’s hands. But this approach works only when the signers are in the same place. The aim of this project is to reduce the gap between deafblind people and the other ones, giving them the capability to communicate remotely. By collecting images with two cameras, the signer’s body is tracked with a deep neural network. The extracted coordinates of the body parts (chest, shoulders, elbows, wrists, palms and fingers) are used to move one or more robotic arms. The deafblind person can put his hands on the robots to understand the message delivered by the person on the other side. The entire system is based on a cloud architecture.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
The PARLOMA project. http://www.asp-poli.it/smain-innovative-small-appliances-accessories/
About deafblindness. https://www.deafblindinformation.org.au/about-deafblindness/
Orbbec Astra Pro. https://orbbec3d.com/product-astra-pro/
Morphological operations. https://docs.opencv.org/3.0-beta/doc/py_tutorials/py_imgproc/py_morphologival_ops/py_morphological_ops.html
Filtering. https://docs.opencv.org/3.1.0/d4/d13/tutorial_py_filtering.html
LeapMotion. https://www.leapmotion.com/
Cao Z, Hidalgo G, Simon T, Sheikh Y (2018) OpenPose: realtime multi-person 2D pose estimation using part affinity field
Acknowledgements
This study was funded in part by the Italian Ministry of Education, Universities and Research within the “Smart Cities and Social Innovation Under 30” program through the PARLOMA Project (SIN_00132).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Panicacci, S. et al. (2020). Empowering Deafblind Communication Capabilities by Means of AI-Based Body Parts Tracking and Remotely Controlled Robotic Arm for Sign Language Speakers. In: Saponara, S., De Gloria, A. (eds) Applications in Electronics Pervading Industry, Environment and Society. ApplePies 2019. Lecture Notes in Electrical Engineering, vol 627. Springer, Cham. https://doi.org/10.1007/978-3-030-37277-4_44
Download citation
DOI: https://doi.org/10.1007/978-3-030-37277-4_44
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-37276-7
Online ISBN: 978-3-030-37277-4
eBook Packages: Physics and AstronomyPhysics and Astronomy (R0)