[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to main content

Gesture-Based Intelligent User Interface for Control of an Assistive Mobile Information Robot

  • Conference paper
  • First Online:
Interactive Collaborative Robotics (ICR 2020)

Abstract

This article presents a gesture-based user interface for a robotic shopping trolley. The trolley is designed as a mobile robotic platform helping customers in shops and supermarkets. Among the main functions are: navigating through the store, providing information on availability and location, and transporting the items bought. One of important features of the developed interface is the gestural modality, or, more precisely, Russian sign language elements recognition system. The notion of the interface design, as well as interaction strategy, are presented in flowcharts, it was made an attempt to demonstrate the gestural modality as a natural part of an assistive information robot. Besides, a short overview of mobile robots is given in the paper, and CNN-based technique of gesture recognition is provided. The Russian sign language recognition option is of high importance due to a relatively large number of native speakers (signers).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
GBP 19.95
Price includes VAT (United Kingdom)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
GBP 35.99
Price includes VAT (United Kingdom)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
GBP 44.99
Price includes VAT (United Kingdom)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Antsaklis, P.J., Passino, K.M. (eds.): An Introduction to Intelligent and Autonomous Control. Kluwer Academic Publishers, Norwell (1993)

    Google Scholar 

  2. Shcherbatov, I.A.: Intellectual control of robotic systems in conditions of uncertainty. Vestnik Astrakhanskogo gosudarstvennogo tekhnicheskogo universiteta 1, 73–77 (2010). (in Russian)

    Google Scholar 

  3. Falconer, J.: Humanoid Robot Demonstrates Sign Language. https://spectrum.ieee.org/automaton/robotics/humanoids/ntu-taiwan-humanoid-sign-language

  4. Kose, H., Yorganci, R.: Tale of a robot: humanoid robot assisted sign language tutoring. In: Proceedings of 11th IEEE-RAS International Conference on Humanoid Robots, Bled, Slovenia, 26–28 October 2011, pp. 105-111 (2011)

    Google Scholar 

  5. Hoshino, K. Kawabuchi, I.: A humanoid robotic hand performing the sign language motions. In: Proceedings of 2003 International Symposium on Micromechatronics and Human Science (MHS-2003), Tsukuba, Japan, pp. 89–94 (2003)

    Google Scholar 

  6. Ethnologue Russian Sign Language Report. https://www.ethnologue.com/language/rsl

  7. Joshua Project. https://joshuaproject.net/people_groups/19007/RS

  8. Ryumin, D., Ivanko, D., Axyonov, A., Kagirov, I., Karpov, A., Zelezny, M.: Human-robot interaction with smart shopping trolley using sign language: data collection. In: 2019 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops), pp. 949–954 (2019)

    Google Scholar 

  9. Kagirov I., et al.: Lower limbs exoskeleton control system based on intelligent human-machine interface. In: Kotenko, I., Badica, C., Desnitsky, V., El Baz, D., Ivanovic, M. (eds.) IDC 2019. SCI, vol. 868, pp. 457–466. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-32258-8_54

  10. The service robot for the hospitality industry has arrived: JEEVES. https://jeeves.robotise.eu. Accessed 14 June 2020

  11. Toyota Global Frontier Research. https://www.toyota-global.com/innovation/partner_robot/. Accessed 14 June 2020

  12. Yamamoto, T., et al.: Human support robot as research platform of domestic mobile manipulator. In: Chalup, S., Niemueller, T., Suthakorn, J., Williams, M.-A. (eds.) RoboCup 2019. LNCS (LNAI), vol. 11531, pp. 457–465. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-35699-6_37

    Chapter  Google Scholar 

  13. Yamamoto, T., et al.: Development of Human Support Robot as the research platform of a domestic mobile manipulator. ROBOMECH J. 6(1), 4 (2019). https://doi.org/10.1186/s40648-019-0132-3

  14. Mars 2020 Perseverance Rover – NASA Mars. https://mars.nasa.gov/mars2020/. Accessed 14 June 2020

  15. Hrúz, M., Campr, P., Krňoul, Z., Železný, M., Aran, O., Santemiz, P.: Multi-modal dialogue system with sign language capabilities. In: The Proceedings of the 13th International ACM SIGACCESS conference on Computers and Accessibility, ASSETS 2011, pp. 265–266. Association for Computing Machinery, New York (2011)

    Google Scholar 

  16. Kagirov, I., Ryumin, D.A., Axyonov, A.A., Karpov, A.A.: Multimedia database of russian sign language items in 3D. Voprosy Jazykoznanija 1, 104–123 (2020)

    Google Scholar 

  17. Pose Estimation. https://www.tensorflow.org/lite/models/pose_estimation/overview. Accessed 14 June 2020

  18. Papandreou, G., Zhu, T., Chen, L.C., Gidaris, S., Tompson, J., Murphy, K.: PersonLab: person pose estimation and instance segmentation with a bottom-up, part-based, geometric embedding model. In: Ferrari, V., Hebert, M., Sminchisescu, C., Weiss, Y. (eds.) ECCV 2018. LNCS, vol. 11218, pp. 269–286. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-01264-9_17

  19. Lugaresi, C., et al.: MediaPipe: A Framework for Building Perception Pipelines, pp. 1–286. arXiv preprint arXiv:1906.08172 (2019)

  20. Keras Applications. https://keras.io/api/applications/. Accessed 14 June 2020

  21. LabelImg is a graphical image annotation tool. https://github.com/tzutalin/labelImg. Accessed 14 June 2020

  22. Tan, M., Le, Q.V.: EfficientNet: rethinking model scaling for convolutional neural networks. arXiv preprint arXiv:1905.11946 (2019)

Download references

Acknowledgements

This research is financially supported by the Ministry of Science and Higher Education of the Russian Federation, agreement No. 14.616.21.0095 (reference RFMEFI61618X0095) and by the Ministry of Education of the Czech Republic, project No. LTARF18017.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dmitry Ryumin .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Kagirov, I., Ryumin, D., Železný, M. (2020). Gesture-Based Intelligent User Interface for Control of an Assistive Mobile Information Robot. In: Ronzhin, A., Rigoll, G., Meshcheryakov, R. (eds) Interactive Collaborative Robotics. ICR 2020. Lecture Notes in Computer Science(), vol 12336. Springer, Cham. https://doi.org/10.1007/978-3-030-60337-3_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-60337-3_13

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-60336-6

  • Online ISBN: 978-3-030-60337-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics