[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content
Log in

Feasibility and performance enhancement of collaborative control of unmanned ground vehicles via virtual reality

  • Original Paper
  • Published:
Personal and Ubiquitous Computing Aims and scope Submit manuscript

A Correction to this article was published on 21 May 2024

This article has been updated

Abstract

To support people working in dangerous industries, virtual reality (VR) can ensure operators manipulate standardized tasks and work collaboratively to deal with potential risks. Surprisingly, limited research has paid attention to the cognitive load of operators in their collaborative tasks, especially via VR interfaces. Once task demands become complex, many researchers focus on optimizing the design of the interaction interfaces to reduce the cognitive load on the operator. In this paper, we propose a new collaborative VR system with edge enhancement to support two teleoperators working in the VR environment to remote control an uncrewed ground vehicle. We use a compared experiment to evaluate the collaborative VR systems, focusing on the time spent on tasks and the total number of operations. Our results show that the total number of processes and the cognitive load during operations were significantly lower in the two-person group than in the single-person group. Our study sheds light on designing VR systems to support collaborative work with respect to the flow of work of teleoperators instead of simply optimizing the design outcomes.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Availability of data and materials

Data and related materials are available upon reasonable request to the corresponding author.

Change history

Notes

  1. https://www.dji.com/hk/robomaster-ep

  2. https://docs.unity3d.com/560/Documentation/Manual/OculusControllers.html

References

  1. Björling E, Kim A, Oleson K, Alves-Oliveira P (2022) I am the robot: teen collaboration in an asymmetric. Virtual Reality Game. Front. Virtual Real. 2. https://doi.org/10.3389/frvir

  2. Fang J, Chang V, Gao G, Wang H-C (2021) Social interactions in virtual reality: What cues do people use most and how. In: Companion publication of the 2021 conference on computer supported cooperative work and social computing, pp 49–52

  3. Cruz A, Paredes H, Morgado L (2021) Martins P Non-verbal aspects of collaboration in virtual worlds: a CSCW taxonomy-development proposal integrating the presence dimension. JUCS-J Universal Comput Sci 27(9):913–954

  4. Chen L, Liang H-N, Lu F, Wang J, Chen W, Yue Y (2021) Effect of collaboration mode and position arrangement on immersive analytics tasks in virtual reality: a pilot study. Appl Sci 11(21):10473

    Article  Google Scholar 

  5. Liang H-N (2019) Lu F, Shi Y, Nanjappan V, Papangelis K: Evaluating the effects of collaboration and competition in navigation tasks and spatial knowledge acquisition within virtual reality environments. Future Generation Comput Syst 95:855–866

    Article  Google Scholar 

  6. Chen L, Liang H-N, Lu F, Papangelis K, Man KL, Yue Y (2020) Collaborative behavior, performance and engagement with visual analytics tasks using mobile devices. Hum-Centric Comput Inform Sci 10(1):1–24

    Google Scholar 

  7. Holzinger A, Saranti A Angerschmid A, Retzlaff CO, Gronauer A, Pejakovic V, Medel-Jimenez F, Krexner T, Gollob C, Stampfer K (2022) Digital transformation in smart farm and forest operations needs human-centered AI: challenges and future directions. Sensors 22(8):3043

  8. Kovarik ML, Ott LS, Robinson JK, Wenzel TJ (2022) Getting started on active learning. In: Active learning in the analytical chemistry curriculum, pp 13–35

  9. Han J, Smith B (1997) Cu-SeeMe VR immersive desktop teleconferencing. In: Proceedings of the fourth ACM international conference on multimedia, pp 199–207

  10. Nguyen C, DiVerdi S, Hertzmann A, Liu F (2017) CollaVR: collaborative in-headset review for VR video. In: Proceedings of the 30th annual ACM symposium on user interface software and technology, pp 267–277

  11. Cordeil M, Dwyer T, Klein K, Laha B, Marriott K, Thomas BH (2016) Immersive collaborative analysis of network connectivity: CAVE-style or head-mounted display? IEEE transactions on visualization and computer graphics 23(1):441–450

  12. Morris M.R, Lombardo J, Wigdor D (2010) WeSearch: supporting collaborative search and sensemaking on a tabletop display. In: Proceedings of the 2010 ACM conference on computer supported cooperative work, pp 401–410

  13. Bjørn P, Wulff M, Petræus MS, Møller NH (2021) Immersive cooperative work environments (CWE): designing human-building interaction in virtual reality. Comput Supported Cooperative Work (CSCW) 30(3):351–391

  14. Chow K, Coyiuto C, Nguyen C, Yoon D (2019) Challenges and design considerations for multimodal asynchronous collaboration in VR. In: Proceedings of the ACM on human-computer interaction 3(CSCW), 1–24

  15. Pan Y (2021) Reflexivity of account, professional vision, and computer-supported cooperative work: working in the maritime domain. Proceedings of the ACM on Human-Computer Interaction 5(CSCW2):1–32

    Article  Google Scholar 

  16. Vasarainen M, Paavola S, Vetoshkina L et al (2021) A systematic literature review on extended reality: virtual, augmented and mixed reality in working life. Int J Virtual Reality

  17. Ludwig T, Stickel O, Tolmie P, Sellmer M (2021) shARe-IT: ad hoc remote troubleshooting through augmented reality. Computer Supported Cooperative Work (CSCW) 30(1):119–167

    Article  Google Scholar 

  18. Luo Y, Wang J, Shi R, Liang H-N, Luo S (2022) In-device feedback in immersive head-mounted displays for distance perception during teleoperation of unmanned ground vehicles. IEEE Trans Haptics 15(1):79–84

    Article  Google Scholar 

  19. Luo Y, Wang J, Liang H-N, Luo S, Lim EG (2021) Monoscopic vs. stereoscopic views and display types in the teleoperation of unmanned ground vehicles for object avoidance. In: 2021 30th IEEE International conference on robot human interactive communication (RO-MAN), pp 418–425

  20. Luo Y, Wang J, Pan Y, Luo S, Irani P, Liang H-N (2023) Teleoperation of a fast omnidirectional unmanned ground vehicle in the cyber-physical world via a VR interface. In: Proceedings of the 18th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry. VRCAI ’22. Association for Computing Machinery, New York, USA. https://doi.org/10.1145/3574131.3574432

  21. Bout M, Brenden AP, Klingegård M, Habibovic A, Böckle M-P (2017) A head-mounted display to support teleoperations of shared automated vehicles. In: Proceedings of the 9th international conference on automotive user interfaces and interactive vehicular applications adjunct, pp 62–66

  22. Islam S, Huang Q, Afghah F, Fule P, Razi A (2019) Fire frontline monitoring by enabling UAV-based virtual reality with adaptive imaging rate. In: 2019 53rd Asilomar conference on signals, systems, and computers, IEEE, pp 368–372

  23. Goedicke D, Li J, Evers V, Ju W (2018) VR-OOM: virtual reality on-road driving simulation. In: Proceedings of the 2018 CHI conference on human factors in computing systems, pp 1–11

  24. Nguyen VT, Jung K, Dang T (2019) DroneVR: a web virtual reality simulator for drone operator. In: 2019 IEEE International conference on artificial intelligence and virtual reality (AIVR), IEEE, pp 257–2575

  25. Helsel S (1992) Virtual reality and education. Educ Technol 32(5):38–42

    Google Scholar 

  26. Fakhoury S, Ma Y, Arnaoudova V, Adesope O (2018) The effect of poor source code lexicon and readability on developers’ cognitive load. In: 2018 IEEE/ACM 26th International conference on program comprehension (ICPC), IEEE, pp 286–28610

  27. Sexton K, Johnson A, Gotsch A, Hussein AA, Cavuoto L, Guru KA (2018) Anticipation, teamwork and cognitive load: chasing efficiency during robot-assisted surgery. BMJ Quality & Safety 27(2):148–154

  28. Murphy G, Greene CM (2017) Load theory behind the wheel; perceptual and cognitive load effects. Canadian J Experiment Psychol/Revue Canadienne de Psycholo Expérimentale 71(3):191

  29. Nguyen H, Bednarz T (2020) User experience in collaborative extended reality: overview study. In: International conference on virtual reality and augmented reality, Springer, pp 41–70

  30. Lee Y, Yoo B (2021) XR collaboration beyond virtual reality: work in the real world. J Computational Design Eng 8(2):756–772

    Article  Google Scholar 

  31. Leung T, Zulkernine F, Isah H (2018) The use of virtual reality in enhancing interdisciplinary research and education. arXiv:1809.08585

  32. Lu F, Yu D, Liang H-N, Chen W, Papangelis K, Ali NM (2018) Evaluating engagement level and analytical support of interactive visualizations in virtual reality environments. In: 2018 IEEE International symposium on mixed and augmented reality (ISMAR), pp 143–152

  33. Pack A, Barrett A, Liang H-N, Monteiro D (2020) University EAP students’ perceptions of using a prototype virtual reality learning environment to learn writing structure. Int J Computer-Assisted Language Learn Teach 10(1):27–46

    Article  Google Scholar 

  34. Jimeno-Morenilla A, Sánchez-Romero JL, Mora-Mora H, Coll-Miralles R (2016) Using virtual reality for industrial design learning: a methodological proposal. Behaviour & Inform Technol 35(11):897–906

    Article  Google Scholar 

  35. Lee H, Kim H, Monteiro DV, Goh Y, Han D, Liang H-N, Yang HS, Jung J (2019) Annotation vs. virtual tutor: comparative analysis on the effectiveness of visual instructions in immersive virtual reality. In: 2019 IEEE International symposium on mixed and augmented reality (ISMAR), pp 318–327

  36. Norman D (2013) The design of everyday things: revised and expanded edition

  37. Heath C, Svensson MS, Hindmarsh J, Luff P, Vom Lehn D (2002) Configuring awareness. Computer Supported Cooperative Work (CSCW) 11(3):317–347

    Article  Google Scholar 

  38. Randall D, Rouncefield M, Tolmie P (2021) Ethnography, CSCW and ethnomethodology. Computer Supported Cooperative Work (CSCW) 30(2):189–214

    Article  Google Scholar 

  39. Clark HH, Brennan SE (1991) Grounding in communication

  40. Sweller J, van Merriënboer JJ, Paas F (2019) Cognitive architecture and instructional design: 20 years later. Educ Psychol Rev 31(2):261–292

    Article  Google Scholar 

  41. Benford S, Brown C, Reynard G, Greenhalgh C (1996) Shared spaces: transportation, artificiality, and spatiality. In: Proceedings of the 1996 ACM conference on computer supported cooperative work, pp 77–86

  42. Schmidt K (2008) Taking CSCW seriously: supporting articulation work (1992). In: Cooperative work and coordinative practices, pp 45–71

  43. Randall D (2016) What is common in accounts of common ground? Computer Supported Cooperative Work (CSCW) 25(4):409–423

    Article  Google Scholar 

  44. Chen L, Liu Y, Li Y, Yu L, Gao B, Caon M, Yue Y, Liang H-N (2021) Effect of visual cues on pointing tasks in co-located augmented reality collaboration. SUI ’21. Association for Computing Machinery, New York, NY, USA

  45. Brown B, Bell M (2004) CSCW at play: ’there’ as a collaborative virtual environment. In: Proceedings of the 2004 ACM conference on computer supported cooperative work, pp 350–359

  46. Pedersen G, Koumaditis K (2020) Virtual reality (VR) in the computer supported cooperative work (CSCW) domain: a mapping and a pre-study on functionality and immersion. In: International conference on human-computer interaction, Springer, pp 136–153

  47. Begum M, Serna RW, Kontak D, Allspaw J, Kuczynski J, Yanco HA, Suarez J (2015) Measuring the efficacy of robots in autism therapy: how informative are standard HRI metrics’. In: Proceedings of the tenth annual ACM/IEEE international conference on human-robot interaction, pp 335–342

  48. Sun Z, Li M (2018) Design of a training system for special types of mine workers based on CSCW. In: 2018 IEEE 22nd International Conference on Computer Supported Cooperative Work in Design ((CSCWD)), IEEE, pp 501–506

  49. Morrison-Smith S, Ruiz J (2020) Challenges and barriers in virtual teams: a literature review. SN Appl Sci 2(6):1–33

    Article  Google Scholar 

  50. Hollan J, Hutchins E, Kirsh D (2000) Distributed cognition: toward a new foundation for human-computer interaction research. ACM Trans Computer-Human Interaction (TOCHI) 7(2):174–196

    Article  Google Scholar 

  51. Shi R, Liang H-N, Wu Y, Yu D, Xu W (2021) Virtual reality sickness mitigation methods: a comparative study in a racing game. Proc ACM Comput Graph Interact Tech 4(1)

  52. Yu D, Liang H-N, Lu X, Fan K, Ens B (2019) Modeling endpoint distribution of pointing selection tasks in virtual reality environments. ACM Trans Graph 38(6)

  53. Yu D, Lu X, Shi R, Liang H-N, Dingler T, Velloso E, Goncalves J (2021) Gaze-supported 3D object manipulation in virtual reality. In: Proceedings of the 2021 CHI conference on human factors in computing systems. CHI ’21. Association for Computing Machinery, New York, USA

  54. Roetzel PG (2019) Information overload in the information age: a review of the literature from business administration, business psychology, and related disciplines with a bibliometric approach and framework development. Business Res 12(2):479–522

  55. Ramakrishnan P, Balasingam B, Biondi F (2021) Cognitive load estimation for adaptive human–machine system automation. In: Learning control, pp 35–58

  56. Stapel J, Mullakkal-Babu FA, Happee R (2019) Automated driving reduces perceived workload, but monitoring causes higher cognitive load than manual driving. Transportation Res Part F: Traffic Psychol Behaviour 60:590–605

    Article  Google Scholar 

  57. Costley J, Lange C (2018) The moderating effects of group work on the relationship between motivation and cognitive load. Int Rev Res Open Distributed Learn 19(1)

  58. Chu H-C, Chen J-M, Tsai C-L (2017) Effects of an online formative peer-tutoring approach on students’ learning behaviors, performance and cognitive load in mathematics. Interactive Learn Environ 25(2):203–219

  59. Baker M (2016) Statisticians issue warning over misuse of p values. Nature News 531(7593):151

    Article  Google Scholar 

  60. Gelman A (2016) The problems with p-values are not just with p-values. American Statistician 70(10)

  61. Hornbæk K (2013) Some whys and hows of experiments in human-computer interaction. Found Trends in Human-Computer Interaction 5(4):299–373

    Google Scholar 

  62. Zhao Y, Szpiro S, Azenkot S (2015) ForeSee: a customizable head-mounted vision enhancement system for people with low vision. In: Proceedings of the 17th international ACM SIGACCESS conference on computers & accessibility, pp 239–249

  63. Zhao Y, Kupferstein E, Castro BV, Feiner S, Azenkot S (2019) Designing AR visualizations to facilitate stair navigation for people with low vision. In: Proceedings of the 32nd annual ACM symposium on user interface software and technology, pp 387–402

  64. Htike HM, Margrain TH, Lai Y-K (2020) Eslambolchilar P: Ability of head-mounted display technology to improve mobility in people with low vision: a systematic review. Trans Vision Sci & Technol 9(10):26–26

    Article  Google Scholar 

  65. Peláez-Coca MD (2009) Vargas-Martín F, Mota S, Díaz J, Ros-Vidal E: A versatile optoelectronic aid for low vision patients. Ophthalmic Physiol Optics 29(5):565–572

    Article  Google Scholar 

  66. Luo G, Peli E (2011) Development and evaluation of vision rehabilitation devices. In: 2011 Annual international conference of the IEEE engineering in medicine and biology society, IEEE, pp 5228–5231

  67. Hicks SL, Wilson I, Muhammed L, Worsfold J, Downes SM, Kennard C (2013) A depth-based head-mounted visual display to aid navigation in partially sighted individuals. PloS one 8(7):67695

    Article  Google Scholar 

  68. Essock EA, McCarley JS, Sinai MJ, DeFord JK (2019) Human perception of sensor-fused imagery. In: Interpreting remote sensing imagery, pp 137–182

  69. Steinfeld A, Fong T, Kaber D, Lewis M, Scholtz J, Schultz A, Goodrich M (2006) Common metrics for human-robot interaction. In: Proceedings of the 1st ACM SIGCHI/SIGART Conference on human-robot interaction, pp. 33–40

  70. Guo J, Ma J, Ángel F, García-Fernández, Zhang Y, Liang H-N (2023) A survey on image enhancement for low-light images. Heliyon 9(4):14558. https://doi.org/10.1016/j.heliyon.2023.e14558

  71. Sobel I, Feldman G et al (1968) A 3x3 isotropic gradient operator for image processing. A talk at the Stanford Artificial Project in, 271–272

  72. Canny JF (1983) Finding edges and lines in images. Technical report, Massachusetts Inst of Tech Cambridge Artificial Intelligence Lab

  73. Canny J (1986) A computational approach to edge detection. IEEE Trans Pattern Anal Mach Intell 6:679–698

  74. Jose K (2014) Deepa merlin dixon, n. In: Joseph, ES George and V. Anjitha," Performance study of edge detection operators." In: 2014 International conference on embedded systems (ICES), Coimbatore, pp 7–11

  75. Prewitt JM et al (1970) Object enhancement and extraction. Picture Process Psychopictorics 10(1):15–19

    Google Scholar 

  76. Hassan MAA, Khalid NEA, Ibrahim A, Noor NM (2008) Evaluation of Sobel, Canny, Shen & Castan using sample line histogram method. In: 2008 International symposium on information technology, IEEE, vol 3, pp 1–7 (2008)

  77. Schubert EF (2006) Human eye sensitivity and photometric quantities. Light-emitting Diodes, 275–291

  78. Li Z, Luo Y, Wang J, Pan Y, Yu L, Liang H-N (2022) Collaborative remote control of unmanned ground vehicles in virtual reality. In: 2022 international conference on interactive media, smart systems and emerging technologies (IMET). pp 1–8. https://doi.org/10.1109/IMET54801.2022.9929783

Download references

Acknowledgements

The authors thank the participants for their time and the reviewers for their insightful comments and useful suggestions that have helped improve our paper. A short version of this paper was published as [78], which presented some early and preliminary results of the work.

Funding

This research is supported in part by Xi’an Jiaotong-Liverpool University (XJTLU) Key Program Special Fund (#KSF-A-03), National Science Foundation of China (#62272396), and XJTLU Research Development Fund (#21-02-008).

Author information

Authors and Affiliations

Authors

Contributions

All authors contributed to this work.

Corresponding author

Correspondence to Hai-Ning Liang.

Ethics declarations

Ethics approval

The study was categorized as Low-Risk Research (LRR), conducted according to the guidelines regulating LRR projects, and approved by the University Ethics Committee at Xi’an Jiaotong-Liverpool University.

Consent to participate

Participants consented to participate in the study on a voluntary basis.

Consent for publication

All authors read the paper and agreed for it to be submitted for review and publication.

Conflict of interest

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

The original online version of this article was revised due to incorrect author affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, Z., Luo, Y., Wang, J. et al. Feasibility and performance enhancement of collaborative control of unmanned ground vehicles via virtual reality. Pers Ubiquit Comput 28, 579–595 (2024). https://doi.org/10.1007/s00779-024-01799-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00779-024-01799-4

Keywords

Navigation