[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3293578.3293590acmotherconferencesArticle/Chapter ViewAbstractPublication PagesmexihcConference Proceedingsconference-collections
research-article

Augmented Reality User interaction to Computer Assisted Orthopedic Surgery system

Published: 29 October 2018 Publication History

Abstract

The reconstruction surgery of Anterior Cruciate Ligament (ACL), precise drilling of tunnels is required to pass the graft. A Computer-Assisted Orthopedic Surgery user interface with Augmented Reality was designed to assist the procedure. However, the surgeon's interaction has limitations: mainly because the controls located out of reach and there is inconvenience when using the hands given that the surgical environment must be kept sterile. Two kind of hand-gesture interaction were evaluated within this research to establish the most suitable device., hand-recognition used by META glasses and hand-gestures detected by EMG muscle sensors used by MYO arm-band, so as to establish the most appropriate one to manage a CAOS. Efficiency, efficacy and satisfaction were evaluated through two experiments. The results showed their relevance and usefulness within the surgical environment. It was established that the MYO arm-band was efficient and possibly more effective at performing the tasks and was perceived as further satisfactory compared to the META glasses.

References

[1]
Sielhorst, T., Feuerstein, M., & Navab, N. (2008). Advanced medical displays: A literature review of augmented reality. Journal of Display Technology, 4(4), 451--467.
[2]
Alkhamisi, A. O., & Monowar, M. M. (2013). Rise of augmented reality: Current and future application areas. International journal of internet and distributed systems, 1(04), 25
[3]
Graetzel, C., Fong, T., Grange, S., & Baur, C. (2004). A noncontact mouse for surgeon-computer interaction. Technology and Health Care, 12(3), 245--257.
[4]
Mishra, A. K., & Wadhai, V. S. (2016). Sterility Testing of Operation Theatres in Hospitals. Int. J. Curr. Microbiol. App. Sci, 5(5), 440--447.
[5]
Wiberg, M. (2016). Interaction, new materials & computing--Beyond the disappearing computer, towards material interactions. Materials & Design, 90, 1200--1206.
[6]
19. Link, T., Kleiner, C., Mancuso, M. P., Dziadkowiec, O., & Halverson-Carpenter, K. (2016). Determining high touch areas in the operating room with levels of contamination. American journal of infection control, 44(11), 1350--1355.
[7]
Johnson, R., O'Hara, K., Sellen, A., Cousins, C., & Criminisi, A. (2011, 2011). Exploring the potential for touchless interaction in image-guided interventional radiology.
[8]
Lopes, D. S., de Figueiredo Parreira, P. D., Paulo, S. F., Nunes, V., Rego, P. A., Neves, M. C., . . . Jorge, J. A. (2017). On the utility of 3D hand cursors to explore medical volume datasets with a touchless interface. Journal of Biomedical Informatics, 72, 140--149.
[9]
Mewes, A., Hensen, B., Wacker, F., & Hansen, C. (2017). Touchless interaction with software in interventional radiology and surgery: a systematic literature review. International journal of computer assisted radiology and surgery, 12(2), 291--305.
[10]
Strickland, M., Tremaine, J., Brigley, G., & Law, C. (2013). Using a depth-sensing infrared camera system to access and manipulate medical imaging from within the sterile operating field. Canadian Journal of Surgery, 56(3), E1.
[11]
Rosa, G. M., & Elizondo, M. L. (2014). Use of a gesture user interface as a touchless image navigation system in dental surgery: Case series report. Imaging science in dentistry, 44(2), 155--160.
[12]
Ma, M., Fallavollita, P., Habert, S., Weidert, S., & Navab, N. (2016). Device-and system-independent personal touchless user interface for operating rooms: One personal UI to control all displays in an operating room. International journal of computer assisted radiology and surgery, 11(6), 853--861.
[13]
Zinnen, A., Schiele, B., & Ziegert, T. (2007). Browsing patient records during ward rounds with a body worn gyroscope.
[14]
Bigdelou, A., Schwarz, L., & Navab, N. (2012). An adaptive solution for intra-operative gesture-based human-machine interaction.
[15]
Jalaliniya, S., Smith, J., Sousa, M., Büthe, L., & Pederson, T. (2013). Touchless interaction with medical images using hand & foot gestures
[16]
Lannessi, A., Marcy, P. Y., Clatz, O., Ayache, N., & Fillard, P. (2014). Touchless user interface for intraoperative image control: almost there. Radiographics, 34(4).
[17]
Kenngott, H. G., Wagner, M., Nickel, F., Wekerle, A. L., Preukschas, A., Apitz, M., . . . Wagner, F. (2015). Computerassisted abdominal surgery: new technologies. Langenbeck's archives of surgery, 400(3), 273--281.
[18]
Sánchez-Margallo, F. M., Sánchez-Margallo, J. A., Moyano-Cuevas, J. L., Pérez, E. M., & Maestre, J. (2017). Use of natural user interfaces for image navigation during laparoscopic surgery: initial experience. Minimally Invasive Therapy & Allied Technologies, 1--9.
[19]
Reinschluessel, A. V., Teuber, J., Herrlich, M., Bissel, J., van Eikeren, M., Ganser, J., . . . Raimondo, L. (2017). Virtual Reality for User-Centered Design and Evaluation of Touch-free Interaction Techniques for Navigating Medical Images in the Operating Room.
[20]
Thalmic Labs. Myo - Real Life Applications of the Myo Armband. (en la Web: Youtube video, https://www.youtube.com/watch?v=te1RBQQlHz4, acceso 25 de febrero de 2018. (2015)
[21]
Meta, Inc. A morning with meta. (en la Web: Youtube promo
[22]
Brangier, E., & Barcenilla, J. (2003). Concevoir un produit facile à utiliser. In E. d'organisation (Ed.), Adapter les technologies a l'homme. Paria, France.
[23]
Karwowski, W., & Marras, W. (1999). The occupational ergonomics. hand book. Estados Unidos: CRC press.
[24]
Helander, M. G., & Zhang, L. (1997). Field studies of comfort and discomfort in sitting. Ergonomics, 40(9), 895--915.
[25]
Kong, Y.-K., & Lowe, B. D. (2005). Optimal cylindrical handle diameter for grip force tasks. International Journal of Industrial Ergonomics, 35(6), 495--507.
[26]
Norman, D. (2014). Some Observations on Mental Models. In P. Press (Ed.), Mental Models. New York
[27]
Christensen, G. L., & Olson, J. C. (2002). Mapping consumers' mental models with ZMET. Psychology and Marketing, 19(6), 477--501.
[28]
Norman, D. A. (2005). El diseño emocional: por qué nos gustan (o no) los objetos cotidianos.
[29]
Kasahara, S., Miyamoto, K., Takahashi, M., Yamanaka, M., & Takeda, N. (2008). Lumbar-pelvic coordination in the sitting position. Gait & Posture, 28(2), 251--257.
[30]
Weick, K. E. (1989). Mental models of high reliability systems. Industrial Crisis Quarterly, 3(2), 127--142.
[31]
Chang, D., Dooley, L., & Tuovinen, J. (2002). Gestalt theory in visual screen design: a new look at an old subjetct. In Proceedings of the Seventh world conference on computers in education conference on Computers in education: Australian topics, 8.
[32]
Cañas, J., & Waerns, Y. (2001). Ergonomía Cognitiva. Madrid: Panamericana.
[33]
Lati, R., Rhee, P., & Gruessner, R. W. G. (2015). Technological Advances in Surgery, Trauma and Critical Care. London: Springer Science+Business Media.
[34]
Hartzler, A. L., Fey, B. C., & Flum, D. R. (2015). Integrating Patient-Reported Outcomes into Spine Surgical Care through Visual Dashboards: Lessons Learned from Human-Centered Design Integrating Patient-Reported Outcomes into Spine Surgical Care through, 3, 3--13.
[35]
Mall, N. a, Chalmers, P. N., Moric, M., Tanaka, M. J., Cole, B. J., Bach, B. R., & Paletta, G. a. (2014). Incidence and trends of anterior cruciate ligament reconstruction in the United States. The American Journal of Sports Medicine, 42(10), 2363--70.
[36]
Koh, J., & Marcus, M. S. (2014). Computer-assisted anterior cruciate ligament (ACL) reconstruction: the US perspective. Sports Medicine and Arthroscopy Review, 22(4), 206--214.

Cited By

View all
  • (2023)Hand interaction designs in mixed and augmented reality head mounted display: a scoping review and classificationFrontiers in Virtual Reality10.3389/frvir.2023.11712304Online publication date: 31-Jul-2023
  • (2022)Strategies to reduce visual attention changes while learning and training in extended reality environmentsInternational Journal on Interactive Design and Manufacturing (IJIDeM)10.1007/s12008-022-01092-917:1(17-43)Online publication date: 11-Dec-2022
  • (2021)Digital Hand Tracking. Interacción sin contacto para la manipulación de imágenes diagnósticas en el quirófanoProceedings of the X Latin American Conference on Human Computer Interaction10.1145/3488392.3488400(1-4)Online publication date: 22-Nov-2021

Index Terms

  1. Augmented Reality User interaction to Computer Assisted Orthopedic Surgery system

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Other conferences
      MexIHC '18: Proceedings of the 7th Mexican Conference on Human-Computer Interaction
      October 2018
      123 pages
      ISBN:9781450366533
      DOI:10.1145/3293578
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      In-Cooperation

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 29 October 2018

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Augmented Reality
      2. Efficacy
      3. Efficiency
      4. Satisfaction
      5. Touchless Interaction

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Conference

      MexIHC '18

      Acceptance Rates

      MexIHC '18 Paper Acceptance Rate 20 of 40 submissions, 50%;
      Overall Acceptance Rate 20 of 40 submissions, 50%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)8
      • Downloads (Last 6 weeks)2
      Reflects downloads up to 05 Jan 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2023)Hand interaction designs in mixed and augmented reality head mounted display: a scoping review and classificationFrontiers in Virtual Reality10.3389/frvir.2023.11712304Online publication date: 31-Jul-2023
      • (2022)Strategies to reduce visual attention changes while learning and training in extended reality environmentsInternational Journal on Interactive Design and Manufacturing (IJIDeM)10.1007/s12008-022-01092-917:1(17-43)Online publication date: 11-Dec-2022
      • (2021)Digital Hand Tracking. Interacción sin contacto para la manipulación de imágenes diagnósticas en el quirófanoProceedings of the X Latin American Conference on Human Computer Interaction10.1145/3488392.3488400(1-4)Online publication date: 22-Nov-2021

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media