[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article
Open access

A comparative study for telerobotic surgery using free hand gestures

Published: 01 September 2016 Publication History

Abstract

This research presents an exploratory study among touch-based and touchless interfaces selected to teleoperate a highly dexterous surgical robot. The possibility of incorporating touchless interfaces into the surgical arena may provide surgeons with the ability to engage in telerobotic surgery similarly as if they were operating with their bare hands. On the other hand, precision and sensibility may be lost. To explore the advantages and drawbacks of these modalities, five interfaces were selected to send navigational commands to the Taurus robot in the system: Omega, Hydra, and a keyboard. The first represented touch-based, while Leap Motion and Kinect were selected as touchless interfaces. Three experimental designs were selected to test the system, based on standardized surgically related tasks and clinically relevant performance metrics measured to evaluate the user's performance, learning rates, control stability, and interaction naturalness. The current work provides a benchmark and validation framework for the comparison of these two groups of interfaces and discusses their potential for current and future adoption in the surgical setting.

References

[1]
Bach-y-Rita, P., & W. Kercel, S. (2003). Sensory substitution and the human--machine interface. Trends in Cognitive Sciences, 7(12), 541--546.
[2]
Beasley, R. A. (2012). Medical robots: Current systems and research directions. Journal of Robotics, 2012, e401613.
[3]
Carriere, K. C., & Reinsel, G. C. (1992). Investigation of dual-balanced crossover designs for two treatments. Biometrics, 48(4), 1157--1164.
[4]
Carrino, F., Ridi, A., Mugellini, E., Khaled, O. A., & Ingold, R. (2012). Gesture segmentation and recognition with an EMG-based intimate approach---An accuracy and usability study. In Proceedings of the 2012 Sixth International Conference on Complex, Intelligent and Software Intensive Systems (CISIS) (pp. 544--551). Palermo, Italy.
[5]
Carswell, C. M., Clarke, D., & Seales, W. B. (2005). Assessing mental workload during laparoscopic surgery. Surgical Innovation, 12(1), 80--90.
[6]
da Vinci Surgery---Minimally Invasive Robotic Surgery With the da Vinci Surgical System. (2015). Retrieved on April 19, 2015 from http://www.davincisurgery.com
[7]
DeSouza, G. N., Jiang, H., Wachs, J. P., & Duerstock, B. S. (2014). Integrated vision-based system for efficient, semi-automated control of a robotic manipulator. International Journal of Intelligent Computing and Cybernetics, 7(3), 253--266.
[8]
Dulan, G., Rege, R. V., Hogg, D. C., Gilberg-Fisher, K. M., Arain, N. A., Tesfay, S. T., & Scott, D. J. (2012a). Developing a comprehensive, proficiency-based training program for robotic surgery. Surgery, 152(3), 477--488.
[9]
Dulan, G., Rege, R. V., Hogg, D. C., Gilberg-Fisher, K. M., Arain, N. A., Tesfay, S. T., & Scott, D. J. (2012b). Proficiency-based training for robotic surgery: Construct validity, workload, and expert levels for nine inanimate exercises. Surgical Endoscopy, 26(6), 1516--1521.
[10]
Dunkin, B., & Wilcox, V. (2014). Developing a curriculum for residents and fellows. In K. C. Kim (Ed.), Robotics in general surgery (pp. 385--413). New York: Springer.
[11]
Force Dimension (2015). Products---Omega.7. Retrieved on August 26, 2016 from http://www.forcedimension.com/products/omega-7/overview
[12]
Frankel, A. S., Leonard, M. W., & Denham, C. R. (2006). Fair and just culture, team behavior, and leadership engagement: The tools to achieve high reliability. Health Services Research, 41(4p2), 1690--1709.
[13]
Greer, A. D., Newhook, P. M., & Sutherland, G. R. (2008). Human-machine interface for robotic surgery and stereotaxy. IEEE/ASME Transactions on Mechatronics, 13(3), 355--361.
[14]
Hartmann, B., Benson, M., Junger, A., Quinzio, L., Röhrig, R., Fengler, B., Hempelmann, G. (2004). Computer keyboard and mouse as a reservoir of pathogens in an intensive care unit. Journal of Clinical Monitoring and Computing, 18(1), 7--12.
[15]
Hoshi, T., Takahashi, M., Nakatsuma, K., & Shinoda, H. (2009). Touchable holography. In Proceedings of the ACM SIGGRAPH Conference on Emerging Technologies (pp. 23:1--23:1). New York, NY: ACM.
[16]
Huang, J.-D. (2011). Kinerehab: A Kinect-based system for physical rehabilitation: A pilot study for young adults with motor disabilities. In Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility (pp. 319--320). New York, NY: ACM.
[17]
Hu, J., Edsinger, A., Lim, Y.-J., Donaldson, N., Solano, M., Solochek, A., & Marchessault, R. (2011). An advanced medical robotic system augmenting healthcare capabilities---Robotic nursing assistant. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (pp. 6264--6269). Shanghai, China.
[18]
Jacob, M., Cange, C., Packer, R., & Wachs, J. P. (2012). Intention, context and gesture recognition for sterile MRI navigation in the operating room. In L. Alvarez, M. Mejail, L. Gomez, & J. Jacobo (Eds.), Progress in pattern recognition, image analysis, computer vision, and applications (pp. 220--227). Berlin Heidelberg, Germany: Springer.
[19]
Jacob, M. G., Li, Y.-T., & Wachs, J. P. (2011). A gesture driven robotic scrub nurse. In Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics (SMC) (pp. 2039--2044). Anchorage, AK.
[20]
Jacob, M. G., & Wachs, J. P. (2014). Context-based hand gesture recognition for the operating room. Pattern Recognition Letters, 36, 196--203.
[21]
Jacob, M. G., Wachs, J. P., & Packer, R. A. (2013). Hand-gesture-based sterile interface for the operating room using contextual cues for the navigation of radiological images. Journal of the American Medical Informatics Association, 20(e1), e183--e186.
[22]
Jaimes, A., & Sebe, N. (2005). Multimodal human computer interaction: A survey. In N. Sebe, M. Lew, & T. S. Huang (Eds.), Computer vision in human-computer interaction (pp. 1--15). Berlin Heidelberg, Germany: Springer.
[23]
Jakus, G., Guna, J., Tomažič, S., & Sodnik, J. (2014). Evaluation of leap motion controller with a high precision optical tracking system. In M. Kurosu (Ed.), Human-computer interaction. Advanced interaction modalities and techniques (pp. 254--263). Switzerland: Springer International Publishing.
[24]
Kim, Y., Kim, P. C. W., Selle, R., Shademan, A., & Krieger, A. (2014). Experimental evaluation of contact-less hand tracking systems for tele-operation of surgical tasks. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (pp. 3502--3509). Hong Kong, China.
[25]
Kim, Y., Leonard, S., Shademan, A., Krieger, A., & Kim, P. C. W. (2014). Kinect technology for hand tracking control of surgical robots: Technical and surgical skill comparison to current robotic masters. Surgical Endoscopy, 28(6), 1993--2000.
[26]
Kinect (2015). Meet Kinect for Windows. Retrieved on August 26, 2016 from https://developer.microsoft.com/en-us/windows/kinect
[27]
Kitagawa, M., Dokko, D., Okamura, A. M., & Yuh, D. D. (2005). Effect of sensory substitution on suture-manipulation forces for robotic surgical systems. The Journal of Thoracic and Cardiovascular Surgery, 129(1), 151--158.
[28]
Lakoff, G., & Johnson, M. (2003). Metaphors we live by (2nd Ed.). Chicago: University Of Chicago Press.
[29]
Leap Motion (2015). Mac & PC motion controller for games, design, & more. Retrieved on April 28, 2015 from https://www.leapmotion.com
[30]
Lingard, L., Reznick, R., Espin, S., Regehr, G., & DeVito, I. (2002). Team communications in the operating room: Talk patterns, sites of tension, and implications for novices. Academic Medicine: Journal of the Association of American Medical Colleges, 77(3), 232--237.
[31]
Mann, R. W. (1974). Technology and human rehabilitation: Prostheses for sensory rehabilitation and/or sensory substitution. In J. Dickson (Ed.), Advances in Biomedical Engineering (pp. 209--353). Cambridge, MA: Academic Press.
[32]
Mitra, S., & Acharya, T. (2007). Gesture recognition: A survey. IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews, 37(3), 311--324.
[33]
Monnich, H., Nicolai, P., Beyl, T., Raczkowsky, J., & Worn, H. (2011). A supervision system for the intuitive usage of a telemanipulated surgical robotic setup. In Proceedings of the IEEE International Conference on Robotics and Biomimetics (ROBIO) (pp. 449--454). Phuket Island, Thailand.
[34]
Moore, A., Butt, D., Ellis-Clarke, J., & Cartmill, J. (2010). Linguistic analysis of verbal and non-verbal communication in the operating room. ANZ Journal of Surgery, 80(12), 925--929.
[35]
Myo (2015). Myo gesture control armband. Retrieved on August 26, 2016 from https://www.myo.com
[36]
Nehaniv, C. L., Dautenhahn, K., Kubacki, J., Haegele, M., Parlitz, C., & Alami, R. (2005). A methodological approach relating the classification of gesture to identification of human intent in the context of human-robot interaction. In Proceedings of the IEEE International Workshop on Robot and Human Interactive Communication (ROMAN). (pp. 371--377). Nashville, TN.
[37]
Nielsen, M., Störring, M., Moeslund, T. B., & Granum, E. (2004). A Procedure for developing intuitive and ergonomic gesture interfaces for HCI. In A. Camurri & G. Volpe (Eds.), Gesture-based communication in human-computer interaction (pp. 409--420). Berlin Heidelberg: Springer.
[38]
Nisky, I., Hsieh, M. H., & Okamura, A. M. (2013). A framework for analysis of surgeon arm posture variability in robot-assisted surgery. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (pp. 245--251). Karlsruhe, Germany.
[39]
O'Hara, K., Harper, R., Mentis, H., Sellen, A., & Taylor, A. (2013). On the Naturalness of Touchless: Putting the "Interaction" Back into NUI. ACM Transactions on Computer-Human Interaction, 20(1), 5:1--5:25.
[40]
Oikonomidis, I., Kyriazis, N., & Argyros, A. A. (2011). Efficient model-based 3D tracking of hand articulations using Kinect. In British Machine Vision Conference (BMVC) (pp. 1--11). Dundee, Scotland.
[41]
Oviatt, S. (2006). Human-centered design meets cognitive load theory: Designing interfaces that help people think. In Proceedings of the 14th Annual ACM International Conference on Multimedia (pp. 871--880). New York, NY: ACM.
[42]
Pavlovic, V. I., Sharma, R., & Huang, T. S. (1997). Visual interpretation of hand gestures for human-computer interaction: A review. IEEE Transactions on Pattern Analysis and Machine Intelligence, 19(7), 677--695.
[43]
Prentice, R. (2007). Drilling surgeons the social lessons of embodied surgical learning. Science, Technology & Human values, 32(5), 534--553.
[44]
Rautaray, S. S., & Agrawal, A. (2012). Vision based hand gesture recognition for human computer interaction: A survey. Artificial Intelligence Review, 1--54.
[45]
Razor Inc. (2015). Enter the Hydra Razer™ for Gamers. By Gamers.™. Retrieved on April 28, 2015 from http://www.razerzone.com/minisite/hydra
[46]
Robles-De-La-Torre, G. (2006). The importance of the sense of touch in virtual and real environments. IEEE Multimedia, 13(3), 24--30.
[47]
Rosen, J., Hannaford, B., & Satava, R. M. (2011). Surgical robotics: Systems applications and visions. New York: Springer Science & Business Media, 2011.
[48]
Rovira, K., Gapenne, O., & Ammar, A. A. (2010). Learning to recognize shapes with a sensory substitution system: A longitudinal study with 4 non-sighted adolescents. In Proceedings of the IEEE 9th International Conference on Development and Learning (ICDL) (pp. 1--6). Ann Arbor, MI.
[49]
Schultz, M., Gill, J., Zubairi, S., Huber, R., & Gordin, F. (2003). Bacterial contamination of computer keyboards in a teaching hospital. Infection Control and Hospital Epidemiology, 24(4), 302--303.
[50]
Sengül, A., van Elk, M., Rognini, G., Aspell, J. E., Bleuler, H., & Blanke, O. (2012). Extending the body to virtual tools using a robotic surgical interface: Evidence from the crossmodal congruency task. PLoS One, 7(12), e49473.
[51]
Sodhi, R., Poupyrev, I., Glisson, M., & Israr, A. (2013). AIREAL: Interactive tactile experiences in free air. ACM Transactions on Graphics (TOG)---SIGGRAPH 2013 Conference Proceedings, 32(4), 134:1--134:10.
[52]
Talamini, M., Campbell, K., & Stanfield, C. (2002). Robotic gastrointestinal surgery: Early experience and system description. Journal of Laparoendoscopic & Advanced Surgical Techniques Part A., 12(4), 225--232.
[53]
Visell, Y. (2009). Tactile sensory substitution: Models for enaction in HCI. Interacting With Computers, 21(1--2), 38--53.
[54]
von Hardenberg, C., & Bérard, F. (2001). Bare-hand human-computer interaction. In Proceedings of the 2001 Workshop on Perceptive User Interfaces (pp. 1--8). New York, NY: ACM.
[55]
Wachs, J. P., Kölsch, M., Stern, H., & Edan, Y. (2011). Vision-based hand-gesture applications. Communications of the ACM, 54(2), 60--71.
[56]
Wachs, J. P., Stern, H. I., Edan, Y., Gillam, M., Handler, J., Feied, C., & Smith, M. (2008). A gesture-based tool for sterile browsing of radiology images. Journal of the American Medical Informatics Association, 15(3), 321--323.
[57]
Wagner, C. R., Stylopoulos, N., Jackson, P. G., & Howe, R. D. (2007). The benefit of force feedback in surgery: Examination of blunt dissection. Presence: Teleoperators and Virtual Environments, 16(3), 252--262.
[58]
Wallenstein, S., & Fisher, A. C. (1977). The analysis of the two-period repeated measurements crossover design with application to clinical trials. Biometrics, 33(1), 261--269.
[59]
Webster, J. L., & Cao, C. G. L. (2006). Lowering communication barriers in operating room technology. Human Factors: The Journal of the Human Factors and Ergonomics Society, 48(4), 747--758.
[60]
Weichert, F., Bachmann, D., Rudak, B., & Fisseler, D. (2013). Analysis of the accuracy and robustness of the Leap Motion controller. Sensors, 13(5), 6380--6393.
[61]
Whittaker, S. (2003). Theories and methods in mediated communication. In The handbook of discourse processes (pp. 243--286). Mahwah, NJ: Erlbaum.
[62]
Zhou, T., Cabrera, M. E., & Wachs, J. (2014). Touchless telerobotic surgery---A comparative study. In IEEE IROS in 3rd Workshop on Telerobotics for Real-Life Applications, Opportunities, Challenges and New Developments. Chicago, IL.
[63]
Zhou, T., Cabrera, M. E., & Wachs, J. P. (2015). Touchless telerobotic surgery---Is it possible at all? In Proceedings of the 29th AAAI Conference on Artificial Intelligence. Austin, TX.
[64]
Zubrycki, I., & Granosik, G. (2014). Using integrated vision systems: Three gears and Leap Motion, to control a 3-finger dexterous gripper. In R. Szewczyk, C. Zieliński, & M. Kaliczyńska (Eds.), Recent advances in automation, robotics and measuring techniques (pp. 553--564). Switzerland: Springer International Publishing.

Cited By

View all
  • (2024)Perception and Action Augmentation for Teleoperation Assistance in Freeform TelemanipulationACM Transactions on Human-Robot Interaction10.1145/364380413:1(1-40)Online publication date: 31-Jan-2024
  • (2020)Telepresence Robotics for Hands-on Distance InstructionProceedings of the 11th Nordic Conference on Human-Computer Interaction: Shaping Experiences, Shaping Society10.1145/3419249.3420116(1-11)Online publication date: 25-Oct-2020
  • (2020)The Potential of Gesture-Based InteractionHuman-Computer Interaction. Multimodal and Natural Interaction10.1007/978-3-030-49062-1_8(125-136)Online publication date: 19-Jul-2020
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Journal of Human-Robot Interaction
Journal of Human-Robot Interaction  Volume 5, Issue 2
September 2016
89 pages

Publisher

Journal of Human-Robot Interaction Steering Committee

Publication History

Published: 01 September 2016

Author Tags

  1. human-robot interaction
  2. robot-assisted surgery
  3. sensory substitution
  4. teleoperation
  5. touchless gestures

Qualifiers

  • Research-article

Funding Sources

  • Qatar National Research Fund NPRP award

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)144
  • Downloads (Last 6 weeks)19
Reflects downloads up to 21 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Perception and Action Augmentation for Teleoperation Assistance in Freeform TelemanipulationACM Transactions on Human-Robot Interaction10.1145/364380413:1(1-40)Online publication date: 31-Jan-2024
  • (2020)Telepresence Robotics for Hands-on Distance InstructionProceedings of the 11th Nordic Conference on Human-Computer Interaction: Shaping Experiences, Shaping Society10.1145/3419249.3420116(1-11)Online publication date: 25-Oct-2020
  • (2020)The Potential of Gesture-Based InteractionHuman-Computer Interaction. Multimodal and Natural Interaction10.1007/978-3-030-49062-1_8(125-136)Online publication date: 19-Jul-2020
  • (2018)Joint Surgeon Attributes Estimation in Robot-Assisted SurgeryCompanion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3173386.3176981(285-286)Online publication date: 1-Mar-2018

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media