[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content

Advertisement

Log in

Speech and motion control for interventional radiology: requirements and feasibility

  • Original Article
  • Published:
International Journal of Computer Assisted Radiology and Surgery Aims and scope Submit manuscript

Abstract

Purpose

Interventional radiology is performed in a sterile environment where speech and motion control of image review is needed to simplify and expedite routine procedures. The requirements and limitations were defined by testing an interventional radiology test bed speech and motion control system.

Methods

Motion control software was implemented using the Microsoft\(^{\circledR }\) Kinect\(^{\circledR }\) (Microsoft Corp., USA) framework. The system was tested by 10 participants using a predefined set of six voice and six gesture commands under different lighting conditions to assess the influence of illumination on command recognition. The participants rated the convenience of the application and its possible use in everyday clinical routine. A basic set of voice or gesture commands required for interventional radiology were identified.

Results

The majority (93 %) of commands were recognized successfully. Speech commands were less prone to errors than gesture commands. Unwanted side effects occurred (e.g., accidentally issuing a gesture command) in about 30 % of cases. Dimmed lighting conditions did not have a measurable effect on the recognition rate. Six out of 10 participants would consider using the application in everyday routine. The necessary voice/gesture commands for interventional radiology were identified and integrated into the control system.

Conclusion

Speech and motion control of image review provides a new man–machine interface for radiological image handling that is especially useful in sterile environments due to no-touch navigation. Command recognition rates were high and remained stable under different lighting conditions. However, the rate of accidental triggering due to unintended commands should be reduced.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Dargar S, Nunno A, Sankaranarayanan G et al (2013) Microsoft kinect based head tracking for life size collaborative surgical simulation environments (LS-CollaSSLE). Stud Health Technol Inform 184:109–113

    PubMed  Google Scholar 

  2. Guo X, Lopez LD, Yu Z et al (2013) A portable immersive surgery training system using RGB-D sensors. Stud Health Technol Inform 184:161–167

    Google Scholar 

  3. Ruppert GCS, Reis LO, Amorim PHJ et al (2012) Touchless gesture user interface for interactive image visualization in urological surgery. World J Urol 30(5):687–691. doi:10.1007/s00345-012-0879-0

    Article  PubMed  Google Scholar 

  4. Cook TS, Couch G, Couch TJ et al (2013) Using the microsoft kinect for patient size estimation and radiation dose normalization: proof of concept and initial validation. J Digit Imaging. doi:10.1007/s10278-012-9567-2

  5. Ebert LC, Hatch G, Ampanozi G et al (2011) You can’t touch this: touch-free navigation through radiological images. Surg Innov 19(3):310. doi:10.1177/1553350611425508

    Google Scholar 

  6. Wachs JP, Stern HI, Edan Y et al (2008) A gesture-based tool for sterile browsing of radiology images. J Am Med Inform Assoc 15(3):321–323. doi:10.1197/jamia.M241

    Article  PubMed  Google Scholar 

  7. Aggarwal JK, Ryoo MS (2011) Human activity analysis: a review. ACM Comput Surv (CSUR) 43(3):16

    Article  Google Scholar 

  8. Holte MB, Moeslund TB (2008) View invariant gesture recognition using 3D motion primitives. In: Acoustics, speech and signal processing, 2008. ICASSP 2008. IEEE International Conference on. IEEE, pp 797–800

  9. Ren Z, Meng J, Yuan J (2011) Depth camera based hand gesture recognition and its applications in human-computer-interaction. In: Communications and signal processing (ICICS) 2011 8th international conference on information: 1–5. doi:10.1109/ICICS.2011.6173545

  10. Oikonomidis I, Kyriazis N, Argyros A (2011) Markerless and efficient 26-DOF hand pose recovery. In: Kimmel R, Klette R, Sugimoto A (eds) Computer vision—ACCV 2010, vol 6494. Springer, Berlin, pp 744–757

    Google Scholar 

  11. Oikonomidis I, Kyriazis N, Argyros A (2011) Efficient model-based 3d tracking of hand articulations using kinect. BMVC 2011

Download references

Conflict of Interest

The authors declare that they have no conflict of interest.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andreas M. Hötker.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Hötker, A.M., Pitton, M.B., Mildenberger, P. et al. Speech and motion control for interventional radiology: requirements and feasibility. Int J CARS 8, 997–1002 (2013). https://doi.org/10.1007/s11548-013-0841-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11548-013-0841-7

Keywords

Navigation