[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content

Advertisement

Log in

A Weak and Semi-supervised Segmentation Method for Prostate Cancer in TRUS Images

  • Original Paper
  • Published:
Journal of Digital Imaging Aims and scope Submit manuscript

Abstract

The purpose of this research is to exploit a weak and semi-supervised deep learning framework to segment prostate cancer in TRUS images, alleviating the time-consuming work of radiologists to draw the boundary of the lesions and training the neural network on the data that do not have complete annotations. A histologic-proven benchmarking dataset of 102 case images was built and 22 images were randomly selected for evaluation. Some portion of the training images were strong supervised, annotated pixel by pixel. Using the strong supervised images, a deep learning neural network was trained. The rest of the training images with only weak supervision, which is just the location of the lesion, were fed to the trained network to produce the intermediate pixelwise labels for the weak supervised images. Then, we retrained the neural network on the all training images with the original labels and the intermediate labels and fed the training images to the retrained network to produce the refined labels. Comparing the distance of the center of mass of the refined labels and the intermediate labels to the weak supervision location, the closer one replaced the previous label, which could be considered as the label updates. After the label updates, test set images were fed to the retrained network for evaluation. The proposed method shows better result with weak and semi-supervised data than the method using only small portion of strong supervised data, although the improvement may not be as much as when the fully strong supervised dataset is used. In terms of mean intersection over union (mIoU), the proposed method reached about 0.6 when the ratio of the strong supervised data was 40%, about 2% decreased performance compared to that of 100% strong supervised case. The proposed method seems to be able to help to alleviate the time-consuming work of radiologists to draw the boundary of the lesions, and to train the neural network on the data that do not have complete annotations.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Stangelberger A, Waldert M: Djavan B:Prostate Cancer in Elderly Men. Rev Urol. 10(2):111–119, 2008

    PubMed  PubMed Central  Google Scholar 

  2. Rawla P: Epidemiology of Prostate Cancer. World J Oncol 10(2):63–89, 2019

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  3. Jemal A, Center MM, DeSantis C, Ward EM: Global patterns of cancer incidence and mortality rates and trends. Cancer Epidemiol Biomarkers Prev. 19(8):1893–1907, 2010

    Article  PubMed  Google Scholar 

  4. Eastham J: Prostate cancer screening. Investig Clin Urol 58(4):217–219, 2017

    Article  PubMed  PubMed Central  Google Scholar 

  5. Llobet R, Perez-Cortes JC, Toselli AH, Juan A: Computeraided detection of prostate cancer. Int J Med Imformatics, 2007

  6. Aus G, Abbou CC, Bolla M, Heidenreich A: EAU guidelines on prostate cancer. Eur Urol 48:546–551, 2005

    Article  CAS  PubMed  Google Scholar 

  7. Djavan B, Margreiter M: Biopsy standards for detection of prostate cancer. World J Urol 25:11–17, 2007

    Article  PubMed  Google Scholar 

  8. Schoots IG, Roobol MJ, Nieboer D, Bangma CH, Steyerberg EW, Hunink MG: Magnetic resonance imaging-targeted biopsy may enhance the diagnostic accuracy of significant prostate cancer detection compared to standard transrectal ultrasound-guided biopsy: a systematic review and meta-analysis. Eur Urol. 68(3):438–450, 2015

    Article  PubMed  Google Scholar 

  9. Martinez C, DallOglio M, Nesrallah L et al.: Predictive value of psa velocity over early clinical and pathological parameters in patients with localized prostate cancer who undergo radical retropubic prostatectomy. Int Braz J Urol 30(1), 2004

  10. Ellis JH, Tempany C, Sarin MS, Gatsonis C, Rifkin MD, Mcneil BJ: MR imaging and sonography of early prostatic cancer : pathologic and imaging features that influence identification and diagnosis, AJR, 1994.

  11. Huynen A, Giesen R et al.: Analysis of ultrasonographic prostate images for the detection of prostatic carcinoma: the automated urologic diagnostic expert system. Ultrasound Med Biol 20(1), 1994

  12. Rosette J, Giesen R et al.: Automated analysis and interpretation of transrectal ultrasonography images in patients with prostatitis. Eur Urol 27(1):47–53, 1995

    Article  PubMed  Google Scholar 

  13. Yfantis EA, Lazarakis T, Bebis G: On Cancer Recognition of Ultrasound Image, Computer Vision Beyond the Visible Spectrum: Methods and Applications, Procedings. IEEE Workshop on, 2000.

  14. Han SM, Lee JH, Choi JY: Computer-aided Prostate Cancer Detection using Texture Features and Clinical Features in Ultrasound Image. J Digit Imaging 21:121–133, 2008

    Article  PubMed Central  Google Scholar 

  15. Suk HI, Lee SW, Shen D, Alzheimer's Disease Neuroimaging Initiative: Hierarchical feature representation and multimodal fusion with deep learning for AD/MCI diagnosis. Neuroimage 101(0):569–582, 2014

    Article  PubMed  PubMed Central  Google Scholar 

  16. Dalmış MU, Vreemann S, Kooi T, Mann RM, Karssemeijer N, Gubern-Mérida A: Fully automated detection of breast cancer in screening MRI using convolutional neural networks. J Med Imaging 5, 2018

  17. Wang J et al.: Discrimination of Breast Cancer with Microcalcifications on Mammography by Deep Learning. Sci Rep 6, 2016

  18. Cheng JZ et al.: Computer-Aided Diagnosis with Deep Learning Architecture: Applications to Breast Lesions in US Images and Pulmonary Nodules in CT Scans. Sci Rep 6, 2016

  19. Han S, Kang HK, Jeong JY, Park MH, Kim W, Bang WC, Seong YK: Automated diagnosis of prostate cancer in multi-parametric MRI based on multimodal convolutional neural networks. Phys Med Biol 62:7714–7728, 2017

    Article  PubMed  Google Scholar 

  20. Le MH, Chen J, Wang L, Wang Z, Liu W, Cheng KT, Yang X: Automated diagnosis of prostate cancer in multi-parametric MRI based on multimodal convolutional neural networks. Phys Med Biol 62:6497–6514, 2017

    Article  PubMed  Google Scholar 

  21. Tsehay Y, Lay N, Wang X, Kwak JT, Turkbey et al: Biopsy-guided learning with deep convolutional neural networks for Prostate Cancer detection on multiparametric MRI, Proceedings of 2017 IEEE 14th International Symposium on Biomedical Imaging (ISBI 2017), 2017.

  22. Anas EMA, Mousavi P, Abolmaesumi P: A deep learning approach for real time prostate segmentation in freehand ultrasound guided biopsy. Med Image Anal 48:107–116, 2018

    Article  PubMed  Google Scholar 

  23. Wu F, Wang Z, Zhang Z, Yang Y, Luo J, Zhu W, Zhuang Y: Weakly Semi-Supervised Deep Learning for Multi-Label Image Annotation. IEEE Trans Big Data 1:109–122, 2015

    Article  Google Scholar 

  24. Papandreou G, Chen LC, Murphy KP, Yuille AL: Weakly-and Semi-Supervised Learning of a Deep Convolutional Network for Semantic Image Segmentation, 2015 IEEE International Conference on Computer Vision (ICCV), 2015.

  25. Wang Y, Liu J, Li Y, Lu H: Semi- and Weakly- Supervised Semantic Segmentation with Deep Convolutional Neural Networks, The 23rd ACM international conference, 2015.

  26. Neverova N, Wolf C, Nebout F: Taylor GW:Hand Pose Estimation through Semi-Supervised and Weakly-Supervised Learning. Comp Vision Image Underst 164:56–67, 2017

    Article  Google Scholar 

  27. Shin SY, Lee S, Yun ID, Kim SM, Lee KM: Joint Weakly and Semi-Supervised Deep Learning for Localization and Classification of Masses in Breast Ultrasound Images. IEEE Trans Med Imag 38:762–774, 2019

    Article  Google Scholar 

  28. Lee J, Kim E, Lee S, Lee J, Yoon S:FickleNet: Weakly and Semi-supervised Semantic Image Segmentation using Stochastic Inference, arXiv:1902.10421[cs.CV], 2019.

  29. Souly N, Spampinato C, Shah M: Semi Supervised Semantic Segmentation Using Generative Adversarial Network, 2017 IEEE International Conference on Computer Vision (ICCV), 2017.

  30. Deng J et al: Imagenet: A large-scale hierarchical image database. Computer Vision and Pattern Recognition, IEEE Conference on CVPR 2009, 2009.

  31. Chen LC, Papandreou G, Schroff F, Adam H: Rethinking Atrous Convolution for Semantic Image Segmentation, arXiv: 1706.05587, 2017.

  32. Kingma DP, Ba J:Adam: A method for stochastic optimization, arXiv:1412.6980, 2014.

  33. Abadi M, Barham P, Chen J, Chen Z, Davis A et al: Tensorflow: A system for large-scale machine learning, Proceedings of the 12th USENIX Symposium on Operating Systems Design and Implementation (OSDI). Savannah, Georgia, USA, 2016.

  34. Osuchowski M, Aebisher D, Gustalik J, Aebisher DB, Kaznowska E: The advancement of imaging in diagnosis of prostate cancer. Eur J Clin Exp Med 17(1):67–70, 2019. https://doi.org/10.15584/ejcem.2019.1.11

    Article  Google Scholar 

Download references

Acknowledgements

This research was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (NRF-2017R1C1B5077068 and NRF-2013R1A1A2011398) and by Korea National University of Transportation in 2019, and also supported by the Technology Innovation Program funded By the Ministry of Trade, Industry and Energy (MOTIE) of Korea (10049785, Development of ‘medical equipment using (ionizing or non-ionizing) radiation’-dedicated R&D platform and medical device technology).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sung Il Hwang.

Ethics declarations

Conflict of Interest

The authors declare that they have no conflict of interest.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Han, S., Hwang, S. & Lee, H.J. A Weak and Semi-supervised Segmentation Method for Prostate Cancer in TRUS Images. J Digit Imaging 33, 838–845 (2020). https://doi.org/10.1007/s10278-020-00323-3

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10278-020-00323-3

Keywords

Navigation