[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content
Log in

Automatic classification of ultrasound breast lesions using a deep convolutional neural network mimicking human decision-making

  • Imaging Informatics and Artificial Intelligence
  • Published:
European Radiology Aims and scope Submit manuscript

Abstract

Objectives

To evaluate a deep convolutional neural network (dCNN) for detection, highlighting, and classification of ultrasound (US) breast lesions mimicking human decision-making according to the Breast Imaging Reporting and Data System (BI-RADS).

Methods and materials

One thousand nineteen breast ultrasound images from 582 patients (age 56.3 ± 11.5 years) were linked to the corresponding radiological report. Lesions were categorized into the following classes: no tissue, normal breast tissue, BI-RADS 2 (cysts, lymph nodes), BI-RADS 3 (non-cystic mass), and BI-RADS 4–5 (suspicious). To test the accuracy of the dCNN, one internal dataset (101 images) and one external test dataset (43 images) were evaluated by the dCNN and two independent readers. Radiological reports, histopathological results, and follow-up examinations served as reference. The performances of the dCNN and the humans were quantified in terms of classification accuracies and receiver operating characteristic (ROC) curves.

Results

In the internal test dataset, the classification accuracy of the dCNN differentiating BI-RADS 2 from BI-RADS 3–5 lesions was 87.1% (external 93.0%) compared with that of human readers with 79.2 ± 1.9% (external 95.3 ± 2.3%). For the classification of BI-RADS 2–3 versus BI-RADS 4–5, the dCNN reached a classification accuracy of 93.1% (external 95.3%), whereas the classification accuracy of humans yielded 91.6 ± 5.4% (external 94.1 ± 1.2%). The AUC on the internal dataset was 83.8 (external 96.7) for the dCNN and 84.6 ± 2.3 (external 90.9 ± 2.9) for the humans.

Conclusion

dCNNs may be used to mimic human decision-making in the evaluation of single US images of breast lesion according to the BI-RADS catalog. The technique reaches high accuracies and may serve for standardization of highly observer-dependent US assessment.

Key Points

• Deep convolutional neural networks could be used to classify US breast lesions.

• The implemented dCNN with its sliding window approach reaches high accuracies in the classification of US breast lesions.

• Deep convolutional neural networks may serve for standardization in US BI-RADS classification.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Abbreviations

ABUS:

Automatic breast ultrasound

ACR BI-RADS:

American College of Radiology Breast Imaging Reporting and Data System

AUC:

Area under the curve

dCNN:

Deep convolutional neural network

DM:

Digital mammography

MD:

Metadata

NT:

Normal tissue

RIS:

Radiological information system

ROC curves:

Receiver operating characteristic curves

US:

Ultrasound

References

  1. Independent UK Panel on Breast Cancer Screening (2012) The benefits and harms of breast cancer screening: an independent review. Lancet 380:1778–1786

  2. Akin O, Brennan SB, Dershaw DD et al (2012) Advances in oncologic imaging: update on 5 common cancers. CA Cancer J Clin 62:364–393

    Article  PubMed  Google Scholar 

  3. Lång K, Andersson I, Zackrisson S (2014) Breast cancer detection in digital breast tomosynthesis and digital mammography-a side-by-side review of discrepant cases. Br J Radiol 87:20140080

  4. Boyd NF, Martin LJ, Yaffe MJ, Minkin S (2011) Mammographic density and breast cancer risk: current understanding and future prospects. Breast Cancer Res 13:223

    Article  PubMed  PubMed Central  Google Scholar 

  5. Brisson J (1991) Family history of breast cancer, mammographic features of breast tissue, and breast cancer risk. Epidemiology 2:440–444

    Article  CAS  PubMed  Google Scholar 

  6. Vourtsis A, Berg WA (2018) Breast density implications and supplemental screening. Eur Radiol. https://doi.org/10.1007/s00330-018-5668-8

  7. Njor SH, Schwartz W, Blichert-Toft M, Lynge E (2015) Decline in breast cancer mortality: how much is attributable to screening? J Med Screen 22:20–27

    Article  PubMed  Google Scholar 

  8. Berg WA, Blume JD, Cormack JB et al (2008) Combined screening with ultrasound and mammography vs mammography alone in women at elevated risk of breast cancer. JAMA 299:2151–2163

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  9. Stavros AT, Thickman D, Rapp CL, Dennis MA, Parker SH, Sisney GA (1995) Solid breast nodules: use of sonography to distinguish between benign and malignant lesions. Radiology 196:123–134

    Article  CAS  PubMed  Google Scholar 

  10. Kelly KM, Dean J, Comulada WS, Lee SJ (2010) Breast cancer detection using automated whole breast ultrasound and mammography in radiographically dense breasts. Eur Radiol 20:734–742

    Article  PubMed  Google Scholar 

  11. Yap MH, Pons G, Marti J et al (2017) Automated breast ultrasound lesions detection using convolutional neural networks. IEEE J Biomed Health Inform. https://doi.org/10.1109/JBHI.2017.2731873

  12. Kuhl CK (2017) Abbreviated breast MRI for screening women with dense breast: the EA1141 Trial. Br J Radiol. https://doi.org/10.1259/bjr.20170441:20170441

  13. Lee HJ, Kim EK, Kim MJ et al (2008) Observer variability of Breast Imaging Reporting and Data System (BI-RADS) for breast ultrasound. Eur J Radiol 65:293–298

    Article  PubMed  Google Scholar 

  14. Park CS, Lee JH, Yim HW et al (2007) Observer agreement using the ACR Breast Imaging Reporting and Data System (BI-RADS)-ultrasound, first edition (2003). Korean J Radiol 8:397–402

    Article  PubMed  PubMed Central  Google Scholar 

  15. Tosteson AN, Fryback DG, Hammond CS et al (2014) Consequences of false-positive screening mammograms. JAMA Intern Med 174:954–961

    Article  PubMed  PubMed Central  Google Scholar 

  16. Stavros AT, Freitas AG, deMello GGN et al (2017) Ultrasound positive predictive values by BI-RADS categories 3-5 for solid masses: an independent reader study. Eur Radiol. https://doi.org/10.1007/s00330-017-4835-7

  17. Zagouri F, Sergentanis TN, Gounaris A et al (2008) Pain in different methods of breast biopsy: emphasis on vacuum-assisted breast biopsy. Breast 17:71–75

    Article  PubMed  Google Scholar 

  18. Yazici B, Sever AR, Mills P, Fish D, Jones SE, Jones PA (2006) Scar formation after stereotactic vacuum-assisted core biopsy of benign breast lesions. Clin Radiol 61:619–624

    Article  CAS  PubMed  Google Scholar 

  19. Berg WA, Zhang Z, Lehrer D et al (2012) Detection of breast cancer with addition of annual screening ultrasound or a single screening MRI to mammography in women with elevated breast cancer risk. JAMA 307:1394–1404

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  20. Tice JA, Lee JM, Pearson SD (2014) The comparative clinical effectiveness and value of supplemental screening tests following negative mammography in women with dense breast tissue. Institute for Clinical & Economic Review

  21. Ardakani AA, Gharbali A, Mohammadi A (2015) Classification of breast tumors using sonographic texture analysis. J Ultrasound Med 34:225–231

    Article  PubMed  Google Scholar 

  22. Yap MH, Yap CH (2016) Breast ultrasound lesions classification: a performance evaluation between manual delineation and computer segmentation. In: Abbey CK, Kupinski MA (eds) Medical Imaging 2016: Image Perception, Observer Performance, and Technology Assessment 9787

  23. Min-Chun Yang, Woo Kyung Moon, Wang YC et al (2013) Robust texture analysis using multi-resolution gray-scale invariant features for breast sonographic tumor diagnosis. IEEE Trans Med Imaging 32:2262–2273

  24. Newell D, Nie K, Chen JH et al (2010) Selection of diagnostic features on breast MRI to differentiate between malignant and benign lesions using computer-aided diagnosis: differences in lesions presenting as mass and non-mass-like enhancement. Eur Radiol 20:771–781

    Article  PubMed  Google Scholar 

  25. Han S, Kang HK, Jeong JY et al (2017) A deep learning framework for supporting the classification of breast lesions in ultrasound images. Phys Med Biol 62:7714–7728

    Article  PubMed  Google Scholar 

  26. Becker AS, Mueller M, Stoffel E, Marcon M, Ghafoor S, Boss A (2017) Classification of breast cancer from ultrasound imaging using a generic deep learning analysis software: a pilot study. Br J Radiol. https://doi.org/10.1259/bjr.20170576:20170576

  27. Cheng JZ, Ni D, Chou YH et al (2016) Computer-aided diagnosis with deep learning architecture: applications to breast lesions in US images and pulmonary nodules in CT scans. Sci Rep 6:24454

  28. Bradski G (2000) The OpenCV library. Dr. Dobb’s Journal of Software Tools. UBM Technology Group, San Francisco

  29. Rodtook A, Makhanov SS (2013) Multi-feature gradient vector flow snakes for adaptive segmentation of the ultrasound images of breast cancer. J Vis Commun Image Represent 24:1414–1430

    Article  Google Scholar 

  30. Joo S, Yang YS, Moon WK, Kim HC (2004) Computer-aided diagnosis of solid breast nodules: use of an artificial neural network based on multiple sonographic features. IEEE Trans Med Imaging 23:1292–1300

    Article  PubMed  Google Scholar 

  31. Mogatadakala KV, Donohue KD, Piccoli CW, Forsberg F (2006) Detection of breast lesion regions in ultrasound images using wavelets and order statistics. Med Phys 33:840–849

    Article  PubMed  Google Scholar 

  32. Moon WK, Lo CM, Chang JM, Huang CS, Chen JH, Chang RF (2013) Quantitative ultrasound analysis for classification of BI-RADS category 3 breast masses. J Digit Imaging 26:1091–1098

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

The work was supported by the Clinical Research Priority Program of the Faculty of Medicine of the University of Zurich “Artificial Intelligence in oncological Imaging Network”.

Funding

The authors state that this work has not received any funding.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alexander Ciritsis.

Ethics declarations

Guarantor

The scientific guarantor of this publication is Prof. Dr. med Andreas Boss.

Conflict of interest

The authors of this manuscript declare no relationships with any companies, whose products or services may be related to the subject matter of the article.

Statistics and biometry

No complex statistical methods were necessary for this paper.

Informed consent

Written informed consent was waived by the Institutional Review Board.

Ethical approval

Institutional Review Board approval was obtained.

Methodology

• retrospective

• experimental

• performed at one institution

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

ESM 1

(DOCX 21 kb)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ciritsis, A., Rossi, C., Eberhard, M. et al. Automatic classification of ultrasound breast lesions using a deep convolutional neural network mimicking human decision-making. Eur Radiol 29, 5458–5468 (2019). https://doi.org/10.1007/s00330-019-06118-7

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00330-019-06118-7

Keywords

Navigation