[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1007/978-3-031-47665-5_18guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

Cross-Domain Few-Shot Sparse-Quantization Aware Learning for Lymphoblast Detection in Blood Smear Images

Published: 05 November 2023 Publication History

Abstract

Deep learning for medical image classification has enjoyed increased attention. However, a bottleneck that prevents it from widespread adoption is its dependency on very large, annotated datasets, a condition that cannot always be satisfied. Few-shot learning in the medical domain is still in its infancy but has the potential to overcome these challenges. Compression is a way for models to be deployed on resource-constrained machines. In an attempt to tackle some of the challenges imposed by limited data and high computational resources, we present a few-shot sparse-quantization aware meta-training framework (FS-SQAM). The proposed framework aims to exploit the role of sparsity and quantization for improved adaptability in a low-resource cross-domain setting for the classification of acute lymphocytic leukemia (ALL) in blood cell images. Combining these strategies enables us to approach two of the most common problems that encounter deep learning for medical images: the need for extremely large datasets and high computational resources. Extensive experiments have been conducted to evaluate the performance of the proposed framework on the ALL-IDB2 dataset in a cross-domain few-shot setting. Performance gains in terms of accuracy and compression have been demonstrated, thus serving to realize the suitability of meta-learning on resource-constrained devices. Future advancements in the domain of efficient deep learning computer-aided diagnosis systems will facilitate their adoption in clinical medicine.

References

[1]
Ellis RJ, Sander RM, and Limon A Twelve key challenges in medical machine learning and solutions Intell. Based Med. 2022 6 100068
[2]
Willemink MJ et al. Preparing medical imaging data for machine learning Radiology 2020 295 4-15
[3]
Weng, W.-H., Deaton, J., Natarajan, V., Elsayed, G.F., Liu, Y.: Addressing the real-world class imbalance problem in dermatology. In: Proceedings of the Machine Learning for Health NeurIPS Workshop, pp. 415–429. PMLR (2020)
[4]
Litjens, G., et al.: Deep learning as a tool for increased accuracy and efficiency of histopathological diagnosis. Sci. Rep. 6, 26286 (2016).
[5]
Rajpurkar, P., et al.: CheXaid: deep learning assistance for physician diagnosis of tuberculosis using chest x-rays in patients with HIV. NPJ Digit. Med. 3, 1–8 (2020).
[6]
Altini N et al. Liver, kidney and spleen segmentation from CT scans and MRI with deep learning: a survey Neurocomputing 2022 490 30-53
[7]
Chan H-P, Hadjiiski LM, and Samala RK Computer-aided diagnosis in the era of deep learning Med. Phys. 2020 47 e218-e227
[8]
Labati, R.D., Piuri, V., Scotti, F.: All-IDB: The acute lymphoblastic leukemia image database for image processing. In: 2011 18th IEEE International Conference on Image Processing, pp. 2045–2048 (2011).
[9]
Genovese, A.: ALLNet: acute lymphoblastic leukemia detection using lightweight convolutional networks. In: 2022 IEEE 9th International Conference on Computational Intelligence and Virtual Environments for Measurement Systems and Applications (CIVEMSA), pp. 1–6 (2022).
[10]
Genovese, A., Hosseini, M.S., Piuri, V., Plataniotis, K.N., Scotti, F.: Histopathological transfer learning for acute lymphoblastic leukemia detection. In: 2021 IEEE International Conference on Computational Intelligence and Virtual Environments for Measurement Systems and Applications (CIVEMSA), pp. 1–6 (2021).
[11]
Maaliw, R.R., et al.: A multistage transfer learning approach for acute lymphoblastic leukemia classification. In: 2022 IEEE 13th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON), pp. 0488–0495 (2022).
[12]
Snell, J., Swersky, K., Zemel, R.S.: Prototypical Networks for Few-shot Learning. arXiv preprint http://arxiv.org/abs/1703.05175 (2017).
[13]
Vinyals, O., Blundell, C., Lillicrap, T., Kavukcuoglu, K., Wierstra, D.: Matching networks for one shot learning. In: Proceedings of the 30th International Conference on Neural Information Processing Systems, pp. 3637–3645. Curran Associates Inc., Red Hook (2016)
[14]
Sung, F., Yang, Y., Zhang, L., Xiang, T., Torr, P.H.S., Hospedales, T.M.: Learning to Compare: Relation Network for Few-Shot Learning. arXiv preprint http://arxiv.org/abs/1711.06025 (2018).
[15]
Chao S and Belanger D Generalizing few-shot classification of whole-genome doubling across cancer types Pac. Symp. Biocomput. 2022 27 144-155
[16]
Paul, A., Shen, T.C., Peng, Y., Lu, Z., Summers, R.M.: Learning few-shot chest X-ray diagnosis using images from the published scientific literature. In: 2021 IEEE 18th International Symposium on Biomedical Imaging (ISBI), pp. 344–348 (2021).
[17]
Walsh R, Abdelpakey MH, Shehata MS, and Mohamed MM Automated human cell classification in sparse datasets using few-shot learning Sci. Rep. 2022 12 2924
[18]
Guo, Y., et al.: A Broader Study of Cross-Domain Few-Shot Learning. arXiv preprint http://arxiv.org/abs/1912.07200 (2020)
[19]
Triantafillou, E., et al.: Meta-Dataset: A Dataset of Datasets for Learning to Learn from Few Examples. arXiv preprint http://arxiv.org/abs/1903.03096 (2020).
[20]
Zhang, X., Colbert, I., Kreutz-Delgado, K., Das, S.: Training Deep Neural Networks with Joint Quantization and Pruning of Weights and Activations. arXiv preprint http://arxiv.org/abs/2110.08271 (2021).
[21]
Park, J.-H., Kim, K.-M., Lee, S.: Quantized sparse training: a unified trainable framework for joint pruning and quantization in DNNs. ACM Trans. Embed. Comput. Syst. 21, 60:1–60:22 (2022).
[22]
Bronskill, J., Massiceti, D., Patacchiola, M., Hofmann, K., Nowozin, S., Turner, R.E.: Memory Efficient Meta-Learning with Large Images. arXiv preprint http://arxiv.org/abs/2107.01105 (2021).
[23]
Youn, J., Song, J., Kim, H.-S., Bahk, S.: Bitwidth-Adaptive Quantization-Aware Neural Network Training: A Meta-Learning Approach. arXiv preprint http://arxiv.org/abs/2207.10188 (2022).
[24]
Chauhan, J., Kwon, Y.D., Mascolo, C.: Exploring On-Device Learning Using Few Shots for Audio Classification 5
[25]
Tian H, Liu B, Yuan X-T, and Liu Q Vedaldi A, Bischof H, Brox T, and Frahm J-M Meta-learning with network pruning Computer Vision – ECCV 2020 2020 Cham Springer 675-700
[26]
Javed S, Mahmood A, Werghi N, Benes K, and Rajpoot N Multiplex cellular communities in multi-gigapixel colorectal cancer histology images for tissue phenotyping IEEE Trans. Image Process. 2020 29 9204-9219
[27]
Breast Cancer Histopathological Database (BreakHis). Laboratório Visão Robótica e Imagem. https://web.inf.ufpr.br/vri/databases/breast-cancer-histopathological-database-breakhis/. Accessed 20 Oct 2022
[28]
Shakeri, F., et al.: FHIST: A Benchmark for Few-shot Classification of Histological Images. http://arxiv.org/abs/2206.00092 (2022).
[29]
[2202.09059] Towards better understanding and better generalization of few-shot classification in histology images with contrastive learning. arXiv preprint https://arxiv.org/abs/2202.09059. Accessed 01 June 2023
[30]
100,000 histological images of human colorectal cancer and healthy tissue. Zenodo. https://zenodo.org/record/1214456. Accessed 02 June 2023
[31]
Kim YJ et al. PAIP 2019: liver cancer segmentation challenge Med. Image Anal. 2021 67 101854
[32]
Ren, M., et al.: Meta-Learning for Semi-Supervised Few-Shot Classification. arXiv preprint http://arxiv.org/abs/1803.00676 (2018)
[33]
Frankle, J., Carbin, M.: The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks. arXiv preprint http://arxiv.org/abs/1803.03635 (2019).
[34]
Jacob, B., et al.: Quantization and Training of Neural Networks for Efficient Integer-Arithmetic-Only Inference. arXiv preprint http://arxiv.org/abs/1712.05877 (2017).
[35]
TensorFlow Lite. https://www.tensorflow.org/lite/guide. Accessed 26 Jan 2023
[36]
Chijiwa, D., Yamaguchi, S., Kumagai, A., Ida, Y.: Meta-ticket: finding optimal subnetworks for few-shot learning within randomly initialized neural networks. Presented at the Advances in Neural Information Processing Systems, 31 October (2022)
[37]
Liu, Z., et al.: MetaPruning: Meta Learning for Automatic Neural Network Channel Pruning. arXiv preprint http://arxiv.org/abs/1903.10258 (2019).
[38]
Hoefler, T., Alistarh, D., Ben-Nun, T., Dryden, N., Peste, A.: Sparsity in Deep Learning: Pruning and Growth for Efficient Inference and Training in Neural Networks. arXiv preprint http://arxiv.org/abs/2102.00554 (2021).

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Guide Proceedings
Pattern Recognition: 7th Asian Conference, ACPR 2023, Kitakyushu, Japan, November 5–8, 2023, Proceedings, Part III
Nov 2023
414 pages
ISBN:978-3-031-47664-8
DOI:10.1007/978-3-031-47665-5
  • Editors:
  • Huimin Lu,
  • Michael Blumenstein,
  • Sung-Bae Cho,
  • Cheng-Lin Liu,
  • Yasushi Yagi,
  • Tohru Kamiya

Publisher

Springer-Verlag

Berlin, Heidelberg

Publication History

Published: 05 November 2023

Author Tags

  1. Few-Shot Learning
  2. Medical Image Analysis
  3. Compression

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 16 Jan 2025

Other Metrics

Citations

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media