Rofo 2023; 195(09): 797-803
DOI: 10.1055/a-2076-6736
Review

Artificial intelligence in radiology – beyond the black box

Article in several languages: English | deutsch
1   Division of Experimental Radiology, Department for Diagnostic and Interventional Radiology, University Ulm Medical Centre, Ulm, Germany
,
2   Visual Computing, University of Ulm, Germany
,
Timo Ropinski
2   Visual Computing, University of Ulm, Germany
,
1   Division of Experimental Radiology, Department for Diagnostic and Interventional Radiology, University Ulm Medical Centre, Ulm, Germany
3   Medical Image Computing, DKFZ, Heidelberg, Germany
› Author Affiliations
Supported by: University of Ulm Baustein (L.SBN.0214)

Abstract

Background Artificial intelligence is playing an increasingly important role in radiology. However, more and more often it is no longer possible to reconstruct decisions, especially in the case of new and powerful methods from the field of deep learning. The resulting models fulfill their function without the users being able to understand the internal processes and are used as so-called black boxes. Especially in sensitive areas such as medicine, the explainability of decisions is of paramount importance in order to verify their correctness and to be able to evaluate alternatives. For this reason, there is active research going on to elucidate these black boxes.

Method This review paper presents different approaches for explainable artificial intelligence with their advantages and disadvantages. Examples are used to illustrate the introduced methods. This study is intended to enable the reader to better assess the limitations of the corresponding explanations when meeting them in practice and strengthen the integration of such solutions in new research projects.

Results and Conclusion Besides methods to analyze black-box models for explainability, interpretable models offer an interesting alternative. Here, explainability is part of the process and the learned model knowledge can be verified with expert knowledge.

Key Points:

  • The use of artificial intelligence in radiology offers many possibilities to provide safer and more efficient medical care. This includes, but is not limited to support during image acquisition and processing or for diagnosis.

  • Complex models can achieve high accuracy, but make it difficult to understand data processing.

  • If the explainability is already taken into account during the planning of the model, methods can be developed that are powerful and interpretable at the same time.

Citation Format

  • Gallée L, Kniesel H, Ropinski T et al. Artificial intelligence in radiology – beyond the black box. Fortschr Röntgenstr 2023; 195: 797 – 803



Publication History

Received: 22 December 2022

Accepted: 22 March 2023

Article published online:
09 May 2023

© 2023. Thieme. All rights reserved.

Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany

 
  • References

  • 1 Hricak H. 2016 new horizons lecture: beyond imaging – radiology of tomorrow. Radiology 2018; 286 (03) 764-775
  • 2 Bundesamt für Strahlenschutz, Hrsg. Röntgendiagnostik: Häufigkeit und Strahlenexposition für die deutsche Bevölkerung“. 14. April 2022. Zugegriffen: 24. Oktober 2022. [Online]. Verfügbar unter:. https://www.bfs.de/DE/themen/ion/anwendung-medizin/diagnostik/roentgen/haeufigkeit-exposition.html
  • 3 Attenberger U, Reiser MF. Future Perspectives: Wie beeinflusst künstliche Intelligenz die Entwicklung unseres Berufsfeldes?. Radiol 2022; 62 (03) 267-270
  • 4 Chen Y. et al. AI-Based Reconstruction for Fast MRI – A Systematic Review and Meta-Analysis. Proc. IEEE 2022; 110 (02) 224-245
  • 5 Reader AJ, Corda G, Mehranian A. et al. Deep Learning for PET Image Reconstruction. IEEE Trans. Radiat. Plasma Med. Sci 2021; 5 (01) 1-25
  • 6 Willemink MJ, Noël PB. The evolution of image reconstruction for CT – from filtered back projection to artificial intelligence. Eur. Radiol 2019; 29 (05) 2185-2195
  • 7 Rodriguez-Ruiz A. et al. Can we reduce the workload of mammographic screening by automatic identification of normal exams with artificial intelligence? A feasibility study. Eur. Radiol 2019; 29: 4825-4832
  • 8 McKinney SM. et al. International evaluation of an AI system for breast cancer screening. Nature 2020; 577: 89-94
  • 9 Kooi T. et al. Large scale deep learning for computer aided detection of mammographic lesions. Med. Image Anal 2017; 35: 303-312
  • 10 Gu Z. et al. CE-Net: Context Encoder Network for 2D Medical Image Segmentation. IEEE Trans. Med. Imaging 2019; 38 (10) 2281-2292
  • 11 Huang H. et al. UNet 3+: A Full-Scale Connected UNet for Medical Image Segmentation. in ICASSP 2020–2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Barcelona, Spain 2020;
  • 12 Zhou Z, Siddiquee MMR, Tajbakhsh N. et al. UNet++: Redesigning Skip Connections to Exploit Multiscale Features in Image Segmentation. IEEE Trans. Med. Imaging 2020; 39 (06) 1856-1867
  • 13 Bera K, Braman N, Gupta A. et al. Predicting cancer outcomes with radiomics and artificial intelligence in radiology. Nat. Rev. Clin. Oncol 2022; 19 (02) 132-146
  • 14 Shin J. et al. MRI radiomics model predicts pathologic complete response of rectal cancer following chemoradiotherapy. Radiology 2022; 303 (02) 351-358
  • 15 Lisson CS. et al. Deep Neural Networks and Machine Learning Radiomics Modelling for Prediction of Relapse in Mantle Cell Lymphoma. Cancers 2022; 14 (08) 2008
  • 16 Guo Y, Liu Y, Oerlemans A. et al. Deep learning for visual understanding: A review. Neurocomputing 2016; 187: 27-48
  • 17 Feng X, Jiang Y, Yang X. et al. Computer vision algorithms and hardware implementations: A survey. Integration 2019; 69: 309-320
  • 18 Kiryati N, Landau Y. Dataset Growth in Medical Image Analysis Research. J. Imaging 2021; 7 (08) 155
  • 19 Thompson NC, Greenewald K, Lee K. et al. The Computational Limits of Deep Learning, MIT INITIATIVE ON THE DIGITAL ECONOMY RESEARCH BRIEF Vol. 4, Sep. 2020.
  • 20 He J, Baxter SL, Xu J. et al. The practical implementation of artificial intelligence technologies in medicine. Nat. Med 2019; 25 (01) 30-36
  • 21 Topol EJ. High-performance medicine: the convergence of human and artificial intelligence. Nat. Med 2019; 25 (01) 44-56
  • 22 Tonekaboni S, Joshi S, McCradden MD. et al. What Clinicians Want: Contextualizing Explainable Machine Learning for Clinical End Use. Proceedings of the 4th Machine Learning for Healthcare Conference 2019; 106: 359-380
  • 23 Götz M, Maier-Hein KH. Optimal Statistical Incorporation of Independent Feature Stability Information into Radiomics Studies. Sci. Rep 2020; 10 (01) 737
  • 24 Zwanenburg A. et al. The Image Biomarker Standardization Initiative: Standardized Quantitative Radiomics for High-Throughput Image-based Phenotyping. Radiology 2020; 295 (02) 328-338
  • 25 Zech JR, Badgeley MA, Liu M. et al. Variable generalization performance of a deep learning model to detect pneumonia in chest radiographs: A cross-sectional study. PLOS Med 2018; 15 (11) e1002683
  • 26 Nasief H. et al. A machine learning based delta-radiomics process for early prediction of treatment response of pancreatic cancer. Npj Precis. Oncol 2019; 3 (01) 25
  • 27 Wood A, Shpilrain V, Najarian K. et al. Private naive bayes classification of personal biomedical data: Application in cancer data analysis. Comput. Biol. Med 2019; 105: 144-150
  • 28 Masoud Rezaeijo S, Ghorvei M, Alaei M. A machine learning method based on lesion segmentation for quantitative analysis of CT radiomics to detect COVID-19. 6th Iranian Conference on Signal Processing and Intelligent Systems (ICSPIS) 2020; 1-5
  • 29 Chaddad A, Zinn PO, Colen RR. Radiomics texture feature extraction for characterizing GBM phenotypes using GLCM. IEEE 12th International Symposium on Biomedical Imaging (ISBI) 2015; 84-87
  • 30 Haniff NSM, Karim MKBA, Ali NS. et al. Magnetic Resonance Imaging Radiomics Analysis for Predicting Hepatocellular Carcinoma. International Congress of Advanced Technology and Engineering (ICOTEN), Taiz, Yemen 2021; 1-5
  • 31 Wu Q. et al. Radiomics analysis of magnetic resonance imaging improves diagnostic performance of lymph node metastasis in patients with cervical cancer. Radiother. Oncol 2019; 138: 141-148
  • 32 Loyola-Gonzalez O. Black-Box vs. White-Box: Understanding Their Advantages and Weaknesses From a Practical Point of View. IEEE Access 2019; 7: 154096-154113
  • 33 Leong MC, Prasad DK, Lee YT. et al. Semi-CNN Architecture for Effective Spatio-Temporal Learning in Action Recognition. Appl. Sci 2020; 10 (02) 557
  • 34 Nguyen A, Dosovitskiy A, Yosinski J. et al. Synthesizing the preferred inputs for neurons in neural networks via deep generator networks. Adv. Neural Inf. Process. Syst 2016; 29
  • 35 Dosovitskiy A, Brox T. Inverting visual representations with convolutional networks. Proceedings of the IEEE conference on computer vision and pattern recognition 2016; 4829-4837
  • 36 Zeiler MD, Fergus R. Visualizing and Understanding Convolutional Networks. Computer Vision – ECCV 2014 D. Fleet, T. Pajdla, B. Schiele, und T. Tuytelaars, Hrsg. Cham: Springer International Publishing, 2014; 8689: 818-833
  • 37 Park SJ, An KH, Lee M. Saliency map model with adaptive masking based on independent component analysis. Neurocomputing 2002; 49 (01/04) 417-422
  • 38 Simonyan K, Vedaldi A, Zisserman A. Deep Inside Convolutional Networks: Visualising Image Classification Models and Saliency Maps, in 2nd International Conference on Learning Representations, ICLR 2014, Banff, AB, Canada, April 14-16, 2014, Workshop Track Proceedings, 2014.
  • 39 Adebayo J, Gilmer J, Muelly M. et al. Sanity checks for saliency maps. Adv. Neural Inf. Process. Syst 2018; 31
  • 40 DeGrave AJ, Janizek JD, Lee SI. AI for radiographic COVID-19 detection selects shortcuts over signal. Nat. Mach. Intell 2021; 3 (07) 610-619
  • 41 Kim B. et al. Interpretability Beyond Feature Attribution: Quantitative Testing with Concept Activation Vectors (TCAV). Proceedings of the 35th International Conference on Machine Learning 2018; 80: 2668-2677
  • 42 Chen C, Li O, Tao D. et al. This looks like that: deep learning for interpretable image recognition. Adv. Neural Inf. Process. Syst 2019; 32
  • 43 Li O, Liu H, Chen C. et al. Deep learning for case-based reasoning through prototypes: A neural network that explains its predictions. Proceedings of the AAAI Conference on Artificial Intelligence 2018; 32 (01)
  • 44 Adler TJ. et al. Uncertainty-aware performance assessment of optical imaging modalities with invertible neural networks. Int. J. Comput. Assist. Radiol. Surg 2019; 14 (06) 997-1007