Advertisement
Open Horizon| Volume 48, P14-16, February 2023

Download started.

Ok

Is Artificial Intelligence Replacing Our Radiology Stars? Not Yet!

  • Giovanni E. Cacciamani
    Correspondence
    Corresponding author. Artificial Intelligence Center at USC Urology, Urology Institute, University of Southern California, Los Angeles, CA, USA. Tel. +1 626 4911531.
    Affiliations
    USC Institute of Urology and Catherine and Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA

    Artificial Intelligence Center at USC Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA, USA

    USC/Norris Comprehensive Cancer Center, USC Institute of Urology, University of Southern California, Los Angeles, CA, USA

    Department of Radiology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA

    Center for Image-Guided Surgery, Focal Therapy and Artificial Intelligence for Prostate Cancer, USC Institute of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA
    Search for articles by this author
  • Daniel I. Sanford
    Affiliations
    USC Institute of Urology and Catherine and Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA

    Artificial Intelligence Center at USC Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA, USA
    Search for articles by this author
  • Timothy N. Chu
    Affiliations
    USC Institute of Urology and Catherine and Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA

    Artificial Intelligence Center at USC Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA, USA
    Search for articles by this author
  • Masatomo Kaneko
    Affiliations
    USC Institute of Urology and Catherine and Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA

    Artificial Intelligence Center at USC Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA, USA

    Center for Image-Guided Surgery, Focal Therapy and Artificial Intelligence for Prostate Cancer, USC Institute of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA

    Department of Urology, Kyoto Prefectural University of Medicine, Kyoto, Japan
    Search for articles by this author
  • Andre L. De Castro Abreu
    Affiliations
    USC Institute of Urology and Catherine and Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA

    Artificial Intelligence Center at USC Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA, USA

    USC/Norris Comprehensive Cancer Center, USC Institute of Urology, University of Southern California, Los Angeles, CA, USA

    Department of Radiology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA

    Center for Image-Guided Surgery, Focal Therapy and Artificial Intelligence for Prostate Cancer, USC Institute of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA
    Search for articles by this author
  • Vinay Duddalwar
    Affiliations
    USC/Norris Comprehensive Cancer Center, USC Institute of Urology, University of Southern California, Los Angeles, CA, USA

    Department of Radiology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA
    Search for articles by this author
  • Inderbir S. Gill
    Affiliations
    USC Institute of Urology and Catherine and Joseph Aresty Department of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA

    Artificial Intelligence Center at USC Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA, USA

    USC/Norris Comprehensive Cancer Center, USC Institute of Urology, University of Southern California, Los Angeles, CA, USA

    Center for Image-Guided Surgery, Focal Therapy and Artificial Intelligence for Prostate Cancer, USC Institute of Urology, Keck School of Medicine, University of Southern California, Los Angeles, CA, USA
    Search for articles by this author
Open AccessPublished:December 19, 2022DOI:https://doi.org/10.1016/j.euros.2022.09.024

      Abstract

      Artificial intelligence (AI) is here to stay and will change health care as we know it. The availability of big data and the increasing numbers of AI algorithms approved by the US Food and Drug Administration together will help in improving the quality of care for patients and in overcoming human fatigue barriers. In oncology practice, patients and providers rely on the interpretation of radiologists when making clinical decisions; however, there is considerable variability among readers, and in particular for prostate imaging. AI represents an emerging solution to this problem, for which it can provide a much-needed form of standardization. The diagnostic performance of AI alone in comparison to a combination of an AI framework and radiologist assessment for evaluation of prostate imaging has yet to be explored. Here, we compare the performance of radiologists alone versus a combination of radiologists aided by a modern computer-aided diagnosis (CAD) AI system. We show that the radiologist-CAD combination demonstrates superior sensitivity and specificity in comparison to both radiologists alone and AI alone. Our findings demonstrate that a radiologist + AI combination could perform best for detection of prostate cancer lesions. A hybrid technology-human system could leverage the benefits of AI in improving radiologist performance while also reducing physician workload, minimizing burnout, and enhancing the quality of patient care.

      Patient summary

      Our report demonstrates the potential of artificial intelligence (AI) for improving the interpretation of prostate scans. A combination of AI and evaluation by a radiologist has the best performance in determining the severity of prostate cancer. A hybrid system that uses both AI and radiologists could maximize the quality of care for patients while reducing physician workload and burnout.

      Keywords

      It is inevitable. Artificial intelligence (AI) will change health care as we know it. AI applications range from improved clinical decision-making to enhanced computer vision image analysis [
      • Chen A.B.
      • Haque T.
      • Roberts S.
      • et al.
      Artificial intelligence applications in urology: reporting standards to achieve fluency for urologists.
      ]. However, AI is not a “new kid on the block”. Alan Turing first mentioned AI in 1950 in his iconic essay The Imitation Game, in which he wondered, “Can a machine think? … What will happen when a machine takes the part of a human in a process?” The availability of big data and the increasing numbers of FDA-approved AI algorithms together will help health care professionals overcome human fatigue barriers and receive support for recurring tasks in daily practice. With these enhancements, physician performance and patient quality of care would certainly improve.
      In oncology care, patients and providers rely on accurate interpretation of imaging studies during health care discussions and shared decision-making. In some cases, these critical imaging findings can be the difference between an invasive surgical intervention and watchful waiting. We rely on interpretation by radiologists; however, it is well documented that there is considerable variability among readers [
      • Schmid A.M.
      • Raunig D.L.
      • Miller C.G.
      • et al.
      Radiologists and clinical trials: part 1. The truth about reader disagreements.
      ]. This discordance may be attributable to systematic differences in image acquisition methods or image quality, as well as radiologist-specific factors such as experience, internal bias, reader fatigue, and inconsistent reporting [
      • Schmid A.M.
      • Raunig D.L.
      • Miller C.G.
      • et al.
      Radiologists and clinical trials: part 1. The truth about reader disagreements.
      ]. Despite improvements in imaging quality, discordance in interpretation has remained surprisingly consistent and exists across all domains of medicine [
      • Schmid A.M.
      • Raunig D.L.
      • Miller C.G.
      • et al.
      Radiologists and clinical trials: part 1. The truth about reader disagreements.
      ,
      • Yoon S.H.
      • Kim Y.J.
      • Doh K.
      • et al.
      Interobserver variability in Lung CT Screening Reporting and Data System categorisation in subsolid nodule-enriched lung cancer screening CTs.
      ].
      This discordance in interpretation has been consistently noted in the grading of Prostate Imaging-Reporting and Data System category 3 lesions and is often significant [
      • Weinreb J.C.
      • Barentsz J.O.
      • Choyke P.L.
      • et al.
      PI-RADS Prostate Imaging-Reporting and Data System: 2015, version 2.
      ,
      • Smith C.P.
      • Harmon S.A.
      • Barrett T.
      • et al.
      Intra- and interreader reproducibility of PI-RADS v2: a multireader study.
      ,
      • Rosenkrantz A.B.
      • Ginocchio L.A.
      • Cornfeld D.
      • et al.
      Interobserver reproducibility of the PI-RADS version 2 lexicon: a multicenter study of six experienced prostate radiologists.
      ]. There are thus clear pitfalls in prostate magnetic resonance imaging (MRI), including inter-reader variability, difficulty in assessing benign “mimickers”, and imaging artifacts related to technical or physiological factors [
      • Panebianco V.
      • Giganti F.
      • Kitzing Y.X.
      • et al.
      An update of pitfalls in prostate mpMRI: a practical approach through the lens of PI-RADS vol 2 guidelines.
      ]. Awareness of these pitfalls is critical in prostate image assessment; however, there is a need to further optimize image analysis for accurate cancer detection.
      AI could represent an emerging solution to this problem. Taking advantage of computerized algorithms and high-throughput learning, AI can provide a much-needed form of standardization in image analysis. Proponents argue that AI can help in delineating subtle imaging findings via powerful image analyses that can be used to predict odds of malignancy and anticipated tumor progression patterns with modern computer aided-diagnosis (CAD) systems. Historical CAD systems relied on data input by users without the ability to self-learn, while modern CAD systems incorporating AI can autonomously learn and adapt with new data that are presented [
      • Oren O.
      • Gersh B.J.
      • Bhatt D.L.
      Artificial intelligence in medical imaging: switching from radiographic pathological data to clinically meaningful endpoints.
      ]. In collaboration with expert radiologists, AI algorithms are trained on large “ground truth” data sets and used to reduce the uncertainty in image interpretation [
      • Chen P.C.
      • Mermel C.H.
      • Liu Y.
      Evaluation of artificial intelligence on a reference standard based on subjective interpretation.
      ]. In addition, AI has many potential applications for health systems that can reduce the staffing shortages and the burnout that health care workers currently face [
      • Hazarika I.
      Artificial intelligence: opportunities and implications for the health workforce.
      ]. For example, implementation of an automated image processing software that autonomously detects and flags suspicious lesions could expedite radiologist review and thus improve productivity, reduce workload, and enhance overall performance [
      • Hazarika I.
      Artificial intelligence: opportunities and implications for the health workforce.
      ]. While these applications undoubtedly face ethical challenges and have liability concerns, potential solutions are realizable and of practical importance. When applied to urology, successful implementation of AI systems can have an important benefit in reducing unnecessary prostate biopsies while maintaining high accuracy in ruling out disease [
      • Penzkofer T.
      • Padhani A.R.
      • Turkbey B.
      • et al.
      ESUR/ESUI position paper: developing artificial intelligence for precision diagnosis of prostate cancer using magnetic resonance imaging.
      ]. Efforts to develop AI systems that serve as diagnostic support aids and can be seamlessly integrated into a radiologist’s workflow are necessary for successful AI application [
      • Penzkofer T.
      • Padhani A.R.
      • Turkbey B.
      • et al.
      ESUR/ESUI position paper: developing artificial intelligence for precision diagnosis of prostate cancer using magnetic resonance imaging.
      ].
      A systematic review and meta-analysis of several medical specialties revealed that the diagnostic performance of deep learning models is equivalent to that of health care professionals, although it also noted that few studies presented externally validated results with poor reporting of methodology, limiting interpretation of the diagnostic accuracy [
      • Liu X.
      • Faes L.
      • Kale A.U.
      • et al.
      A comparison of deep learning performance against health-care professionals in detecting diseases from medical imaging: a systematic review and meta-analysis.
      ]. Despite abundant, well-cited literature on the use of AI in prostate imaging, only one study looking at multiparametric MRI (mpMRI) scans of patients undergoing robot-assisted laparoscopic prostatectomy was included because of the rigid inclusion criteria [
      • Liu X.
      • Faes L.
      • Kale A.U.
      • et al.
      A comparison of deep learning performance against health-care professionals in detecting diseases from medical imaging: a systematic review and meta-analysis.
      ]. Given this limitation, further assessment of the diagnostic performance of AI frameworks in comparison to health care professionals for prostate imaging evaluation is warranted.
      In this context, we present preliminary findings from a project to address this deficiency in available data for detecting prostate cancer with and without AI models. The pilot cumulative analysis includes five studies (six references, seven contingency tables; Supplementary material) that used mpMRI for prostate cancer detection and compares the performance of radiologists alone versus a combination of radiologists aided by an AI-based CAD system. Our analysis showed pooled sensitivity of 89.1% (95% confidence interval [CI] 80.6–94.2%) for the radiologist-CAD combination and 79.5% (95% CI 75.5–84.3%) for radiologists alone (Fig 1). The pooled specificity was 78.1% (95% CI 64.8–87.4%) for the radiologist-CAD combination compared to 73.1% (95% CI 61.9–82.0%) for radiologists alone. A systematic review of AI system performance revealed average sensitivity of 84% and specificity of 61.5% [
      • Syer T.
      • Mehta P.
      • Antonelli M.
      • et al.
      Artificial intelligence compared to radiologists for the initial diagnosis of prostate cancer on magnetic resonance imaging: a systematic review and recommendations for future studies.
      ]. Our pooled analysis for the radiologist-AI combination demonstrates superior performance in comparison to both radiologists alone and AI alone. Therefore, the real game-changing move could be a “partnership” between AI systems and health care professionals rather than “replacement” of the latter.
      Figure thumbnail gr1
      Fig. 1Hierarchical summary receiver operating characteristic (HSROC) curves for studies assessing the performance of (A) radiologists alone and (B) a combination of artificial intelligence algorithms and radiologists (5 studies, 6 contingency tables) in the detection of any prostate cancer using multiparametric magnetic resonance imaging. CI = confidence interval; PI-RADS = Prostate Imaging-Reporting and Data System.
      The current evidence demonstrates that a combination of radiologists and AI-based systems could perform best for the detection of prostate cancer lesions. Not surprisingly, we are still far from a totally automated diagnostic pathway. These fully automated systems are still limited by the requirement for professional oversight, as well as ethical considerations and liability concerns [
      • World Health Organization
      Ethics and governance of artificial intelligence for health: WHO guidance.
      ]. However, a hybrid technology-human system could leverage the benefits of AI in improving radiologist performance while also reducing physician workload, minimizing burnout, and enhancing the quality of patient care.
      Conflicts of interest: The authors have nothing to disclose.

      Appendix A. Supplementary data

      The following are the Supplementary data to this article:

      References

        • Chen A.B.
        • Haque T.
        • Roberts S.
        • et al.
        Artificial intelligence applications in urology: reporting standards to achieve fluency for urologists.
        Urol Clin North Am. 2022; 49: 65-117
        • Schmid A.M.
        • Raunig D.L.
        • Miller C.G.
        • et al.
        Radiologists and clinical trials: part 1. The truth about reader disagreements.
        Ther Innov Regul Sci. 2021; 55: 1111-1121
        • Yoon S.H.
        • Kim Y.J.
        • Doh K.
        • et al.
        Interobserver variability in Lung CT Screening Reporting and Data System categorisation in subsolid nodule-enriched lung cancer screening CTs.
        Eur Radiol. 2021; 31: 7184-7191
        • Weinreb J.C.
        • Barentsz J.O.
        • Choyke P.L.
        • et al.
        PI-RADS Prostate Imaging-Reporting and Data System: 2015, version 2.
        Eur Urol. 2016; 69: 16-40
        • Smith C.P.
        • Harmon S.A.
        • Barrett T.
        • et al.
        Intra- and interreader reproducibility of PI-RADS v2: a multireader study.
        J Magn Reson Imaging. 2019; 49: 1694-1703
        • Rosenkrantz A.B.
        • Ginocchio L.A.
        • Cornfeld D.
        • et al.
        Interobserver reproducibility of the PI-RADS version 2 lexicon: a multicenter study of six experienced prostate radiologists.
        Radiology. 2016; 280: 793-804
        • Panebianco V.
        • Giganti F.
        • Kitzing Y.X.
        • et al.
        An update of pitfalls in prostate mpMRI: a practical approach through the lens of PI-RADS vol 2 guidelines.
        Insights Imaging. 2018; 9: 87-101
        • Oren O.
        • Gersh B.J.
        • Bhatt D.L.
        Artificial intelligence in medical imaging: switching from radiographic pathological data to clinically meaningful endpoints.
        Lancet Digit Health. 2020; 2: e486-e488
        • Chen P.C.
        • Mermel C.H.
        • Liu Y.
        Evaluation of artificial intelligence on a reference standard based on subjective interpretation.
        Lancet Digit Health. 2021; 3: e693-e695
        • Hazarika I.
        Artificial intelligence: opportunities and implications for the health workforce.
        Int Health. 2020; 12: 241-245
        • Penzkofer T.
        • Padhani A.R.
        • Turkbey B.
        • et al.
        ESUR/ESUI position paper: developing artificial intelligence for precision diagnosis of prostate cancer using magnetic resonance imaging.
        Eur Radiol. 2021; 31: 9567-9578
        • Liu X.
        • Faes L.
        • Kale A.U.
        • et al.
        A comparison of deep learning performance against health-care professionals in detecting diseases from medical imaging: a systematic review and meta-analysis.
        Lancet Digit Health. 2019; 1: e271-e297
        • Syer T.
        • Mehta P.
        • Antonelli M.
        • et al.
        Artificial intelligence compared to radiologists for the initial diagnosis of prostate cancer on magnetic resonance imaging: a systematic review and recommendations for future studies.
        Cancers. 2021; 13: 3318
        • World Health Organization
        Ethics and governance of artificial intelligence for health: WHO guidance.
        WHO, Geneva2021

      Linked Article