CC BY-NC-ND 4.0 · Indian J Radiol Imaging 2024; 34(02): 371-372
DOI: 10.1055/s-0043-1777129
Letter to the Editor

ChatGPT and Radiology in the Future: Comment

1   Private Academic Consultant, Phonhong, Lao People's Democratic Republic
,
2   Chandigarh University, Chandigarh, India
› Author Affiliations
Funding None.
 

We read the editorial by Botchu and Iyengar entitled “Will ChatGPT drive radiology in the future?”[1] with great interest. ChatGPT's clinical application in radiology has the potential to transform the sector. It has the potential to improve productivity, simplify the preparation of radiology reports, and allow for speedier sharing of patient information. This would result in faster turnaround times, fewer reporting backlogs, and more timely decision-making for patient management. ChatGPT can also help with image interpretation and pathology detection, boosting radiologists' ability with the support of artificial intelligence (AI).

ChatGPT can also be used to educate patients and teach future radiologists. It can be used to create interactive modules on various radiology techniques, enabling individualized instruction and remote access solutions. Patient information tools can also be designed to be user friendly, with interactive modules to answer patient questions and promote patient satisfaction. This technology can also help with audit evaluation and subsequent advancements in the field of radiology. However, as with any emerging technology, there are legitimate worries about the validity and potential inaccuracies in patient decision-making procedures. To address these concerns, future safeguarding techniques such as increased inspection, cost, and seamless incorporation into current radiology practices should be introduced. It is critical to recognize that technology and innovation are always advancing and that with suitable safeguards in place, ChatGPT can deliver increased diagnostic certainty, faster turnaround of patient reports, and improved work–life balance for radiologists.

If there is no enough human oversight or verification, a chatbot may give a fake reference, which could cause other issues.[2] [3] Modern approaches and a large training set are needed to reduce bias and errors from chatbots.[2] [3] This is because relying only on a big data source carries hazards. Chatbot use poses ethical questions because some of the impacts of chatbots might be unpleasant or unexpected. To prevent the dissemination of harmful ideas and false information, ethical restrictions and limitations must be put in place as AI language models develop. Human oversight and verification are necessary because a chatbot may give a fake reference, which could cause other issues.[2] [3] New strategies and a broad training portfolio are necessary for success, because relying only on a big data source can be hazardous. Chatbots pose ethical questions since some of their effects can be unpleasant or unexpected. As AI language models develop, ethical boundaries and restrictions must be incorporated to avoid the spread of destructive ideas and false information. ChatGPT will continue to develop, but it is important to be wary of any phony information it might create as evidenced by studies. A previous report clearly showed that inaccurate and fictitious references commonly result from ChatGPT use.[4] Hence, it is time for us to set a proper method for proper use of ChatGPT to avoid unwanted problems.


#

Conflict of Interest

None declared.

Author Contributions

H.D. contributed to the development of ideas and was responsible for writing and analysis of the manuscript. V.W. contributed to the development of ideas and supervised the study. Both authors approved the final version of the article.


  • References

  • 1 Botchu R, Iyengar KP. Will ChatGPT drive radiology in the future?. Indian J Radiol Imaging 2023; 33 (04) 436-437
  • 2 Kleebayoon A, Wiwanitkit V. Artificial intelligence, chatbots, plagiarism and basic honesty: comment. Cell Mol Bioeng 2023; 16 (02) 173-174
  • 3 Kleebayoon A, Wiwanitkit V. Comment on “How may ChatGPT impact medical teaching?”. Rev Assoc Med Bras (1992) 2023; 69 (08) e20230593
  • 4 Ariyaratne S, Iyengar KP, Nischal N, Chitti Babu N, Botchu R. A comparison of ChatGPT-generated articles with human-written articles. Skeletal Radiol 2023; 52 (09) 1755-1758

Address for correspondence

Hinpetch Daungsupawong, PhD
Private Academic Consultant
Phonhong
Lao People's Democratic Republic   
Viroj Wiwanitkit
Chandigarh University
Chandigarh
India   

Publication History

Article published online:
04 December 2023

© 2023. Indian Radiological Association. This is an open access article published by Thieme under the terms of the Creative Commons Attribution-NonDerivative-NonCommercial License, permitting copying and reproduction so long as the original work is given appropriate credit. Contents may not be used for commercial purposes, or adapted, remixed, transformed or built upon. (https://creativecommons.org/licenses/by-nc-nd/4.0/)

Thieme Medical and Scientific Publishers Pvt. Ltd.
A-12, 2nd Floor, Sector 2, Noida-201301 UP, India

  • References

  • 1 Botchu R, Iyengar KP. Will ChatGPT drive radiology in the future?. Indian J Radiol Imaging 2023; 33 (04) 436-437
  • 2 Kleebayoon A, Wiwanitkit V. Artificial intelligence, chatbots, plagiarism and basic honesty: comment. Cell Mol Bioeng 2023; 16 (02) 173-174
  • 3 Kleebayoon A, Wiwanitkit V. Comment on “How may ChatGPT impact medical teaching?”. Rev Assoc Med Bras (1992) 2023; 69 (08) e20230593
  • 4 Ariyaratne S, Iyengar KP, Nischal N, Chitti Babu N, Botchu R. A comparison of ChatGPT-generated articles with human-written articles. Skeletal Radiol 2023; 52 (09) 1755-1758