Understanding and Addressing AI Hallucinations in Healthcare and Life Sciences
Aditya Gadiko ()
International Journal of Health Sciences, 2024, vol. 7, issue 3, 1 - 11
Abstract:
Purpose: This paper investigates the phenomenon of "AI hallucinations" in healthcare and life sciences, where large language models (LLMs) produce outputs that, while coherent, are factually incorrect, irrelevant, or misleading. Understanding and mitigating such errors is critical given the high stakes of accurate and reliable information in healthcare and life sciences. We classify hallucinations into three types input-conflicting, context-conflicting, and fact-conflicting and examine their implications through real-world cases. Methodology: Our methodology combines the Fact Score, Med-HALT, and adversarial testing to evaluate the fidelity of AI outputs. We propose several mitigation strategies, including Retrieval-Augmented Generation (RAG), Chain-of-Verification (CoVe), and Human-in-the-Loop (HITL) systems, to enhance model reliability. Findings: As artificial intelligence continues to permeate various sectors of society, the issue of hallucinations in AI-generated text poses significant challenges, especially in contexts where precision and reliability are paramount. This paper has delineated the types of hallucinations commonly observed in AI systems input-conflicting, context-conflicting, and fact-conflicting and highlighted their potential to undermine trust and efficacy in critical domains such as healthcare and legal proceedings. Unique contribution to theory, policy and practice: This study's unique contribution lies in its comprehensive analysis of AI hallucinations' types and impacts and the development of robust controls that advance theoretical understanding, practical application, and policy formulation in AI deployment. These efforts aim to foster safer, more effective AI integration across healthcare and life sciences sectors
Keywords: Hallucinations; Large Language Models; Artificial Intelligence; Healthcare; Life Sciences (search for similar items in EconPapers)
Date: 2024
References: Add references at CitEc
Citations:
Downloads: (external link)
https://carijournals.org/journals/index.php/IJHS/article/view/1862/2238 (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:bhx:ojijhs:v:7:y:2024:i:3:p:1-11:id:1862
Access Statistics for this article
More articles in International Journal of Health Sciences from CARI Journals Limited
Bibliographic data for series maintained by Chief Editor ().