Affiliation:
1. Department of Biomedical Informatics, Yong Loo Lin School of Medicine National University of Singapore Singapore Singapore
2. Singapore National Eye Center, Singapore Eye Research Institute Singapore Health Service Singapore Singapore
3. StatNLP Research Group Singapore University of Technology and Design Singapore
4. University of Cambridge School of Clinical Medicine Cambridge UK
5. Duke‐NUS Medical School Centre for Quantitative Medicine Singapore Singapore
6. Duke‐NUS Medical School Programme in Health Services and Systems Research Singapore Singapore
Abstract
AbstractRecently, the emergence of ChatGPT, an artificial intelligence chatbot developed by OpenAI, has attracted significant attention due to its exceptional language comprehension and content generation capabilities, highlighting the immense potential of large language models (LLMs). LLMs have become a burgeoning hotspot across many fields, including health care. Within health care, LLMs may be classified into LLMs for the biomedical domain and LLMs for the clinical domain based on the corpora used for pre‐training. In the last 3 years, these domain‐specific LLMs have demonstrated exceptional performance on multiple natural language processing tasks, surpassing the performance of general LLMs as well. This not only emphasizes the significance of developing dedicated LLMs for the specific domains, but also raises expectations for their applications in health care. We believe that LLMs may be used widely in preconsultation, diagnosis, and management, with appropriate development and supervision. Additionally, LLMs hold tremendous promise in assisting with medical education, medical writing and other related applications. Likewise, health care systems must recognize and address the challenges posed by LLMs.
Reference64 articles.
1. Attention is all you need;Vaswani A;Adv Neural Inf Process Syst,2017
2. DevlinJ ChangM‐W LeeK ToutanovaK. BERT: pre‐training of deep bidirectional transformers for language understanding.2018.https://doi.org/10.48550/arXiv.1810.04805
3. PaLM: scaling language modeling with pathways;Chowdhery A;arXiv:2204.02311,2022
4. TouvronH LavrilT IzacardG MartinetX LachauxM‐A LacroixT et al. LLaMA: open and efficient foundation language models. 2023.http://arxiv.org/abs/2302.13971
Cited by
40 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献