Comprehensive review of Transformer‐based models in neuroscience, neurology, and psychiatry

Author:

Cong Shan12ORCID,Wang Hang1,Zhou Yang3,Wang Zheng45678,Yao Xiaohui12ORCID,Yang Chunsheng910

Affiliation:

1. Qingdao Innovation and Development Center Harbin Engineering University Qingdao Shandong China

2. College of Intelligent Science Systems and Engineering Harbin Engineering University Harbin Heilongjiang China

3. Department of Radiology Harbin Medical University Cancer Hospital Harbin Heilongjiang China

4. School of Psychological and Cognitive Sciences Peking University Beijing China

5. Beijing Key Laboratory of Behavior and Mental Health Peking University Beijing China

6. IDG/McGovern Institute for Brain Research Peking University Beijing China

7. Peking‐Tsinghua Center for Life Sciences Peking University Beijing China

8. School of Biomedical Engineering Hainan University Haikou Hainan China

9. Institute of Artificial Intelligence Guangzhou University Guangzhou Guangdong China

10. Digital Technology Research Center National Research Council of Canada Ottawa Ontario Canada

Abstract

AbstractThis comprehensive review aims to clarify the growing impact of Transformer‐based models in the fields of neuroscience, neurology, and psychiatry. Originally developed as a solution for analyzing sequential data, the Transformer architecture has evolved to effectively capture complex spatiotemporal relationships and long‐range dependencies that are common in biomedical data. Its adaptability and effectiveness in deciphering intricate patterns within medical studies have established it as a key tool in advancing our understanding of neural functions and disorders, representing a significant departure from traditional computational methods. The review begins by introducing the structure and principles of Transformer architectures. It then explores their applicability, ranging from disease diagnosis and prognosis to the evaluation of cognitive processes and neural decoding. The specific design modifications tailored for these applications and their subsequent impact on performance are also discussed. We conclude by providing a comprehensive assessment of recent advancements, prevailing challenges, and future directions, highlighting the shift in neuroscientific research and clinical practice towards an artificial intelligence‐centric paradigm, particularly given the prominence of Transformer architecture in the most successful large pre‐trained models. This review serves as an informative reference for researchers, clinicians, and professionals who are interested in understanding and harnessing the transformative potential of Transformer‐based models in neuroscience, neurology, and psychiatry.

Publisher

Wiley

Cited by 1 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3