Comparison between R2′‐based and R2*‐based χ‐separation methods: A clinical evaluation in individuals with multiple sclerosis

Author:

Ji Sooyeon1ORCID,Jang Jinhee2,Kim Minjun1,Lee Hyebin2,Kim Woojun3,Lee Jongho1ORCID,Shin Hyeong‐Geol45ORCID

Affiliation:

1. Department of Electrical and Computer Engineering Seoul National University Seoul South Korea

2. Department of Radiology, Seoul St. Mary's Hospital, College of Medicine The Catholic University of Korea Seoul South Korea

3. Department of Neurology, Seoul St. Mary's Hospital, College of Medicine The Catholic University of Korea Seoul South Korea

4. Department of Radiology, School of Medicine Johns Hopkins University Baltimore Maryland USA

5. F.M. Kirby Research Center for Functional Brain Imaging Kennedy Krieger Institute Baltimore Maryland USA

Abstract

AbstractSusceptibility source separation, or χ‐separation, estimates diamagnetic (χdia) and paramagnetic susceptibility (χpara) signals in the brain using local field and R2′ (= R2* − R2) maps. Recently proposed R2*‐based χ‐separation methods allow for χ‐separation using only multi‐echo gradient echo (ME‐GRE) data, eliminating the need for additional data acquisition for R2 mapping. Although this approach reduces scan time and enhances clinical utility, the impact of missing R2 information remains a subject of exploration. In this study, we evaluate the viability of two previously proposed R2*‐based χ‐separation methods as alternatives to their R2′‐based counterparts: model‐based R2*‐χ‐separation versus χ‐separation and deep learning‐based χ‐sepnet‐R2* versus χ‐sepnet‐R2′. Their performances are assessed in individuals with multiple sclerosis (MS), comparing them with their corresponding R2′‐based counterparts (i.e., R2*‐χ‐separation vs. χ‐separation and χ‐sepnet‐R2* vs. χ‐sepnet‐R2′). The evaluations encompass qualitative visual assessments by experienced neuroradiologists and quantitative analyses, including region of interest analyses and linear regression analyses. Qualitatively, R2*‐χ‐separation tends to report higher χpara and χdia values compared with χ‐separation, leading to less distinct lesion contrasts, while χ‐sepnet‐R2* closely aligns with χ‐sepnet‐R2′. Quantitative analysis reveals a robust correlation between both R2*‐based methods and their R2′‐based counterparts (r ≥ 0.88). Specifically, in the whole‐brain voxels, χ‐sepnet‐R2* exhibits higher correlation and better linearity than R2*‐χ‐separation (χdiapara from R2*‐χ‐separation: r = 0.88/0.90, slope = 0.79/0.86; χdiapara from χ‐sepnet‐R2*: r = 0.90/0.92, slope = 0.99/0.97). In MS lesions, both R2*‐based methods display comparable correlation and linearity (χdiapara from R2*‐χ‐separation: r = 0.90/0.91, slope = 0.98/0.91; χdiapara from χ‐sepnet‐R2*: r = 0.88/0.88, slope = 0.91/0.95). Notably, χ‐sepnet‐R2* demonstrates negligible offsets, whereas R2*‐χ‐separation exhibits relatively large offsets (0.02 ppm in the whole brain and 0.01 ppm in the MS lesions), potentially indicating the false presence of myelin or iron in MS lesions. Overall, both R2*‐based χ‐separation methods demonstrated their viability as alternatives to their R2′‐based counterparts. χ‐sepnet‐R2* showed better alignment with its R2′‐based counterpart with minimal susceptibility offsets, compared with R2*‐χ‐separation that reported higher χpara and χdia values compared with R2′‐based χ‐separation.

Funder

Institute of Engineering Research, Seoul National University

National Research Foundation of Korea

Ministry of Science and ICT, South Korea

Institute for the Study of New Media, Politics and Society, Ariel University

Publisher

Wiley

Cited by 1 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3