Study of multistep Dense U‐Net‐based automatic segmentation for head MRI scans

Author:

Gi Yongha1,Oh Geon1,Jo Yunhui2,Lim Hyeongjin1,Ko Yousun1,Hong Jinyoung1,Lee Eunjun1,Park Sangmin13,Kwak Taemin13,Kim Sangcheol13,Yoon Myonggeun13

Affiliation:

1. Department of Bio‐medical Engineering Korea University Seoul Republic of Korea

2. Institute of Global Health Technology (IGHT) Korea University Seoul Republic of Korea

3. Field Cure Ltd. Seoul Republic of Korea

Abstract

AbstractBackgroundDespite extensive efforts to obtain accurate segmentation of magnetic resonance imaging (MRI) scans of a head, it remains challenging primarily due to variations in intensity distribution, which depend on the equipment and parameters used.PurposeThe goal of this study is to evaluate the effectiveness of an automatic segmentation method for head MRI scans using a multistep Dense U‐Net (MDU‐Net) architecture.MethodsThe MDU‐Net‐based method comprises two steps. The first step is to segment the scalp, skull, and whole brain from head MRI scans using a convolutional neural network (CNN). In the first step, a hybrid network is used to combine 2.5D Dense U‐Net and 3D Dense U‐Net structure. This hybrid network acquires logits in three orthogonal planes (axial, coronal, and sagittal) using 2.5D Dense U‐Nets and fuses them by averaging. The resultant fused probability map with head MRI scans then serves as the input to a 3D Dense U‐Net. In this process, different ratios of active contour loss and focal loss are applied. The second step is to segment the cerebrospinal fluid (CSF), white matter, and gray matter from extracted brain MRI scans using CNNs. In the second step, the histogram of the extracted brain MRI scans is standardized and then a 2.5D Dense U‐Net is used to further segment the brain's specific tissues using the focal loss. A dataset of 100 head MRI scans from an OASIS‐3 dataset was used for training, internal validation, and testing, with ratios of 80%, 10%, and 10%, respectively. Using the proposed approach, we segmented the head MRI scans into five areas (scalp, skull, CSF, white matter, and gray matter) and evaluated the segmentation results using the Dice similarity coefficient (DSC) score, Hausdorff distance (HD), and the average symmetric surface distance (ASSD) as evaluation metrics. We compared these results with those obtained using the Res‐U‐Net, Dense U‐Net, U‐Net++, Swin‐Unet, and H‐Dense U‐Net models.ResultsThe MDU‐Net model showed DSC values of 0.933, 0.830, 0.833, 0.953, and 0.917 in the scalp, skull, CSF, white matter, and gray matter, respectively. The corresponding HD values were 2.37, 2.89, 2.13, 1.52, and 1.53 mm, respectively. The ASSD values were 0.50, 1.63, 1.28, 0.26, and 0.27 mm, respectively. Comparing these results with other models revealed that the MDU‐Net model demonstrated the best performance in terms of the DSC values for the scalp, CSF, white matter, and gray matter. When compared with the H‐Dense U‐Net model, which showed the highest performance among the other models, the MDU‐Net model showed substantial improvements in the HD view, particularly in the gray matter region, with a difference of approximately 9%. In addition, in terms of the ASSD, the MDU‐Net model outperformed the H‐Dense U‐Net model, showing an approximately 7% improvements in the white matter and approximately 9% improvements in the gray matter.ConclusionCompared with existing models in terms of DSC, HD, and ASSD, the proposed MDU‐Net model demonstrated the best performance on average and showed its potential to enhance the accuracy of automatic segmentation for head MRI scans.

Funder

Ministry of Science and ICT, South Korea

Korea Medical Device Development Fund

Publisher

Wiley

Subject

General Medicine

全球学者库

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"全球学者库"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前全球学者库共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2023 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3