Data-driven head motion correction for PET using time-of-flight and positron emission particle tracking techniques

Author:

Tumpa Tasmia Rahman,Acuff Shelley N.,Gregor Jens,Bradley Yong,Fu Yitong,Osborne Dustin R.ORCID

Abstract

Objectives Positron emission tomography (PET) is susceptible to patient movement during a scan. Head motion is a continuing problem for brain PET imaging and diagnostic assessments. Physical head restraints and external motion tracking systems are most commonly used to address to this issue. Data-driven methods offer substantial advantages, such as retroactive processing but typically require manual interaction for robustness. In this work, we introduce a time-of-flight (TOF) weighted positron emission particle tracking (PEPT) algorithm that facilitates fully automated, data-driven head motion detection and subsequent automated correction of the raw listmode data. Materials methods We used our previously published TOF-PEPT algorithm Dustin Osborne et al. (2017), Tasmia Rahman Tumpa et al., Tasmia Rahman Tumpa et al. (2021) to automatically identify frames where the patient was near-motionless. The first such static frame was used as a reference to which subsequent static frames were registered. The underlying rigid transformations were estimated using weak radioactive point sources placed on radiolucent glasses worn by the patient. Correction of raw event data were achieved by tracking the point sources in the listmode data which was then repositioned to allow reconstruction of a single image. To create a “gold standard” for comparison purposes, frame-by-frame image registration based correction was implemented. The original listmode data was used to reconstruct an image for each static frame detected by our algorithm and then applying manual landmark registration and external software to merge these into a single image. Results We report on five patient studies. The TOF-PEPT algorithm was configured to detect motion using a 500 ms window. Our event-based correction produced images that were visually free of motion artifacts. Comparison of our algorithm to a frame-based image registration approach produced results that were nearly indistinguishable. Quantitatively, Jaccard similarity indices were found to be in the range of 85-98% for the former and 84-98% for the latter when comparing the static frame images with the reference frame counterparts. Discussion We have presented a fully automated data-driven method for motion detection and correction of raw listmode data. Easy to implement, the approach achieved high temporal resolution and reliable performance for head motion correction. Our methodology provides a mechanism by which patient motion incurred during imaging can be assessed and corrected post hoc.

Publisher

Public Library of Science (PLoS)

Subject

Multidisciplinary

Reference37 articles.

1. Motion estimation and correction in SPECT, PET and CT;Andre Z Kyme;Physics in Medicine Biology,2021

2. Head movement in normal subjects during simulated PET brain imaging with and without head restraint;Michael V Green;Journal of Nuclear Medicine,1994

3. On the use of positioning aids to reduce misregistration in the head and neck in whole-body PET/CT studies;Thomas Beyer;Journal of Nuclear Medicine,2005

4. Physical restraints and agitation in nursing home residents;Perla Werner;Journal of the American Geriatrics Society,1989

5. https://www.who.int/news-room/fact-sheets/detail/dementia Web Page.

Cited by 1 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

1. An Introduction to Particle Tracking Techniques with Applications in Biomedical Research;Microscopy Techniques for Biomedical Education and Healthcare Practice;2023

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3