Deriving biomarkers from computed tomography using deep learning
Abstract
X-ray computed tomography (CT) and magnetic resonance imaging (MRI) are widely used structural neuroimaging modalities. For brain atrophy assessment and volumetric quantification using automated methods, MRI is the preferred modality due to its superior soft tissue contrast. In neurodegenerative disease diagnostics, CT scanning is generally used in primary care centres to visually assess brain integrity and to potentially exclude causes of cognitive impairment in the general assessment of neurodegeneration. In comparison to MRI, CT is a fast, affordable, and widely available neuroimaging modality. Currently, only semi-quantitative visual rating methods are established for atrophy assessment on CT, and these are subjective, time-consuming, and dependent on a trained expert. Automated methods to quantify brain volumes in CT are, however, underexplored. The purpose of this thesis was thus to develop automated methods to identify CT-based imaging markers and to assess their potential as a diagnostic aid for neurodegenerative diseases. Studies I and II evaluated the use of deep learning models trained on MR-derived labels to segment brain tissue classes from head CT images. In Study III and IV, the clinical applications of the deep-learning-derived CT-based measures were explored.
In Study I, two-dimensional (2D) U-Net-based deep learning models were successfully trained on CT images with labels generated by segmenting paired MR images in order to automatically segment grey matter, white matter, cerebrospinal fluid, brain volume, and intracranial volume. High spatial overlap scores and high volumetric correlations were observed between deep-learning-derived CT-based segmentations and MR-derived maps for all tissue classes indicating that model-derived tissue volumes were highly comparable to MR-derived volumes.
In Study II, patch-based 3D segmentation networks for CT brain tissue classification were developed and the segmentation performance of 2D- and 3D-based segmentation networks were compared to evaluate which model is more suitable for anisotropic CT brain tissue classification. Our study demonstrated that slice-wise-processed 2D U-Nets perform better than patch-based 3D U-Nets in anisotropic CT brain tissue classification.
In Study III, we showed that CT-based atrophy measures can be employed to differentiate between patients with Alzheimer’s disease, prodromal Alzheimer’s disease, vascular dementia, prodromal vascular dementia and healthy controls with accuracy levels comparable to MR-derived atrophy measures. CT-based atrophy measures strongly correlated with relevant MR-based and biochemical markers of neurodegeneration as well as with cognitive impairment. This study indicated that CT-based atrophy measures have great potential to offer diagnostic support in the first-line assessment of neurodegenerative diseases.
In Study IV, ventricular cerebrospinal fluid (VCSF) volumes were derived from CT images using U-Net models trained on both MR-derived labels and labels manually derived from CT scans using a transfer-learning approach. Further, CT-based volumetry was evaluated for assessing ventricle volume change post-shunt surgery in idiopathic normal pressure hydrocephalus (iNPH). We demonstrated that CT-based volumetric measures could distinguish iNPH patients from cognitively normal individuals with high accuracy and the presence of a shunt had little to no effect on the model performance. Strong volumetric correlations were observed between automatically and manually derived CT-VCSF maps, indicating a strong potential for automated CT-derived VCSF volumetry in the clinical assessment of iNPH with comparable performance to visual assessments. Together, our findings describe novel automated methods for CT brain image segmentation that make a quantitative assessment of neurodegenerative change accessible to many more patients, as CT is far more accessible, cheaper, and faster than MRI.
Parts of work
Srikrishna, M., Pereira, J. B., Heckemann, R. A., Volpe, G., van Westen, D., Zettergren, A.,Kern, S., Wahlund, L.-O., Westman, E., Skoog, I, & Schöll, M. Deep learning from MRI-derived labels enables automatic brain tissue classification on human brain CT. NeuroImage,2021, 244, 118606. https://doi.org/10.1016/j.neuroimage.2021.118606 Srikrishna, M., Heckemann, R. A., Pereira, J. B., Volpe, G., Zettergren, A., Kern, S., Westman, E., Skoog, I., & Schöll, M. Comparison of Two-Dimensional- and Three-Dimensional-Based U-Net Architectures for Brain Tissue Classification in One-Dimensional Brain CT. Frontiers in Computational Neuroscience, 2022, 15. https://doi.org/10.3389/fncom.2021.785244 Srikrishna, M., Ashton, N.J., Moscoso, A., Pereira, J. B., Heckemann, R. A., van Westen, D., Volpe, G., Simrén, J., Zettergren, A., Kern, S., Wahlund, L.-O., Gyanwali, B., Hilal, S., Ruifen, J.C., Zetterberg, H., Blennow, K., Westman, E., Chen, C., Skoog, I., & Schöll, M. Assessing the clinical applications of deep-learning-derived CT volumetric measures in neurodegenerative disease diagnosis. Manuscript, submitted Srikrishna, M.*, Seo, W.*, Pereira, J. B., Heckemann, R. A., Zettergren, A., Kern, S., Wahlund, L.-O., Westman, E., Skoog, I., Virhammar, J., Fällmar, D., & Schöll, M. Clinical validation of deep-learning-based CT image analyses in idiopathic normal pressure hydrocephalus. Manuscript, in preparation.
Degree
Doctor of Philosophy (Medicine)
University
University of Gothenburg. Sahlgrenska Academy
Institution
Institute of Neuroscience and Physiology. Department of Psychiatry and Neurochemistry
Disputation
Fredagen den 14 oktober 2022, kl. 14.00, Hörsal Arvid Carlsson, Academicum, Medicinaregatan 3, Göteborg.
https://gu-se.zoom.us/j/65379533635?pwd=MVhSNDhUeXRIM203Wm9JSzhyb0FaZz09
Date of defence
2022-10-14
meera.srikrishna@gu.se
Date
2022-09-23Author
Srikrishna, Meera
Keywords
CT
MRI
convoluted neural networks
deep learning
Dementia
Alzheimer's disease
Normal pressure hydrocephalus
brain segmentation
Publication type
Doctoral thesis
ISBN
978-91-8009-923-3 (PRINT)
978-91-8009-924-0 (PDF)
Language
eng