Author(s): Zhang, Y (Zhang, Yu); Xie, R (Xie, Rui); Beheshti, I (Beheshti, Iman); Liu, X (Liu, Xia); Zheng, GW (Zheng, Guowei); Wang, Y (Wang, Yin); Zhang, ZW (Zhang, Zhenwen); Zheng, WH (Zheng, Weihao); Yao, ZJ (Yao, Zhijun); Hu, B (Hu, Bin)

Source: COMPUTERS IN BIOLOGY AND MEDICINE Volume: 169  Article Number: 107873  DOI: 10.1016/j.compbiomed.2023.107873  Early Access Date: JAN 2024   Published: FEB 2024 

Abstract: Currently, significant progress has been made in predicting brain age from structural Magnetic Resonance Imaging (sMRI) data using deep learning techniques. However, despite the valuable structural information they contain, the traditional engineering features known as anatomical features have been largely overlooked in this context. To address this issue, we propose an attention-based network design that integrates anatomical and deep convolutional features, leveraging an anatomical feature attention (AFA) module to effectively capture salient anatomical features. In addition, we introduce a fully convolutional network, which simplifies the extraction of deep convolutional features and overcomes the high computational memory requirements associated with deep learning. Our approach outperforms several widely-used models on eight publicly available datasets (n = 2501), with a mean absolute error (MAE) of 2.20 years in predicting brain age. Comparisons with deep learning models lacking the AFA module demonstrate that our fusion model effectively improves overall performance. These findings provide a promising approach for combining anatomical and deep convolutional features from sMRI data to predict brain age, with potential applications in clinical diagnosis and treatment, particularly for populations with age-related cognitive decline or neurological disorders.

Accession Number: WOS:001151922700001

PubMed ID: 38181606

Author Identifiers:

Author Web of Science ResearcherID     ORCID Number

Zhang, Zhenwen      0000-0001-8443-8779

Zheng, Weihao       0000-0003-2996-5909

Hu, Bin       0000-0003-3514-5413

ISSN: 0010-4825

eISSN: 1879-0534