Med Image Anal. 2023 Sep 1;90:102941. doi: 10.1016/j.media.2023.102941. Online ahead of print.
Although many deep learning models-based medical applications are performance-driven, i.e., accuracy-oriented, their explainability is more critical. This is especially the case with neuroimaging, where we are often interested in identifying biomarkers underlying brain development or disorders. Herein we propose an explainable deep learning approach by elucidating the information transmission mechanism between two layers of a deep network with a joint feature selection strategy that considers several shallow-layer explainable machine learning models and sparse learning of the deep network. At the end, we apply and validate the proposed approach to the analysis of dynamic brain functional connectivity (FC) from fMRI in a brain development study. Our approach can identify the differences within and between functional brain networks over age during development. The results indicate that the brain network transits from undifferentiated structures to more specialized and organized ones, and the information processing ability becomes more efficient as age increases. In addition, we detect two developmental patterns in the brain network: the FCs in regions related to visual and sound processing and mental regulation become weakened, while those between regions corresponding to emotional processing and cognitive activities are enhanced.