Assessing the robustness of deep learning based brain age prediction models across multiple EEG datasets

This article has 0 evaluations Published on
Read the full article Related papers
This article on Sciety

Abstract

The increasing availability of large electroencephalography (EEG) datasets enhances the potential clinical utility of deep learning (DL) for cognitive and pathological decoding. However, dataset shifts due to variations in the population and acquisition hardware can considerably degrade the model performance. We systematically investigated the generalisation of DL models to unseen datasets with different characteristics, using age as the target variable. Five datasets were used in two different experimental setups, including (1) leave-one-dataset-out (LODO) and (2) leave-one-dataset-in (LODI) cross validation. A comprehensive set of 1805 different hyperparameter configurations was tested, including variations in the DL architectures and data pre-processing. The performance varied across source/target dataset pair. Using LODO, we obtained Pearson’s r values of {0.63, 0.84, 0.75, 0.23, 0.10} andR2values of {-0.01, 0.63, 0.41, −4.66, −70.98}. For LODI, the results varied in Pearson’s r from −0.11 to 0.84 andR2values from −704.89 to 0.65, depending on the source and target dataset. Adjusting the model intercepts using the average age of the target dataset substantially improved someR2scores. Our results show that DL models can learn age-related EEG patterns which generalise with strong correlations to datasets with broad age spans. The most important hyperparameter was to use the frequency range between 1 and 45Hz, rather than a single frequency band. The second most important hyperparameter effect depended on the experimental setup. Our findings highlight the challenges of dataset shifts in EEG-based DL models and establish a benchmark for future studies aiming to improve the robustness of DL models across diverse datasets.

Related articles

Related articles are currently not available for this article.