Brain Inform. 2025 Oct 2;12(1):25. doi: 10.1186/s40708-025-00274-x.
ABSTRACT
Understanding how demographic factors influence visual attention is crucial for the development of adaptive and user-centered web interfaces. This paper presents a gender-aware saliency prediction system based on fine-tuned deep learning models and demographic-specific gaze behavior. We introduce the WIC640 dataset, which includes 640 web page screenshots categorized by content type and country of origin, along with eye-tracking data from 85 participants across four age groups and both genders. To investigate gender-related differences in visual saliency, we fine-tuned TranSalNet, a Transformer-based saliency prediction model, on the WIC640 dataset. Our experiments reveal distinct gaze behavior patterns between male and female users. The female-trained model achieved a correlation coefficient (CC) of 0.7786, normalized scanpath saliency (NSS) of 2.4224, and Kullback-Leibler divergence (KLD) of 0.5447; the male-trained model showed slightly lower performance (CC = 0.7582, NSS = 2.3508, KLD = 0.5986). Interestingly, the general model trained on the complete dataset outperformed both gender-specific models, highlighting the importance of inclusive training data. Statistical analysis revealed significant gender-related differences in 9 out of 12 saliency features and a trend of reduced fixation dispersion with increasing age. While this study does not yet incorporate temporal gaze modeling, the results suggest practical benefits for intelligent systems aiming to personalize user experiences based on demographic features. The WIC640 dataset is publicly available and offers a valuable resource for future research on adaptive AI systems, visual attention modeling, and demographic-aware interface design.
PMID:41037184 | DOI:10.1186/s40708-025-00274-x