Neural Netw. 2026 Mar 21;201:108885. doi: 10.1016/j.neunet.2026.108885. Online ahead of print.
ABSTRACT
Sentiment analysis remains challenging due to the complex, intertwined relationships among sentiment expressions, contextual cues, and emotional features distributed across heterogeneous data sources. Conventional deep learning and transformer-based models often treat sentiments as isolated units, failing to capture these rich, multi-perspective interactions. To address these limitations, this study introduces a Two-Fold Multi-Perspective Heterogeneous Graph Neural Network (TFMPHGNN) that jointly models sentiment, emotion, and contextual dependencies within a dual-stage heterogeneous graph framework. The first stage employs a meta-path-based encoder integrated with a capsule network to capture hierarchical semantic relationships among sentiment-emotion-context nodes, while the second stage utilizes a multi-channel graph convolutional network (MC-GCN) to learn complementary topological, semantic, and collaborative representations of sentiment-emotion pairs. A variational autoencoder (VAE) further denoises and refines latent embeddings. Experiments on the newly developed VaKSent-2025 corpus show that TFMPHGNN outperforms eight state-of-the-art graph-based baselines by 4.67% in accuracy, 2.7% in F1-micro, and 4.2% in F1-weighted, with statistical significance testing confirming the reliability of these gains. An extended ablation analysis further demonstrates that the collaborative fusion channel achieves 0.9387 accuracy, representing improvements of 7.5% and 7.9% over the topological-only and semantic-only channels, respectively, underscoring the synergistic value of integrating multiple graph perspectives. Collectively, these results indicate that TFMPHGNN effectively captures complex sentiment-emotion interdependencies and offers a robust, interpretable framework for fine-grained sentiment understanding.
PMID:41904902 | DOI:10.1016/j.neunet.2026.108885