Categories
Nevin Manimala Statistics

Enhanced generalized normal distribution optimizer with Gaussian distribution repair method and cauchy reverse learning for features selection

Sci Rep. 2026 Feb 2. doi: 10.1038/s41598-026-35804-y. Online ahead of print.

ABSTRACT

The presence of noisy, redundant, and irrelevant features in high-dimensional datasets significantly degrades the performance of classification models. Feature selection is a critical pre-processing step to mitigate this issue by identifying an optimal feature subset. While the Generalized Normal Distribution Optimization (GNDO) algorithm has shown promise in various domains, its efficacy for feature selection is hampered by premature convergence and an imbalance between exploration and exploitation. This paper proposes a Binary Adaptive GNDO (BAGNDO) framework to overcome these limitations. BAGNDO integrates three key strategies: an Adaptive Cauchy Reverse Learning (ACRL) mechanism to enhance population diversity, an Elite Pool Strategy to balance the search process, and a Gaussian Distribution-based Worst-solution Repair (GDWR) method to improve exploitation. The performance of BAGNDO was rigorously evaluated against nine state-of-the-art metaheuristic algorithms on 18 UCI benchmark datasets. The results demonstrate the superior efficacy of BAGNDO, which achieved the highest classification accuracy with the most compact feature subsets in 14 out of 18 datasets. Statistical analysis, including the Wilcoxon signed-rank and Friedman tests, confirmed that BAGNDO’s performance is significantly better, establishing it as a robust and efficient solution for wrapper-based feature selection.

PMID:41629550 | DOI:10.1038/s41598-026-35804-y

By Nevin Manimala

Portfolio Website for Nevin Manimala