Categories
Nevin Manimala Statistics

Bayesian Estimation of Inverted Beta Mixture Models With Extended Stochastic Variational Inference for Positive Vector Classification

IEEE Trans Neural Netw Learn Syst. 2022 Oct 24;PP. doi: 10.1109/TNNLS.2022.3213518. Online ahead of print.

ABSTRACT

The finite inverted beta mixture model (IBMM) has been proven to be efficient in modeling positive vectors. Under the traditional variational inference framework, the critical challenge in Bayesian estimation of the IBMM is that the computational cost of performing inference with large datasets is prohibitively expensive, which often limits the use of Bayesian approaches to small datasets. An efficient alternative provided by the recently proposed stochastic variational inference (SVI) framework allows for efficient inference on large datasets. Nevertheless, when using the SVI framework to address the non-Gaussian statistical models, the evidence lower bound (ELBO) cannot be explicitly calculated due to the intractable moment computation. Therefore, the algorithm under the SVI framework cannot directly use stochastic optimization to optimize the ELBO, and an analytically tractable solution cannot be derived. To address this problem, we propose an extended version of the SVI framework with more flexibility, namely, the extended SVI (ESVI) framework. This framework can be used in many non-Gaussian statistical models. First, some approximation strategies are applied to further lower the ELBO to avoid intractable moment calculations. Then, stochastic optimization with noisy natural gradients is used to optimize the lower bound. The excellent performance and effectiveness of the proposed method are verified in real data evaluation.

PMID:36279334 | DOI:10.1109/TNNLS.2022.3213518

By Nevin Manimala

Portfolio Website for Nevin Manimala