Efficient Reduction of Variances in Stochastic Spectral Conjugate Gradient Algorithm

Journal article


Authors/Editors


Strategic Research Themes


Publication Details

Author listYimer S.E.; Kumam P.; Chaipunya P.

Publication year2025

JournalCarpathian Journal of Mathematics (1843-4401)

Volume number41

Issue number3

Start page849

End page862

Number of pages14

ISSN1843-4401

URLhttps://www.scopus.com/inward/record.uri?eid=2-s2.0-105007842120&doi=10.37193%2fCJM.2025.03.14&partnerID=40&md5=9b705f1aa5851398a8c489391aab80cd

LanguagesEnglish-Great Britain (EN-GB)


View on publisher site


Abstract

Conjugate gradient methods are often popular for solving nonlinear optimization problems. In this paper, we discuss the spectral conjugate gradient (SCG) method, an effective numerical method that generalizes the conjugate gradient method (CG) for solving a large-scale unconstrained optimization problem. Integrating the methods of Fletcher and Reeves (FR), and Polak and Ribiere (PR), we introduce a new stochastic spectral conjugate gradient algorithm with variance reduction, and we show that it is linearly convergent with the Fletcher and Reeves method for smooth and strongly convex functions. Thus, we illustrate experimentally that our algorithm converges quicker than its companions for the four learning models considered. Moreover, likewise the CG method, it only stores the last gradient vector so that it would be easy to apply and handle some complex problems considered as in machine learning. In the experiment, we also show that our algorithm overtakes generalization performance (AUC) over their corresponding companions through the four models considered that might be nonsmooth or nonconvex. © 2025, SINUS Association. All rights reserved.


Keywords

No matching items found.


Last updated on 2025-28-08 at 00:00