Efficient Reduction of Variances in Stochastic Spectral Conjugate Gradient Algorithm
Journal article
Authors/Editors
Strategic Research Themes
Publication Details
Author list: Yimer S.E.; Kumam P.; Chaipunya P.
Publication year: 2025
Journal: Carpathian Journal of Mathematics (1843-4401)
Volume number: 41
Issue number: 3
Start page: 849
End page: 862
Number of pages: 14
ISSN: 1843-4401
Languages: English-Great Britain (EN-GB)
Abstract
Conjugate gradient methods are often popular for solving nonlinear optimization problems. In this paper, we discuss the spectral conjugate gradient (SCG) method, an effective numerical method that generalizes the conjugate gradient method (CG) for solving a large-scale unconstrained optimization problem. Integrating the methods of Fletcher and Reeves (FR), and Polak and Ribiere (PR), we introduce a new stochastic spectral conjugate gradient algorithm with variance reduction, and we show that it is linearly convergent with the Fletcher and Reeves method for smooth and strongly convex functions. Thus, we illustrate experimentally that our algorithm converges quicker than its companions for the four learning models considered. Moreover, likewise the CG method, it only stores the last gradient vector so that it would be easy to apply and handle some complex problems considered as in machine learning. In the experiment, we also show that our algorithm overtakes generalization performance (AUC) over their corresponding companions through the four models considered that might be nonsmooth or nonconvex. © 2025, SINUS Association. All rights reserved.
Keywords
No matching items found.