Efficient Reduction of Variances in Stochastic Spectral Conjugate Gradient Algorithm
บทความในวารสาร
ผู้เขียน/บรรณาธิการ
กลุ่มสาขาการวิจัยเชิงกลยุทธ์
รายละเอียดสำหรับงานพิมพ์
รายชื่อผู้แต่ง: Yimer S.E.; Kumam P.; Chaipunya P.
ปีที่เผยแพร่ (ค.ศ.): 2025
วารสาร: Carpathian Journal of Mathematics (1843-4401)
Volume number: 41
Issue number: 3
หน้าแรก: 849
หน้าสุดท้าย: 862
จำนวนหน้า: 14
นอก: 1843-4401
ภาษา: English-Great Britain (EN-GB)
บทคัดย่อ
Conjugate gradient methods are often popular for solving nonlinear optimization problems. In this paper, we discuss the spectral conjugate gradient (SCG) method, an effective numerical method that generalizes the conjugate gradient method (CG) for solving a large-scale unconstrained optimization problem. Integrating the methods of Fletcher and Reeves (FR), and Polak and Ribiere (PR), we introduce a new stochastic spectral conjugate gradient algorithm with variance reduction, and we show that it is linearly convergent with the Fletcher and Reeves method for smooth and strongly convex functions. Thus, we illustrate experimentally that our algorithm converges quicker than its companions for the four learning models considered. Moreover, likewise the CG method, it only stores the last gradient vector so that it would be easy to apply and handle some complex problems considered as in machine learning. In the experiment, we also show that our algorithm overtakes generalization performance (AUC) over their corresponding companions through the four models considered that might be nonsmooth or nonconvex. © 2025, SINUS Association. All rights reserved.
คำสำคัญ
ไม่พบข้อมูลที่เกี่ยวข้อง