Efficient hybrid conjugate gradient techniques for vector optimization
Journal article
Authors/Editors
Strategic Research Themes
Publication Details
Author list: Jamilu Yahaya, Poom Kumam
Publisher: Elsevier BV
Publication year: 2024
Volume number: 14
Start page: 100348
ISSN: 26667207
URL: https://www.sciencedirect.com/science/article/pii/S2666720723001509?via%3Dihub
Abstract
Scalarization approaches transform vector optimization problems (VOPs) into single-objective optimization but have trade-offs: information loss, subjective weight assignments, and limited representation of the Pareto front. To address these limitations, alternative strategies like conjugate gradient (CG) techniques are valuable for their simplicity and less memory usage. The paper introduces three CG techniques for VOPs, including two CG techniques that satisfy sufficient descent conditions (SDC) without a line search. These two CG techniques are combined with the third CG technique, a variant of the Polak–Ribiére–Polyak (PRP) technique, resulting in two hybrid CG techniques. Global convergence of these hybrids is achieved, without convexity assumptions, under standard assumptions and Wolfe line search. Numerical analysis and comparisons with nonnegative PRP and Liu–Storey (LS) CG techniques showcase the implementation and effectiveness of our hybrid CG techniques. The results demonstrate the promise of our hybrids techniques.
Keywords
No matching items found.