Efficient hybrid conjugate gradient techniques for vector optimization

Journal article


Authors/Editors


Strategic Research Themes


Publication Details

Author listJamilu Yahaya, Poom Kumam

PublisherElsevier BV

Publication year2024

Volume number14

Start page100348

ISSN26667207

URLhttps://www.sciencedirect.com/science/article/pii/S2666720723001509?via%3Dihub


View on publisher site


Abstract

Scalarization approaches transform vector optimization problems (VOPs) into single-objective optimization but have trade-offs: information loss, subjective weight assignments, and limited representation of the Pareto front. To address these limitations, alternative strategies like conjugate gradient (CG) techniques are valuable for their simplicity and less memory usage. The paper introduces three CG techniques for VOPs, including two CG techniques that satisfy sufficient descent conditions (SDC) without a line search. These two CG techniques are combined with the third CG technique, a variant of the Polak–Ribiére–Polyak (PRP) technique, resulting in two hybrid CG techniques. Global convergence of these hybrids is achieved, without convexity assumptions, under standard assumptions and Wolfe line search. Numerical analysis and comparisons with nonnegative PRP and Liu–Storey (LS) CG techniques showcase the implementation and effectiveness of our hybrid CG techniques. The results demonstrate the promise of our hybrids techniques.


Keywords

No matching items found.


Last updated on 2024-17-04 at 23:05