A SCALED CONJUGATE GRADIENT METHOD USING THE DFP UPDATE FOR UNCONSTRAINED OPTIMIZATION PROBLEMS
Abstract
The Conjugate Gradient (CG) and Quasi-Newton methods are famous methods for finding solution to optimization problems involving large variables such as problems in optimal inflation rate, minimal cost, maximal profit, minimal error, optimal design and many more. In this presentation, we propose a modification of the hybrid Davidon-Fletcher-Powell-Conjugate-Gradient (DFP-CG)methods developed by Wan Osman et.al (2017) by adapting a spectral-scaling memory less DFP-update. The numerical implementation of the proposed method on a some selected unconstrained optimization test problems by adopting the performance profile by Dolan et al. (2002) indicates that the newly suggested method is competitive, robust and in most instances more efficient when compared with some existing CG methods in the literature.