A New Conjugate Gradient Method with Descent Properties and its Application to Regression analysis

Ibrahim Mohammed Sulaiman1, Mustafa Mamat1*

1Faculty of Informatics and Computing, Universiti Sultan Zainal Abidin, 21300 Terengganu, Malaysia
corresponding author: *must@unisza.edu.my
sulaimanib@unisza.edu.my
Submitted 24/07/2019, Revised 13/01/2020, 2nd revised: 11/06/2020, accepted: 09/09/2020

Abstract: The area of unconstrained optimization has been enjoying vivid growth recently, with significant progress on numerical innovative techniques. The classical technique for obtaining the solution of this problem is the Conjugate Gradient (CG) scheme, due to its rapid convergence rate with low memory requirements. However, recent variations of CG methods are complicated and computationally expensive. This paper presents a new and efficient CG parameter with descent condition for solving optimization problems. The convergence result of this method is established under exact and inexact line search. The proposed method is applied to real-life problems in regression analysis. Numerical results have been reported to illustrate the efficiency and robustness of the proposed method.

� European Society of Computational Methods in Sciences and Engineering
Keywords: Optimization; exact line search; global convergence; conjugate gradient method;
Mathematics Subject Classification: 90C53; 65K05

 

Scroll to Top