Comparison Between Steepest Descent Method and Conjugate Gradient Method by Using Matlab

Main Article Content

Dana Taha Mohammed Salih
Bawar Mohammed Faraj

Abstract

The Steepest descent method and the Conjugate gradient method to minimize nonlinear functions have been studied in this work. Algorithms are presented and implemented in Matlab software for both methods. However, a comparison has been made between the Steepest descent method and the Conjugate gradient method. The obtained results in Matlab software has time and efficiency aspects. It is shown that the Conjugate gradient method needs fewer iterations and has more efficiency than the Steepest descent method. On the other hand, the Steepest descent method converges a function in less time than the Conjugate gradient method.

Article Details

How to Cite
Mohammed Salih, D. T., & Faraj, B. M. (2021). Comparison Between Steepest Descent Method and Conjugate Gradient Method by Using Matlab. Journal of Studies in Science and Engineering, 1(1), 20–31. https://doi.org/10.53898/josse2021113
Section
Research Articles

References

Djordjevic, S.S., 2019. Unconstrained Optimization Methods: Conjugate Gradient Methods and Trust-Region Methods. In Applied Mathematics. IntechOpen.

E.Carrizosa,E.Conde,M.MuñozMárquez,J.Puerto,Planarpoint objective location problems with nonconvex constraints: a geometrical construction, J. Global Optim. 6 (1) (1995) 77–86.

E. Carrizosa, J.B.G. Frenk, Dominating sets for convex functions with some applications, J. Optim. Theory Appl. 96 (2)(1998) 281–29

Das, I. and Dennis, J.E., 1998. Normal-boundary intersection: A new method for generating the Pareto surface in nonlinear multicriteria optimization problems. SIAM journal on optimization, 8(3), pp.631-657.

H. Eschenauer, J. Koski,A. Osyczka, Multicriteria Design Optimization, Springer, Berlin, 1990.

G.W. Evans, An overvie wof techniques for solving multi-objective mathematical programs, Management Sci. 30 (11)(1984) 1268–1282

D.J. White, Epsilon-dominating solutions in mean-variance portfolio analysis, European J. Oper. Res. 105 (1998)457–466.

Curtis, F.E. and Guo, W., 2016. Handling nonpositive curvature in a limited memory Steepest descent method. IMA Journal of Numerical Analysis, 36(2), pp.717-742.

Drummond, L.G. and Svaiter, B.F., 2005. A Steepest descent method for vector optimization. Journal of computational and applied mathematics, 175(2), pp.395-414.

Shewchuk, J.R., 1994. An introduction to the Conjugate gradient method without the agonizing pain.

Fletcher, R., 2012. A limited memory Steepest descent method. Mathematical programming, 135(1), pp.413-436.

Gonzaga, C.C. and Schneider, R.M., 2016. On the Steepest descent algorithm for quadratic functions. Computational Optimization and Applications, 63(2), pp.523-542.

Abrudan, T.E., Eriksson, J. and Koivunen, V., 2008. Steepest descent algorithms for optimization under unitary matrix constraint. IEEE Transactions on Signal Processing, 56(3), pp.1134-1147.

Pradhan, R. and Subudhi, B., 2012, December. A Steepest-descent based maximum power point tracking technique for a photovoltaic power system. In 2012 2nd International Conference on Power, Control and Embedded Systems (pp.1-6). IEEE.

J. Fliege, B.F. Svaiter, Steepest descent methods for multicriteria optimization, Math. Methods Oper. Res. 51 (3) (2000)479–494

Fliege, J. and Svaiter, B.F., 2000. Steepest descent methods for multicriteria optimization. Mathematical methods of operations research, 51(3), pp.479-494.

Barzilai, J. and Borwein, J.M., 1988. Two-point step size gradient methods. IMA journal of numerical analysis, 8(1), pp.141-148.

Djordjevic, S.S., 2019. Some unconstrained optimization methods. In Applied Mathematics. IntechOpen.

Powell, M.D., 1977. Restart procedures for the Conjugate gradient method. Mathematical programming, 12(1), pp.241-254.

Fletcher, R. and Reeves, M., 1964. Function minimization by Conjugate gradients. The computer journal, 7(2), pp.149-154.

Hestenes, M.R. and Stiefel, E., 1952. Methods of Conjugate gradients for solving linear systems (Vol. 49, No. 1). Washington, DC: NBS.

Nazareth, L., 2009. Conjugate gradient method. Wiley Interdisciplinary Reviews: Computational Statistics, 1(3), pp.348-353.

Faber, V. and Manteuffel, T., 1984. Necessary and sufficient conditions for the existence of a Conjugate gradient method. SIAM Journal on numerical analysis, 21(2), pp.352-362.

Stanimirović, P.S., Ivanov, B., Ma, H. and Mosić, D., 2020. A survey of gradient methods for solving nonlinear optimization. Electronic Research Archive, 28(4), p.1573.

Lasdon, L., Mitter, S. and Waren, A., 1967. The Conjugate gradient method for optimal control problems. IEEE Transactions on Automatic Control, 12(2), pp.132-138.

Vagaská, A. and Gombár, M., 2021. Mathematical Optimization and Application of Nonlinear Programming. In Algorithms as a Basis of Modern Applied Mathematics (pp. 461-486). Springer, Cham.

Osadcha, O. and Marszaek, Z., 2017. Comparison of Steepest descent method and Conjugate gradient method. In CEUR Workshop Proceedings, SYSTEM.

Beck, A., 2014. Introduction to nonlinear optimization: Theory, algorithms, and applications with MATLAB. Society for Industrial and Applied Mathematics.

Baldick, R., 2006. Applied optimization: formulation and algorithms for engineering systems. Cambridge University Press.

Antoniou, A. and Lu, W.S., 2007. Practical optimization: algorithms and engineering applications (Vol. 19, p. 669). New York: Springer.

Venkataraman, P., 2009. Applied optimization with MATLAB programming. John Wiley & Sons.

Rahim, I. and Hussan, M.A., 2012. Teaching Optimization Techniques using Matlab and Mathematica: A Comparative Study. Journal of Information & Communication Technology (JICT), 6(1), p.9.

Zhang, Y., 1998. Solving large-scale linear programs by interior-point methods under the MATLAB environment. Optimization Methods and Software, 10(1), pp.1-31.

Arora, J.S., 2006. Jan A. Snyman, Practical Mathematical Optimization: An introduction to basic optimization theory and classical and new gradient-based algorithms. Structural and Multidisciplinary Optimization, 31(3), pp.249-249.

Allwright, J.C., 1976. Conjugate gradient versus Steepest descent. Journal of Optimization Theory and Applications, 20(1), pp.129-134.

Meza, J.C., 2010. Steepest descent. Wiley Interdisciplinary Reviews: Computational Statistics, 2(6), pp.719-722.

Axelsson, O. and Lindskog, G., 1986. On the rate of convergence of the preconditioned Conjugate gradient method. Numerische Mathematik, 48(5), pp.499-523.

Dai, Y.H. and Yuan, Y., 2000. Nonlinear Conjugate gradient methods. Shanghai Science and Technology Publisher, Shanghai.

Most read articles by the same author(s)