Seminario del 2021

2021
18 febbraio
Numerical first-order methods are the most suitable choice for solving large-scale nonlinear optimization problems which model many real life applications. Among these approaches, gradient methods have widely proved their effectiveness in solving challenging unconstrained and constrained problems arising in machine learning, compressive sensing, image processing and other areas. These methods became extremely popular since the work by Barzilai and Borwein  (BB) (1988), which showed how a suitable choice of the steplength can significantly accelerate the classical Steepest Descent method. It is well-known that the performance of gradient methods based on the BB steplength does not depend on the decrease of the objective function at each iteration but relies on the relationship between the steplengths used and the eigenvalues of the average Hessian matrix; hence BB based methods are also denoted as Spectral Gradient methods. The first part of this seminar will be devoted to a review of spectral gradient methods for unconstrained optimization while the second part will focus on recent advances on the extension of these methods to the solution of large nonlinear systems of equations, the so-called Spectral Residual methods. These methods are derivative-free, low-cost per iteration and are particularly suitable when the Jacobian matrix of the residual function is not available analytically or its computation is not relatively easy. In this framework, numerical experience will be presented on sequences of nonlinear systems arising from rolling contact models which play a central role in many important applications, such as rolling bearings and wheel-rail interaction.

indietro