Title
Spectral Preconditioning for Gradient Methods on Graded Non-convex Functions
Abstract
The performance of optimization methods is often tied to the spectrum of the objective Hessian. Yet, conventional assumptions, such as smoothness, do often not enable us to make finely-grained convergence statements — particularly not for non-convex problems. Striving for a more intricate characterization of complexity, we introduce a unique concept termed graded non-convexity. This allows to partition the class of non-convex problems into a nested chain of subclasses. Interestingly, many traditional non-convex objectives, including partially convex problems, matrix factorizations, and neural networks, fall within these subclasses. Then, we propose gradient methods with spectral preconditioning, which employ inexact top eigenvectors of the Hessian to address the ill-conditioning of the problem, contingent on the grade. Our analysis reveals that these new methods provide provably superior convergence rates compared to basic gradient descent on applicable problem classes, particularly when large gaps exist between the top eigenvalues of the Hessian.
Bio
Nikita Doikov is a postdoctoral researcher at EPFL, Switzerland, working at the Machine Learning and Optimization Laboratory led by Martin Jaggi. His research focuses on the foundations of optimization theory and the development of numerical methods for large-scale problems, in particular, second-order optimization algorithms with global convergence guarantees. He completed his PhD at UCLouvain, Belgium, under the supervision of Yurii Nesterov. Prior to that, Nikita obtained his BSc degree at Moscow State University and his MSc degree at the Higher School of Economics.