Abstract
We present an optimal gradient method for smooth (possibly strongly) convex optimization. The method is optimal in the sense that its worst-case bound exactly match the lower bound on the oracle complexity for the class of problems, meaning that no black-box first-order method can have a better worst-case guarantees without further assumptions on the class of problems at hand.
The method is in some sense a generalization of the Optimized Gradient Method of~\citet{kim2015optimized}, and asymptotically corresponds to the Triple Momentum Method~\citep{van2017fastest}, in the presence of strong convexity.
Furthermore, the method is numerically stable to arbitrarily large condition numbers and admits a conceptually very simple proof, which involves a Lyapunov argument and a sum of two inequalities.
Finally, we provide a numerical recipe for obtaining the algorithmic parameters of the new method, using semidefinite programming, and illustrate that it can be used for developing other methods as well.