Optimized first-order methods for convex minimization

First-order algorithms are favorable when solving large-scale optimization problems, since their per-iteration computational cost mildly depends on the problem dimension. In particular, Nesterov's fast gradient method (FGM) is widely used as it is computationally efficient and achieves the optimal rate O(1/N^2) for the decrease of smooth convex functions, where N denotes the number of iterations. In search of the best-performing first-order methods [Y. Drori and M. Teboulle, Math. Program., 2016], two papers below propose and discuss a new first-order method, named an optimized gradient method (OGM), that is computationally as efficient as FGM and has a worst-case cost function rate that is twice smaller than that of FGM. Recently this OGM has been found to exactly achieve the optimal worst-case cost function rate of first-order methods for solving (large-dimensional) smooth convex problems [Y. Drori, J. Complexity, 2017].

The paper below generalizes the formulation of OGM and analyzes its worst-case rates for both the cost function and the gradient. This paper then proposes a new algorithm called OGM-OG (OG for optimized over gradient) that has the best known analytical worst-case rate on the gradient norm decrease among non-adaptive-step first-order methods for smooth convex minimization.

  • D. Kim and J. A. Fessler,
    Generalizing the optimized gradient method for smooth convex minimization,
    SIAM J. Optimization, To appear. [arXiv version]

The following paper somewhat extends the above approach for the nonsmooth composite convex problems.

Adaptive restart of the optimized gradient method (OGM)

Adpative restart of FGM [B. Odonoghue and E. Candes, Found. Comp. Math., 2015] has been found useful for improving the worst-case rate when the function parameter is unknown or when the iterates of the algorithm enter a locally strongly convex region. Building upon the success of accelerating FGM, the paper below investigates similar heuristic acceleration of OGM.

  • D. Kim and J. A. Fessler,
    Adaptive restart of the optimized gradient method for convex minimization,
    J. Optim. Theory Appl., To appear. [arXiv version]

Fast optimization methods for statistical X-ray CT reconstruction

Radiation exposure to patients from computed tomography (CT) imaging has been criticized for its increased risk of cancer. Statistical X-ray CT image reconstruction can provide diagnostic quality images, unlike the conventional filtered back-projection methods, from safer low-dose CT scans (yielding very noisy measurement data). However, statistical reconstruction requires very long computation time, and the following papers below are the efforts to accelerate this statistical reconstruction and thus potentially reduce the dose usage for the public health. Each work is based on the distributed computing, Nesterov's FGM, and majorization-minimization, with the combination of ordered subsets technique (that is an instance of incremental/stochastic gradient methods for tomography problems).

The above works are now patented jointly with General Electric (GE) as below.