Solving the Nearest Correlation Matrix Problem with an Inexact Accelerated Proximal Gradient Method
Let $F(x) = f(x) + g(x)$, where $f$ is convex and differentiable and $g$ is convex and nonsmooth, and let $F_*$ be its minimum value. The $H$-weighted Nearest Correlation Matrix (NCM) problem is an example of such a problem, with $f(X) = \frac{1}{2}\|H \circ (X - G)\|_F^2$ and $g$ as the characteristic function for the set of correlation matrices (that is, $g(X)$ is zero if $X$ is a positive semidefinite matrix whose diagonal entries are all one, and $+\infty$ otherwise).
The classical accelerated proximal gradient method (APGM) generates a sequence of iterates $x_k$ such that $F(x_k) - F_* = O(k^{-2})$, which is the optimal complexity for a first-order algorithm that minimizes a convex function. Each iteration of the APGM evaluates the gradient of $f$ and the proximal operator of $g$. In the NCM problem, this proximal operator cannot be computed exactly and is instead approximated via an iterative method.
We introduce a relative error criterion to determine when the proximal operator of $g$ has been sufficiently approximated, and an inexact APG method based on this error criterion. We establish the optimal complexity of $O(k^{-2})$ for our inexact APG method, and present numerical results that demonstrate an improved performance compared to an inexact APG method that is based on an absolute error criterion.
This work was done in collaboration with Yunier Bello-Cruz (Northern Illinois University) and Max L. N. Gonçalves (Universidade Federal de Goiás).