Smooth overparameterized solvers for nonsmooth structured optimisation
Nonsmooth regularisers are widely used in machine learning for enforcing solution structures (such as the l1 norm for sparsity or the nuclear norm for low rank). State of the art solvers are typically first order methods or coordinate descent methods which handle nonsmoothness by careful smooth approximations and support pruning. In this work, we revisit the approach of iteratively reweighted least squares (IRLS) and show how a simple reparameterization coupled with a bilevel resolution leads to a smooth unconstrained problem. We are therefore able to exploit the machinery of smooth optimisation, such as BFGS, to obtain local superlinear convergence. The result is a highly versatile approach, which handles both convex and non-convex regularisers, and different optimisation formulations (e.g. basis pursuit and lasso). We show that this is able to significantly outperform state of the art methods for a wide range of problems.