Robust Low-Rank Matrix Recovery
This paper focuses on robust estimation of the low-rank matrix from the trace regression model. It encompasses four popularly problems: Sparse linear models, compressive sensing, matrix completion and multi-task regression as its specific examples. Instead of optimizing nuclear-norm penalized least-squares, our robust penalized least-squares approach is to replace the quadratic loss by its robust version. The robust version is obtained by the appropriate truncations or shrinkage of the data and hence is very easy to implement. Under only bounded 2+δ moment condition on the noise, we show that the proposed robust penalized trace regression yields an estimator that processes the same rates as those presented in Negahban and Wainwright's work under sub-Gaussian error assumption. The rates of convergence are explicitly derived. As a byproduct, we also give a robust covariance matrix estimation and establish its concentration inequality.