A Space Decomposition Framework for Nonlinear Optimization
In this talk, we will consider continuous nonlinear optimization problems (for simplicity unconstrained) where the direct minimization of the objective function in the full space is out of reach. Our target includes large-scale optimization with derivatives and relatively large-size derivative-free optimization. With these problems in mind, we propose a space decomposition framework for nonlinear optimization. At each iteration, the space of variables is decomposed into (possibly overlapping) subspaces, and a step is defined in each of the spaces by approximately minimizing a subspace model of the objective function. A synchronization phase is then performed to obtain a full-space step from the subspace ones. The main novelty lies in the synchronization phase which is inspired from successful Overlapping Domain Decomposition techniques for linear PDE systems. In particular, our framework covers the Restricted Additive Schwarz and Additive Schwarz with Harmonic Extension and several other synchronization strategies as special cases. In doing so, the model gradients and the subspace steps may be changed according to the overlaps. Using regularization or globalization schemes, such as Levenberg-Marquardt or trust regions, the new framework is guaranteed to converge globally at appropriated rates. If time permits, we will discuss related issues such as gradient inexactness or random generation of subspaces. This is joint work with Serge Gratton (Toulouse) and Zaikun Zhang (Hong Kong).