Stable Dimension Reduction and Efficient Conditional Distribution Approximation for Volterra Processes
Predicting the conditional evolution of Volterra processes with stochastic volatility is a crucial challenge in mathematical finance. While deep neural network models offer promise in approximating the conditional law of such processes, their effectiveness is hindered by the curse of dimensionality caused by the infinite-dimensionality and non-smooth nature of these problems. To address this, we propose a two-step solution. Firstly, we develop a stable dimension reduction technique, projecting the law of any Volterra process onto a low-dimensional statistical manifold of non-positive sectional curvature. Next, we then introduce a sequentially deep learning model tailored to the manifold's geometry, which we show can approximate the projected conditional law of the Volterra process. Thus, our approach overcomes the limitations of traditional neural network models and presents a promising advancement for efficient predictions in mathematical finance. Our approach brings together elements of optimal transport theory, Riemannian geometry, and the approximation theory of deep neural networks on non-vectorial spaces.