Efficient Approximation of Infinite-Dimensional Dynamical Systems in Finance: From Adapted Optimal Transport to Stochastic Filtering
We introduce a universal class of geometric deep learning models, called metric hypertransformers (MHTs), capable of approximating any adapted map $F:\mathcal{X}^{\mathbb{Z}}\to\mathcal{Y}^\mathbb{Z}$ with approximable complexity, where $\mathcal{X}\subseteq\mathbb{R}^d$ and $\mathcal{Y}$ is any suitable metric space, and $\mathcal{X}^\mathbb{Z}$ (resp. $\mathcal{Y}^\mathbb{Z}$) capture all discrete-time paths on $\mathcal{X}$ (resp. $\mathcal{Y}$). Suitable spaces $\mathcal{Y}$ include various adapted Wasserstein spaces, all Fréchet spaces admitting a Schauder basis, and a variety of Riemannian manifolds arising from information geometry. Even in the "static case", where $f:\mathcal{X}\to\mathcal{Y}$ is a Hölder map, our results provide the first (quantitative) universal approximation theorem compatible with any such $\mathcal{X}$ and $\mathcal{Y}$. Our universal approximation theorems are quantitative, and they depend on the regularity of $F$, the choice of activation function, the metric entropy and diameter of $\mathcal{X}$ and on the regularity of the compact set of paths whereon the approximation is performed. Our guiding examples originate from stochastic analysis. Notably, the MHT models introduced here are able to approximate a broad range of stochastic processes' kernels, including solutions to SDEs and many processes with an arbitrarily long memory such as certain Volterra processes.
As a main application, we show how dynamic stochastic filtering operators can be approximated efficiently by our neural network models.