The isotonic single index model under fixed and random designs
To quote L. Wasserman, probability theory asks "Given a data generation process, what are the properties of the outcomes?", while statistics ask "Given the outcomes, what can we say about the process that generated the data?", where the latter question can be viewed as solving an inverse problem.
My main research area is called shape constrained estimation, and in the first part of the talk I will try to motivate the field in the context of solving the inverse problem while striking a balance between robustness (bias) and efficiency (variance), the two key sources of error in statistical estimation. Following the introduction, I will discuss a recent problem I have been working on: the monotone single index model.
In the monotone single index model a real response variable Y is linked to a multivariate covariate X through the relationship $E[Y|X]=f_0(\alpha_0^TX)$ almost surely. Both the ridge function, $f_0$, and the index parameter, $\alpha_0$, are unknown and the ridge function is assumed to be monotone. Under random design, we show that the rate of convergence of the estimator of the bundled function $f_0(\alpha_0^Tx)$ is $n^{1/3}$. Furthermore, we show that the least squares estimator is nearly parametrically rate-adaptive to piecewise constant ridge functions. For the fixed design setting, we
show that the rate of convergence is parametric, as expected.
Throughout the talk I will illustrate the methodology on several real data sets.