Please report any queries concerning the funding data grouped in the sections named "Externally Awarded" or "Internally Disbursed" (shown on the profile page) to
your Research Finance Administrator. Your can find your Research Finance Administrator at https://www.ucl.ac.uk/finance/research/rs-contacts.php by entering your department
Please report any queries concerning the student data shown on the profile page to:
Email: portico-services@ucl.ac.uk
Help Desk: http://www.ucl.ac.uk/ras/portico/helpdesk
Email: portico-services@ucl.ac.uk
Help Desk: http://www.ucl.ac.uk/ras/portico/helpdesk
Publication Detail
Optimal Rates for Regularized Conditional Mean Embedding Learning
-
Publication Type:Working discussion paper
-
Authors:Li Z, Meunier D, Mollenhauer M, Gretton A
-
Publisher:arXiv
-
Keyword:stat.ML, stat.ML, cs.LG
-
Publisher URL:
Abstract
We address the consistency of a kernel ridge regression estimate of the
conditional mean embedding (CME), which is an embedding of the conditional
distribution of $Y$ given $X$ into a target reproducing kernel Hilbert space
$\mathcal{H}_Y$. The CME allows us to take conditional expectations of target
RKHS functions, and has been employed in nonparametric causal and Bayesian
inference. We address the misspecified setting, where the target CME is in the
space of Hilbert-Schmidt operators acting from an input interpolation space
between $\mathcal{H}_X$ and $L_2$, to $\mathcal{H}_Y$. This space of operators
is shown to be isomorphic to a newly defined vector-valued interpolation space.
Using this isomorphism, we derive a novel and adaptive statistical learning
rate for the empirical CME estimator under the misspecified setting. Our
analysis reveals that our rates match the optimal $O(\log n / n)$ rates without
assuming $\mathcal{H}_Y$ to be finite dimensional. We further establish a lower
bound on the learning rate, which shows that the obtained upper bound is
optimal.
› More search options
UCL Researchers