Institutional Research Information Service
UCL Logo
Please report any queries concerning the funding data grouped in the sections named "Externally Awarded" or "Internally Disbursed" (shown on the profile page) to your Research Finance Administrator. Your can find your Research Finance Administrator at https://www.ucl.ac.uk/finance/research/rs-contacts.php by entering your department
Please report any queries concerning the student data shown on the profile page to:

Email: portico-services@ucl.ac.uk

Help Desk: http://www.ucl.ac.uk/ras/portico/helpdesk
Publication Detail
Learning Theory for Vector-Valued Distribution Regression
  • Publication Type:
    Conference presentation
  • Publication Sub Type:
  • Authors:
    Szabo Z, Sriperumbudur B, Poczos B, Gretton A
  • Date:
  • Name of Conference:
    CMStatistics 2015
  • Conference place:
    London, UK
  • Conference start date:
  • Conference finish date:
  • Keywords:
    distribution regression, two-stage sampling, mean embedding, convergence rate, set kernel, consistency
  • Notes:
    Preprint: "http://arxiv.org/abs/1411.2066", code: "https://bitbucket.org/szzoli/ite/". Abstract: "http://www.gatsby.ucl.ac.uk/~szabo/talks/invited_talk/Zoltan_Szabo_invited_talk_CMStatistics_12_12_2015_abstract.pdf"
We focus on the distribution regression problem (DRP): we regress from probability measures to Hilbert-space valued outputs, where the input distributions are only available through samples (this is the 'two-stage sampled' setting). Several important statistical and machine learning problems can be phrased within this framework including point estimation tasks without analytical solution (such as entropy estimation), or multi-instance learning. However, due to the two-stage sampled nature of the problem, the theoretical analysis becomes quite challenging: to the best of our knowledge the only existing method with performance guarantees to solve the DRP task requires density estimation (which often performs poorly in practise) and the distributions to be defined on a compact Euclidean domain. We present a simple, analytically tractable alternative to solve the DRP task: we embed the distributions to a reproducing kernel Hilbert space and perform ridge regression from the embedded distributions to the outputs. We prove that this scheme is consistent under mild conditions, and construct explicit finite sample bounds on its excess risk as a function of the sample numbers and the problem difficulty, which hold with high probability. Specifically, we establish the consistency of set kernels in regression, which was a 15-year-old-open question, and also present new kernels on embedded distributions. The practical efficiency of the studied technique is illustrated in supervised entropy learning and aerosol prediction using multispectral satellite images.
Publication data is maintained in RPS. Visit https://rps.ucl.ac.uk
 More search options
UCL Researchers
Gatsby Computational Neurosci Unit
University College London - Gower Street - London - WC1E 6BT Tel:+44 (0)20 7679 2000

© UCL 1999–2011

Search by