UCL  IRIS
Institutional Research Information Service
UCL Logo
Please report any queries concerning the funding data grouped in the sections named "Externally Awarded" or "Internally Disbursed" (shown on the profile page) to your Research Finance Administrator. Your can find your Research Finance Administrator at https://www.ucl.ac.uk/finance/research/rs-contacts.php by entering your department
Please report any queries concerning the student data shown on the profile page to:

Email: portico-services@ucl.ac.uk

Help Desk: http://www.ucl.ac.uk/ras/portico/helpdesk
Publication Detail
Differentially Private Mixture of Generative Neural Networks
  • Publication Type:
    Conference
  • Authors:
    Acs G, Melis L, Castelluccia C, Cristofaro ED
  • Publisher:
    IEEE
  • Publication date:
    18/12/2017
  • Published proceedings:
    Proceedings of 17th IEEE International Conference on Data Mining
  • Name of conference:
    IEEE International Conference on Data Mining (ICDM 2017)
  • Conference place:
    New Orleans, LA
  • Conference start date:
    18/11/2017
  • Conference finish date:
    21/11/2017
  • Keywords:
    cs.LG, cs.LG, cs.CR
  • Notes:
    This is a preliminary full version of the paper with the same title to appear at 17th IEEE International Conference on Data Mining series (ICDM 2017)
Abstract
Over the past few years, an increasing number of applications of generative models have emerged that rely on large amounts of contextually rich information about individuals. Owing to possible privacy violations of individuals whose data is used to train these models, however, publishing or sharing generative models is not always viable. In this paper, we introduce a novel solution geared for privately releasing generative models as well as entire high-dimensional datasets produced by these models. We model the generator distribution of the training data by a mixture of $k$ generative neural networks. These networks are trained together, and collectively learn the generator distribution of the given dataset. More specifically, the data is first divided into $k$ clusters using a novel differential private kernel $k$-means, then each cluster is given to a separate generative neural network, such as Restricted Boltzmann Machines or Variational Autoencoders, which are trained only on their own cluster using differentially private gradient descent. As the components of our model are neural networks, it can characterize complicated data distributions, and applies to various types of data. We evaluate our approach using the MNIST dataset and a large Call Detail Records (CDR) dataset, and show that it produces realistic synthetic samples, which can also be used to accurately compute arbitrary number of counting queries.
Publication data is maintained in RPS. Visit https://rps.ucl.ac.uk
 More search options
UCL Researchers
Author
Dept of Computer Science
University College London - Gower Street - London - WC1E 6BT Tel:+44 (0)20 7679 2000

© UCL 1999–2011

Search by