Agenda

Generalized Energy – Based Models
07/abr
ABSTRACT

I will introduce Generalized Energy Based Models (GEBM) for generative modelling. These models combine two trained components: a base distribution (generally an implicit model, as in a Generative Adversarial Network – GAN), which can learn the support of data with low intrinsic dimension in a high dimensional space; and an energy function, to refine the probability mass on the learned support. Both the energy function and base jointly constitute the final model, unlike GANs, which retain only the base distribution (the «generator»). In particular, while the energy function is analogous to the GAN critic function, it is not discarded after training. GEBMs are trained by alternating between learning the energy and the base, much like a GAN. Both training stages are well-defined: the energy is learned by maximising a generalized likelihood, and the resulting energy-based loss provides informative gradients for learning the base. Samples from the posterior on the latent space of the trained model can be obtained via MCMC, thus finding regions in this space that produce better quality samples. Empirically, the GEBM samples on image-generation tasks are of better quality than those from the learned generator alone, indicating that all else being equal, the GEBM will outperform a GAN of the same complexity. GEBMs also return state-of-the- art performance on density modelling tasks, and when using base measures with an explicit form.

PRESENTER

ARTHUR GRETTON, Gatsby Computational Neuroscience Unit, University College London.

Arthur Gretton is a Professor with the Gatsby Computational Neuroscience Unit, and director of the Centre for Computational Statistics and Machine Learning (CSML) at University College London. He received degrees in Physics and Systems Engineering from the Australian National University, and a PhD with Microsoft Research and the Signal Processing and Communications Laboratory at the University of Cambridge. He previously worked at the Max Planck Institute (MPI) for Biological Cybernetics, and at the Machine Learning Department, Carnegie Mellon University. Arthur`s recent research interests in machine learning include the design and training of generative models, both implicit (e.g. GANs) and explicit (high/infinite dimensional exponential family models and energy-based models), nonparametric hypothesis testing, survival analysis, causality, and kernel methods. He has been an associate editor at IEEE Transactions on Pattern Analysis and Machine Intelligence from 2009 to 2013, an Action Editor for Journal of Machine Learning Research (JMLR) since April 2013, an Area Chair for NeurIPS in 2008 and 2009, a Senior Area Chair for NeurIPS in 2018, an Area Chair for International Conference on Machine Learning (ICML) in 2011 and 2012, a member of the Conference on Learning Theory (COLT) Program Committee in 2013, and a member of Royal Statistical Society Research Section Committee since January 2020. Arthur was program chair for Society for Artificial Intelligence and Statistics (AISTATS) in 2016 (with Christian Robert), tutorials chair for ICML 2018 (with Ruslan Salakhutdinov), workshops chair for ICML 2019 (with Honglak Lee), program chair for the Dali workshop in 2019 (with Krikamol Muandet and Shakir Mohammed), and co-organsiser of the Machine Learning Summer School 2019 in London (with Marc Deisenroth).

WHEN AND WHERE:

Wednesday  April 7, 15.00 hs, in zoom.