Statistical theory of deep generative models

TopicLearning Theory
FormatHybird
LocationSIMISShanghai
SpeakerLin Lizhen
(University of Maryland)
Time (GMT+8)

Abstract

Deep generative models are probabilistic generative models where the generator is parameterized by a deep neural network.


They are popular models for modeling high-dimensional data such as texts, images and speeches, and have achieved impressive empirical success. Despite demonstrated success in empirical performance, theoretical understanding of such models is largely lacking . We investigate statistical properties of deep generative models from a nonparametric distribution estimation viewpoint. In the considered model, data are assumed to be observed in some high-dimensional ambient space but concentrate around some low-dimensional structure such as a lower-dimensional manifold. This talk will provide an explanation of why deep generative models can perform well from the lens of statistical theory. In particular, we will provide insights into