Constructive universal distribution generation through deep ReLU networks

Part of Proceedings of the International Conference on Machine Learning 1 pre-proceedings (ICML 2020)

Bibtex »Metadata »Paper »

Bibtek download is not availble in the pre-proceeding


Dmytro Perekrestenko, Stephan Müller, Helmut Bölcskei


<p>We present an explicit deep network construction that transforms uniformly distributed one-dimensional noise into an arbitrarily close approximation of any two-dimensional target distribution of finite differential entropy and Lipschitz-continuous pdf. The key ingredient of our design is a generalization of the "space-filling'' property of sawtooth functions introduced in (Bailey &amp; Telgarsky, 2018). We elicit the importance of depth in our construction in driving the Wasserstein distance between the target distribution and its approximation realized by the proposed neural network to zero. Finally, we outline how our construction can be extended to output distributions of arbitrary dimension.</p>