Back to Search Start Over

Max-Sliced Wasserstein Distance and its use for GANs

Authors :
Deshpande, Ishan
Hu, Yuan-Ting
Sun, Ruoyu
Pyrros, Ayis
Siddiqui, Nasir
Koyejo, Sanmi
Zhao, Zhizhen
Forsyth, David
Schwing, Alexander
Publication Year :
2019

Abstract

Generative adversarial nets (GANs) and variational auto-encoders have significantly improved our distribution modeling capabilities, showing promise for dataset augmentation, image-to-image translation and feature learning. However, to model high-dimensional distributions, sequential training and stacked architectures are common, increasing the number of tunable hyper-parameters as well as the training time. Nonetheless, the sample complexity of the distance metrics remains one of the factors affecting GAN training. We first show that the recently proposed sliced Wasserstein distance has compelling sample complexity properties when compared to the Wasserstein distance. To further improve the sliced Wasserstein distance we then analyze its `projection complexity' and develop the max-sliced Wasserstein distance which enjoys compelling sample complexity while reducing projection complexity, albeit necessitating a max estimation. We finally illustrate that the proposed distance trains GANs on high-dimensional images up to a resolution of 256x256 easily.<br />Comment: Accepted to CVPR 2019

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1904.05877
Document Type :
Working Paper