Data instance prior (disp) in generative adversarial networks
Proceedings of the IEEE/CVF Winter Conference on Applications …, 2022•openaccess.thecvf.com
Recent advances in generative adversarial networks (GANs) have shown remarkable
progress in generating high-quality images. However, this gain in performance depends on
the availability of a large amount of training data. In limited data regimes, training typically
diverges, and therefore the generated samples are of low quality and lack diversity. Previous
works have addressed training in low data setting by leveraging transfer learning and data
augmentation techniques. We propose a novel transfer learning method for GANs in the …
progress in generating high-quality images. However, this gain in performance depends on
the availability of a large amount of training data. In limited data regimes, training typically
diverges, and therefore the generated samples are of low quality and lack diversity. Previous
works have addressed training in low data setting by leveraging transfer learning and data
augmentation techniques. We propose a novel transfer learning method for GANs in the …
Abstract
Recent advances in generative adversarial networks (GANs) have shown remarkable progress in generating high-quality images. However, this gain in performance depends on the availability of a large amount of training data. In limited data regimes, training typically diverges, and therefore the generated samples are of low quality and lack diversity. Previous works have addressed training in low data setting by leveraging transfer learning and data augmentation techniques. We propose a novel transfer learning method for GANs in the limited data domain by leveraging informative data prior derived from self-supervised/supervised pre-trained networks trained on a diverse source domain. We perform experiments on several standard vision datasets using various GAN architectures (BigGAN, SNGAN, StyleGAN2) to demonstrate that the proposed method effectively transfers knowledge to domains with few target images, outperforming existing state-of-the-art techniques in terms of image quality and diversity. We also show the utility of data instance prior in large-scale unconditional image generation.
openaccess.thecvf.com