collection of resources to understand attention and implementation of attention models
- BEGAN summary: https://blog.heuritech.com/2017/04/11/began-state-of-the-art-generation-of-faces-with-generative-adversarial-networks/
- alternative loss
- wasserstein: https://arxiv.org/abs/1701.07875
- improved wasserstein: https://arxiv.org/abs/1704.00028
- least squares: https://arxiv.org/abs/1611.04076
- generalized loss sensitive GAN: https://arxiv.org/abs/1701.06264
- BEGAN (investigated in this article)
- EBGAN: https://arxiv.org/abs/1609.03126
- look at existing results for GAN
- improved WGAN: https://arxiv.org/abs/1704.00028
- image editing: https://arxiv.org/abs/1609.07093
- super resolution: https://arxiv.org/abs/1609.04802
- semi-supervised learning: https://arxiv.org/abs/1605.09782
- domain transfer (cycleGAN): https://arxiv.org/abs/1703.10593
- alternative loss
- https://towardsdatascience.com/a-new-kind-of-deep-neural-networks-749bcde19108
- dcgan chainer: https://medium.com/@keisukeumezawa/dcgan-generate-the-images-with-deep-convolutinal-gan-55edf947c34b
- read
- write out math
- chainer tutorial
- implement in chainer
- cycleGAN
- applications:
- easy starters
- https://blog.openai.com/generative-models/
- https://arxiv.org/abs/1701.00160
- https://github.com/soumith/ganhacks
- 1D Generative Adversarial Network Demo