JGAN
Code based on Pytorch-GAN
Our GAN model zoo supports 27 kinds of GAN.
This table is the latest citations we found from Google Scholar.
It can be seen that since GAN was proposed in 2014, a lot of excellent work based on GAN has appeared.
These 27 GANs have a total of 60953 citations, with an average of 2176 citations per article.
我们的 GAN 模型库支持 27 种 GAN 模型。该表是我们从 Google Scholar 中找到的最新引用量。可以看到,自 2014 年提出 GAN 以来,出现了很多基于 GAN 的优秀工作。这 27 个 GAN 模型总共有 60953 次引用,平均每篇文章被引用 2176 次。
We compared the performance of these GANs of Jittor and Pytorch. The PyTorch version code uses the commit a163b8 on August 24, 2019 of the master branch of github repository. The picture below is the speedup ratio of Jittor relative to Pytorch. It can be seen that the highest acceleration ratio of these GANs reaches 283%, and the average acceleration ratio is 185%.
我们比较了 Jittor 和 Pytorch 的这些 GAN 模型的性能。PyTorch 版本代码使用 github 仓库 master 分支 2019 年 8 月 24 日的 commit a163b8。下图是 Jittor 相对于 Pytorch 的加速比。可以看出,这些GANs的最高加速比达到283%,平均加速比为185%。
In another form of presentation, assuming that Pytorch’s training time is 100 hours, we calculated the time required for GAN training corresponding to Jittor. Of these GANs, our fastest accelerating GAN takes only 35 hours to run, with an average of 57 hours.
以另一种形式的呈现,假设 Pytorch 的训练时间为 100 小时,我们计算了 Jittor 对应的 GAN 训练所需的时间。在这些 GAN 中,我们最快的加速 GAN 仅需 35 小时即可运行,平均为 57 小时。
News
- 第二届计图人工智能挑战赛已于 2022/04/15 正式开启。
- 计图 (Jittor) 人工智能算法挑战赛是在国家自然科学基金委信息科学部指导下,由北京信息科学与技术国家研究中心和清华大学-腾讯互联网创新技术联合实验室于 2021 年创办、基于清华大学“计图”深度学习框架的人工智能算法大赛。今年起,该赛事将作为中国软件开源创新大赛中开源任务挑战赛的赛事之一开展 AI 算竞赛。
- 大赛面向所有在校学生和 AI 相关领域从业人士开放,旨在通过竞技的方式提升人们对数据分析与处理的算法研究与技术应用的能力,推动我国自主人工智能平台的生态建设和人工智能研究和应用的深入。竞赛得到腾讯公司的赞助。
- 本届挑战赛设置一个热身赛(手写数字生成赛题)和两个正式赛题(风景图片生成赛题和可微渲染新视角生成赛题),参赛选手需要通过热身赛才能参加两个正式赛题。比赛更多信息可以在官网查看。其中计图挑战热身赛和赛题一:风景图片生成赛题可以详见这里。
Table of Contents
Installation
$ git clone https://github.com/Jittor/JGAN.git
$ cd JGAN/
$ sudo python3.7 -m pip install -r requirements.txt
models
Auxiliary Classifier GAN
Auxiliary Classifier Generative Adversarial Network
Authors
Augustus Odena, Christopher Olah, Jonathon Shlens
[Paper] [Code]
Run Example
$ cd models/acgan/
$ python3.7 acgan.py
Adversarial Autoencoder
Adversarial Autoencoder
Authors
Alireza Makhzani, Jonathon Shlens, Navdeep Jaitly, Ian Goodfellow, Brendan Frey
[Paper] [Code]
Run Example
$ cd models/aae/
$ python3.7 aae.py
BEGAN
BEGAN: Boundary Equilibrium Generative Adversarial Networks
Authors
David Berthelot, Thomas Schumm, Luke Metz
[Paper] [Code]
Run Example
$ cd models/began/
$ python3.7 began.py
BicycleGAN
Toward Multimodal Image-to-Image Translation
Authors
Jun-Yan Zhu, Richard Zhang, Deepak Pathak, Trevor Darrell, Alexei A. Efros, Oliver Wang, Eli Shechtman
[Paper] [Code]
Run Example
$ cd data/
$ bash download_pix2pix_dataset.sh edges2shoes
$ cd ../models/bicyclegan/
$ python3.7 bicyclegan.py
Various style translations by varying the latent code.
Boundary-Seeking GAN
Boundary-Seeking Generative Adversarial Networks
Authors
R Devon Hjelm, Athul Paul Jacob, Tong Che, Adam Trischler, Kyunghyun Cho, Yoshua Bengio
[Paper] [Code]
Run Example
$ cd models/bgan/
$ python3.7 bgan.py
Cluster GAN
ClusterGAN: Latent Space Clustering in Generative Adversarial Networks
Authors
Sudipto Mukherjee, Himanshu Asnani, Eugene Lin, Sreeram Kannan
[Paper] [Code]
Run Example
$ cd models/cluster_gan/
$ python3.7 clustergan.py
Conditional GAN
Conditional Generative Adversarial Nets
Authors
Mehdi Mirza, Simon Osindero
[Paper] [Code]
Run Example
$ cd models/cgan/
$ python3.7 cgan.py
Context Encoder
Context Encoders: Feature Learning by Inpainting
Authors
Deepak Pathak, Philipp Krahenbuhl, Jeff Donahue, Trevor Darrell, Alexei A. Efros
[Paper] [Code]
Run Example
$ cd models/context_encoder/
<follow steps at the top of context_encoder.py>
$ python3.7 context_encoder.py
Rows: Masked | Inpainted | Original | Masked | Inpainted | Original
Coupled GAN
Coupled Generative Adversarial Networks
Authors
Ming-Yu Liu, Oncel Tuzel
[Paper] [Code]
Run Example
$ download mnistm.pkl from https://cloud.tsinghua.edu.cn/f/d9a411da271745fcbe1f/?dl=1 and put it into data/mnistm/mnistm.pkl
$ cd models/cogan/
$ python3.7 cogan.py
Generated MNIST and MNIST-M images
CycleGAN
Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks
Authors
Jun-Yan Zhu, Taesung Park, Phillip Isola, Alexei A. Efros
[Paper] [Code]
Run Example
$ cd data/
$ bash download_cyclegan_dataset.sh monet2photo
$ cd ../models/cyclegan/
$ python3.7 cyclegan.py --dataset_name monet2photo
Monet to photo translations.
Deep Convolutional GAN
Deep Convolutional Generative Adversarial Network
Authors
Alec Radford, Luke Metz, Soumith Chintala
[Paper] [Code]
Run Example
$ cd models/dcgan/
$ python3.7 dcgan.py
DRAGAN
On Convergence and Stability of GANs
Authors
Naveen Kodali, Jacob Abernethy, James Hays, Zsolt Kira
[Paper] [Code]
Run Example
$ cd models/dragan/
$ python3.7 dragan.py
Energy-Based GAN
Energy-based Generative Adversarial Network
Authors
Junbo Zhao, Michael Mathieu, Yann LeCun
[Paper] [Code]
Run Example
$ cd models/ebgan/
$ python3.7 ebgan.py
Enhanced Super-Resolution GAN
ESRGAN: Enhanced Super-Resolution Generative Adversarial Networks
Authors
Xintao Wang, Ke Yu, Shixiang Wu, Jinjin Gu, Yihao Liu, Chao Dong, Chen Change Loy, Yu Qiao, Xiaoou Tang
[Paper] [Code]
Run Example
$ cd models/esrgan/
<follow steps at the top of esrgan.py>
$ python3.7 esrgan.py
GAN
Generative Adversarial Network
Authors
Ian J. Goodfellow, Jean Pouget-Abadie, Mehdi Mirza, Bing Xu, David Warde-Farley, Sherjil Ozair, Aaron Courville, Yoshua Bengio
[Paper] [Code]
Run Example
$ cd models/gan/
$ python3.7 gan.py
InfoGAN
InfoGAN: Interpretable Representation Learning by Information Maximizing Generative Adversarial Nets
Authors
Xi Chen, Yan Duan, Rein Houthooft, John Schulman, Ilya Sutskever, Pieter Abbeel
[Paper] [Code]
Run Example
$ cd models/infogan/
$ python3.7 infogan.py
Result of varying continuous latent variable by row.
Least Squares GAN
Least Squares Generative Adversarial Networks
Authors
Xudong Mao, Qing Li, Haoran Xie, Raymond Y.K. Lau, Zhen Wang, Stephen Paul Smolley
[Paper] [Code]
Run Example
$ cd models/lsgan/
$ python3.7 lsgan.py
Pix2Pix
Unpaired Image-to-Image Translation with Conditional Adversarial Networks
Authors
Phillip Isola, Jun-Yan Zhu, Tinghui Zhou, Alexei A. Efros
[Paper] [Code]
Run Example
$ cd data/
$ bash download_pix2pix_dataset.sh facades
$ cd ../models/pix2pix/
$ python3.7 pix2pix.py --dataset_name facades
Rows from top to bottom: (1) The condition for the generator (2) Generated image <br>
based of condition (3) The true corresponding image to the condition
PixelDA
Unsupervised Pixel-Level Domain Adaptation with Generative Adversarial Networks
Authors
Konstantinos Bousmalis, Nathan Silberman, David Dohan, Dumitru Erhan, Dilip Krishnan
[Paper] [Code]
MNIST to MNIST-M Classification
Trains a classifier on images that have been translated from the source domain (MNIST) to the target domain (MNIST-M) using the annotations of the source domain images. The classification network is trained jointly with the generator network to optimize the generator for both providing a proper domain translation and also for preserving the semantics of the source domain image. The classification network trained on translated images is compared to the naive solution of training a classifier on MNIST and evaluating it on MNIST-M. The naive model manages a 55% classification accuracy on MNIST-M while the one trained during domain adaptation achieves a 95% classification accuracy.
$ download mnistm.pkl from https://cloud.tsinghua.edu.cn/f/d9a411da271745fcbe1f/?dl=1 and put it into data/mnistm/mnistm.pkl
$ cd models/pixelda/
$ python3.7 pixelda.py
Rows from top to bottom: (1) Real images from MNIST (2) Translated images from
MNIST to MNIST-M (3) Examples of images from MNIST-M
Relativistic GAN
The relativistic discriminator: a key element missing from standard GAN
Authors
Alexia Jolicoeur-Martineau
[Paper] [Code]
Run Example
$ cd models/relativistic_gan/
$ python3.7 relativistic_gan.py # Relativistic Standard GAN
$ python3.7 relativistic_gan.py --rel_avg_gan # Relativistic Average GAN
Semi-Supervised GAN
Semi-Supervised Generative Adversarial Network
Authors
Augustus Odena
[Paper] [Code]
Run Example
$ cd models/sgan/
$ python3.7 sgan.py
Softmax GAN
Softmax GAN
Authors
Min Lin
[Paper] [Code]
Run Example
$ cd models/softmax_gan/
$ python3.7 softmax_gan.py
StarGAN
StarGAN: Unified Generative Adversarial Networks for Multi-Domain Image-to-Image Translation
Authors
Yunjey Choi, Minje Choi, Munyoung Kim, Jung-Woo Ha, Sunghun Kim, Jaegul Choo
[Paper] [Code]
Run Example
$ cd models/stargan/
<follow steps at the top of stargan.py>
$ python3.7 stargan.py
Original | Black Hair | Blonde Hair | Brown Hair | Gender Flip | Aged
UNIT
Unsupervised Image-to-Image Translation Networks
Authors
Ming-Yu Liu, Thomas Breuel, Jan Kautz
[Paper] [Code]
Run Example
$ cd data/
$ bash download_cyclegan_dataset.sh apple2orange
$ cd models/unit/
$ python3.7 unit.py --dataset_name apple2orange
Wasserstein GAN
Wasserstein GAN
Authors
Martin Arjovsky, Soumith Chintala, Léon Bottou
[Paper] [Code]
Run Example
$ cd models/wgan/
$ python3.7 wgan.py
Wasserstein GAN GP
Improved Training of Wasserstein GANs
Authors
Ishaan Gulrajani, Faruk Ahmed, Martin Arjovsky, Vincent Dumoulin, Aaron Courville
[Paper] [Code]
Run Example
$ cd models/wgan_gp/
$ python3.7 wgan_gp.py
Wasserstein GAN DIV
Wasserstein Divergence for GANs
Authors
Jiqing Wu, Zhiwu Huang, Janine Thoma, Dinesh Acharya, Luc Van Gool
[Paper] [Code]
Run Example
$ cd models/wgan_div/
$ python3.7 wgan_div.py
JGAN
Code based on Pytorch-GAN
Our GAN model zoo supports 27 kinds of GAN. This table is the latest citations we found from Google Scholar. It can be seen that since GAN was proposed in 2014, a lot of excellent work based on GAN has appeared. These 27 GANs have a total of 60953 citations, with an average of 2176 citations per article.
我们的 GAN 模型库支持 27 种 GAN 模型。该表是我们从 Google Scholar 中找到的最新引用量。可以看到,自 2014 年提出 GAN 以来,出现了很多基于 GAN 的优秀工作。这 27 个 GAN 模型总共有 60953 次引用,平均每篇文章被引用 2176 次。
We compared the performance of these GANs of Jittor and Pytorch. The PyTorch version code uses the commit a163b8 on August 24, 2019 of the master branch of github repository. The picture below is the speedup ratio of Jittor relative to Pytorch. It can be seen that the highest acceleration ratio of these GANs reaches 283%, and the average acceleration ratio is 185%.
我们比较了 Jittor 和 Pytorch 的这些 GAN 模型的性能。PyTorch 版本代码使用 github 仓库 master 分支 2019 年 8 月 24 日的 commit a163b8。下图是 Jittor 相对于 Pytorch 的加速比。可以看出,这些GANs的最高加速比达到283%,平均加速比为185%。
In another form of presentation, assuming that Pytorch’s training time is 100 hours, we calculated the time required for GAN training corresponding to Jittor. Of these GANs, our fastest accelerating GAN takes only 35 hours to run, with an average of 57 hours.
以另一种形式的呈现,假设 Pytorch 的训练时间为 100 小时,我们计算了 Jittor 对应的 GAN 训练所需的时间。在这些 GAN 中,我们最快的加速 GAN 仅需 35 小时即可运行,平均为 57 小时。
News
Table of Contents
Installation
models
Auxiliary Classifier GAN
Auxiliary Classifier Generative Adversarial Network
Authors
Augustus Odena, Christopher Olah, Jonathon Shlens
[Paper] [Code]
Run Example
Adversarial Autoencoder
Adversarial Autoencoder
Authors
Alireza Makhzani, Jonathon Shlens, Navdeep Jaitly, Ian Goodfellow, Brendan Frey
[Paper] [Code]
Run Example
BEGAN
BEGAN: Boundary Equilibrium Generative Adversarial Networks
Authors
David Berthelot, Thomas Schumm, Luke Metz
[Paper] [Code]
Run Example
BicycleGAN
Toward Multimodal Image-to-Image Translation
Authors
Jun-Yan Zhu, Richard Zhang, Deepak Pathak, Trevor Darrell, Alexei A. Efros, Oliver Wang, Eli Shechtman
[Paper] [Code]
Run Example
Various style translations by varying the latent code.
Boundary-Seeking GAN
Boundary-Seeking Generative Adversarial Networks
Authors
R Devon Hjelm, Athul Paul Jacob, Tong Che, Adam Trischler, Kyunghyun Cho, Yoshua Bengio
[Paper] [Code]
Run Example
Cluster GAN
ClusterGAN: Latent Space Clustering in Generative Adversarial Networks
Authors
Sudipto Mukherjee, Himanshu Asnani, Eugene Lin, Sreeram Kannan
[Paper] [Code]
Run Example
Conditional GAN
Conditional Generative Adversarial Nets
Authors
Mehdi Mirza, Simon Osindero
[Paper] [Code]
Run Example
Context Encoder
Context Encoders: Feature Learning by Inpainting
Authors
Deepak Pathak, Philipp Krahenbuhl, Jeff Donahue, Trevor Darrell, Alexei A. Efros
[Paper] [Code]
Run Example
Rows: Masked | Inpainted | Original | Masked | Inpainted | Original
Coupled GAN
Coupled Generative Adversarial Networks
Authors
Ming-Yu Liu, Oncel Tuzel
[Paper] [Code]
Run Example
Generated MNIST and MNIST-M images
CycleGAN
Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks
Authors
Jun-Yan Zhu, Taesung Park, Phillip Isola, Alexei A. Efros
[Paper] [Code]
Run Example
Monet to photo translations.
Deep Convolutional GAN
Deep Convolutional Generative Adversarial Network
Authors
Alec Radford, Luke Metz, Soumith Chintala
[Paper] [Code]
Run Example
DRAGAN
On Convergence and Stability of GANs
Authors
Naveen Kodali, Jacob Abernethy, James Hays, Zsolt Kira
[Paper] [Code]
Run Example
Energy-Based GAN
Energy-based Generative Adversarial Network
Authors
Junbo Zhao, Michael Mathieu, Yann LeCun
[Paper] [Code]
Run Example
Enhanced Super-Resolution GAN
ESRGAN: Enhanced Super-Resolution Generative Adversarial Networks
Authors
Xintao Wang, Ke Yu, Shixiang Wu, Jinjin Gu, Yihao Liu, Chao Dong, Chen Change Loy, Yu Qiao, Xiaoou Tang
[Paper] [Code]
Run Example
GAN
Generative Adversarial Network
Authors
Ian J. Goodfellow, Jean Pouget-Abadie, Mehdi Mirza, Bing Xu, David Warde-Farley, Sherjil Ozair, Aaron Courville, Yoshua Bengio
[Paper] [Code]
Run Example
InfoGAN
InfoGAN: Interpretable Representation Learning by Information Maximizing Generative Adversarial Nets
Authors
Xi Chen, Yan Duan, Rein Houthooft, John Schulman, Ilya Sutskever, Pieter Abbeel
[Paper] [Code]
Run Example
Result of varying continuous latent variable by row.
Least Squares GAN
Least Squares Generative Adversarial Networks
Authors
Xudong Mao, Qing Li, Haoran Xie, Raymond Y.K. Lau, Zhen Wang, Stephen Paul Smolley
[Paper] [Code]
Run Example
Pix2Pix
Unpaired Image-to-Image Translation with Conditional Adversarial Networks
Authors
Phillip Isola, Jun-Yan Zhu, Tinghui Zhou, Alexei A. Efros
[Paper] [Code]
Run Example
PixelDA
Unsupervised Pixel-Level Domain Adaptation with Generative Adversarial Networks
Authors
Konstantinos Bousmalis, Nathan Silberman, David Dohan, Dumitru Erhan, Dilip Krishnan
[Paper] [Code]
MNIST to MNIST-M Classification
Trains a classifier on images that have been translated from the source domain (MNIST) to the target domain (MNIST-M) using the annotations of the source domain images. The classification network is trained jointly with the generator network to optimize the generator for both providing a proper domain translation and also for preserving the semantics of the source domain image. The classification network trained on translated images is compared to the naive solution of training a classifier on MNIST and evaluating it on MNIST-M. The naive model manages a 55% classification accuracy on MNIST-M while the one trained during domain adaptation achieves a 95% classification accuracy.
Rows from top to bottom: (1) Real images from MNIST (2) Translated images from
MNIST to MNIST-M (3) Examples of images from MNIST-M
Relativistic GAN
The relativistic discriminator: a key element missing from standard GAN
Authors
Alexia Jolicoeur-Martineau
[Paper] [Code]
Run Example
Semi-Supervised GAN
Semi-Supervised Generative Adversarial Network
Authors
Augustus Odena
[Paper] [Code]
Run Example
Softmax GAN
Softmax GAN
Authors
Min Lin
[Paper] [Code]
Run Example
StarGAN
StarGAN: Unified Generative Adversarial Networks for Multi-Domain Image-to-Image Translation
Authors
Yunjey Choi, Minje Choi, Munyoung Kim, Jung-Woo Ha, Sunghun Kim, Jaegul Choo
[Paper] [Code]
Run Example
UNIT
Unsupervised Image-to-Image Translation Networks
Authors
Ming-Yu Liu, Thomas Breuel, Jan Kautz
[Paper] [Code]
Run Example
Wasserstein GAN
Wasserstein GAN
Authors
Martin Arjovsky, Soumith Chintala, Léon Bottou
[Paper] [Code]
Run Example
Wasserstein GAN GP
Improved Training of Wasserstein GANs
Authors
Ishaan Gulrajani, Faruk Ahmed, Martin Arjovsky, Vincent Dumoulin, Aaron Courville
[Paper] [Code]
Run Example
Wasserstein GAN DIV
Wasserstein Divergence for GANs
Authors
Jiqing Wu, Zhiwu Huang, Janine Thoma, Dinesh Acharya, Luc Van Gool
[Paper] [Code]
Run Example