Skip to content
/ GAN Public
forked from mayank-soni/GAN

GANs 101 and its Applications: An exploration of DCGAN, WGAN, LSGAN and GAN transfer learning on the CelebA dataset

Notifications You must be signed in to change notification settings

davzoku/GAN

 
 

Repository files navigation

GANs 101 and its Applications

Binder

This repository is for AI Singapore's AIAP Batch 12 Group Presentation Topic: "GANs 101 and its Applications".

In this repository, we generated fake celebrity images based on the CelebA Dataset using 3 different Generative Adversarial Networks (GANs), namely,

  • DCGAN (Deep Convolution GAN)
  • WGAN (Wasserstein GAN)
  • LSGAN (Least Squares GAN)

We also further explored the possibility of transfer learning on GANs in an attempt to reduce time and resources to train a model from scratch.

The results of our findings is briefly discussed below. Please refer to our article for the full details.

Installation

There are 2 options:

  1. Experiment this repository online using Binder
  2. Alternatively, you may clone this repository locally.

To run it locally, install relevant dependencies using one of the environment.yml files available.

  • Use environment-cuda.yml for running PyTorch with CUDA.
  • Use environment-m1.yml for run this repository on Apple Silicon.
  • Additionally, environment.lock.yml is added to replicate the exact dependency tree while developing this repository on Windows.
# install conda environment
conda install --file environment-{arch}.yml

# activate conda environment
conda activate gan-101

Usage

  1. Open the Jupyter Notebook: gan-on-celeba-dataset-wgan.ipynb
  2. Most of the configuration can be found on the second code block, specifically, different types of GANs can be experiment by setting the MODE as either gan, lsgan or wgan.
  3. Set TRANSFER_LEANRING as True to enable transfer learning.

Results

DCGAN

DCGAN training results using 10000 samples with 60 epochs:

results-dcgan

LSGAN

LSGAN training results using 10000 samples with 60 epochs:

results-lsgan

WGAN

WGAN training results using 10000 samples with 60 epochs:

results-wgan

Transfer Learning

Comparing results from scratch with pretrained model with same hyperparameters such as learning rate and batch sizes.

Here is the result of LSGAN from scratch at different epochs: Results From Scratch

Here is the result of LSGAN + Pretrained model using vgg16_bn at different epochs.

Transfer Learning Results using VGG16_bn

The pretrained model performs worse than the scratch model upon visual inspection, possibly due to differences between the source data and target dataset. Hyperparameter tuning may improve the pretrained model, and using a lower layer of vgg16_bn could be explored.

Authors

This program is developed by apprentices from Batch 12 of AI Singapore's Apprenticeship Program, with contributions from the following people (in alphabetical order):

References

Further Exploration

Check out the work from other groups too:

About

GANs 101 and its Applications: An exploration of DCGAN, WGAN, LSGAN and GAN transfer learning on the CelebA dataset

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 100.0%