Skip to content

shp216/VanillaGAN_pytorch

Repository files navigation

layout title categories tag toc
single
Vanilla GAN
GAN
python
DL
GAN
true

Vanilla GAN

The basic model that expresses GAN

Vanilla_GAN1

Model

Using nn.Linear in Model and to limit values 0~1, using sigmoid and tanh function (+ denorm)

Discriminator = nn.Sequential(
            		nn.Linear(image_size, 256),
            		nn.LeakyReLU(0.2),
            		nn.Linear(256, 256),
            		nn.LeakyReLU(0.2),
            		nn.Linear(256, 1),
            		nn.Sigmoid(),
        				)

		Generator = nn.Sequential(
            		nn.Linear(latent_size, 256),
            		nn.ReLU(),
            		nn.Linear(256, 256),
            		nn.BatchNorm1d(256),
            		nn.ReLU(),
            		nn.Linear(256, image_size),
            		nn.Tanh()
        				)

Optimizer

Using Adam Optimizer

self.G_optimizer =
torch.optim.Adam(self.G.parameters(), self.g_lr)
self.D_optimizer =
torch.optim.Adam(self.D.parameters(), self.d_lr)

Loss Function

loss_func

Discriminator learns to decide D(x)->1, D(G(z))->0

outputs = self.D(images)
d_loss_real = self.criterion(outputs, real_labels)

outputs = self.D(self.G(z))
d_loss_fake = self.criterion(outputs, fake_labels)

d_loss = d_loss_real + d_loss_fake

Generator learns to decide D(G(z))->1

outputs = self.D(self.G(z))
g_loss = self.criterion(outputs, real_labels)

Results

Generated_img

$V(G, D) = E_ {x \sim p_ {data} (x)} [ \log(D(x)) ] + E_ {z \sim p_ {z} (z)} [ \log(1-D(G(z))) ]$

-All of the codes are committed in my Github.

About

Studying VanillaGAN with pytorch

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages