文章目录
- cycleGAN原理
-
- 一般的unsupervised conditional generation的处理办法
- cycleGAN处理unsupervised conditional generation的办法:
-
- 比较正常的思路:
- cycleGAN的思路:
- cycleGAN实现:
-
- Discriminator的结构:
- Discriminator的Loss:
- Generator的结构:
-
- Generator的结构图:
- Residual Block:
- Autoencoder实现:
- Generator的Loss:
- 整体结构:
- 优化器:
- 训练过程:
cycleGAN原理
一般的unsupervised conditional generation的处理办法
参考一下,信息量很大
Discriminator的Loss:
这个就是平常的GAN的Discriminator,让real loss底,fake loss高
Generator的结构:
Generator的结构图:
这个一个典型的autoencoder结构
Autoencoder实现:
class CycleGenerator(nn.Module): def __init__(self, conv_dim=64, n_res_blocks=6):super(CycleGenerator, self).__init__()# 1. Define the encoder part of the generator # initial convolutional layer given, belowself.conv1 = conv(3, conv_dim, 4)self.conv2 = conv(conv_dim, conv_dim*2, 4)self.conv3 = conv(conv_dim*2, conv_dim*4, 4)# 2. Define the resnet part of the generator# Residual blocksres_layers = []for layer in range(n_res_blocks): res_layers.append(ResidualBlock(conv_dim*4))# use sequential to create these layersself.res_blocks = nn.Sequential(*res_layers)# 3. Define the decoder part of the generator# two transpose convolutional layers and a third that looks a lot like the initial conv layerself.deconv1 = deconv(conv_dim*4, conv_dim*2, 4)self.deconv2 = deconv(conv_dim*2, conv_dim, 4)# no batch norm on last layerself.deconv3 = deconv(conv_dim, 3, 4, batch_norm=False) def forward(self, x):"""Given an image x, returns a transformed image."""# define feedforward behavior, applying activations as necessaryout = F.relu(self.conv1(x))out = F.relu(self.conv2(out))out = F.relu(self.conv3(out))out = self.res_blocks(out)out = F.relu(self.deconv1(out))out = F.relu(self.deconv2(out))# tanh applied to last layerout = F.tanh(self.deconv3(out))return out# helper deconv functiondef deconv(in_channels, out_channels, kernel_size, stride=2, padding=1, batch_norm=True): """Creates a transpose convolutional layer, with optional batch normalization. """ layers = 声明:本站部分文章及图片源自用户投稿,如本站任何资料有侵权请您尽早请联系jinwei@zod.com.cn进行处理,非常感谢!