DiscriminatorLSGAN

Inheritance Diagram

Inheritance diagram of ashpy.losses.gan.DiscriminatorLSGAN

class ashpy.losses.gan.DiscriminatorLSGAN[source]

Bases: ashpy.losses.gan.AdversarialLossD

Least square Loss for discriminator.

Reference: Least Squares Generative Adversarial Networks 1 .

Basically the Mean Squared Error between the discriminator output when evaluated in fake samples and 0 and the discriminator output when evaluated in real samples and 1: For the unconditioned case this is:

\[L_{D} = \frac{1}{2} E[(D(x) - 1)^2 + (0 - D(G(z))^2]\]

where x are real samples and z is the latent vector.

For the conditioned case this is:

\[L_{D} = \frac{1}{2} E[(D(x, c) - 1)^2 + (0 - D(G(c), c)^2]\]

where c is the condition and x are real samples.

1

https://arxiv.org/abs/1611.04076

Methods

__init__()

Initialize loss.

Attributes

fn

Return the Keras loss function to execute.

global_batch_size

Global batch size comprises the batch size for each cpu.

weight

Return the loss weight.

class LeastSquareLoss[source]

Bases: tensorflow.python.keras.losses.Loss

Least Square Loss as tf.keras.losses.Loss.

__init__()[source]

Initialize the Loss.

Return type

None

call(d_real, d_fake)[source]

Compute the Least Square Loss.

Parameters
  • d_real (tf.Tensor) – Discriminator evaluated in real samples.

  • d_fake (tf.Tensor) – Discriminator evaluated in fake samples.

Return type

Tensor

Returns

tf.Tensor – Loss.

property reduction

Return the reduction type for this loss.

Return type

ReductionV2

Returns

tf.keras.losses.Reduction – Reduction.

__init__()[source]

Initialize loss.

Return type

None