Generator

Bases: Module

A Generator model for generating images from a latent space representation.

This model acts as a generator in a Generative Adversarial Network (GAN), taking a random noise vector from a latent space and outputting a 2D image. It uses a series of linear layers with LeakyReLU activations, except for the output layer which is a linear layer. The final output is reshaped to a 2D image.

Attributes:
  • latent_space (int) –

    The size of the input latent space.

  • layers_config (list of tuple) –

    Configuration of the layers where each tuple contains (in_features, out_features, negative_slope) for LeakyReLU activated layers, and (in_features, out_features) for the final output layer.

  • model (Sequential) –

    The sequential model comprising the linear and activation layers.

Parameters:
  • latent_space (int, default: 100 ) –

    The size of the latent space from which random inputs are drawn.

Raises:
  • Exception

    If the layers configuration is not provided during the model instantiation, or if the input to the forward pass is None.

Methods:

Name Description
connected_layer

Constructs a series of connected layers based on the provided configuration.

forward

Passes the input through the model to generate an image.

Source code in generator.py
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
class Generator(nn.Module):
    """
    A Generator model for generating images from a latent space representation.

    This model acts as a generator in a Generative Adversarial Network (GAN),
    taking a random noise vector from a latent space and outputting a 2D image.
    It uses a series of linear layers with LeakyReLU activations, except for the
    output layer which is a linear layer. The final output is reshaped to a 2D image.

    Attributes:
        latent_space (int): The size of the input latent space.
        layers_config (list of tuple): Configuration of the layers where each tuple contains
                                       (in_features, out_features, negative_slope) for LeakyReLU
                                       activated layers, and (in_features, out_features) for the
                                       final output layer.
        model (nn.Sequential): The sequential model comprising the linear and activation layers.

    Args:
        latent_space (int): The size of the latent space from which random inputs are drawn.

    Raises:
        Exception: If the layers configuration is not provided during the model instantiation,
                   or if the input to the forward pass is None.

    Methods:
        connected_layer(layers_config): Constructs a series of connected layers based on the provided configuration.
        forward(x): Passes the input through the model to generate an image.
    """

    def __init__(self, latent_space=100):
        """
        Initializes the Generator model with the given latent space size and constructs
        the model layers based on a predefined configuration.

        Args:
            latent_space (int): The size of the latent space (default: 100).
        """
        self.latent_space = latent_space
        super(Generator, self).__init__()

        self.layers_config = [
            (self.latent_space, 128, 0.2),
            (128, 256, 0.2),
            (256, 512, 0.2),
            (512, 28 * 28),
        ]
        self.model = self.connected_layer(self.layers_config)

    def connected_layer(self, layers_config=None):
        """
        Constructs a series of connected layers based on the provided configuration.

        Args:
            layers_config (list of tuple): Layer configurations where each tuple contains
                                           (in_features, out_features, negative_slope) for LeakyReLU
                                           activated layers, and (in_features, out_features) for the
                                           final output layer.

        Returns:
            nn.Sequential: A sequential model comprising the linear and activation layers.

        Raises:
            Exception: If the layers configuration is not provided.
        """
        layers = OrderedDict()

        if layers_config is not None:
            for index, (in_features, out_features, negative_slope) in enumerate(
                layers_config[:-1]
            ):
                layers["{}_layer".format(index)] = nn.Linear(
                    in_features=in_features, out_features=out_features
                )
                layers["{}_activation".format(index)] = nn.ReLU(inplace=True)

            (in_features, out_features) = layers_config[-1]
            layers["out_layer"] = nn.Linear(
                in_features=in_features, out_features=out_features
            )

            return nn.Sequential(layers)

        else:
            raise Exception("Layers is not defined in the Geneator".capitalize())

    def forward(self, x):
        """
        Forward pass of the generator model.

        Args:
            x (Tensor): A batch of random noise vectors from the latent space.

        Returns:
            Tensor: A batch of 2D images generated from the input noise vectors.

        Raises:
            Exception: If the input x is None.
        """
        if x is not None:
            x = self.model(x)
        else:
            raise Exception("Input is not defined in the Genearator".capitalize())
        return x.reshape(-1, 1, 28, 28)

__init__(latent_space=100)

Initializes the Generator model with the given latent space size and constructs the model layers based on a predefined configuration.

Parameters:
  • latent_space (int, default: 100 ) –

    The size of the latent space (default: 100).

Source code in generator.py
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
def __init__(self, latent_space=100):
    """
    Initializes the Generator model with the given latent space size and constructs
    the model layers based on a predefined configuration.

    Args:
        latent_space (int): The size of the latent space (default: 100).
    """
    self.latent_space = latent_space
    super(Generator, self).__init__()

    self.layers_config = [
        (self.latent_space, 128, 0.2),
        (128, 256, 0.2),
        (256, 512, 0.2),
        (512, 28 * 28),
    ]
    self.model = self.connected_layer(self.layers_config)

connected_layer(layers_config=None)

Constructs a series of connected layers based on the provided configuration.

Parameters:
  • layers_config (list of tuple, default: None ) –

    Layer configurations where each tuple contains (in_features, out_features, negative_slope) for LeakyReLU activated layers, and (in_features, out_features) for the final output layer.

Returns:
  • nn.Sequential: A sequential model comprising the linear and activation layers.

Raises:
  • Exception

    If the layers configuration is not provided.

Source code in generator.py
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
def connected_layer(self, layers_config=None):
    """
    Constructs a series of connected layers based on the provided configuration.

    Args:
        layers_config (list of tuple): Layer configurations where each tuple contains
                                       (in_features, out_features, negative_slope) for LeakyReLU
                                       activated layers, and (in_features, out_features) for the
                                       final output layer.

    Returns:
        nn.Sequential: A sequential model comprising the linear and activation layers.

    Raises:
        Exception: If the layers configuration is not provided.
    """
    layers = OrderedDict()

    if layers_config is not None:
        for index, (in_features, out_features, negative_slope) in enumerate(
            layers_config[:-1]
        ):
            layers["{}_layer".format(index)] = nn.Linear(
                in_features=in_features, out_features=out_features
            )
            layers["{}_activation".format(index)] = nn.ReLU(inplace=True)

        (in_features, out_features) = layers_config[-1]
        layers["out_layer"] = nn.Linear(
            in_features=in_features, out_features=out_features
        )

        return nn.Sequential(layers)

    else:
        raise Exception("Layers is not defined in the Geneator".capitalize())

forward(x)

Forward pass of the generator model.

Parameters:
  • x (Tensor) –

    A batch of random noise vectors from the latent space.

Returns:
  • Tensor

    A batch of 2D images generated from the input noise vectors.

Raises:
  • Exception

    If the input x is None.

Source code in generator.py
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
def forward(self, x):
    """
    Forward pass of the generator model.

    Args:
        x (Tensor): A batch of random noise vectors from the latent space.

    Returns:
        Tensor: A batch of 2D images generated from the input noise vectors.

    Raises:
        Exception: If the input x is None.
    """
    if x is not None:
        x = self.model(x)
    else:
        raise Exception("Input is not defined in the Genearator".capitalize())
    return x.reshape(-1, 1, 28, 28)

total_params(model=None)

Calculates the total number of parameters in a given PyTorch model.

The function iterates over all parameters in the model and sums their number of elements to get the total parameter count.

Parameters:
  • model (torch.nn.Module, optional): The model for which the total number of parameters is to be calculated.
Returns:
  • total_params (int): The total number of parameters in the model.
Raises:
  • Exception: If the model is not defined properly (i.e., model is None).
Source code in generator.py
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
def total_params(model=None):
    """
    Calculates the total number of parameters in a given PyTorch model.

    The function iterates over all parameters in the model and sums their number of elements to get the total parameter count.

    ### Parameters:
    - `model` (torch.nn.Module, optional): The model for which the total number of parameters is to be calculated.

    ### Returns:
    - `total_params` (int): The total number of parameters in the model.

    ### Raises:
    - Exception: If the model is not defined properly (i.e., `model` is None).
    """
    total_params = 0
    if model is not None:
        for _, params in model.named_parameters():
            total_params += params.numel()
    else:
        raise Exception("Model is not defined properly".capitalize())

    return total_params