Create a MLP with Dropout in PyTorch – PyTorch Tutorial

By | May 17, 2022

MLP is the basic unit in neural network. It is often used with dropout. In this tutorial, we will introduce you how to create a mlp network with dropout in pytorch.

Here is an example:

import torch
import torch.nn as nn

class MLP(nn.Module):
    def __init__(self, n_in, n_out, dropout=0.5):
        super().__init__()

        self.linear = nn.Linear(n_in, n_out)
        self.activation = nn.GELU()
        self.dropout = nn.Dropout(dropout)

    def forward(self, x):
        x = self.linear(x)
        x = self.activation(x)
        x = self.dropout(x)
        return x

In this example, we use GELU activation function and Dropout after it.

An Explain to GELU Activation Function – Deep Learning Tutorial

Understand Dropout – Place it Before or After Activation Function in Dense Layer?

In order to understand torch.nn.Dropout(), we can read:

Understand torch.nn.Dropout() with Examples – PyTorch Tutorial

Then, we can use this MLP as follows:

x = torch.randn(5,5)
mlp = MLP(5, 2)
y = mlp(x)
print(x)
print(y)

Run this code, we will see:

tensor([[ 0.7958,  0.6588,  1.8197,  0.7679, -0.0483],
        [-0.0041, -0.1534, -0.0375, -2.1821, -1.2527],
        [-0.5517, -1.3701,  1.5059,  0.2818, -0.1294],
        [ 0.5329,  0.5917, -1.4886,  1.2256,  0.9414],
        [ 0.3836, -0.7467, -0.2634, -0.4923,  0.9796]])
tensor([[-0.0000, -0.3348],
        [-0.2718, -0.0000],
        [-0.0000, -0.1885],
        [ 0.5827,  0.0000],
        [-0.0000,  0.5262]], grad_fn=<MulBackward0>)

Leave a Reply