Understand torch.nn.functional.pad() with replicate Mode – PyTorch Tutorial

By | May 22, 2023

There are some modes in torch.nn.functional.pad(), such as ‘constant‘, ‘reflect‘, ‘replicate‘ or ‘circular‘. In this tutorial, we will introduce you how to use this function with mode = replicate.

How to use torch.nn.functional.pad()?

If mode = constant, we can check this tutorial:

Understand torch.nn.functional.pad() with Examples – PyTorch Tutorial

How about mode=’replicate’?

Here we will use an example to explain:

import torch
import torch.nn.functional as F

# Create a dummy tensor for demonstration
tensor = torch.rand((2, 3, 4))
padding = (2,1)

# Pad the tensor
padded_tensor = F.pad(tensor, padding, mode='constant', value=0)

print(padded_tensor)

In this code, when mode=’constant’, we will get:

tensor([[[0.0000, 0.0000, 0.0235, 0.8200, 0.8174, 0.8588, 0.0000],
         [0.0000, 0.0000, 0.9864, 0.8185, 0.3279, 0.4748, 0.0000],
         [0.0000, 0.0000, 0.6444, 0.1466, 0.1752, 0.9925, 0.0000]],

        [[0.0000, 0.0000, 0.6968, 0.9210, 0.3821, 0.9656, 0.0000],
         [0.0000, 0.0000, 0.6591, 0.9207, 0.4244, 0.6129, 0.0000],
         [0.0000, 0.0000, 0.0444, 0.9455, 0.5917, 0.7691, 0.0000]]])

This result is easy to understand if we have read the tutorial above.

If mode=’replicate’

padded_tensor = F.pad(tensor, padding, mode='replicate')
print(padded_tensor)

We will get:

tensor([[[0.0235, 0.0235, 0.0235, 0.8200, 0.8174, 0.8588, 0.8588],
         [0.9864, 0.9864, 0.9864, 0.8185, 0.3279, 0.4748, 0.4748],
         [0.6444, 0.6444, 0.6444, 0.1466, 0.1752, 0.9925, 0.9925]],

        [[0.6968, 0.6968, 0.6968, 0.9210, 0.3821, 0.9656, 0.9656],
         [0.6591, 0.6591, 0.6591, 0.9207, 0.4244, 0.6129, 0.6129],
         [0.0444, 0.0444, 0.0444, 0.9455, 0.5917, 0.7691, 0.7691]]])

Understand torch.nn.functional.pad() with replicate Mode - PyTorch Tutorial

In this code, padding = (2,1)

We can find the first and last value are copied to pad tensor.

The first value is copied 2 times  and the last value is copy 1 time.