Change Learning Rate By Step When Training a PyTorch Model Initiatively – PyTorch Tutorial

By | April 28, 2022

When we are training a pytorch model, we may change learning rate by training step. In this tutorial, we will introduce you how to do.

Create an optimizer

In order to change the learning rate, we should create an optimizer. For example:

import torch

class CustomNN(torch.nn.Module):
    def __init__(self):
        super().__init__()

        self.a = torch.nn.Parameter(torch.randn(()))
        self.b = torch.nn.Parameter(torch.randn(()))

    def forward(self, x):
        pass

cn = CustomNN()
all_params = cn.parameters()

optimizer = torch.optim.Adam(all_params)

In this example, we have created an Adam optimizer.

List all parameters in an optimizer

We will use optimizer.param_groups to show all parameters in an optimizer. Here is the example code.

print(optimizer.param_groups)

Run this code, we will see:

[{'params': [Parameter containing:
tensor(0.2792, requires_grad=True), Parameter containing:
tensor(0.8839, requires_grad=True)], 'lr': 0.001, 'betas': (0.9, 0.999), 'eps': 1e-08, 'weight_decay': 0, 'amsgrad': False}]

It means the default learning rate (lr) is 0.001.

Understand PyTorch optimizer.param_groups with Examples – PyTorch Tutorial

Change learning rate by training step

Then, we can start to change the learning rate of an optimizer.

lr = optimizer.param_groups[0]["lr"]
print(lr)

for param_group in optimizer.param_groups:
    param_group['lr'] = 0.01

lr = optimizer.param_groups[0]["lr"]
print(lr)

optimizer.param_groups[0]["lr"] = 0.05
lr = optimizer.param_groups[0]["lr"]
print(lr)

Run this code, we will see:

0.001
0.01
0.05

In this code, we use two ways to change the value of learning rate.

(1) we can traverse optimizer.param_groups, then change current learning rate.

for param_group in optimizer.param_groups:
    param_group['lr'] = 0.01

Then we can find current learning is updated to 0.01

(2) we also can set a new value for optimizer.param_groups[0][“lr”]. For example:

optimizer.param_groups[0]["lr"] = 0.05

Then, we can find current learning rate is set to 0.05.

How to change learning rate by step?

From above, we can find it is easy to change the learing rate by step. Here is an example code:

if self.step_num > self.warmup_steps:
    self.lr = self.max_lr * np.exp(-1.0 * self.k * (self.step_num - self.warmup_steps))
    self.lr = max(self.lr, self.min_lr)
    for param_group in self.optimizer.param_groups:
        param_group['lr'] = self.lr
self.optimizer.step()

Leave a Reply