Understand torch.optim.lr_scheduler.StepLR() with Examples – PyTorch Tutorial

By | August 9, 2022

torch.optim.lr_scheduler.StepLR() allows us to change the learning rate when training a model. In this tutorial, we will use some examples to show you how to use it.

torch.optim.lr_scheduler.StepLR()

It is defined as:

torch.optim.lr_scheduler.StepLR(optimizer, step_size, gamma=0.1, last_epoch=- 1, verbose=False)

It will decay the learning rate of each parameter group by gamma every step_size epochs

Parameters

  • optimizer (Optimizer) – Wrapped optimizer.
  • step_size (int) – Period of learning rate decay. It determines how to decay the learning rate by epoch.
  • gamma (float) – Multiplicative factor of learning rate decay. Default: 0.1.
  • last_epoch (int) – The index of last epoch. Default: -1.
  • verbose (bool) – If True, prints a message to stdout for each update. Default: False.

How to use torch.optim.lr_scheduler.StepLR()?

We will use some examples to show you how to do.

Look at this example:

import torch
from matplotlib import pyplot as plt

lr_list = []
model = [torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))]
LR = 0.01
#optimizer = Ranger(model, lr=LR)
optimizer = torch.optim.Adam(model, lr=LR)
scheduler = torch.optim.lr_scheduler.StepLR(optimizer, step_size= 1, gamma=0.97)
for epoch in range(200):
    data_size = 400
    for i in range(data_size):
        optimizer.zero_grad()
        #loss
        optimizer.step()

    scheduler.step(epoch)
    lr_list.append(optimizer.state_dict()['param_groups'][0]['lr'])
plt.plot(range(len(lr_list)),lr_list,color = 'r')
plt.show()

In this example: step_size = 1, which means we will decay the learning rate every epoch.

Run this code, we will see:

Understand torch.optim.lr_scheduler.StepLR() with Examples 1

If step_size = 10

scheduler = torch.optim.lr_scheduler.StepLR(optimizer, step_size= 10, gamma=0.97)

It means we will decay the learning rate every 10 epoch.

Run this code, we will see:

Understand torch.optim.lr_scheduler.StepLR() with Examples 2

If step_size = 25

scheduler = torch.optim.lr_scheduler.StepLR(optimizer, step_size= 25, gamma=0.97)

We will decay the learning rate every 25 epoch.

We will see:

Understand torch.optim.lr_scheduler.StepLR() with Examples 3

Leave a Reply