Understand torch.optim.lr_scheduler.ExponentialLR() with Examples – PyTorch Tutorial

By | November 24, 2022

torch.optim.lr_scheduler.ExponentialLR() is often used to change the learning rate in pytorch. In this tutorial, we will use some examples to show you how to use it correctly.

Syntax

torch.optim.lr_scheduler.ExponentialLR() is defined as:

torch.optim.lr_scheduler.ExponentialLR(optimizer, gamma, last_epoch=- 1, verbose=False)

It will decay the learning rate of each parameter group by gamma every epoch. When last_epoch=-1, sets initial lr as lr.

It is different from torch.optim.lr_scheduler.StepLR().

gamma: it is the most parameter. We can set it to 0.97, 0.99, 0.999875 et al.

How to use torch.optim.lr_scheduler.ExponentialLR()?

Here we will use some examples to show you how to use it.

For example:

import torch

from matplotlib import pyplot as plt
lr_list = []
model = [torch.nn.Parameter(torch.randn(2, 2, requires_grad=True))]
LR = 0.01

optimizer = torch.optim.Adam(model, lr=LR)
scheduler = torch.optim.lr_scheduler.ExponentialLR(optimizer, gamma=0.97)

for epoch in range(0,200):
    data_size = 400
    # when epoch = 0
    for i in range(data_size):
        optimizer.zero_grad()
        #loss
        optimizer.step()

    lr_list.append(optimizer.state_dict()['param_groups'][0]['lr'])
    scheduler.step() # update lr with gamma

print(lr_list)
plt.plot(range(len(lr_list)),lr_list,color = 'r')
plt.show()

Run this code, we will see:

Understand torch.optim.lr_scheduler.ExponentialLR() with Examples - PyTorch Tutorial

The learning rate is:

[0.01, 0.0097, 0.009409, 0.00912673, ......, 2.3311762989647067e-05]

In this code, the gamma=0.97, the initial lr = 0.01, we can find:

When:

  • epoch = 0, the lr = 0.01
  • epoch = 1, the lr = 0.01 * 0.97 = 0.0097
  • epoch = 2, the lr = 0.0097*0.97 = 0.009409

Leave a Reply