An Introduction to PyTorch Lightning Gradient Clipping – PyTorch Lightning Tutorial

By | January 18, 2023

When we are building a pytorch model, we may use torch.nn.utils.clip_grad_norm_() to clip gradiet. Here is the tutorial.

Understand torch.nn.utils.clip_grad_norm_() with Examples: Clip Gradient – PyTorch Tutorial

However, we do not need to use torch.nn.utils.clip_grad_norm_() to clip gradient in pytorch lightning. In this tutorial, we will introduce the reason.

Gradient Clipping in PyTorch Lightning

PyTorch Lightning Trainer supports clip gradient by value and norm. They are:

Gradient Clipping in PyTorch Lightning

It means we do not need to use torch.nn.utils.clip_grad_norm_() to clip.

For example:

# DEFAULT (ie: don't clip)
trainer = Trainer(gradient_clip_val=0)

# clip gradients' global norm to <=0.5 using gradient_clip_algorithm='norm' by default
trainer = Trainer(gradient_clip_val=0.5)

# clip gradients' maximum magnitude to <=0.5
trainer = Trainer(gradient_clip_val=0.5, gradient_clip_algorithm="value")