When we are building a pytorch model, we may use torch.nn.utils.clip_grad_norm_() to clip gradiet. Here is the tutorial.
Understand torch.nn.utils.clip_grad_norm_() with Examples: Clip Gradient – PyTorch Tutorial
However, we do not need to use torch.nn.utils.clip_grad_norm_() to clip gradient in pytorch lightning. In this tutorial, we will introduce the reason.
Gradient Clipping in PyTorch Lightning
PyTorch Lightning Trainer supports clip gradient by value and norm. They are:
It means we do not need to use torch.nn.utils.clip_grad_norm_() to clip.
For example:
# DEFAULT (ie: don't clip) trainer = Trainer(gradient_clip_val=0) # clip gradients' global norm to <=0.5 using gradient_clip_algorithm='norm' by default trainer = Trainer(gradient_clip_val=0.5) # clip gradients' maximum magnitude to <=0.5 trainer = Trainer(gradient_clip_val=0.5, gradient_clip_algorithm="value")