Understand Leaky ReLU Activation Function: A Beginner Guide – Deep Learning Tutorial

By | August 23, 2020

Leaky ReLU is an activation function in deep learning, it often is used in graph attention networks. In this tutorial, we will introduce it for deep learning beginners.

What is Leaky ReLU?

Mathematically, Leaky ReLU is defined as follows (Maas et al., 2013)

understand Leaky ReLU

The graph of it is:

The graph of Leaky ReLU

How to compute Leaky ReLU?

In tensorflow, we can use tf.nn.leaky_relu() function to compute it.

Leave a Reply