TensorFlow Implements Cross Entropy Loss with Customized Function – TensorFlow Tutorial

By | December 4, 2019

TensorFlow provides some functions to compute cross entropy loss, however, these functions will compute sigmoid or softmax value for logists.

For example:

tf.nn.sigmoid_cross_entropy_with_logits(labels=None, logits=None) will compute sigmoid value of logits in function.

tf.nn.softmax_cross_entropy_with_logits(labels=None, logits=None) will compute softmax value of logits in function.

How to compute cross entropy loss without computing softmax or sigmoid value of logits? In this tutorial, we will tell you how to do.

Cross entropy loss

Cross entropy loss is defined as:

Cross entropy equation 2

We can create a function to compute the value of it by tensorflow.

Create a customized function to calculate cross entropy loss

Here we create a function to compute the cross entropy loss between logits and labels.

def compute_cross_entropy(logits, labels):
        cross_entropy= -tf.reduce_mean(tf.reduce_sum(labels*tf.log(tf.clip_by_value(logits,1e-10,1.0)), axis = 1))
        return cross_entropy

We shoud make the value of logits and labels in [1e-10, 1].

Notice: sum(logits) = 1 and sum(labels) = 1.

How to use this function?

Here is an example to show the usage of compute_cross_entropy()

Create two distributions logits and labels

import tensorflow as tf
import numpy as np

logits = tf.Variable(np.array([[1, 2, 3],[4, 5, 6]]), dtype = tf.float32)
labels = tf.Variable(np.array([[-1, 2, 0],[3, 1, -4]]), dtype = tf.float32)

logits = tf.nn.softmax(logits, axis = 1)
labels = tf.nn.softmax(labels, axis = 1)

Calculate the cross entropy loss between logits and labels

loss= compute_cross_entropy(logits = logits , labels = labels)

init = tf.global_variables_initializer() 
init_local = tf.local_variables_initializer()

with tf.Session() as sess:
    sess.run([init, init_local])
    print(sess.run([loss]))

The loss value is:

[1.8111573]

Why we should use cross entropy as loss function?

You can read this tutorial:

Understand Why Use Cross Entropy as Loss Function in Classification Problem – Deep Learning Tutorial

Leave a Reply