Tutorial Example

Implement Softmax Cross-entropy Loss with Masking in TensorFlow – TensorFlow Tutorial

We often need to process variable length sequence in deep learning. In that situation, we will need use mask in our model. In this tutorial, we will introduce how to calculate softmax cross-entropy loss with masking in TensorFlow.

Softmax cross-entropy loss

In tensorflow, we can use tf.nn.softmax_cross_entropy_with_logits() to compute cross-entropy. For example:

loss = tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=labels)

However, how to calculate softmax cross-entropy loss with masking?

We will use an tensorflow function to implement it.

Calculate softmax cross-entropy loss with masking

This function is:

    def masked_softmax_cross_entropy(logits, labels, mask):
        """Softmax cross-entropy loss with masking."""
        loss = tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=labels)
        mask = tf.cast(mask, dtype=tf.float32)
        mask /= tf.reduce_mean(mask)
        loss *= mask
        return tf.reduce_mean(loss)

In order to get mask in tensorflow, you can use tf.sequence_mask() function, here is the tutorial:

Understand TensorFlow tf.sequence_mask(): Create a Mask Tensor to Shield Elements – TensorFlow Tutorial