Understand TensorFlow tf.sequence_mask(): Create a Mask Tensor to Shield Elements – TensorFlow Tutorial

By | May 30, 2020

To shield some elements in tensor, we can use a mask tesnor. In this tutorial, we will introduce tf.sequence_mask(), which can create a mask tensor.

Syntax

tf.sequence_mask(
    lengths,
    maxlen=None,
    dtype=tf.bool,
    name=None
)

Create a mask tensor.

lengths: integer tensor, all its values <= maxlen.

maxlen: scalar integer tensor, size of last dimension of returned tensor. Default is the maximum value in lengths

dtype: output type of the resulting tensor.

How to use tf.sequence_mask()?

We will write some examples to illustrate how to use it.

Here is an example.

import tensorflow as tf

mask = tf.sequence_mask([2, 3, 1], maxlen = 4)

with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    print(sess.run(mask))

Run this code, you will get the result:

understand tensorflow tf.sequence_mask() with examples

From the result, we can find the dimension of lengths is (1, ), the dimension of mask is (lengths, maxlen)

Create a float32 mask tensor

We can set dtype for tf.sequence_mask() to create a float32 mask tensor. Here is an example:

mask = tf.sequence_mask([2, 3, 1], maxlen = 4, dtype = tf.float32)

The mask is:

[[1. 1. 0. 0.]
 [1. 1. 1. 0.]
 [1. 0. 0. 0.]]

True will be converted to 1.0 and False to 0.

How to use tf.sequence_mask() in NLP?

In nlp, the length of sentences is not fixed, in order to calculate the atttention we need to use tf.sequence_mask().

Here is an example:

Calculate Attention(Softmax) of Variable Length Sequence in TensorFlow – TensorFlow Tutorial

Leave a Reply