TensorFlow can allow us to select elements from a tensor by ids. Here is an example:
Understand tf.nn.embedding_lookup(): Pick Up Elements by Ids
However, does this function support gradient operation in tensorflow? To address this issue, we will discuss this topic in this tutorial.
Here is an example:
First, we create a tensor, then we pick up some elements from it.
import tensorflow as tf import numpy as np c = tf.Variable(np.array([[2, 1],[5, 5], [2, 2]]), dtype = tf.float32) m = tf.nn.embedding_lookup(c, ids=[1, 0])
Here, we select two elements from c and save it to m.
Implement some tensorflow operations to m.
n = tf.Variable(np.array([[1, 2],[2, 2]]), dtype = tf.float32) u = tf.matmul(m,n)
Finally, we will compute c gradient based on u.
r = tf.gradients(u, c) init = tf.global_variables_initializer() init_local = tf.local_variables_initializer() with tf.Session() as sess: sess.run([init, init_local]) print(sess.run([r]))
Run this python code, we will get result:
[[IndexedSlicesValue(values=array([[3., 4.], [3., 4.]], dtype=float32), indices=array([1, 0]), dense_shape=array([3, 2]))]]
Which means tf.nn.embedding_lookup() supports backprop and gradient operation. We can use it our model safely.