When you are using tensorflow to develop ai application, you may encounter this error: Tensor must be from the same graph as Tensor. In this tutorial, we will introduce you how to fix it.
Look at this example code below:
import tensorflow as tf import numpy as np g1 = tf.Graph() g2 = tf.Graph() # create a, b in graph g1 with g1.as_default(): a = tf.Variable(np.array([1,2], dtype = np.float32)) b = tf.Variable(np.array([2,2], dtype = np.float32)) result1 = a + b # create c, d in graph g2 with g2.as_default(): c = tf.Variable(np.array([1,2], dtype = np.float32)) d = tf.Variable(np.array([2,2], dtype = np.float32)) result2 = a + b + c +d # create session with tf.Session(graph=g1) as sess: out = sess.run(result1) print(out) with tf.Session(graph=g2) as sess: out = sess.run(result2) print(out)
In this code, we create two tensorflow graph g1 and g2.
We create two variables a, b in g1 and create two variables c, d in g2.
Please notice the result2 in g2.
This operation uses the variables in g1 and g2.
Run this code, you will get error like below.
Why does this error occur?
Because variables in different tensorflow graph can not be shared. a and b is in g1, they can not be used in g2.
How to fix this error?
You should make all variables in a graph can not be used in other graph.
So you should not use a and b in g2.