Suppose that I have two tensors
x = tf.constant([["a", "b"], ["c", "d"]]),
y = tf.constant([["b", "c"], ["d", "c"]])
Then, I want to get the following tensors:
x_ = [[0, 1], [1, 1]]
y_ = [[1, 0], [1, 1]]
How is x_ constructed?
The (0,1) entry in the first row of x is in the first row of y, so we set the (0,1) entry of x_ equal to 1. In addition, both the (1,0) and (1,1) entries of x are in the second row of y, so we set the (1,0) and (1,1) entries of x_ equal to 1.
How is y_ constructed?
The (0,0) entry of the first row of y is in the first row of x, so we set the (0,0) entry of y_ equal to 1. In addition, both the (1,0) and (1,1) entries of y are in the second row of x, so we set the (1,0) and (1,1) entries of y_ equal to 1.
Ugliest solution possible:
import tensorflow as tf
x = tf.constant([["a", "b"], ["c", "d"]])
y = tf.constant([["b", "c"], ["d", "c"]])
def func(x, y):
def compare(v, w):
return tf.cast(tf.equal(v, w), tf.int32)
y_rows = tf.range(tf.shape(y)[0])
x_cols = tf.range(tf.shape(x)[1])
res = tf.map_fn(lambda t: tf.map_fn(lambda z: compare(x[t,z],y[t]),
x_cols, dtype=tf.int32), y_rows, dtype=tf.int32)
res = tf.reduce_max(res, axis=-1)
return res
res1 = func(x,y)
res2 = func(y,x)
sess = tf.Session()
print(sess.run(res1))
print(sess.run(res2))
[[0 1]
[1 1]]
[[1 0]
[1 1]]
Don't know whether it has gradient. Perhaps someone will propose something better