In general when we multiply a vector v
of dimension 1*n
with a tensor T
of dimension m*n*k
, we expect to get a matrix/tensor of dimension m*k
/m*1*k
. This means that our tensor has m
slices of matrices with dimension n*k
, and v
is multiplied to each matrix and the resulting vectors are stacked together. In order to do this multiplication in tensorflow
, I came up with the following formulation. I am just wondering if there is any built-in function that does this standard multiplication straightforward?
T = tf.Variable(tf.random_normal((m,n,k)), name="tensor")
v = tf.Variable(tf.random_normal((1,n)), name="vector")
c = tf.stack([v,v]) # m times, here set m=2
output = tf.matmul(c,T)
You can do it with:
tf.reduce_sum(tf.expand_dims(v,2)*T,1)
Code:
m, n, k = 2, 3, 4
T = tf.Variable(tf.random_normal((m,n,k)), name="tensor")
v = tf.Variable(tf.random_normal((1,n)), name="vector")
c = tf.stack([v,v]) # m times, here set m=2
out1 = tf.matmul(c,T)
out2 = tf.reduce_sum(tf.expand_dims(v,2)*T,1)
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
n_out1 = sess.run(out1)
n_out2 = sess.run(out2)
#both n_out1 and n_out2 matches