Search code examples
tensorflowmachine-learningneural-networkbackpropagation

Unsure whether function breaks backpropagation


I have been tinkering around a lot with tensorflow in the past few days however I am quite unsure whether a function I wrote would break the backpropagation in a Neural network. I thought I'd ask here before I try to integrate this function in a NN. So the basic setup is I want to add two matricies with

op = tf.add(tfObject, tfImageBackground)

where tfImageBackground is some constant image. (i.e. an RGBA image of size 800, 800 with R = G = B = A = 0) and the tfObject is again a matrix with the same dimenstion however we get that with the function I am unsure about

def getObject(vector):
    objectId = vector[0]
    x = vector[1]
    y = vector[2]
    xEnd = baseImageSize-(x+objectSize)
    yStart =baseImageSize- (y+objectSize)

    padding = tf.convert_to_tensor([[x, xEnd], [yStart, y],[0,0]])

    RTensor = tfObjectMatrix[objectId,:,:,0:1]
    GTensor = tfObjectMatrix[objectId,:,:,1:2]
    BTensor = tfObjectMatrix[objectId,:,:,2:3]
    ATensor = tfObjectMatrix[objectId,:,:,3:4]

    paddedR = tf.pad(tensor = RTensor,
        paddings= padding,
        mode='Constant',
        name='padAverageRed',
        constant_values=255)

    ...
    generates padding for every channel
    ...

    finalTensor=tf.concat([paddedR, paddedG, paddedB, paddedA], 2)
    return finalTensor

The tfObjectMatrix is a list of images which never change. I did check wether I was able to generate a tf.gradient from the op, which turned out to work. I am unsure if that is sufficient for backpropagation to work though.

Thanks for you time and effort. Any input at all would be greatly appreciated.


Solution

  • TensorFlow will backpropagate to everything by default. As per your code, everything will receive gradients with a training operation from an optimizer. So to answer your question, backpropagation will work.

    The only thing to consider, is that you say tfObjectMatrix is a list of images that will not change. So you might not want it to receive any gradients. Therefore you might want to look into tf.stop_gradient() and maybe use it like OM = tf.stop_gradient( tfObjectMatrix ) and work with that OM in your function.