Search code examples
pythontensorflowkerasgenerative-adversarial-network

Convert a tensor from 128,128,3 to 129,128,3 and the 1,128,3 values padded to that tensor happens later


This is my piece of code for GAN where the model is being initialized, everything is working and only the relevant code to the problem is present here:

z = Input(shape=(100+384,))
img = self.generator(z)
print("before: ",img)    #128x128x3 shape, dtype=tf.float32
temp = tf.get_variable("temp", [1, 128, 3],dtype=tf.float32)
img=tf.concat(img,temp)
print("after: ",img)    #error ValueError: Incompatible type conversion requested to type 'int32' for variable of type 'float32_ref'
valid = self.discriminator(img)
self.combined = Model(z, valid)

I have 128x128x3 images to generate, what I want to do is give 129x128x3 images to discriminator and the 1x128x3 text-embedding matrix is concatenated with the image while training. But I have to specify at the start the shape of tensors and input value that each model i.e. GEN and DISC will get. Gen takes 100noise+384embedding matrix and generates 128x128x3 image which is again embeded by some embedding i.e. 1x128x3 and is fed to DISC. So my question is that whether this approach is correct or not? Also, if it is correct or it makes sense then how can I specific the stuff needed at the start so that it does not give me errors like incompatible shape because at the start I have to add these lines:-

    z = Input(shape=(100+384,))
    img = self.generator(z)    #128x128x3
    valid = self.discriminator(img)   #should be 129x128x3
    self.combined = Model(z, valid)

But img is of 128x128x3 and is later during training changed to 129x128x3 by concatenating embedding matrix. So how can I change "img" from 128,128,3 to 129,128,3 in the above code either by padding or appending another tensor or by simply reshaping which of course is not possible. Any help will be much much appreciated. Thanks.


Solution

  • The first argument of tf.concat should be the list of tensors, while the second is the axis along which to concatenate. You could concatenate the img and temp tensors as follows:

    import tensorflow as tf
    
    img = tf.ones(shape=(128, 128, 3))
    temp = tf.get_variable("temp", [1, 128, 3], dtype=tf.float32)
    img = tf.concat([img, temp], axis=0)
    
    with tf.Session() as sess:
        print(sess.run(tf.shape(img)))
    

    UPDATE: Here you have a minimal example showing why you get the error "AttributeError: 'Tensor' object has no attribute '_keras_history'". This error pops up in the following snippet:

    from keras.layers import Input, Lambda, Dense
    from keras.models import Model
    import tensorflow as tf
    
    img = Input(shape=(128, 128, 3))  # Shape=(batch_size, 128, 128, 3)
    temp = Input(shape=(1, 128, 3))  # Shape=(batch_size, 1, 128, 3)
    concat = tf.concat([img, temp], axis=1)
    print(concat.get_shape())
    dense = Dense(1)(concat)
    model = Model(inputs=[img, temp], outputs=dense)
    

    This happens because tensor concatis not a Keras tensor, and therefore some of the typical Keras tensors' attributes (such as _keras_history) are missing. To overcome this problem, you need to encapsulate all TensorFlow tensors into a Keras Lambda layer:

    from keras.layers import Input, Lambda, Dense
    from keras.models import Model
    import tensorflow as tf
    
    img = Input(shape=(128, 128, 3))  # Shape=(batch_size, 128, 128, 3)
    temp = Input(shape=(1, 128, 3))  # Shape=(batch_size, 1, 128, 3)
    concat = Lambda(lambda x: tf.concat([x[0], x[1]], axis=1))([img, temp])
    print(concat.get_shape())
    dense = Dense(1)(concat)
    model = Model(inputs=[img, temp], outputs=dense)