Search code examples
kerasdeep-learningpre-trained-model

Problem with feed_dict after loading a Keras pre_trained model (Input variable "not defined")


I have a pretrained model defined below:

row = keras.Input(shape=(2,60), dtype='float32')
mu = keras.Input(shape=(2,), dtype='float32')
f = keras.Input(shape=(2,), dtype='float32')
rows = dict()
fc_rows_1 = dict()
#fc_rows_2 = dict()

shared_row = Dense(60, activation='relu', kernel_regularizer = regularizers.l2(l = 0.001))

mu_fc = Dense(10, activation='relu', kernel_regularizer = regularizers.l2(l = 0.001))(mu)
f_fc = Dense(10, activation='relu', kernel_regularizer = regularizers.l2(l = 0.001))(f)

#for i in range(2):
#    rows['row_{}'.format(i)] = rows_array[i,:]
#    fc_rows_1['row_{}'.format(i)] = shared_row(rows['row_{}'.format(i)])
rows['row_1'] = Lambda(lambda x: x[:, 0, :])(row)
rows['row_2'] = Lambda(lambda x: x[:, 1, :])(row)
fc_rows_1['row_1'] = shared_row(rows['row_1'])
fc_rows_1['row_2'] = shared_row(rows['row_2'])
concat_rows = Concatenate()([fc_rows_1['row_1']] + [fc_rows_1['row_2']]) 
#concat_rows = Dropout(0.2)(concat_rows)
fc_rows_2 = Dense(30, activation='relu', kernel_regularizer = regularizers.l2(l = 0.001))(concat_rows)    

#    row_list = [fc_rows_2] + [c]
#    concat_rows_2 = Concatenate()(row_list)
concat_rows_2 = Dropout(0.2)(fc_rows_2)

#    fc_cols_2 = Dropout(0.3)(fc_cols_2)

concat_list = [concat_rows_2] + [mu_fc] + [f_fc]

concat = Concatenate()(concat_list)

#fc_concat = Dropout(0.2)(concat)
fc_concat = concat
output = Dense(1, activation='linear')(fc_concat)

model = keras.Model(inputs=[row, mu, f], outputs=output)

What's important to note is that the model takes 3 inputs. I then trained this model and save the whole thing in a h5 file. After loading it in a new file, I tried the following code:

# optimize f
import keras.backend as K

session = K.get_session()
#model = keras.models.load_model('model.h5')
#session.run(tf.global_variables_initializer())
step = 0.01

row_0 = row_array[0]
mu_0 = mu_new[0]
f_0 = f_new[0]

f_adapt = f_0.copy()

row_0 = row_0.reshape(1,2,60)
mu_0 = mu_0.reshape(1,2,)
f_adapt = f_adapt.reshape(1,2,)

for i in range(50):
    grads = session.run(K.gradients(model.output, f), feed_dict={row: row_0, mu: mu_0, f: f_adapt})
    f_adapt += grads[0] * step
    print("iter:",i,f_adapt)
    if np.linalg.norm(grads) < 10**(-50):
        break

and I get the following error:

NameError                                 Traceback (most recent call last)
<ipython-input-18-89ec662f423c> in <module>()
     18 
     19 for i in range(50):
---> 20     grads = session.run(K.gradients(model.output, f), feed_dict={row: row_0, mu: mu_0, f: f_adapt})
     21     f_adapt += grads[0] * step
     22     print("iter:",i,f_adapt)

NameError: name 'f' is not defined

I was wondering why the Input variable 'f' is not found after I load the model. Thank you!


Solution

  • I actually figured it out. It seems like the local variable defined initially are not stored when I save the model. So in order to use these input tensors, I have to use model.inputs. The new code is:

    for i in range(50):
        grads = session.run(K.gradients(model.output, model.inputs[2]), feed_dict={model.inputs[0]: row_0, model.inputs[1]: mu_0, model.inputs[2]: f_adapt})
        f_adapt += grads[0] * step
        print("iter:",i,f_adapt)
        if np.linalg.norm(grads) < 10**(-50):
            break