Search code examples
pythontensorflowneural-networksequential

Problem with set_weights function tensorflow


I've built a Sequential model like this:

model=Sequential()
model.add(Dense(40, activation='relu',input_dim=12))
model.add(Dense(60, activation='relu'))
model.add(Dense(units=3, activation='softmax'))
opt=tf.keras.optimizers.Adam(lr=0.001)
model.compile(loss="mse", optimizer=opt)
model.summary()

I would like to create a second model and then change its weights according to a rule made by me, so I''ve written this code

    model2=model
    w1=model.get_weights()
    w2=model2.get_weights()
    for i in range(len(w1)):
        j=np.random.random(1)
        w1[i]=w2[i]*j
    model.set_weights(w1)
    model2.set_weights(w2)

After the for cycle w1 is different from w2, but after I set the weights of both models and then recall the get_weights() functions, they are still the same. Why this happens?


Solution

  • create a copy of your model with tf.keras.models.clone_model

    model=Sequential()
    model.add(Dense(40, activation='relu',input_dim=12))
    model.add(Dense(60, activation='relu'))
    model.add(Dense(units=3, activation='softmax'))
    opt=tf.keras.optimizers.Adam(lr=0.001)
    model.compile(loss="mse", optimizer=opt)
    model.summary()
    
    model2 = tf.keras.models.clone_model(model) # make a copy
    w1 = model.get_weights()
    w2 = model2.get_weights()
    
    for i in range(len(w1)):
        j=np.random.random(1)
        w1[i]=w2[i]*j
        
    model.set_weights(w1)
    model2.set_weights(w2)