I'm implementing a chain classifier which takes the lstm model as chain of binary classifiers for a multiclass problem. As the output of one binary classifier is fed into next binary classifier as a feature so we cant make the input shape fixed in input layer of model. My code is here:
def create_model():
input_size=length_long_sentence #107 in my case
embedding_size=128
lstm_size=64
output_size=len(unique_tag_set)
#----------------------------Model--------------------------------
current_input=Input(shape=(input_size,))
emb_current = Embedding(vocab_size, embedding_size, input_length=input_size)(current_input)
out_current=Bidirectional(LSTM(units=lstm_size))(emb_current )
output = Dense(units=1, activation= 'sigmoid')(out_current)
model = Model(inputs=current_input, outputs=output)
#-------------------------------compile-------------
model.compile(optimizer='Adam', loss='binary_crossentropy', metrics=['accuracy'])
return model
model = KerasClassifier(build_fn=create_model, epochs=1,batch_size=256, shuffle = True, verbose = 1,validation_split=0.2)
chain=ClassifierChain(model, order='random', random_state=42)
history=chain.fit(X_train, y_train)
while training I'm getting this warning for classifier in the chain with varying input shape:
Every time a binary classifier will be trained, it will be called on inputs like (None,108), (none,109) and so on.
Model summary is here:
Is there any way that I can make this size (none,107) variable in input layer of keras model?
Use None
to denote a variable shape in the Input
layer.
current_input=Input(shape=(None,))