Search code examples
pythonkeraslstmrecurrent-neural-network

Getting an error stating I need to specify steps_per_epoch


I'm trying to build a many-to-one RNN using LSTM units to classify twitter Sentiment Analysis. After trying to fit my model I am getting a value error that says. My guess is that it is due to the way that I am tokenizing my input, but I'm not too sure what it means by symbolic tensors:

If your data is in the form of symbolic tensors, you should specify the `steps_per_epoch` argument (instead of the `batch_size` argument because symbolic tensors are expected to produce batches of input data)."

What does this mean and what can I do to remedy this?

# Tokenize the input
#creates tokenizer
tokenizer = Tokenizer()
#fits the input to the text, ie most common words being closer to 0 and more obscure being father away
tokenizer.fit_on_texts(X_training) 
#converts the input to token indices
X_training_tokens = tokenizer.texts_to_sequences(X_training)
#get largest list of words
maxLen = max([len(s.split()) for s in X_data])
#padding so all inputs are the same size
X_train_pad = pad_sequences(X_training_tokens, maxlen = maxLen)

#time to make the embedding matrix
#instantiate embedding matrix of zeroes
embedding_matrix = np.zeros((len(tokenizer.word_index)+1, dims))
#go through each word in the token list
for word, i in tokenizer.word_index.items():
    #get the corresponding embedding vector (if it exists)
    embedding_vector = embeddings.get(word)
    #check if its not none
    if embedding_vector is not None:
        #add that to the embedding matrix
        embedding_matrix[i] = embedding_vector

#Make the model
Model = Sequential()
Model.add(
    Embedding(
        input_dim = len(tokenizer.word_index) + 1,
        output_dim = dims,
        weights = [embedding_matrix],
        input_length = maxLen,
        trainable = False
    )
)
Model.add(
    LSTM(
        units = maxLen,
        return_sequences = False
        #possibly add dropout
    )
)
Model.add(
    Dense(
        maxLen,
        activation = 'relu'
    )
)
Model.add(
    Dense(
        3,
        activation = 'softmax'
    )
)

Model.compile(
    optimizer = 'Adam',
    loss = 'categorical_crossentropy',
    metrics = ['accuracy']
)

costs = Model.fit(
    x = X_train_pad,
    y = Y_training,
    batch_size = 2048,
    epochs = 10
)

Solution

  • Turns out my Y was in a symbolic tensor since I was using the TensorFlow one-hot function, I just used the Keras to categorical function and was able to get a NumPy array which worked.