I have the transactional data of 15 branches of a bank and I am trying to fit a neural network to it. Here is some background on the data.
df.interpolate(method='time', axis=0, inplace=True)
df.fillna(0, inplace=True)
from keras.models import Sequential
from keras.layers import Dense, LSTM, Flatten, BatchNormalization, Dropout
from keras.optimizers import Adam
model = Sequential()
model.add(BatchNormalization())
model.add(Dense(32 , activation='relu'))
model.add(Dense(64, activation='relu'))
model.add(Dense(128, activation='relu'))
model.add(Flatten())
model.add(Dense(1, activation='relu'))
model.compile(loss='mean_squared_error', optimizer=Adam())
model.fit(X, y, epochs=epochs, batch_size=31, verbose=0)
Below is the code that I am using
def train_neural_network(actual_data, tbats_predictions, model, batch_size=31, epochs=100):
X = predictions
y = actual_data
model.add(BatchNormalization())
model.add(Dense(32 , activation='relu'))
model.add(Dense(64, activation='relu'))
model.add(Dense(128, activation='relu'))
model.add(Flatten())
model.add(Dense(1, activation='relu'))
model.compile(loss='mean_squared_error', optimizer=Adam())
# Train the model in batches
model.fit(X, y, epochs=100, batch_size=31, verbose=0)
# Return the trained model
return model
hybrid_forecast = {}
for col in df_dep.columns:
act = np.array(train_data_deposit[col])
act = act.reshape(-1, 1)
preds = np.array(predictions[col])
preds = preds.reshape(-1, 1)
model = Sequential()
model = train_neural_network(preds, act, model)
forecast = model.predict(validation_data_deposit[col])
hybrid_forecast[col] = forecast.squeeze()
I am not sure what is the reason for it. I have inspected the training_data_dep
and validation_data_deposit
for any missing values or NaNs but they are no present. Does it happen due to presence of zeros at weekends, if so then why does it work for some of the branches while fail for others?[Model output for two branches](https://i.sstatic.net/0nIai.png)
You have to change the activation on your last layer, from
model.add(Dense(1, activation='relu'))
to
model.add(Dense(1, activation='linear'))