I was reading this post https://machinelearningmastery.com/time-series-prediction-lstm-recurrent-neural-networks-python-keras/ and i want draw in my mind the structure of the LSTM network. Analyzing this part of the code:
model = Sequential()
model.add(LSTM(4, input_shape=(1, look_back)))
where look_back = 1
, the diagram of the model could be this?
being pink box the input, green boxes the hidden layer and the blue box the output
No, you still have one LSTM layer with four LSTM Neurons.
BTW: If you're looking for a fast way to visualize an ANN: Netron