I'm using Keras with TensorFlow 2 and I have a trained model with the weights corresponding to each layer of my model but the shape of some conv1d layers confused me. I set the convolutional layers to have 64 filters with a length of 16 but the shape of my weight vector is like (16,64,64) at the end. can someone explain this to me? I suppose that 16 is the length of every filter and the last 64 is my num_filters, what is the other one, I mean how is that 3-dimensional? it should be (16,64) or something. and besides, isn't this odd to specify the length of every filter on z-axis ? (of course with assuming the computer science version of representing dimensions (z,x,y instead of x,y,z)) what I get is something like this :
name:conv1d/kernel:0 shape:(16,64,64) dtype:<dtype:'float32'> numpy=...
thank you guys in advance.
to answer my own question, the first 64 corresponds to depth of the data we are facing. for instance, if you want to have 5 filters with 32-element length for a data which has 10 features (in other words the input depth of the conv layer is 10) your variable shape will be : (32,10,5)