Search code examples
pythontensorflowkerassequentialbatchsize

Maximum-security batch size in Keras Sequential model


I'm training a Keras Sequential model where I haven't specified a batch_size. When I took a look into a list which is saving all my batch sizes during training, I observed that 1200 seemed like the maximum batch size.

My input data has 1500 samples. When I specify the batch size to 1500, keras is still splitting this into 1200 and an extra 300. When I do not specify something, I get 37 times 32 batches plus one of size 16 (=1200) and then 9 times 32 plus one with size 12 (=300). Saved batch size:

enter image description here

I looked into the keras Sequential model documentation here, but haven't found an explanation why this is happening.

I thought about if maybe my memory is too small for 1500 samples, but during calculations it's only used about 60%. And this is no explanation for the second observation.


Solution

  • that is because by default it takes 80% of the data to train with it and 20% of the data to test with it.