In my understanding, when using tensorflow keras, we can flow data from directories using ImageDataGenerator
. When the data is flowed in, they are resized to target_size
. But how is this transformation done?
An example is this:
from tensorflow.keras.preprocessing.image import ImageDataGenerator
trainDir = 'train'
dataGen = ImageDataGenerator(rescale=1/255)
gen = dataGen.flow_from_directories(
trainDir,
batch_size=64,
target_size=(150, 150),
class_mode='binary'
)
Here, all the images from trainDir
would be resized to 150x150. But how are they resized?
I've checked the documentation on tensorflow ImageDataGenerator. It only says that target_size
is "The dimensions to which all images found will be resized." But I didn't find an explanation on how the resizing is done.
It uses the interpolation
parameter when the target_size
is different from the size of input image. The default interpolation method used is nearest
but others are available to use as well.
Source: https://www.tensorflow.org/api_docs/python/tf/keras/preprocessing/image/ImageDataGenerator