Search code examples
pythonkerasgeneratorautoencoder

Keras fit_generator using input and output image generators 'ndim' error


I decided to try my hand at training an auto-encoder for re-coloring grey scale images. This approach might be a tad naive, but I want to play with it and see how good (or bad) it works and examine how I can improve it.

However, it unexpectedly throws the following error:

  File "colorise0.py", line 63, in <module>
    validation_data=(val_g_generator, val_c_generator)
  File "/usr/local/lib/python2.7/dist-packages/keras/legacy/interfaces.py", line 91, in wrapper
    return func(*args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/keras/engine/training.py", line 2183, in fit_generator
    val_x, val_y, val_sample_weight)
  File "/usr/local/lib/python2.7/dist-packages/keras/engine/training.py", line 1483, in _standardize_user_data
    exception_prefix='input')
  File "/usr/local/lib/python2.7/dist-packages/keras/engine/training.py", line 76, in _standardize_input_data
    data = [np.expand_dims(x, 1) if x is not None and x.ndim == 1 else x for x in data]
AttributeError: 'DirectoryIterator' object has no attribute 'ndim'

My code is:

from keras.layers import Input, Dense, Conv2D, MaxPooling2D, UpSampling2D
from keras.layers import Conv2DTranspose as DeConv2D
from keras.models import Model
from keras import backend as K

from keras.preprocessing.image import ImageDataGenerator

img_width, img_height = 150, 150
batch_size=32

train_data_dir = './train/'
validation_data_dir = './validation/'
input_shape = (img_width, img_height,3)

train_datagen = ImageDataGenerator(rescale = 1./255)
test_datagen =  ImageDataGenerator(rescale = 1./255)


train_c_generator=  train_datagen.flow_from_directory( 
    train_data_dir+'colored',
    target_size=(img_width, img_height),
    batch_size=batch_size
)

train_g_generator=  train_datagen.flow_from_directory(
    train_data_dir+'grey',
    target_size=(img_width, img_height),
    batch_size=batch_size
)

val_c_generator=  test_datagen.flow_from_directory(
    validation_data_dir+'colored',
    target_size=(img_width, img_height),
    batch_size=batch_size
)

val_g_generator=  test_datagen.flow_from_directory(
    validation_data_dir+'grey',
    target_size=(img_width, img_height),
    batch_size=batch_size
)

input_img=Input(shape=(img_width,img_height,3))
x=Conv2D(32,(3,3), activation='relu', padding='same')(input_img)
x=Conv2D(32,(3,3), activation='relu', padding='same')(x)
x=Conv2D(32,(3,3), activation='relu', padding='same')(x)
x=Conv2D(32,(3,3), activation='relu', padding='same')(x)

y=DeConv2D(32,(3,3), activation='relu',padding='same')(x)
y=DeConv2D(32,(3,3), activation='relu',padding='same')(y)
y=DeConv2D(32,(3,3), activation='relu',padding='same')(y)
decoded=DeConv2D(3,(3,3), padding='same')(y)

autoencoder = Model(input_img, decoded)
autoencoder.compile(optimizer='adadelta', loss='binary_crossentropy')


autoencoder.fit_generator(
                train_g_generator, train_c_generator,
                epochs=50,
                validation_data=(val_g_generator, val_c_generator)
                )

Given the error message, I think the error might stem from calling two generators (one supplying the grey input images, the second supplying the original colorful images, serving as the targets).

How can I solve this?

Many thanks!


Solution

  • fit_generator takes only a single generator, not two. The generator outputs a tuple (X, y), so both the input and targets are handled by a single generator.

    For your specific use case I think you would need to make a custom generator.