I've load the dataset preprocessed the image data using this code:
data = tf.keras.utils.image_dataset_from_directory('/content/drive/MyDrive/PengantarSainsData/Capstone2/dataset_revisi')
data_iterator = data.as_numpy_iterator()
batch = data_iterator.next()
def preprocess(x, y):
x_normalized = x / 255
y_one_hot = tf.keras.utils.to_categorical(y, num_classes=5)
return x_normalized, y_one_hot
data = data.map(lambda x, y: tf.py_function(func=preprocess, inp=[x, y], Tout=[tf.float32, tf.float32]))
scaled_iterator = data.as_numpy_iterator()
batch = scaled_iterator.next()
Then partitioned it into training, validation, and testing data using this code:
train_size = int(len(data) * .7)
val_size = int(len(data) * .2)
test_size = int(len(data) * .1)
train = data.take(train_size)
val = data.skip(train_size).take(val_size)
test = data.skip(train_size + val_size).take(test_size)
also created the model architecture, this way:
model.add(Conv2D(16, (3, 3), 1, activation = 'relu', input_shape = (256, 256, 3)))
model.add(MaxPooling2D())
model.add(Conv2D(32, (3, 3), 1, activation = 'relu'))
model.add(MaxPooling2D())
model.add(Conv2D(16, (3, 3), 1, activation = 'relu'))
model.add(MaxPooling2D())
model.add(Flatten())
model.add(Dense(128, activation = 'relu')) # 256 number of units used in dense layer
model.add(Dense(5, activation = 'softmax')) # sigmoid represents 0 and 1 output
model.compile('adam', loss = 'categorical_crossentropy', metrics = ['accuracy'])
model.summary()
then when I want to do the training process:
logdir = 'logs'
tensorboard_callback = tf.keras.callbacks.TensorBoard(log_dir = logdir)
hist = model.fit(train, epochs = 16, validation_data = val, callbacks = [tensorboard_callback])
it keeps throwing errors as attached to the picture. What should i do? Been struggling with this for 2 days
i've tried a different model, and using this code below to defining its architecture more specifically
model.build(input_shape=(None, 256, 256, 3))
but it still didn't work
I think that you did some unnecessary steps with as_numpy_iterator
. Your image_dataset_from_directory
is a tf.data.Dataset
which you can manipulate directly without turning it into a NumPy iterator. You will simply need to use tf.one_hot
instead of tf.keras.utils.to_categorical
.
Here's a full example that works, using a local version of MNIST (you'll have to change the path and number of categories:
import tensorflow as tf
from tensorflow.keras.layers import *
data = tf.keras.utils.image_dataset_from_directory(r'path\to\mnist\test')
def preprocess(x, y):
x_normalized = x / 255
y_one_hot = tf.one_hot(tf.cast(y, tf.int32), depth=10)
return x_normalized, y_one_hot
data = data.map(preprocess)
train_size = int(len(data) * .7)
val_size = int(len(data) * .2)
test_size = int(len(data) * .1)
train = data.take(train_size)
val = data.skip(train_size).take(val_size)
test = data.skip(train_size + val_size).take(test_size)
model = tf.keras.models.Sequential()
model.add(Conv2D(16, (3, 3), 1, activation='relu', input_shape=(256, 256, 3)))
model.add(MaxPooling2D())
model.add(Conv2D(32, (3, 3), 1, activation='relu'))
model.add(MaxPooling2D())
model.add(Conv2D(16, (3, 3), 1, activation='relu'))
model.add(MaxPooling2D())
model.add(Flatten())
model.add(Dense(128, activation='relu'))
model.add(Dense(10, activation='softmax'))
model.compile('adam', loss='categorical_crossentropy', metrics=['accuracy'])
model.summary()
logdir = 'logs'
tensorboard_callback = tf.keras.callbacks.TensorBoard(log_dir=logdir)
hist = model.fit(train, epochs=1, validation_data=val, callbacks=[tensorboard_callback])