I created a tensorflow model and then converted it into tensorflow.js model using below code:
import tensorflowjs as tfjs
from tensorflow.keras.models import load_model
classifier = load_model("model")
tfjs.converters.save_keras_model(classifier, "js")
It works fine and now I would like to reduce the size of the model using quantization when converting it to a TensorflowJS Layers Model.
Yes it is possible you have four options for quantization:
Example conversion from keras format to tfjs_layers_model:
tensorflowjs_converter \
--input_format keras \
--output_format tfjs_layers_model \
--quantize_uint16 \
original_model/model.json
quantized_model/
Or if you need more help with the entire process just type the following in your terminal
tensorflowjs_wizard