I trained an image segmentation model with tf.keras
in Python, saved it and reloaded it in with tensorflow.js (to use it in a web app).
Python (Transfering model):
import tensorflow as tf
import tensorflowjs as tfjs
model = tf.keras.models.load_model('model')
tfjs.converters.save_keras_model(model, 'tfjs_model/')
In Javascript, I load the my model (Unet with MobileNet backbone) AND a MobileNet based segmentation model from body-pix (to compare both models):
<head>
<script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs"></script>
<script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs-converter"></script>
<script src="https://cdn.jsdelivr.net/npm/@tensorflow-models/body-pix"></script>
...
</head>
<body>
...
const model_lib = await bodyPix.load();
// NumBytesInGPU: 5349984 (every log)
const model_own = await tf.loadLayersModel('mobilenet_test/model.json');
// NumBytesInGPU: 36930448 (1st log), 53707664 (5th log)
</body>
Doing that works fine and everything gets loaded without error. However, when tyring to predict from a video, tf.memory()
increases until the app crashes, while the body-pix models runs smoothly.
async function body_segment() {
const frame = document.getElementById("camera");
const canvas = document.getElementById("body_pix");
const draw = canvas.getContext("2d");
// const model = await bodyPix.load();
// NumBytesInGPU: 5349984
const model = await tf.loadLayersModel('mobile_net.json');
// NumBytesInGPU: 36930448 (1st log), 53707664 (5th log)
const runPrediction = function(input) {
return tf.tidy(() => {
const asFloat = tf.cast(input, 'float32');
const asBatch = tf.expandDims(asFloat, 0);
const results = model.predict(asBatch);
// Normally do something additional here, but removed due to debug reasons
return results
});
}
const resized = function(input) {
return tf.tidy(() => {
let imageTensor = tf.browser.fromPixels(input);
return tf.image.resizeBilinear(imageTensor, [512, 512]);
})
}
let ctr = 0;
while (ctr < 10) {
console.log("memory", tf.memory());
// !!!!! THIS FUNCTION CAUSES THE MEMORY LEAK, BUT WHY ?????
const result = await runPrediction(resized(video));
// const result = await model.segmentPersonParts(frame);
// do something with prediction here ...
result.dispose(); // remove results from prediction to clean the memory
ctr+=1;
await tf.nextFrame();
}
}
I tried to use exactly the same code as used in the body-pix files. In addition, I used tidy functions all the time, so actually, it should garbage collect everything.
Does it have to do something with the Keras Import? Or what else could be the problem for the memory leak?
Instead of:
const result = await runPrediction(resized(video));
// do smt
result.dispose();
use
const res = await resized(video);
const result = await runPrediction(res);
res.dispose();
// do smt
result.dispose();
Otherwise the intermediate result will not be disposed.