I wrote a script to do the classification of a single input image using a model I trained with MxNet. To classify the incoming image I feedforward them in through network.
In short here is what I am doing:
symbol, arg_params, aux_params = mx.model.load_checkpoint('model-prefix', 42)
model = mx.mod.Module(symbol=symbol, context=mx.cpu())
model.bind(data_shapes=[('data', (1, 3, 224, 244))], for_training=False)
model.set_params(arg_params, aux_params)
# ... loading the image & resizing ...
# img is the image to classify as numpy array of shape (3, 244, 244)
Batch = namedtuple('Batch', ['data'])
self._model.forward(Batch(data=[mx.nd.array(img)]))
probabilities = self._model.get_outputs()[0].asnumpy()
print(str(probabilities))
This works fine, except that I am getting the following warning
UserWarning: Data provided by label_shapes don't match names specified by label_names ([] vs. ['softmax_label'])
What should I change to avoid getting this warning? It is not clear to me what the label_shapes and label_names parameters are meant for, and what I am expect to fill them with.
Note: I found some thread about them, but none enabled me to solve the problem. Similarly the MxNet documentation doesn't provide much details on what those parameters are and on how they are supposed to be filled.
Set label_names=None
and allow_missing=True
. That should get rid of the warning.
model = mx.mod.Module(symbol=symbol, context=mx.cpu(), label_names=None)
...
model.set_params(arg_params, aux_params, allow_missing=True)
If you are curious why the warning is printed in the first place,
Every module has associated label. When this model was trained, softmax_label
was used as the label (most likely because the output layer was a softmax layer named 'softmax'). When the model was loaded from file, the module that was created had softmax_label
as the module's label.
>>>print(model.label_names)
['softmax_label']
model.bind
is then called without providing label_shapes.
model.bind(data_shapes=[('data', (1, 3, 224, 244))], for_training=False)
MXNet sees that the module has a label in it which was not provided during bind and complains about it - which is the warning message you see.
I think if bind is called with for_training=False
, MXNet shouldn't complain about the missing label. I've created this issue: https://github.com/dmlc/mxnet/issues/6958
However, for this particular case where we load a model from disk, we can load it with None
as the label so that MXNet doesn't later complain when bind doesn't provide label - which is what the suggested fix does.