Search code examples
mxnet

MXNetError: Shape inconsistent, Provided=(1,2), inferred shape=(1,1)


I am trying to train a LSTM like,

from __future__ import print_function
import mxnet as mx
import numpy as np
from mxnet import nd, autograd, sym
from mxnet import gluon

ctx = mx.cpu()

LIMIT = 20
data = np.array([(s, 1) for s in spanish_sentences[LIMIT]] + [(s, 0) for s in english_sentences[LIMIT]])

layer = mx.gluon.rnn.LSTM(100, 3)
net = mx.gluon.nn.Dense(2)
softmax_cross_entropy = gluon.loss.SoftmaxCrossEntropyLoss()

layer.initialize(ctx=ctx)
net.collect_params().initialize(ctx=ctx)

trainer = gluon.Trainer(net.collect_params(), 'sgd', {'learning_rate': .1})
for epoch in range(10):
    np.random.shuffle(data)
    losses = []
    for s, l in data:
        if len(s) == 0:
            continue

        x = nd.array([ord(c) for c in s]).reshape(shape=(-1, 1, 1))
        y = nd.array([np.eye(2)[int(l)]])
        with autograd.record():
            output = layer(x)[output.shape[0]-1, :, :]
            pred = net(output)
            loss = softmax_cross_entropy(pred, y)
        losses.append(loss.asscalar())
        trainer.step(1, ignore_stale_grad=True)
    print("Loss:", np.mean(losses), "+-", np.std(losses))

But I am getting an error,

---------------------------------------------------------------------------
MXNetError                                Traceback (most recent call last)
<ipython-input-31-12ab8d4ad733> in <module>()
     30             output = layer(x)[output.shape[0]-1, :, :]
     31             pred = net(output)
---> 32             loss = softmax_cross_entropy(pred, y)
     33         losses.append(loss.asscalar())
     34         trainer.step(1, ignore_stale_grad=True)

 ... Stack trace ...

MXNetError: Shape inconsistent, Provided=(1,2), inferred shape=(1,1)

What is being done wrong? When I test the shape of pred and y, I get that they are both equal to (1, 2). I don't know why it is expecting (1, 1).


Solution

  • It was rather simple. SoftMaxCrossEntropy()(pred, label) expects shapes pred.shape = (BATCH_SIZE, N_LABELS) and label.shape = (BATCH_SIZE,).

    So y = nd.array([l]) fixed it.