Search code examples
rmxnet

Retrain mxnet model in R


I have created a neural network with mxnet. Now I want to train this model iteratively on new data points. After I simulated a new data point I want to make a new gradient descent update on this model. I do not want to save the model to an external file and load it again.

I have written the following code, but the weights do not change after a new training step. I also get NaN as a training error.

library(mxnet)
data <- mx.symbol.Variable("data")
fc1 <- mx.symbol.FullyConnected(data, num_hidden = 2, no.bias = TRUE)
lro <- mx.symbol.LinearRegressionOutput(fc1)

# first data observation
train.x = matrix(0, ncol = 3)
train.y = matrix(0, nrow = 2)

# first training step
model = mx.model.FeedForward.create(lro,
  X = train.x, y = train.y, initializer = mx.init.uniform(0.001),
  num.round = 1, array.batch.size = 1, array.layout = "rowmajor",
  learning.rate = 0.1, eval.metric = mx.metric.mae)
print(model$arg.params)

# second data observation
train.x = matrix(0, ncol = 3)
train.x[1] = 1
train.y = matrix(0, nrow = 2)
train.y[1] = -33

# retrain model on new data
# pass on params of old model
model = mx.model.FeedForward.create(symbol = model$symbol,
  arg.params = model$arg.params, aux.params = model$aux.params,
  X = train.x, y = train.y, num.round = 1,
  array.batch.size = 1, array.layout = "rowmajor",
  learning.rate = 0.1, eval.metric = mx.metric.mae)
# weights do not change
print(model$arg.params)

Solution

  • I found a solution. begin.round in the second training step must be greater than num.round in the first training step, so that the model continues to train.

    library(mxnet)
    data <- mx.symbol.Variable("data")
    fc1 <- mx.symbol.FullyConnected(data, num_hidden = 2, no.bias = TRUE)
    lro <- mx.symbol.LinearRegressionOutput(fc1)
    
    # first data observation
    train.x = matrix(0, ncol = 3)
    train.y = matrix(0, nrow = 2)
    
    # first training step
    model = mx.model.FeedForward.create(lro,
      X = train.x, y = train.y, initializer = mx.init.uniform(0.001),
      num.round = 1, array.batch.size = 1, array.layout = "rowmajor",
      learning.rate = 0.1, eval.metric = mx.metric.mae)
    print(model$arg.params)
    
    # second data observation
    train.x = matrix(0, ncol = 3)
    train.x[1] = 1
    train.y = matrix(0, nrow = 2)
    train.y[1] = -33
    
    # retrain model on new data
    # pass on params of old model
    model = mx.model.FeedForward.create(symbol = model$symbol,
      arg.params = model$arg.params, aux.params = model$aux.params,
      X = train.x, y = train.y, begin.round = 2, num.round = 3,
      array.batch.size = 1, array.layout = "rowmajor",
      learning.rate = 0.1, eval.metric = mx.metric.mae)
    
    print(model$arg.params)