Search code examples
cntk

CNTK complains about Feature Not Implemented


I have the following network in Brainscript.

BrainScriptNetworkBuilder = {
    inputDim = 4
    labelDim = 1
    embDim = 20
    hiddenDim = 40

    model = Sequential (
        EmbeddingLayer {embDim} :                            # embedding
        RecurrentLSTMLayer {hiddenDim, goBackwards=false} :  # LSTM
        DenseLayer {labelDim}                                # output layer
    )

    # features
    t = DynamicAxis{}
    features = SparseInput {inputDim, tag="feature", dynamicAxis=t}
    anomaly  = Input {labelDim, tag="label"}

    # model application
    z = model (features)

    zp = ReconcileDynamicAxis(z, anomaly)

    # loss and metric
    ce   = CrossEntropyWithSoftmax (anomaly, zp)
    errs = ClassificationError     (anomaly, zp)

    featureNodes    = (features)
    labelNodes      = (anomaly)
    criterionNodes  = (ce)
    evaluationNodes = (errs)
    outputNodes     = (z)
}

and my data looks like this:

2 |Features -0.08169 -0.07840 -0.09580 -0.08748 
2 |Features 0.00354 -0.00089 0.02832 0.00364 
2 |Features -0.18999 -0.12783 -0.02612 0.00474 
2 |Features 0.16097 0.11350 -0.01656 -0.05995 
2 |Features 0.09638 0.07632 -0.04359 0.02183 
2 |Features -0.12585 -0.08926 0.02879 -0.00414 
2 |Features -0.10224 -0.18541 -0.16963 -0.05655 
2 |Features 0.08327 0.15853 0.02869 -0.17020 
2 |Features -0.25388 -0.25438 -0.08348 0.13638 
2 |Features 0.20168 0.19566 -0.11165 -0.40739 |IsAnomaly 0

When I run the cntk command to try and train a model, I get the following exception.

EXCEPTION occurred: Inside File: Matrix.cpp Line: 1323 Function: Microsoft::MSR::CNTK::Matrix::SetValue -> Feature Not Implemented.

What am I missing?


Solution

  • Here are some suggestions:

    • Firstly, inputs should match the type in the data as described in the reader. So the features variable should not be a Sparse, as the Input in the data is dense.

    • Secondly the LSTM will output a sequence of outputs, one for each sample in the input sequence. You need to ignore all but the last one.

        model = Sequential ( DenseLayer {embDim} :  # embedding
                             RecurrentLSTMLayer {hiddenDim, goBackwards=false} :  # LSTM
                             BS.Sequences.Last :    #Use only the last in the LSTM sequence
                             DenseLayer {labelDim, activation=Sigmoid}  # output layer
                           )