I've been looking in to CNTK and decided to create a create a model for an xor function to make sure I understood the basics. I created the files below but since the model does horribly bad I'm guessing I am missing something fundamental.
command = Train:Output:DumpNodeInfo
modelPath = "Models\xor.dnn"
deviceId = -1
makeMode = false
featureDimension = 2
labelDimension = 1
Train = [
action = "train"
BrainScriptNetworkBuilder = {
FDim = $featureDimension$
LDim = $labelDimension$
features = Input {FDim}
labels = Input {LDim}
W0 = ParameterTensor {(FDim:FDim)} ; b0 = ParameterTensor {FDim}
W1 = ParameterTensor {(LDim:FDim)} ; b1 = ParameterTensor {LDim}
o1 = W0*features + b0
z = Sigmoid (W1*o1 + b1)
ce = SquareError (labels, z)
errs = ClassificationError (labels, z)
# root nodes
featureNodes = (features)
labelNodes = (labels)
criterionNodes = (ce)
evaluationNodes = (errs)
outputNodes = (z)
}
SGD = [
epochSize = 0
minibatchSize = 1
learningRatesPerSample = 0.4
maxEpochs = 50
]
reader=[
readerType="CNTKTextFormatReader"
file="Train_xor.txt"
input = [
features = [
dim = $featureDimension$
alias = X
format = "dense"
]
labels = [
dim = $labelDimension$
alias = y
format = "dense"
]
]
]
]
Output = [
action="write"
reader=[
readerType="CNTKTextFormatReader"
file="Train_xor.txt"
input = [
features = [
dim = $featureDimension$
alias = X
format = "dense"
]
labels = [
dim = $labelDimension$
alias = y
format = "dense"
]
]
]
outputNodeNames = z
outputPath = "Output\xor.txt"
]
DumpNodeInfo = [
action = "dumpNode"
printValues = true
]
The input file looks like this
|y 0 |X 0 0
|y 1 |X 1 0
|y 1 |X 0 1
|y 0 |X 1 1
And I get this output
0.490156
0.490092
0.489984
0.489920
If it helps, the node dump looks like the following
b0=LearnableParameter [2,1] learningRateMultiplier=1.000000 NeedsGradient=true
-0.00745151564
0.0358283482
####################################################################
b1=LearnableParameter [1,1] learningRateMultiplier=1.000000 NeedsGradient=true
-0.0403601788
####################################################################
ce=SquareError ( labels , z )
errs=ClassificationError ( labels , z )
features=InputValue [ 2 ]
labels=InputValue [ 1 ]
o1=Plus ( o1.PlusArgs[0] , b0 )
o1.PlusArgs[0]=Times ( W0 , features )
W0=LearnableParameter [2,2] learningRateMultiplier=1.000000 NeedsGradient=true
-0.0214280766 0.0442263819
-0.0401388146 0.0261882655
####################################################################
W1=LearnableParameter [1,2] learningRateMultiplier=1.000000 NeedsGradient=true
-0.0281925034 0.0214234442
####################################################################
z=Sigmoid ( z._ )
z._=Plus ( z._.PlusArgs[0] , b1 )
z._.PlusArgs[0]=Times ( W1 , o1 )
You definitely need some nonlinearity in your hidden units such as
o1 = Tanh(W0*features + b0)
In general, learning xor with two hidden units via sgd is tricky: there are many random initializations that can lead to divergence. It becomes much more easy to learn if you have 3 or more hidden units.