Search code examples
pythonkerasneural-networkkeras-layer

Connecting input and output in keras in this simple XOR problem


I'm trying to recreate this architecture in Keras for solving a XOR problem where there are weights connecting the input (a two-dimensional array) and the output (a scalar). I know that the XOR problem can be solved using a fully connected 2,2,1 architecture, but I don't know how to implement this architecture in Keras.

I read the docs and researched SO but I can't seem to find a solution. The following code shows what I have done so far. My main issue is how to connect the hidden layer and the output layer.

input1 = keras.layers.Input(shape=(2,)) # input
hidden_layer = keras.layers.Dense(1, activation='tanh')(input1) # linking the input with the hidden layer
output1 = keras.layers.Dense(1, activation='tanh')(input1) # linking the input with the output layer
# The code for connecting hidden and output layer should probably go here #
model = keras.models.Model(inputs=input1, outputs=outpu1) 
model.compile(...)

Solution

  • Hi Evelyn welcome to stacckoverflow.

    I think that it makes more sense to do it with two inputs.

    You can implement it as follows:

    import tensorflow as tf
    from tensorflow import keras
    
    
    inp1 = keras.layers.Input(shape=(1,))
    inp2 = keras.layers.Input(shape=(1,))
    
    x = keras.layers.Concatenate()([inp1, inp2])
    x = keras.layers.Dense(1, activation='tanh')(x)
    
    x = keras.layers.Concatenate()([inp1, inp2, x])
    output = keras.layers.Dense(1, activation='tanh')(x)
    
    model = keras.models.Model(inputs=[inp1, inp2], outputs=output) 
    model.summary()
    model([tf.ones([8, 1]), tf.zeros([8, 1])])