Search code examples
tensorflowmachine-learningkerasgenerative-adversarial-network

Generative Adversarial Networks in Keras doesn't work like expected


I'm a beginner in Keras machine learning. I'm Trying to understand the Generative Adversarial Networks (GAN). For this purpose i'm trying to program a simple example. Im generating data With the following function:

def genReal(l):
    realX = []
    for i in range(l):
        x = []
        y = []
        for i in np.arange(0.0, 1.0, 0.02):
            x.append(i + np.random.normal(0,0.01))
            y.append(-abs(i-0.5)+0.5+ np.random.normal(0,0.01))

        data = np.array(list(zip(x, y)))
        data = np.reshape(data, (100))
        data.clip(0,1)
        realX.append(data)

    realX = np.array(realX)
    return realX

Data that is gerated with this fuction looks similar to these examples:

enter image description here Now the aim should be to train a Neural Network to generate similar data. For the GAN we need a Generator Network which i modeled like this:

generator = Sequential()
generator.add(Dense(128, input_shape=(100,), activation='relu'))
generator.add(Dropout(rate=0.2))
generator.add(Dense(128, activation='relu'))
generator.add(Dropout(rate=0.2))
generator.add(Dense(100, activation='sigmoid'))
generator.compile(loss='mean_squared_error', optimizer='adam')

an a discriminator which looks like this:

discriminator = Sequential()
discriminator.add(Dense(128, input_shape=(100,), activation='relu'))
discriminator.add(Dropout(rate=0.2))
discriminator.add(Dense(128, activation='relu'))
discriminator.add(Dropout(rate=0.2))
discriminator.add(Dense(1, activation='sigmoid'))
discriminator.compile(loss='mean_squared_error', optimizer='adam')

the combined model:

ganInput = Input(shape=(100,))
x = generator(ganInput)
ganOutput = discriminator(x)

GAN = Model(inputs=ganInput, outputs=ganOutput)
GAN.compile(loss='binary_crossentropy', optimizer='adam')

I have a function that generates noise (a random array)

def noise(l):
   noise = np.array([np.random.uniform(0, 1, size=[l, ])])
   return noise

And then i'm training the model:

for i in range(1000000):
    fake = generator.predict(noise(100))
    print(i, "==>", discriminator.predict(fake))
    discriminator.train_on_batch(genReal(1), np.array([1]))
    discriminator.train_on_batch(fake, np.array([0]))

    discriminator.trainable = False
    GAN.train_on_batch(noise(100), np.array([1]))
    discriminator.trainable = True

Like you can see i've already tried to train the model for 1. Mio iterations. But the generator outputs data that looks like this afterwards (despite of different inputs):

enter image description here

Definitely not what I wanted. So my question is: Is 1. Mio Iterations not enough, or is there anything wrong in the concept of my program

edit:

That is the function with which i plot my data:

def plotData(data):
    x = np.reshape(data,(50,2))
    x = x.tolist()
    plt.scatter(list(zip(*x))[0],list(zip(*x))[1], c=col)

Solution

  • The problem with your implementation is that discriminator.trainable = False doesn't have any effect after compiling discriminator. Therefore, all the weights (both from the discriminator and the generator networks) are trainable when you execute GAN.train_on_batch.

    The solution to this problem is to set discriminator.trainable = False right after compiling discriminator and before compiling GAN:

    discriminator.compile(loss='mean_squared_error', optimizer='adam')    
    discriminator.trainable = False
    
    ganInput = Input(shape=(100,))
    x = generator(ganInput)
    ganOutput = discriminator(x)
    
    GAN = Model(inputs=ganInput, outputs=ganOutput)
    GAN.compile(loss='binary_crossentropy', optimizer='adam')
    

    NOTE. I have plotted your data and it looks more like this: Generated data