Update: Found it: The class is tf.keras.layers.Activation; needs to be called with argument activation='relu'....
Trying to access tf.keras.layers.ReLU gives the error:
AttributeError: module 'tensorflow.tools.api.generator.api.keras.layers' has no attribute 'ReLU'.
In the docs, version master has such a layer. Version 1.8 (and 1.9) only seems to have leaky relu, PReLU, and other derivatives.
Right now I'm using ThresholdedReLU with theta of 0.0, I hope this results in a standard ReLU. But there must be a simple 'ReLU' layer as well?
Where can I find keras' ReLU layer in tensorflow 1.8? I want a keras layer class, i.e., not tf.keras.backend.relu.
It feels as if I'm overlooking something completely obvious. I haven't used keras before, so, sorry if this is a super stupid question.
For simple activation layers you can just use the Activation layer.
Activation('relu')