In Keras, I would do the following to dynamically create a model's layers:
for i in range(number_dense_layers):
model.add(layers.Dense(units=units, input_dim=input_dim,
kernel_initializer='normal', activation='relu'))
however, in the case of Tensorflow, I have the following:
class generic_vns_function(tf.keras.Model):
def __init__(self, num_layers, num_class=10):
super().__init__()
# Convolutional layers and MaxPools
self.conv1 = tf.keras.layers.Conv2D(64, 3, activation="relu")
self.conv2 = tf.keras.layers.Conv2D(64, 3, activation="relu")
where I would want to do something like:
for i in range(num_layers):
self.add(tf.keras.layers.Conv2D(64, 3, activation="relu"))
but I am unsure how to dynamically create this layer since the add
function does not work in this context as it did in Keras.
You can append first and stack them later.
Here is a rough example:
import tensorflow as tf
class generic_vns_function(tf.keras.Model):
def __init__(self, num_layers, num_class=10):
super().__init__()
self.convolutions = []
...
for i in range(num_layers):
self.convolutions.append(tf.keras.layers.Conv2D(64, 3, activation="relu"))
def call(self, inputs):
...
for layer in self.convolutions:
x = layer(x)
...
return x