Search code examples
keraskeras-layer

Keras: Syntax clarification


Newbie in keras:

I am trying to understand the syntax used in keras. Syntax that I am having difficult in understanding is while building a network. I have seen in number of places as also described in following code.

Statements like: current_layer = SOME_CODE(current_layer)
What is the meaning of such a statement? Does it means first the computation described in SOME_CODE is to be followed to the computation described in the current layer?

What is the use of such a syntax and when should one use it? Any advantages and alternatives?

input_layer = keras.layers.Input(
        (IMAGE_BORDER_LENGTH, IMAGE_BORDER_LENGTH, NB_CHANNELS))

current_layer = image_mirror_left_right(input_layer)

current_layer = keras.layers.convolutional.Conv2D(
      filters=16, "some values " ])
        )(current_layer)

def random_image_mirror_left_right(input_layer):
    return keras.layers.core.Lambda(function=lambda batch_imgs: tf.map_fn(
        lambda img: tf.image.random_flip_left_right(img), batch_imgs
    )
    )(input_layer)

Solution

  • If you are indeed newbie in Keras, as you say, I would strongly suggest not bothering with such advanced stuff at this stage yet.

    The repo you are referring to is a rather advanced and highly non-trivial case of using a specialized library (HyperOpt) for automatic meta-optimizing a Keras model. It involves 'automatic' model building according to some configuration parameters already stored in a Python dictionary...

    Additionally, the function you quote goes beyond Keras to involve TensorFlow methods and lambda functions...

    The current_layer=SOME_CODE(current_layer) is a typical example of the Keras Functional API; according to my experience, it is less widely used than the more straightforward Sequential API, but it may come handy in some more advanced cases, e.g.:

    The Keras functional API is the way to go for defining complex models, such as multi-output models, directed acyclic graphs, or models with shared layers. [...] With the functional API, it is easy to re-use trained models: you can treat any model as if it were a layer, by calling it on a tensor. Note that by calling a model you aren't just re-using the architecture of the model, you are also re-using its weights.