Search code examples
rtensorflowkerasloss-functionactivation-function

How to use different activations in output layer in Keras in R


I want to combine more types of activations in output layer in Keras interface for R. Also, I want to use different loss functions for different outputs. Lets say I want to have first two neurons linear with MSE loss, second 2 neurons sigmoid with BCE loss and last output will be relu with MAE loss. By now I have this and it is not working:

model <- keras_model_sequential()

model %>% layer_dense(units=120, activation="selu", 
           input_shape=dim(X)[2]) # this is hidden layer, this works fine

model %>% layer_dense(units=120, activation=as.list(c(rep("linear",2), 
            rep("sigmoid",2), "relu"))) # output layer which is not working

model %>% compile(loss=as.list(c(rep("mean_squared_error",2), 
             rep("binary_crossentropy",2), "mean_absolute_error")), # problem here ?
             optimizer=optimizer_adam(lr=0.001) ,metrics = "mae")

and after this I fit the model with model %>% fit(...) .

Error is the following:

Error in py_call_impl(callable, dots$args, dots$keywords) : 
  ValueError: When passing a list as loss, it should have one entry per model outputs. 
  The model has 1 outputs, but you passed loss=['mean_squared_error', 'mean_squared_error', ...

Any help is appreciated.

EDIT : only rewrited code so that is better readable.


Solution

  • I think that if you want to have multiple outputs you need to use the functional (that is, not the sequential) API - see some examples here: https://keras.rstudio.com/articles/functional_api.html