Search code examples
tensorflowkeras

usage of tf.keras.layers.DenseFeatures


Here is the official doc.

A layer that produces a dense Tensor based on given feature_columns.

Inherits From: DenseFeatures

tf.keras.layers.DenseFeatures(
    feature_columns, trainable=True, name=None, **kwargs
)

This is used in TF example and usually put in keras.Sequential(...) model construction. Like below:

model = tf.keras.Sequential([
  feature_layer,
  layers.Dense(128, activation='relu'),
  layers.Dense(128, activation='relu'),
  layers.Dropout(.1),
  layers.Dense(1)
])

In my case, I want to use it to transfer my dictionary data type into Tensor format and pass it into model. So I used code like below:

feature_columns = []
bins = [-125, -75, -50, -25, 0, 25, 50, 75, 125]
temp_num = feature_column.numeric_column('temp')
temp_buckets = feature_column.bucketized_column(temp_num, boundaries=bins)
feature_columns.append(temp_buckets)
feature_layer = layers.DenseFeatures(feature_columns) 
input = feature_layer(dict(dataframe))

And input is the training data I would feed into model. The question is whether my usage of this DenseFeatures() layer is reasonable. Or this feature_layer has to be in keras.Model class?


Solution

  • Yes, your idea is reasonable. And actually you are free to choose either Keras functional API or Keras Sequential API when specifying your deep learning architecture.

    To complete your work, I would remove the last line and make some additional tweaks. What follows is a code snippet for completing the work you left by using Keras functional APIs:

    feature_columns = []
    bins = [-125, -75, -50, -25, 0, 25, 50, 75, 125]
    temp_num = feature_column.numeric_column('temp')
    temp_buckets = feature_column.bucketized_column(temp_num, boundaries=bins)
    feature_columns.append(temp_buckets)
    feature_layer = tf.keras.layers.DenseFeatures(feature_columns)
    
    # create a dictionary to associate column names with column values
    inputs = {}
    inputs["temp_num"] = tf.keras.Input(shape=(1,), name="temp_num") 
    
    # convert FeatureColumns into a single tensor layer
    x = feature_layer(inputs)
    
    x = tf.keras.layers.Dense(128, activation='relu')(x)
    x = tf.keras.layers.Dense(128, activation='relu')(x)
    x = tf.keras.layers.Dropout(.1)(x)
    out = tf.keras.layers.Dense(1)(x)
    
    model = tf.keras.Model(inputs=dict(inputs), outputs=out)