Search code examples
pythonnlpbert-language-modelnlu

why input_mask is all the same number in BERT language model?


For a text classification task I applied Bert(fine tune) and the output that I got is as below: Why input_mask is all 1 ?

#to_feature_map is a function.
to_feature_map("hi how are you doing",0)
({'input_mask': <tf.Tensor: shape=(64,), dtype=int32, numpy=
  array([1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
         0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
         0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
        dtype=int32)>,
  'input_type_ids': <tf.Tensor: shape=(64,), dtype=int32, numpy=
  array([0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
         0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,
         0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
        dtype=int32)>,
  'input_word_ids': <tf.Tensor: shape=(64,), dtype=int32, numpy=
  array([ 101, 7632, 2129, 2024, 2017, 2725,  102,    0,    0,    0,    0,
            0,    0,    0,    0,    0,    0,    0,    0,    0,    0,    0,
            0,    0,    0,    0,    0,    0,    0,    0,    0,    0,    0,
            0,    0,    0,    0,    0,    0,    0,    0,    0,    0,    0,
            0,    0,    0,    0,    0,    0,    0,    0,    0,    0,    0,
            0,    0,    0,    0,    0,    0,    0,    0,    0], dtype=int32)>},
 <tf.Tensor: shape=(), dtype=int32, numpy=0>)```

Solution

  • The input masks — allows the model to cleanly differentiate between the content and the padding. The mask has the same shape as the input ids, and contains 1 anywhere the the input ids is not padding.