1

I am using Keras with TensorFlow 2.0. I have an embedding layer which I initialize like the following:

embed = Embedding(len(embedding_weights), params['embedding_dim'], input_length=sequence_length, mask_zero=True, weights=[embedding_weights],name="embedding")(model_input)

embedding_weights is a matrix of word embeddings. embedding_weights[0] is a row of zeros.

However, for an input = [1,2,5,1,5, 0,0,0 ...], the embedding layer outputs a non-zero vector for the padding zeros. Aren't these supposed to be zeros? Why is the model updating the zero vector in the embedding matrix?

David Buck
  • 3,752
  • 35
  • 31
  • 35
  • The vectors are repeated for the last non-zero step of the calculation. See this answer for details https://stackoverflow.com/questions/47485216/how-does-mask-zero-in-keras-embedding-layer-work – Björn Lindqvist Jun 17 '20 at 04:36
  • Does this answer your question? [How does mask\_zero in Keras Embedding layer work?](https://stackoverflow.com/questions/47485216/how-does-mask-zero-in-keras-embedding-layer-work) – Björn Lindqvist Jun 17 '20 at 04:36

0 Answers0