I am using Keras with TensorFlow 2.0. I have an embedding layer which I initialize like the following:
embed = Embedding(len(embedding_weights), params['embedding_dim'], input_length=sequence_length, mask_zero=True, weights=[embedding_weights],name="embedding")(model_input)
embedding_weights
is a matrix of word embeddings. embedding_weights[0]
is a row of zeros.
However, for an input = [1,2,5,1,5, 0,0,0 ...],
the embedding layer outputs a non-zero vector for the padding zeros. Aren't these supposed to be zeros? Why is the model updating the zero vector in the embedding matrix?