I have created a custom loss function called
def customLoss(true, pred)
//do_stuff
//print(variables)
return loss
Now I'm calling compile as model.compile(optimizer='Adamax', loss = customLoss)
EDIT: I tried tf.Print and this is my result.
def customLoss(params):
def lossFunc(true, pred):
true = tf.Print(true, [true.shape],'loss-func') #obviously this won't work because the tensors aren't the same shape; however, this is what I want to do.
#stuff
return loss
return lossFunc
model = Model(inputs=[inputs], outputs=[outputs])
parallel_model = multi_gpu_model(model, gpus=8)
parallel_model.compile(opimizer='Adam', loss = customLoss(params), metrics = [mean_iou)
history = parallel_model.fit(X_train, Y_train, validation_split=0.25, batch_size = 32, verbose=1)
and the output is
Epoch 1/10
1159/1159 [==============================] - 75s 65ms/step - loss: 0.1051 - mean_iou: 0.4942 - val_loss: 0.0924 - val_mean_iou: 0.6933
Epoch 2/10
1152/1159 [============================>.] - ETA: 0s - loss: 0.0408 - mean_iou: 0.7608
The print statements still aren't printing. Am I missing something - are my inputs into tf.Print
not proper?