I trained a neural network to do a regression on the sine function and would like to compute the first and second derivative with respect to the input. I tried using the tf.gradients() function like this (neural_net is an instance of tf.keras.Sequential):
prediction = neural_net(x_value)
dx_f = tf.gradients(prediction, x_value)
dx_dx_f = tf.gradients(dx_f, x_value)
x_value is an array that has the length of the test size. However, this results in predictions and derivatives. The prediction of the network (blue curve) basically exactly catches the sine function, but I had to divide the first derivative (orange) with a factor of 10 and the second derivative (green) with a factor of 100 in order for it to be in the same order of magnitude. So the, the first derivative looks (after that rescale) ok, but the seond derivative is completely erratic. Since the prediction of the sine function works really well there is clearly something funny going on here.