RMSE/ RMSLE loss function in Keras
Solution 1
When you use a custom loss, you need to put it without quotes, as you pass the function object, not a string:
def root_mean_squared_error(y_true, y_pred):
return K.sqrt(K.mean(K.square(y_pred - y_true)))
model.compile(optimizer = "rmsprop", loss = root_mean_squared_error,
metrics =["accuracy"])
Solution 2
The accepted answer contains an error, which leads to that RMSE being actually MAE, as per the following issue:
https://github.com/keras-team/keras/issues/10706
The correct definition should be
def root_mean_squared_error(y_true, y_pred):
return K.sqrt(K.mean(K.square(y_pred - y_true)))
Solution 3
If you are using latest tensorflow nightly, although there is no RMSE in the documentation, there is a tf.keras.metrics.RootMeanSquaredError()
in the source code.
sample usage:
model.compile(tf.compat.v1.train.GradientDescentOptimizer(learning_rate),
loss=tf.keras.metrics.mean_squared_error,
metrics=[tf.keras.metrics.RootMeanSquaredError(name='rmse')])
Solution 4
I prefer reusing part of the Keras work
from keras.losses import mean_squared_error
def root_mean_squared_error(y_true, y_pred):
return K.sqrt(mean_squared_error(y_true, y_pred))
model.compile(optimizer = "rmsprop", loss = root_mean_squared_error,
metrics =["accuracy"])
Solution 5
You can do RMSLE the same way RMSE is shown in the other answers, you just also need to incorporate the log function:
from tensorflow.keras import backend as K
def root_mean_squared_log_error(y_true, y_pred):
return K.sqrt(K.mean(K.square(K.log(1+y_pred) - K.log(1+y_true))))
dennis
Updated on July 09, 2022Comments
-
dennis almost 2 years
I try to participate in my first Kaggle competition where
RMSLE
is given as the required loss function. For I have found nothing how to implement thisloss function
I tried to settle forRMSE
. I know this was part ofKeras
in the past, is there any way to use it in the latest version, maybe with a customized function viabackend
?This is the NN I designed:
from keras.models import Sequential from keras.layers.core import Dense , Dropout from keras import regularizers model = Sequential() model.add(Dense(units = 128, kernel_initializer = "uniform", activation = "relu", input_dim = 28,activity_regularizer = regularizers.l2(0.01))) model.add(Dropout(rate = 0.2)) model.add(Dense(units = 128, kernel_initializer = "uniform", activation = "relu")) model.add(Dropout(rate = 0.2)) model.add(Dense(units = 1, kernel_initializer = "uniform", activation = "relu")) model.compile(optimizer = "rmsprop", loss = "root_mean_squared_error")#, metrics =["accuracy"]) model.fit(train_set, label_log, batch_size = 32, epochs = 50, validation_split = 0.15)
I tried a customized
root_mean_squared_error
function I found on GitHub but for all I know the syntax is not what is required. I think they_true
and they_pred
would have to be defined before passed to the return but I have no idea how exactly, I just started with programming in python and I am really not that good in math...from keras import backend as K def root_mean_squared_error(y_true, y_pred): return K.sqrt(K.mean(K.square(y_pred - y_true), axis=-1))
I receive the following error with this function:
ValueError: ('Unknown loss function', ':root_mean_squared_error')
Thanks for your ideas, I appreciate every help!
-
dennis about 7 yearsWorks perfectly fine, thank you very much for pointing out that mistake. I really did not think about it that way as I am kind of new to programming. You would not know by any chance how to edit this custom function so that it computes the root mean square LOGARITHMIC error, would you?
-
Jitesh almost 7 yearsIt gives me Unknown loss function:root_mean_squared_error
-
Dr. Snoopy almost 7 years@Jitesh Please do not make such comments, make your own question with source code.
-
carllacan about 6 years@Jitesh You're probably putting quotes around the function's name. You need to pass the function object to the compile function, not its name.
-
muon over 5 yearsyou mean
metrics=['mse']
? -
George C about 4 yearsOne thing to note is that the manifold of this loss function may go to infinite (because of the square root) and the training can fail.
-
Jo.Hen about 4 yearsThank you very much for this comment! I spent so much time trying to figure out why my RMSE results (using code above) are this same as MAE.
-
Jo.Hen about 4 yearsThis code gives this same value as MAE, not RMSE (see answer belowe).
-
Dr. Snoopy about 4 yearsI just updated the answer, by setting axis=None (the default), it will take the mean over all dimensions.
-
zipline86 about 4 years@muon mse stands for Mean Square Error. The difference is taken and then squared, followed by taking the mean. This is different than RMSE (Root Mean Squared Error) because the square root is taken of the whole operation of the Mean Square Error.
-
Tom N Tech over 3 yearsI get an error when I try to use it as a loss function:
AttributeError: 'RootMeanSquaredError' object has no attribute '__name__'
even though I used the name parameter. -
atline over 3 yearsYou may want to add more explain.
-
Hong Cheng over 3 yearsI just tried this function and get this infinite loss ^_^
-
Bersan over 3 yearsYou should always add the import
import tensorflow.keras.backend as K
(I added it to the answer) -
George C over 3 yearslol, yes, if at some point in the training the square root returns infinite all your training fails
-
fogx over 2 yearsnote that y_pred and y_true need to be float values ->
K.sqrt(K.mean(K.square(K.log(float(y_pred+1)) - K.log(float(y_true+1)))))