Advanced Keras — Constructing Complex Custom Losses and Metrics

Advanced Keras — Constructing Complex Custom Losses and MetricsEyal ZakkayBlockedUnblockFollowFollowingJan 10Photo Credit: Eyal ZakkayTL;DR — In this tutorial I cover a simple trick that will allow you to construct custom loss functions in Keras which can receive arguments other than y_true and y_pred.

Background — Keras Losses and MetricsWhen compiling a model in Keras, we supply the compile function with the desired losses and metrics.

For example:model.

compile(loss=’mean_squared_error’, optimizer=’sgd’, metrics=‘acc’)For readability purposes, I will focus on loss functions from now on.

However most of what‘s written will apply for metrics as well.

From Keras’ documentation on losses:You can either pass the name of an existing loss function, or pass a TensorFlow/Theano symbolic function that returns a scalar for each data-point and takes the following two arguments:y_true: True labels.

TensorFlow/Theano tensor.

y_pred: Predictions.

TensorFlow/Theano tensor of the same shape as y_true.

So if we want to use a common loss function such as MSE or Categorical Cross-entropy, we can easily do so by passing the appropriate name.

A list of available losses and metrics are available in Keras’ documentation.

Custom Loss FunctionsWhen we need to use a loss function (or metric) other than the ones available , we can construct our own custom function and pass to model.

compile.

For example, constructing a custom metric (from Keras’ documentation):Loss/Metric Function with Multiple ArgumentsYou might have noticed that a loss function must accept only 2 arguments: y_true and y_pred, which are the target tensor and model output tensor, correspondingly.

But what if we want our loss/metric to depend on other tensors other than these two?To accomplish this, we will need to use function closure.

We will create a loss function (with whichever arguments we like) which returns a function of y_true and y_pred.

For example, if we want (for some reason) to create a loss function that adds the mean square value of all activations in the first layer to the MSE:Note that we have created a function (without limiting the number of arguments) that returned a legitimate loss function, which has access to the arguments of its enclosing function.

A more concrete example:The previous example was rather a toy example for a not so useful use case.

So when would we want to use such loss functions?Let’s say you are designing a Variational Autoencoder.

You want your model to be able to reconstruct its inputs from the encoded latent space.

However, you also want your encoding in the latent space to be (approximately) normally distributed.

While the former goal can be achieved by designing a reconstruction loss that depends only on your inputs and desired outputs y_true and y_pred.

For the latter, you will need to design a loss term (for instance, Kullback Leibler loss) that operates on the latent tensor.

To give your loss function access to this intermediate tensor, the trick we have just learned can come in handy.

Example use:This example is part of a Sequence to Sequence Variational Autoencoder model, for more context and full code visit this repo — a Keras implementation of the Sketch-RNN algorithm.

As mentioned before, though examples are for loss functions, creating custom metric functions works in the same way.

Keras version at time of writing : 2.

2.

4References:[1] Keras — Losses[2] Keras — Metrics[3] Github Issue — Passing additional arguments to objective function.

. More details

Leave a Reply