An Intuitive Explanation to Dropout

Well, let us consider the following network…A Simple Neural Network ExampleWe could think of the input layer (in green) as the question the host asking, each neuron in the hidden layer (in blue) as one person from the audience, and the output layer (in red) as the chosen answer from one selected audience.

If the output layer finds out that a specific neuron is always giving the best answer, it may neglect the others and give all the weight to this neuron.

Based on our previous analysis, we chose to forbid some neurons of answering and give chance to others.

This way we will achieve balance and force all neurons to learn.

This is the concept of dropout, and technically it works as follows:We assign a dropout rate, which represents the percentage of neurons to drop (e.

g.

20% of neurons)At each stage, we remove random neurons according to the predefined percentage.

We calculate the final output according to the combination of results from the remaining neurons.

With this technique all neurons will have the chance to vote and will be obliged to answer correctly to decrease the model loss.

This is an example of a neural network before and after dropout.

Image source: Deep Learning for Computer VisionDropout in KerasApplying dropout in Keras in simpler than you think.

All what you have to do is import and create a new Dropout layer object, then add it into the relevant place withing your network structure.

from keras import models, layersmodel = models.

Sequential()model.

add(layers.

Dense(32))model.

add(layers.

Dropout(0.

5))model.

add(layers.

Dense(1))Normally, dropout layer is added after a Dense layer in a fully connected before the output with a dropout rate = 0.

5 (50%).

Some recent approaches apply dropout after the activation function of a convolution layer or a recurrent layer with a dropout rate=0.

2 (20%).

Final ThoughtsIn this article I introduced dropout, an interesting way to solve overfitting problems while training.

Even thought, the concept behind dropout is very simple, it leads to great improvements when training your model.

I tried to keep the explanation as simple as possible.

So, if you have any questions please feel free to leave your comment below.

.. More details

Leave a Reply