Transfer Learning for Image Classification using Keras

This is exactly what we want since we’re going to train our own brand new FC layers for transfer learning.Flowing dataNow we’ll need to create a data generator to actually get our data from our folders and into Keras in an automated way..Keras provides convenient python generator functions for this purpose.We define our training directory called “food_dataset” which contains the folders for each class of images as we set up before..We also define the image dimensions and batch size; the Keras generator will automatically resize all loaded images to the size of target_size using bilinear interpolation.We’ll add some additional data augmentation to our generator, flipping and rotations, to try and boost our model’s accuracy..We create our final generator using the flow_from_directory function, which will use a queue to maintain a continuous flow of loading and preparing our images!Popping layersIt’s time to set up our final model for transfer learning..You may find it quite convenient to write a function for this if you’re going to use it again in the future..There’s one you can start with down below.We start by freezing all of the base model’s layers..We don’t want to train those layers since we are trying to leverage the knowledge learned by the network from the previous dataset (in this case ImageNet)..By setting the layer.trainable=False , we are telling Keras not to update those weights during training, which is exactly what we want!Now we can add on our FC layers..We do this in a loop since many networks have multiple FC layers and looping through a list makes things clean and easy..We’ll also add in some dropout to each FC layer to reduce the chances of overfitting (this part is optional)..At the end, we tack on the final Softmax layer and build the Keras model.Once our function is all set up, it’s a simple one-liner to create our final model..Just pass over the number of classes for the softmax layer, the sizes of the FC layers in a list (since we loop over it), and the dropout probability.Train itThe final step is to set up our training and hit the big red button to make it all run!We’ll use the Adam optimiser with a small value for the learning rate.. More details

Leave a Reply