Generating Novel Cocktail Recipes with a Specific Style through Recurrent Neural Networks

Generating Novel Cocktail Recipes with a Specific Style through Recurrent Neural NetworksDaniel BojarBlockedUnblockFollowFollowingJan 2Deep learning is all the rage now.

It’s used in everything from diagnostics via pathology image classification to speech recognition used in Google Assistant or Amazon Echo.

So it seems useful to at least pick it up as a skill to implement into your workflow.

And as every new skill you ever learn, it can be quite challenging to get started and to stay motivated throughout the learning curve.

So what better way to motivate yourself than to choose cocktails as a subject of inquiry, where you may reward yourself with a delicious beverage upon the successful completion of your project?To make it brief: I planned to generate a program that can generate new cocktail recipes from scratch using deep learning in Python.

For this, I chose recurrent neural networks (RNNs), a form of deep learning often used in natural language processing (NLP).

More specifically, I used long short-term memory units (LSTMs) as units of the RNN to create an LSTM network.

As their name suggests, LSTMs (and RNNs in general) have the advantage of memory which is particularly useful if you generate coherent text a character or word at a time and don’t want to end up with a garbled mess.

Here you can find an excellent introduction into RNNs for NLP, but briefly here’s what happens for LSTMs: A single LSTM consists of a memory-containing cell, an input gate, an output gate and a forget gate.

Simplistically speaking, the input and forget gate determine how much of incoming values transit to the output gate and the activation function of the gates is usually a logistic function.

What we effectively learn with our input data are the weights of connections that influence the activity of these gates and hopefully generate awesome cocktails in the process!As with all such projects, data comes first.

To generate cocktail recipes, you first need cocktail recipes so that your RNN can learn the correct embeddings (the ‘company a word keeps’ according to Firth) of ingredients and quantities etc.

Of course you could, in theory, choose any cocktail recipes you come across for this but I decided to restrict myself to recipes from one specific bar to emulate their style with this AI algorithm.

More specifically, I chose the famous bar Death & Co in New York City.

Conveniently I own a physical copy of their highly recommended book ‘Death & Co: Modern Classic Cocktails, with More than 500 Recipes’ which I used to (manually) enter the ~500 recipes into a text file (one cocktail recipe each line, because what else do you do during the holidays).

Importantly, I regularized the recipes while transcribing them.

For us, the differences and commonalities of a Smith & Cross Rum or a Santa Teresa 1796 Rum may be obvious, especially in taste.

Yet too many variants of something I simplistically labeled as ‘brown rum’ may rob the RNN of the opportunity to effectively learn the link between brown rum and, say, lime juice.

I stripped brand names, clustered classes of ingredients and converted ounces, dashes and drops into the uniform milliliters.

This of course also means that we may have to subsequently optimize generated recipes by trying out different versions of the category ‘brown rum’.

Now that we have the data, we can get started with deep learning!.One great resource to get started on this is Colaboratory, a free Jupyter notebook environment offered by Google.

Next to being able to link your notebook to your Google Drive files you can use a free Tesla K80 GPU for your computations (much faster for neural networks than the CPU in laptops).

For text generation with RNNs, textgenrnn is a highly effective tool based on Keras/TensorFlow.

Before we start training we first need to set the hyperparameters which define the architecture of our model.

To build the model, I combined five layers of LSTM layers which each had 128 memory cells with the gates described above.

Additionally, I decided to use bidirectional LSTMs which take characters before as well as after the character of interest into account to strengthen the context-awareness of the RNN.

After a semi-successful try with words as features I switched to an RNN version focusing on characters (more suitable for textgenrnn as it’s based on char-rnn by Andrej Karpathy).

Another important consideration for the eventual text generation is the question how many context characters the RNN should consider prior to generating the next character and which I set to 40.

Watching the RNN grow & learn how to construct cocktail recipesNow for each epoch, a full pass through the training data, the 40 context characters of every character are converted into their 100-dimensional embeddings and fed into and through the five LSTM layers.

All five layers are also connected to an attention layer at the end.

Loosely based on human attention, attention layers focus the network on certain regions of the surrounding characters while downplaying others and then shift their attention repeatedly.

The final output of the RNN are then predicted probabilities for the next character.

Here, I trained my RNN for 40 epochs, meaning 40 full passes through the training data.

In the beginning, the RNN is not able to construct coherent words as seen in the examples.

Yet at every step during the training process improvements in the cocktail recipe generation can be clearly seen and at the end, most words and usages seem correct.

The largest problem can be seen with cocktail names, as the input names are mostly fictional and only occur once or twice (with the exception of suffixes such as ‘… Daiquiri’ or ‘… Julep’).

Now the fun starts!.After training for 40 epochs (which took about ~20 minutes on a GPU), the model is now ready for deployment.

One detail that still has to be mentioned is temperature.

Here, temperature is used as a stand-in for ‘creativity’ (the hotter the more creative) and determines how close the model creations will be to the input text.

We can now take the model and, with a given temperature and a maximum length of the created recipes, generate an unlimited number of new cocktail recipes inspired by the Death & Co style.

I personally like to have a mix of temperatures in my creations because it gives you a wider spectrum to choose from.

Because you have to choose a bit as a few recipes won’t make sense.

Examples include the demand for ‘176 ml of cinnamon syrup’ or ingredients from a different universe such as ’15 ml of laro werry’.

In general though, this truly is a great way to receive inspiration for new cocktail combinations and, by combining human and AI, the created recipes can be further tweaked to perfection.

If you want to generate cocktail recipes yourself you can download the model weights, vocabulary and configuration for the described RNN here.

Also, if you are interested what this cocktailAI generated and how they tasted, head over to my other article where I try some of these cocktails and delve into flavor combinations thought up by a machine.

Here’s just a brief teaser of one of the more intriguing creations, the abstrusely named Pon Cong where I filled in the brand of the gin and the detailed instructions.

Cheers!A cocktail dreamt up by a machinePon Cong1 Strawberry45 ml (1.

5 oz) Wild Burrow Irish Gin7.

5 ml (0.

25 oz) Vanilla SyrupGarnish: 1 strawberryInstructions: Muddle the strawberry with the gin and the syrup.

Carefully strain the mixture into a mixing tin filled with ice cubes and stir.

Carefully strain the mixture into a prechilled glass and garnish with a cut strawberry.

Vanilla Syrup: Bring 1 cup of water and 1 cup of sugar to a boil with half a vanilla bean.

Leave overnight; strain.

.

. More details

Leave a Reply