OpenAI GPT-2 writes alternate endings for Game of Thrones

OpenAI GPT-2 writes alternate endings for Game of ThronesChintan TrivediBlockedUnblockFollowFollowingMay 22I trained the GPT-2 language model on GRRM’s book series “A Song of Ice and Fire” and let it complete the HBO show’s storyline.

Can it do better than HBO’s season 8 train-wreck?Game of Thrones season 8 storyline has left its fandom divided with millions of fans (including myself) disappointed by its rushed and unsatisfactory ending.

Many people are calling for a rewrite of the last season for the TV show while others are coming up with their own version of the ending to get a satisfactory closure.

Rest are waiting for GRRM to finish his books hoping he will provide a more compelling build-up to the characters’ unconvincing actions, but this might take many years.

Thus, I wanted to let an AI learn GRRM’s writing style and make it finish the ending of the show.

In this article, I’ll explain how we can achieve this by using the text generation language model called GPT-2, introduced by researchers at Open AI in early 2019.

Then, I’ll share the model’s predicted outcome for the storylines of three major villains of this season: the Night King, Cersei Lannister and Daenerys Targaryen.

Generative Pre-Trained Transformer 2 (GPT-2)GPT-2 is the current state-of-the-art text generation model in AI language modeling.

It is capable of producing human-like coherent sentences and is also able to maintain focus on a topic for longer periods.

Its impressive improvement over other language models is down to two main contributions of this research-Lots of Data: 8 million webpages were scraped and curated to form a 40GB corpus of text to perform unsupervised training on.

It covered a good variety of topics, which is why the pre-trained model is really good for transfer learning to specific domains, like GoT books in our case.

Lots of compute: It uses 1.

5 billion(!!) parameters in its Transformer based network architecture.

However, they have only released a smaller version of this model containing “only” 345 million parameters, citing safety reasons.

We shall use this version of the pre-trained model for performing transfer learning for GoT scripts.

The Transformer ModelGPT-2 uses a Transformer network architecture instead of the traditional RNN/LSTM/GRU networks that have usually been used for sequence modeling.

Instead of recurrent units with “memory” gates and timesteps, Transformer uses an attention-based mechanism with an encoder/decoder pair.

The input is processed by the encoder with the help of byte-pair and positional embedding that describe the “what” and “where” of the words in the input sentence (embedding).

The same encoder converts the byte and positional embeddings to a vector encoding which then goes to the decoder that is tasked to convert the encoding to the target text sequence.

This output and the encoding from previous step is fed to next encoding-decoding step and is repeated multiple times to produce further text.

Here’s an excellent article explaining this model architecture in detail.

Overview of a general attention-based Transformer model.

[source: Attention Is All You Need’ by Vaswani et al.

]Note here that GPT-2 is trained to predict the next word in the given input sentence with the assumption that in order to predict next word fairly accurately, the model will be forced to learn about the contextual meaning of all the words that it has encountered so far.

Without an understanding of the text, a language model may only be able to predict grammatically correct words but the generated text would not make complete sense.

Thus, this formulation of making the network predict sufficiently good next-word, GPT-2 is able to gain deep understanding of the input text.

Here’s a very good explanation of this concept by Ilya Sutskever.

GPT-2 does not need the encoder during text generation as it is not a seq2seq conversion task like language translation.

So once the training is over, encoder is no longer necessary.

GPT-2 only uses the decoder from the pre-trained Transformer to generate subsequent text.

Training the modelI took the text from all 5 books of George R.

R.

Martin’s book series “A Song of Ice and Fire” and concatenated them all to prepare the training text.

The pre-trained GPT-2 model with 345M parameters is used to perform transfer learning on this text.

George R.

R.

Martin’s Book Series “A Song of Ice and Fire”.

Surprisingly, I noticed the pre-trained model is also pretty good at writing about Game of Thrones characters which means it has been exposed to this content during the initial training on 8M webpages.

Anyways, I decided to narrow the scope of this “generalized” model to only write in the style of GRRM, so continued with the training on our GoT dataset.

The model was trained for ~12 hours on Google Colab’s K80 GPU.

ResultsLet’s try to address some of the disappointing endings or twists in the storyline and try to predict alternate endings to these scenes using our trained GPT-2 model.

Below are the results in video format, followed by the same in text format.

Choose your own adventure.

Disappointment 1: Night King’s anticlimactic endingInput: The Night King slowly walked up to the three-eyed Raven.

Jon was held up by the Ice Dragon and Daenerys was surrounded by Wights.

The Night King reaches for his ice sword to kill Bran, but suddenlyOutput: Hodor stops him, and he’s holding the direwolf still with his big hands.

“HODOR!” the raven cried.

“HODOR!.HODOR!” The black lord wore a lost look on his face.

He thrust his stump through Bran’s back and began fighting, rolling, spinning.

Wights, giants, halflings, straw men.

Bran fought with Summer and the Eagle and Blockhead Dog and the Stranger, but the Lord of Light would never shield a wight against a sword.

The raven was saying that just climbing up Hodor’s thick snow-white legs.

According to GPT-2, Hodor is still alive and comes to rescue Bran, what a twist!.Or it is probably the undead body of Hodor, warged into by Bran.

But after that I have no idea what the model is trying to say, haha!Input: The Night King slowly walks up to the three-eyed Raven.

Jon is held up by the Ice dragon and Daenerys is surrounded by Wights.

The Night King reaches for his ice sword to kill Bran, but suddenly something unexpected happens.

Output: The sword glows red hot.

When the blade strikes the Night’s Eye, the green glass crackling all around it splits, and the light dissolves, and the sword glows like dead ice .

Then the ravens begin to scream, the distant owls to the left and right of the dragon’s, and Something is returning fire, crashing through the doors and slaughtering all the babe-faced fools.

Osha is showing Rickon how to reload, I imagine.

Maybe they’ll all be good long men to talk to after the battle.

Even though the show runners forgot, GPT-2 remembers about the Azor Ahai prophecy and dances around the idea of Lightbringer, the sword to kill Night King.

How cool would it have been if somehow Night King’s ice sword turns out to be the Lightbringer and would ultimately kill him?!Input: Bran says, “Theon, you’re a good man.

Thank you.

”!.Theon, with his redemption arc complete, runs towards to Night King to die.

Just as he is about to pierce the Night King with his spear,Output: Dragon — or was it the Other? — loosed a blast of fire at his face, whilst Night King looked on calmly.

The world shrank to three men, standing shoulder to shoulder; the red priest and his spear, and the king clad in blue, his crown upon his head.

A breath of wind blew through the hard cold air into Theon’s head, and he could hear his brothers shouting as well.

If I had not been so strong .

 .

 .

Kill him, he wanted to shout, but the courage was gone from him then, like a part of him that had been forced to silence.

Instead he flexed his crippled arm, throwing loose the shaft of his spear.

Save yourself some damage, he told himself, and run as best you can.

This one blew my mind, not because it is brilliant writing (it’s not) but because the model knows that the dragon was actually an “Other” (which means White Walkers in the books).

How in the world would it know that Viserion is an undead dragon controlled by the Night King?.Viserion breathes fire at Theon and breaks his spear.

Theon, in this moment, decides to run away in fear like he has done before on this show.

It’s too much of a coincidence to get all these details right in a single paragraph.

Truly astonishing that the model has learned these details about the characters!Disappointment 2: Cersei Lannister going down easily without any plan or proper fightInput: The dragon has burned down the whole of Lannister army, the Iron fleet and the Golden Company army, but Cersei remains confident that the Red Keep will not fall.

As the dragon is about to burn down the towers of the Red Keep, Jaime comes to rescue Cersei.

At this moment,Output: his heart is in his throat, his hand about his throat … he does not know how to breathe …” Arya shivered.

“Tyrion said the Lannisters had knives.

” “He said they had swords,” Jaime admitted.

“Jaime did not kill Tommen to save Tommen’s virginity,” Arya said, giving his side of the story to the blushing boy beside her, Tyrion Lannister.

His little brother just grinned.

“No.

I didn’t.

” Jaime gave her a hard look.

“Very well.

You lie.

True, you lied about that kiss.

It was neither kiss nor promise you murdered Tommen with your mouth.

Only a sword or two, that.

You did cut her pretty good, little man.

Anyhow your sister will keep that dirty boy for her bed, so long as he cherishes the gift you gave him.

”Umm, okay, this one is complete garbage.

But I kept it anyway because it’s so stupid it’s actually funny.

Anyways, let’s try again!Input: The dragon has burned down the whole of Lannister army, the Iron fleet and the Golden Company army, but Cersei remains confident that the Red Keep will not fall.

As the dragon is about to burn down the towers of the Red Keep, Jaime comes to rescue Cersei.

At this moment,Output: according to her prophecies, the Mad King will attack the city … “Cersei,” Jaime said again, trying to sound stern and distant, “silently, oh, sweet, oh, sweet.

” To Trotter he wrote, “Cersei, remember, you are the king in Casterly Rock.

Remember who you are, what you are.

Remember how you slew the Mad King.

Forgive me if I make you balk.

” She made no reply.

“Forgive me,” he repeated fervidly, “for the sake of the mother I never had, the mother you never had, the mother that all of them knew and loved but did not know.

” The eagle shimmered on ripples of oil as the blade was sheathed, and the words glistened on the ceiling, shiny and dark …Slightly better this time!.Although the model gets certain details wrong, specifically about the Mad King, I do like the idea of Jaime trying to give strength to Cersei in that moment.

Perhaps using some sort of wildfire at this moment would have given some sense of justice to her storyline.

Two mad queens, fighting fire with fire, burning down rest of the city.

That would have been a better ending signifying why war for the Iron Throne is bad.

Disappointment 3: Daenerys makes unearned heel turnInput: Daenerys had love in Essos and Meereen alongside fear.

But in Westeros, she felt she only had fear.

The moment after she has won the war for the Iron Throne, she could only think about instilling fear in the minds of the people she was going to rule.

Daenerys, who was sitting atop her dragon in King’s Landing, decides she wants toOutput: eat.

“Food with teeth,” Ser Jorah said.

“I’m like to taste that,” Dany complained.

“It won’t be very Dothraki to eat browned meat and drink ale.

” She was remembering the old times, when her khalasar had been King’s Quarters during the Long Lances, and the great fat khalasars of the Dothraki would fill their cups with flowing wine and fish course and spit-roasted turnips before they rode south to wipe out the other Dothraki.

This is only food for thought.

I will eat and drink and feign rest, until the dragons come again, to kill us all.

Always a good time to eat and drink, right?.I heard there’s a good coffee shop in King’s Landing that Dany could try (if it hasn’t been burned down already).

This is bad, let’s give it another shot.

Input: Daenerys had love in Essos and Meereen alongside fear.

But in Westeros, she felt she only had fear.

The moment after she has won the war for the Iron Throne, she could only think about instilling fear in the minds of the people she was going to rule.

Daenerys, who was sitting atop her dragon in King’s Landing, decides she wants toOutput: protect the city and has Targaryen scourges and horrors ready for anyone who might try to flee, yes that includes humans.

Someone tells her that some giant might be able to slay her, but Daenerys shakes her head and says, “No giant is like to grow giant-sized.

Later, later, no one wants to die alone in the woods.

” She reads from the letter again.

“Prince Quentyn, my loyal servants, I have seen the foolish fool you were, but I have never seen the foolish fool youwere.

Your life has been taken, in full view of all we.

And Daenerys is the only one who can save you.

”So this doesn’t make much sense as well, but it seems to stick to the narrative that Dany wants to liberate the people of King’s Landing from the tyrants who are ready to “bend the knee” and are not trying to flee.

ConclusionI tried to generate more samples for these storylines, but none make a lot of sense in context of the entire show.

Even though the understanding of the storyline is not very good, the model is clearly demonstrating the ability to learn about the show’s characters and their certain traits, and also create associations between these characters in its output.

For example, I see Ser Jorah come up in the storyline when talking about Dany, and Hodor come up when talking about Bran.

And it often gets the relationship between these characters right, which is really impressive!I think this shows how far NLP research has advanced with OpenAI’s GPT-2 research work.

I would be curious to know how much better the biggest GPT-2 model performs here at this task.

Perhaps if the larger model is ever released, I might come back update this article.

As for wanting a satisfactory and sensible Game of Thrones ending, I guess current generation of AI cannot help us and we will have to wait for George R.

R.

Martin to finish his books.

Long may he live!Want to give the Game of Thrones model of GPT-2 a try on your browser?.The Google Colab notebook for testing the pre-trained model is linked below (credits for this notebook).

Google ColaboratoryGame of Thrones GPT-2 text generatorcolab.

research.

google.

comReferencesHow to Build OpenAI’s GPT-2: “The AI That’s Too Dangerous to Release”How to code The Transformer in PyTorchTransformer: Attention is all you needPlay with GPT-2 on your browserThank you for reading.

If you liked this article, you may follow more of my work on Medium, GitHub, or subscribe to my YouTube channel.

Do it.

Bend the knee.

Or else face my dragons.

.

. More details

Leave a Reply