What Happens When You Let an Artificial Intelligence Make Cooking Recipes?

And what happends when you eat them?

Rick Vink
4 min readSep 13, 2020
Edited by me the editor and original Image by Daria-Yakovleva on Pixabay

Food and Artificial Intelligence, two topics that are dear to my heart. I love watching Master Chef and wonder about how those dishes would taste. On the other hand I love to work on Artificial Intelligence. So why not combine the two?

Artificial generated recipe website

I have set myself the goal to generate a diverse set of new eatable recipes all generated by Artificial Intelligence. But why not go just that one step further and create a whole website with recipes that you can show it (off) to others!

I think that ten year ago most people would probably have thought that this idea is doomed to fail. However, nowadays this is actually quite doable when you have the know-how. This due to the recent advancements in text generation.

Every recipe page consists of multiple components: A title, categories, ingredients, instructions and don’t forget the comment section. All these components have to relate to each other. A cookie must be classified as a cookie, must contain the ingredients for a cookie, and those ingredients have to be mixed and matched the right way to get a cookie!

So did it work out? Well check the results for yourself! (And then come back to this article, please)

How to write content

Content all the way

Content is very important in this day and age. Dedicated teams are working hard to compose the best articles to put company pages higher in the SEO ranking list of Google. This requires two different skill sets: content creation and domain knowledge. Without good writing skills there aren’t that many people who want to plow through your text. And without sustenance people aren’t interested in reading your text.

Writing content with an Artificial Intelligence

I suspect that most of the readers of this article have come across the model GPT-2 and/or GPT-3 by OpenAI; these models are famous for writing texts. They’re altered versions of a so called transformer network that are tailor made to extract knowledge from the previous text and use that to predict the next word. It’s trained by predicting the next word of large amounts of text examples; including Wikipedia articles.

You can download the trained GPT-2 model and use it yourself. Just think about it… You can download a model that is trained on a bunch of text for multiple weeks on muliple cloud servers (probably costing a couple of milions of Euros/Dollars to train) for free! This is one aspect of the field of Artificial Intelligence that I find amazing!

Retraining a model

So of course, I wanted to generate food recipes. The fastest way to get a good result is to use an existing pretrained model that already understands language and to retrain them on your dataset. So I downloaded a couple (about 20k) of recipes from allrecipes.com and preprocessed them in a structure:

<title> Title <categories> Category0… CategoryN <Ingredients>Ingredient0… IngredientN <Instructions>Instruction0… InstructionN <Comment> Comment <Rating>

When you have this structure it’s actually just like any other text. I made a YouTube video about retraining the model:

Video about how I retrained the GPT-2 model to generate food recipes

Are the recipes any good? A lot of them are prety good actually! I randomly choose one recipe and followed all the steps. And this is the result:

What is next? The Good, the Bad and the Fun

This project is most definitely just for the fun and for the learnings. But of course you could also do great things with it! I like to focus on the good things, so let’s start with that.

There is so much data out there and it’s too much to know/read it all. Some applications of this technique that adress this issue are summarization and question answering. Just imagine how great it will be to insert a couple of research papers and ask the model a few questions for which it can give a proper answer!

But of course, as with most technologies, this technique also has its dark side. GPT-2 model was not released to the public the day the paper about the model was published because the creators were afraid of the application of writing fake news. This gives another weapon to the arsenal to people with lower moral standards.

I don’t like to end on a low note, so here is a picture of a puppy

Image on akc.org

More of me

My website

My own projects on youtube:

Our Machine Learning and Data Science courses:

Meetups about Machine Learning in The Netherlands:

--

--