A new AI can retrieve recipes from a single still image of a meal.
Have you ever dreamed of having a recipe for a delectable meal while you’re eating it, or seeing it in an image, so that you can recreate the pleasure later?
Shazam for food will be here soon.
MIT's Deep Learning algorithm retrieves the recipe of a dish based on its photo.Click To TweetWell, that may be soon possible thanks to an experimental Deep Learning algorithm.
Pic2Recipe: we eat With our Eyes!
That’s the name of MIT’s new algorithm, (also called Im2Recipe), which, from a single photo of food, can detect the elements present and find the corresponding recipe.
To give their deep learning algorithm something to chew on, CSAIL researchers compiled a large-scale dataset, Recipe1M, containing over 1 million cooking recipes and 800k food photos.
When Pic2Recipe is presented with a photo of a dish, it analyzes its elements then wades through the Recipe1M to find the photo of an existing recipe resembling it.
Im2Recipe is still in experimenting stage, but it succeeded in 65% of the cases, according to MIT’s CSAIL who trained its neural networks to retrieve food photos.
The algorithm was more successful at identifying pastries and baked foods simply because they make the most of the dataset. With the continuous inclusion of food photos and recipes into the lexicon, the algorithm becomes more accurate over time.
You can upload your own food pictures and give Pic2Recipe a try here!
Foodography: Food as a Social Currency
Food pictures are among the most posted and shared content types in social media, and the web is “replete” with millions of photogenic dishes.
On Instagram alone, there are hundreds of millions of food-related photos.
And there’s a reason for that.
Food pictures combine two universal art forms: photography and cooking. While most people don’t master these arts, they sure enjoy them. Together, they create an experience that pleases the eyes and titillates the tasting buds.
The food Instagram trend gained more steam as food tourism grew–perhaps itself a benefactor of the Instagram catalyst. This mutually beneficial relationship has a “chicken or the egg” element to it. The average person these days immortalizes their food-centric trips with photos of exotic dishes from around the world.
However, computers are mostly blind to these food photos because they’re not gathered in reference datasets for them to make predictions. Without prior labeled data at neural networks disposal, Deep Learning systems are useless.
It’s a lot like blind tasting for human sommeliers. A master sommelier can tell you everything about a wine–down to the age, quality level, and who made it–but not if it’s not from a classic set of wines. For instance, most sommeliers can pick out Burgundy Pinot Noir in a blind tasting.
Yet, if you presented a wine that isn’t studied as a classic style, such as Merlot from Michigan, it would stump the greatest taster in the world.
Accepting that, food libraries, like MIT’s database Recipe1M, come in handy for deep learning systems. Not only do AI algorithms get access to data to make predictions, they can also gather data that would provide valuable insights into a person’s life.
Getting data about people’s eating preferences and health habits could make food photos and recipe sharing more useful (and monetizable?) than just a scroll on the monitor.
Comments (0)
Most Recent