From a picture, the Mit algoritm which reveals what and how we eat

From a picture, the Mit algoritm which reveals what and how we eat

We still don’t know whether the MIT researchers came up with the idea after watching the  3ª episode of “Silicon Valley” IV season, but there are many obvious similarities between the tv series and their Pic2Recipe.

For those who may not be familiar with “Silicon Valley”, it is a TV series made in the US and broadcast in Italy by Sky Atlantic,  which tells the story of a group of programmers ( some may call them nerds) working in a brain factory managed by Erlich Bachman, who provides them with funds and a workplace in exchange for a 10%  share on their projects. It is a fun portrait of start-ups operating in the Valley and of their funding mechanism, and so on. Read it in Italian.

In the above mentioned episode, one of Bachman’s protegés develops an app, called SeaFood, offering 8 octopus based recipes, and then manages to sell it to investors as SeeFood, i.e. the “food Shazam”, processing users’ pictures. After the episode was broadcast in the United States, ironically, a parody of the app was actually created and placed on sale in the Apple Store, but that’s another story (Silicon Valley, the SeeFood app from the TV series is real! ).

Going back to the researchers of the Massachusetts Institute of Technology, Pic2Recipe is an algorithm based on AI which can analyze the picture of a dish, guess its ingredients and suggest some similar recipes.

Pictures posted on social networks, apparently useless, can actually provide a precious vision of people’s  dietary habits and food preferences “, explained Yusuf Aytar from Mit (Artificial intelligence suggests recipes based on food photos).

Some time ago a survey by the The Guardian, showed that food, once in the dish or while being prepared, is the subject of most pictures taken and shared by users (about 67% of pictures) and accounts for 25% of the pictures in the memory cards of our smartphones. However, despite the fact that such huge quantity of pictures is silently changing our dietary habits – let’s think about sushi and about how many of our friends regularly post it on their social pages – Big Data have not exploited it yet to understand ( and profile ) users’ habits.

So while Big Data currently play a fundamental role in the food industry ( you may read “ Big Data provide the solution for future famines ”), at a basic level, commercially speaking, they are still weak.

The Qatar Computing Research Institute team of Mit believe that the analysis of the pictures posted by users may help people to learn new ways to prepare food, to become more aware of what they eat, and allow marketers to understand their clients’ habits and governments to focus on the dangers brought about by some of the aforementioned habits.

A demo version if Pic2Recipe is already available on line.
In  the future, the team is expected to make it guess how the food was coke, to get to the point where the app will be able to suggest users what to cook on the basis of the ingredients they have in their fridges.

A real treasure for food marketers but not only!

Are you used to posting your food? Tweet @agostinellialdo.

If you liked this post, you may also likeWhy more businesses are turning to artificial intelligence

 

Leggi questo articolo in Italiano