The Food Recognition Benchmark: Using Deep Learning to Recognize Food in Images

The automatic recognition of food on images has numerous interesting applications, including nutritional tracking in medical cohorts. The problem has received significant research attention, but an ongoing public benchmark on non-biased (i.e., not scraped from web) data to develop open and reproducible algorithms has been missing. Here, we report on the setup of such a benchmark using publicly available food images sourced through the mobile MyFoodRepo app used in research cohorts. Through four rounds, the benchmark released the MyFoodRepo-273 dataset constituting 24,119 images and a total of 39,325 segmented polygons categorized in 273 different classes. Models were evaluated on private tests sets from the same platform with 5,000 images and 7,865 annotations in the final round. Top-performing models on the 273 food categories reached a mean average precision of 0.568 (round 4) and a mean average recall of 0.885 (round 3), and were deployed in production use of the MyFoodRepo app. We present experimental validation of round 4 results, and discuss implications of the benchmark setup designed to increase the size and diversity of the dataset for future rounds..

Medienart:

E-Artikel

Erscheinungsjahr:

2022

Erschienen:

2022

Enthalten in:

Zur Gesamtaufnahme - volume:9

Enthalten in:

Frontiers in Nutrition - 9(2022)

Sprache:

Englisch

Beteiligte Personen:

Sharada Prasanna Mohanty [VerfasserIn]
Gaurav Singhal [VerfasserIn]
Eric Antoine Scuccimarra [VerfasserIn]
Djilani Kebaili [VerfasserIn]
Harris Héritier [VerfasserIn]
Victor Boulanger [VerfasserIn]
Marcel Salathé [VerfasserIn]
Marcel Salathé [VerfasserIn]

Links:

doi.org [kostenfrei]
doaj.org [kostenfrei]
www.frontiersin.org [kostenfrei]
Journal toc [kostenfrei]

Themen:

Artificial intelligence (AI)
Benchmark
Deep learning
Food recognition
Images
Nutrition. Foods and food supply

doi:

10.3389/fnut.2022.875143

funding:

Förderinstitution / Projekttitel:

PPN (Katalog-ID):

DOAJ032294476