10
IRUS Total
Downloads
  Altmetric

Food/non-food classification of real-life egocentric images in low- and middle-income countries based on image tagging features

File Description SizeFormat 
frai-04-644712.pdfPublished version2.51 MBAdobe PDFView/Open
Title: Food/non-food classification of real-life egocentric images in low- and middle-income countries based on image tagging features
Authors: Chen, G
Jia, W
Zhao, Y
Mao, Z-H
Lo, B
Anderson, AK
Frost, G
Jobarteh, ML
McCrory, MA
Sazonov, E
Steiner-Asiedu, M
Ansong, RS
Baranowski, T
Burke, L
Sun, M
Item Type: Journal Article
Abstract: Malnutrition, including both undernutrition and obesity, is a significant problem in low- and middle-income countries (LMICs). In order to study malnutrition and develop effective intervention strategies, it is crucial to evaluate nutritional status in LMICs at the individual, household, and community levels. In a multinational research project supported by the Bill & Melinda Gates Foundation, we have been using a wearable technology to conduct objective dietary assessment in sub-Saharan Africa. Our assessment includes multiple diet-related activities in urban and rural families, including food sources (e.g., shopping, harvesting, and gathering), preservation/storage, preparation, cooking, and consumption (e.g., portion size and nutrition analysis). Our wearable device ("eButton" worn on the chest) acquires real-life images automatically during wake hours at preset time intervals. The recorded images, in amounts of tens of thousands per day, are post-processed to obtain the information of interest. Although we expect future Artificial Intelligence (AI) technology to extract the information automatically, at present we utilize AI to separate the acquired images into two binary classes: images with (Class 1) and without (Class 0) edible items. As a result, researchers need only to study Class-1 images, reducing their workload significantly. In this paper, we present a composite machine learning method to perform this classification, meeting the specific challenges of high complexity and diversity in the real-world LMIC data. Our method consists of a deep neural network (DNN) and a shallow learning network (SLN) connected by a novel probabilistic network interface layer. After presenting the details of our method, an image dataset acquired from Ghana is utilized to train and evaluate the machine learning system. Our comparative experiment indicates that the new composite method performs better than the conventional deep learning method assessed by integrated measures of sensitivity, specificity, and burden index, as indicated by the Receiver Operating Characteristic (ROC) curve.
Issue Date: 1-Apr-2021
Date of Acceptance: 26-Feb-2021
URI: http://hdl.handle.net/10044/1/88413
DOI: 10.3389/frai.2021.644712
ISSN: 2624-8212
Publisher: Frontiers Media
Journal / Book Title: Frontiers in Artificial Intelligence
Volume: 4
Copyright Statement: © 2021 Chen, Jia, Zhao, Mao, Lo, Anderson, Frost, Jobarteh, McCrory, Sazonov, Steiner-Asiedu, Ansong, Baranowski, Burke and Sun. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
Sponsor/Funder: Bill and Melinda Gates Foundation
Bill & Melinda Gates Foundation
Funder's Grant Number: OPP1171395
OPP1171395
Keywords: artificial intelligence
egocentric image
low- and middle-income country
technology-based dietary assessment
wearable device
artificial intelligence
egocentric image
low- and middle-income country
technology-based dietary assessment
wearable device
Publication Status: Published
Conference Place: Switzerland
Open Access location: https://www.frontiersin.org/articles/10.3389/frai.2021.644712/full
Article Number: ARTN 644712
Appears in Collections:Department of Surgery and Cancer
Institute of Global Health Innovation



This item is licensed under a Creative Commons License Creative Commons