Skip to main content
ARS Home » Plains Area » Houston, Texas » Children's Nutrition Research Center » Research » Publications at this Location » Publication #421730

Research Project: Preventing the Development of Childhood Obesity

Location: Children's Nutrition Research Center

Title: AI-enabled wearable cameras for assisting dietary assessment in African populations

Author
item LO, FRANK - Imperial College
item QIU, JIANING - Imperial College
item JOBARTEH, MODOU - London School Of Hygeine
item SUN, YINGNAN - Imperial College
item WANG, ZEYU - Imperial College
item JIANG, SHUO - Tongji University
item BARANOWSKI, TOM - Children'S Nutrition Research Center (CNRC)
item ANDERSON, ALEX - University Of Georgia
item MCCRORY, MEGAN - Boston University
item SAZONOV, EDWARD - University Of Alabama
item JIA, WENYAN - University Of Pittsburgh
item SUN, MINGUI - University Of Pittsburgh
item STEINER-ASIEDU, MATILDA - University Of Ghana
item FROST, GARY - Imperial College
item LO, BENNY - Imperial College

Submitted to: npj Digital Medicine
Publication Type: Peer Reviewed Journal
Publication Acceptance Date: 11/16/2024
Publication Date: 12/5/2024
Citation: Lo, F.P., Qiu, J., Jobarteh, M.L., Sun, Y., Wang, Z., Jiang, S., Baranowski, T., Anderson, A.K., McCrory, M.A., Sazonov, E., Jia, W., Sun, M., Steiner-Asiedu, M., Frost, G., Lo, B. 2024. AI-enabled wearable cameras for assisting dietary assessment in African populations. npj Digital Medicine. 7. Article 356. https://doi.org/10.1038/s41746-024-01346-8.
DOI: https://doi.org/10.1038/s41746-024-01346-8

Interpretive Summary: Self-reported dietary intake assessment tends to be highly error prone and can be expensive to conduct because of the needed nutrient intake software. We have developed an objective method for dietary assessment using low-cost wearable cameras. EgoDiet employs an egocentric vision-based pipeline to identify foods and learn portion sizes. To evaluate the functionality of EgoDiet, field studies were conducted in London (Study A) and Ghana (Study B) among samples of Ghanaian and Kenyan origin. In the more controlled circumstances of Study A, EgoDiet’s estimations were contrasted with dietitians’ assessments, revealing a  Mean Absolute Percentage Error (MAPE) of 31.9% for portion size estimation, compared to 40.1% for estimates made by dietitians. In the more naturalistic circumstances of Study B, comparing EgoDiet to the traditional 24-Hour Dietary Recall (24HR) demonstrated a MAPE of 28.0% for EgoDiet, and a MAPE of 32.5%for the 24-Hour dietary recall. This improvement highlights the potential of using passive camera technology to serve as an alternative to the traditional dietary assessment methods.

Technical Abstract: We have developed a population-level method for dietary assessment using low-cost wearable cameras. Our approach, EgoDiet, employs an egocentric vision-based pipeline to learn portion sizes, addressing the shortcomings of traditional self-reported dietary methods. To evaluate the functionality of this method, field studies were conducted in London (Study A) and Ghana (Study B) among populations of Ghanaian and Kenyan origin. In Study A, EgoDiet's estimations were contrasted with dietitians' assessments, revealing a performance with a Mean Absolute Percentage Error (MAPE) of 31.9% for portion size estimation, compared to 40.1% for estimates made by dietitians. We further evaluated our approach in Study B, comparing its performance to the traditional 24-Hour Dietary Recall (24HR). Our approach demonstrated a MAPE of 28.0%, showing a reduction in error when contrasted with the 24HR, which exhibited a MAPE of 32.5%. This improvement highlights the potential of using passive camera technology to serve as an alternative to the traditional dietary assessment methods.