Location: Jean Mayer Human Nutrition Research Center On Aging
Title: Use of natural spoken language with automated mapping of self-reported food intake to food composition data for low-burden real-time dietary assessment: method comparison studyAuthor
TAYLOR, SALIMA - Jean Mayer Human Nutrition Research Center On Aging At Tufts University | |
KORPUSIK, MANDY - Massachusetts Institute Of Technology | |
DAS, SAI KRUPA - Jean Mayer Human Nutrition Research Center On Aging At Tufts University | |
GILHOOLY, CHERYL - Jean Mayer Human Nutrition Research Center On Aging At Tufts University | |
SIMPSON, RYAN - Tufts University | |
GLASS, JAMES - Massachusetts Institute Of Technology | |
ROBERTS, SUSAN - Jean Mayer Human Nutrition Research Center On Aging At Tufts University |
Submitted to: Journal of Medical Internet Research
Publication Type: Peer Reviewed Journal Publication Acceptance Date: 11/10/2021 Publication Date: 6/12/2021 Citation: Taylor, S., Korpusik, M., Das, S., Gilhooly, C., Simpson, R., Glass, J., Roberts, S. 2021. Use of natural spoken language with automated mapping of self-reported food intake to food composition data for low-burden real-time dietary assessment: method comparison study. Journal of Medical Internet Research. 23(12):e26988. https://doi.org/10.2196/26988. DOI: https://doi.org/10.2196/26988 Interpretive Summary: Self-monitoring of food intake is a cornerstone of national recommendations for health, but existing applications for this purpose are burdensome, which limits use. We developed and pilot tested a new app that combines speech understanding technology with technologies for mapping foods to appropriate food composition codes, for lower-burden and automated nutritional analysis of dietary intake. In a pilot study, there was no significant difference in energy intake between values obtained by the new method and the gold standard 24-h recall. This first demonstration of using natural spoken language to map reported foods to food composition codes demonstrates a promising new approach to automate assessments of dietary intake. Technical Abstract: Background: Self-monitoring food intake is a cornerstone of national recommendations for health, but existing apps for this purpose are burdensome for users and researchers, which limits use. Objective: We developed and pilot tested a new app (COCO Nutritionist) that combines speech understanding technology with technologies for mapping foods to appropriate food composition codes in national databases, for lower-burden and automated nutritional analysis of self-reported dietary intake. Methods: COCO was compared with the multiple-pass, interviewer-administered 24-hour recall method for assessment of energy intake. COCO was used for 5 consecutive days, and 24-hour dietary recalls were obtained for two of the days. Participants were 35 women and men with a mean age of 28 (range 20-58) years and mean BMI of 24 (range 17-48) kg/m2. Results: There was no significant difference in energy intake between values obtained by COCO and 24-hour recall for days when both methods were used (mean 2092, SD 1044 kcal versus mean 2030, SD 687 kcal, P=.70). There were also no significant differences between the methods for percent of energy from protein, carbohydrate, and fat (P=.27-.89), and no trend in energy intake obtained with COCO over the entire 5-day study period (P=.19). Conclusions: This first demonstration of a dietary assessment method using natural spoken language to map reported foods to food composition codes demonstrates a promising new approach to automate assessments of dietary intake. |