[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3313831.3376869acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article
Open access

FitByte: Automatic Diet Monitoring in Unconstrained Situations Using Multimodal Sensing on Eyeglasses

Published: 23 April 2020 Publication History

Abstract

In an attempt to help users reach their health goals and practitioners understand the relationship between diet and disease, researchers have proposed many wearable systems to automatically monitor food consumption. When a person consumes food, he/she brings the food close to their mouth, take a sip or bite and chew, and then swallow. Most diet monitoring approaches focus on one of these aspects of food intake, but this narrow reliance requires high precision and often fails in noisy and unconstrained situations common in a person's daily life. In this paper, we introduce FitByte, a multi-modal sensing approach on a pair of eyeglasses that tracks all phases of food intake. FitByte contains a set of inertial and optical sensors that allow it to reliably detect food intake events in noisy environments. It also has an on-board camera that opportunistically captures visuals of the food as the user consumes it. We evaluated the system in two studies with decreasing environmental constraints with 23 participants. On average, FitByte achieved 89% F1-score in detecting eating and drinking episodes.

Supplementary Material

ZIP File (pn9728aux.zip)
smashlab.io/publications/fitbyte dataset available in the link above
MP4 File (paper740pv.mp4)
Preview video
MP4 File (pn9728vf.mp4)
Supplemental video

References

[1]
O. Amft, H. Junker, and G. Troster. 2005. Detection of eating and drinking arm gestures using inertial body-worn sensors. In Ninth IEEE International Symposium on Wearable Computers (ISWC'05). 160--163.
[2]
O. Amft*, M. Kusserow, and G. TrÖster. 2009. Bite Weight Prediction From Acoustic Recognition of Chewing. IEEE Transactions on Biomedical Engineering 56, 6 (June 2009), 1663--1672.
[3]
Oliver Amft, Mathias Stäger, Paul Lukowicz, and Gerhard Tröster. 2005. Analysis of Chewing Sounds for Dietary Monitoring. In Proceedings of the 7th International Conference on Ubiquitous Computing (UbiComp'05). Springer-Verlag, Berlin, Heidelberg, 56--72.
[4]
Abdelkareem Bedri, Richard Li, Malcolm Haynes, Raj Prateek Kosaraju, Ishaan Grover, Temiloluwa Prioleau, Min Yan Beh, Mayank Goel, Thad Starner, and Gregory Abowd. 2017. EarBit: Using Wearable Sensors to Detect Eating Episodes in Unconstrained Environments. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 1, 3, Article Article 37 (Sept. 2017), 20 pages.
[5]
Abdelkareem Bedri, Apoorva Verlekar, Edison Thomaz, Valerie Avva, and Thad Starner. 2015a. Detecting Mastication: A Wearable Approach. In Proceedings of the 2015 ACM on International Conference on Multimodal Interaction (ICMI '15). Association for Computing Machinery, New York, NY, USA, 247--250.
[6]
Abdelkareem Bedri, Apoorva Verlekar, Edison Thomaz, Valerie Avva, and Thad Starner. 2015b. A Wearable System for Detecting Eating Activities with Proximity Sensors in the Outer Ear. In Proceedings of the 2015 ACM International Symposium on Wearable Computers (ISWC '15). Association for Computing Machinery, New York, NY, USA, 91--92.
[7]
Shengjie Bi, Tao Wang, Ellen Davenport, Ronald Peterson, Ryan Halter, Jacob Sorber, and David Kotz. 2017. Toward a Wearable Sensor for Eating Detection. In Proceedings of the 2017 Workshop on Wearable Systems and Applications (WearSys '17). Association for Computing Machinery, New York, NY, USA, 17--22.
[8]
Shengjie Bi, Tao Wang, Nicole Tobias, Josephine Nordrum, Shang Wang, George Halvorsen, Sougata Sen, Ronald Peterson, Kofi Odame, Kelly Caine, and et al. 2018. Auracle: Detecting Eating Episodes with an Ear-Mounted Sensor. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2, 3, Article Article 92 (Sept. 2018), 27 pages.
[9]
Y. Bi, M. Lv, C. Song, W. Xu, N. Guan, and W. Yi. 2016. AutoDietary: A Wearable Acoustic Sensor System for Food Intake Recognition in Daily Life. IEEE Sensors Journal 16, 3 (Feb 2016), 806--816.
[10]
Junghoon Chae, Insoo Woo, SungYe Kim, Ross Maciejewski, Fengqing Zhu, Edward J Delp, Carol J Boushey, and David S Ebert. 2011. Volume estimation using food specific shape templates in mobile image-based dietary assessment. In Computational Imaging IX, Vol. 7873. International Society for Optics and Photonics, 78730K.
[11]
Keum San Chun, Sarnab Bhattacharya, and Edison Thomaz. 2018. Detecting Eating Episodes by Tracking Jawbone Movements with a Non-Contact Wearable Sensor. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2, 1, Article Article 4 (March 2018), 21 pages.
[12]
Jungman Chung, Jungmin Chung, Wonjun Oh, Yongkyu Yoo, Won Gu Lee, and Hyunwoo Bang. 2017. A glasses-type wearable device for monitoring the patterns of food intake and facial activity. Scientific reports 7 (2017), 41690.
[13]
Y. Dong, J. Scisco, M. Wilson, E. Muth, and A. Hoover. 2014. Detecting Periods of Eating During Free-Living by Tracking Wrist Motion. IEEE Journal of Biomedical and Health Informatics 18, 4 (July 2014), 1253--1260.
[14]
M. Farooq and E. Sazonov. 2017. Segmentation and Characterization of Chewing Bouts by Monitoring Temporalis Muscle Using Smart Glasses With Piezoelectric Sensor. IEEE Journal of Biomedical and Health Informatics 21, 6 (Nov 2017), 1495--1503.
[15]
J. M. Fontana, M. Farooq, and E. Sazonov. 2014. Automatic Ingestion Monitor: A Novel Wearable Device for Monitoring of Ingestive Behavior. IEEE Transactions on Biomedical Engineering 61, 6 (June 2014), 1772--1779.
[16]
David R. Jacobs. 2012. Challenges in Research in Nutritional Epidemiology. Humana Press, Totowa, NJ, 29--42.
[17]
J. Liu, E. Johns, L. Atallah, C. Pettitt, B. Lo, G. Frost, and G. Yang. 2012. An Intelligent Food-Intake Monitoring System Using Wearable Sensors. In 2012 Ninth International Conference on Wearable and Implantable Body Sensor Networks. 154--160.
[18]
Christopher Merck, Christina Maher, Mark Mirtchouk, Min Zheng, Yuxiao Huang, and Samantha Kleinberg. 2016. Multimodality Sensing for Eating Recognition. In Proceedings of the 10th EAI International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth '16). ICST (Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering), Brussels, BEL, 130--137.
[19]
Mark Mirtchouk, Drew Lustig, Alexandra Smith, Ivan Ching, Min Zheng, and Samantha Kleinberg. 2017. Recognizing Eating from Body-Worn Sensors: Combining Free-Living and Laboratory Data. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 1, 3, Article Article 85 (Sept. 2017), 20 pages.
[20]
Mark Mirtchouk, Christopher Merck, and Samantha Kleinberg. 2016. Automated Estimation of Food Type and Amount Consumed from Body-Worn Audio and Motion Sensors. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '16). Association for Computing Machinery, New York, NY, USA, 451--462.
[21]
T. Olubanjo and M. Ghovanloo. 2014. Real-time swallowing detection based on tracheal acoustics. In 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). 4384--4388.
[22]
Temiloluwa O Olubanjo. 2016. Towards automatic food intake monitoring using wearable sensor-based systems. Ph.D. Dissertation. Georgia Institute of Technology.
[23]
T. Prioleau, E. Moore II, and M. Ghovanloo. 2017. Unobtrusive and Wearable Systems for Automatic Dietary Monitoring. IEEE Transactions on Biomedical Engineering 64, 9 (Sep. 2017), 2075--2089.
[24]
S. A. Rahman, C. Merck, Yuxiao Huang, and S. Kleinberg. 2015. Unintrusive eating recognition using Google Glass. In 2015 9th International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth). 108--111.
[25]
Tauhidur Rahman, Alexander T. Adams, Mi Zhang, Erin Cherry, Bobby Zhou, Huaishu Peng, and Tanzeem Choudhury. 2014. BodyBeat: A Mobile System for Sensing Non-Speech Body Sounds. In Proceedings of the 12th Annual International Conference on Mobile Systems, Applications, and Services (MobiSys '14). Association for Computing Machinery, New York, NY, USA, 2--13.
[26]
Seyedmostafa Safavi and Zarina Shukur. 2014. Improving Google glass security and privacy by changing the physical and software structure. Life Science Journal 11, 5 (2014), 109--117.
[27]
Giovanni Schiboni and Oliver Amft. 2018. Automatic Dietary Monitoring Using Wearable Accessories. Springer International Publishing, Cham, 369--412.
[28]
S. Sen, V. Subbaraju, A. Misra, R. Balan, and Y. Lee. 2018. Annapurna: Building a Real-World Smartwatch-Based Automated Food Journal. In 2018 IEEE 19th International Symposium on "A World of Wireless, Mobile and Multimedia Networks" (WoWMoM). 1--6.
[29]
Edison Thomaz, Irfan Essa, and Gregory D. Abowd. 2015. A Practical Approach for Recognizing Eating Moments with Wrist-Mounted Inertial Sensing. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '15). Association for Computing Machinery, New York, NY, USA, 1029--1040.
[30]
Edison Thomaz, Aman Parnami, Irfan Essa, and Gregory D. Abowd. 2013. Feasibility of Identifying Eating Moments from First-Person Images Leveraging Human Computation. In Proceedings of the 4th International SenseCam Pervasive Imaging Conference (SenseCam '13). Association for Computing Machinery, New York, NY, USA, 26--33.
[31]
Michael S. Wagner. 2013. Google Glass: A Preemptive Look at Privacy Concerns Student Note. Journal on Telecommunications High Technology Law 11 (2013), 477. https://heinonline.org/HOL/P?h=hein.journals/ jtelhtel11&i=505.
[32]
Chang Xu, Ye He, Nitin Khannan, Albert Parra, Carol Boushey, and Edward Delp. 2013. Image-Based Food Volume Estimation. In Proceedings of the 5th International Workshop on Multimedia for Cooking Eating Activities (CEA '13). Association for Computing Machinery, New York, NY, USA, 75--80.
[33]
Koji Yatani and Khai N. Truong. 2012. BodyScope: A Wearable Acoustic Sensor for Activity Recognition. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing (UbiComp '12). Association for Computing Machinery, New York, NY, USA, 341--350.
[34]
R. Zhang and O. Amft. 2018a. Free-living eating event spotting using EMG-monitoring eyeglasses. In 2018 IEEE EMBS International Conference on Biomedical Health Informatics (BHI). 128--132.
[35]
R. Zhang and O. Amft. 2018b. Monitoring Chewing and Eating in Free-Living Using Smart Eyeglasses. IEEE Journal of Biomedical and Health Informatics 22, 1 (Jan 2018), 23--32.
[36]
R. Zhang, S. Bernhart, and O. Amft. 2016. Diet eyeglasses: Recognising food chewing using EMG and smart eyeglasses. In 2016 IEEE 13th International Conference on Wearable and Implantable Body Sensor Networks (BSN). 7--12.

Cited By

View all
  • (2024)Controlled and Real-Life Investigation of Optical Tracking Sensors in Smart Glasses for Monitoring Eating Behavior Using Deep Learning: Cross-Sectional StudyJMIR mHealth and uHealth10.2196/5946912(e59469)Online publication date: 26-Sep-2024
  • (2024)Densor: An Intraoral Battery-Free Sensing PlatformProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997468:4(1-30)Online publication date: 21-Nov-2024
  • (2024)HabitSense: A Privacy-Aware, AI-Enhanced Multimodal Wearable Platform for mHealth ApplicationsProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785918:3(1-48)Online publication date: 9-Sep-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI '20: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems
April 2020
10688 pages
ISBN:9781450367080
DOI:10.1145/3313831
This work is licensed under a Creative Commons Attribution-NoDerivatives International 4.0 License.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 23 April 2020

Check for updates

Author Tags

  1. activity recognition
  2. diet monitoring
  3. drinking detection
  4. earables
  5. eating detection
  6. health sensing
  7. ubiquitous computing
  8. wearable computing

Qualifiers

  • Research-article

Conference

CHI '20
Sponsor:

Acceptance Rates

Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

Upcoming Conference

CHI 2025
ACM CHI Conference on Human Factors in Computing Systems
April 26 - May 1, 2025
Yokohama , Japan

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)518
  • Downloads (Last 6 weeks)69
Reflects downloads up to 11 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Controlled and Real-Life Investigation of Optical Tracking Sensors in Smart Glasses for Monitoring Eating Behavior Using Deep Learning: Cross-Sectional StudyJMIR mHealth and uHealth10.2196/5946912(e59469)Online publication date: 26-Sep-2024
  • (2024)Densor: An Intraoral Battery-Free Sensing PlatformProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997468:4(1-30)Online publication date: 21-Nov-2024
  • (2024)HabitSense: A Privacy-Aware, AI-Enhanced Multimodal Wearable Platform for mHealth ApplicationsProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785918:3(1-48)Online publication date: 9-Sep-2024
  • (2024)MunchSonic: Tracking Fine-grained Dietary Actions through Active Acoustic Sensing on EyeglassesProceedings of the 2024 ACM International Symposium on Wearable Computers10.1145/3675095.3676619(96-103)Online publication date: 5-Oct-2024
  • (2024)EchoGuide: Active Acoustic Guidance for LLM-Based Eating Event Analysis from Egocentric VideosProceedings of the 2024 ACM International Symposium on Wearable Computers10.1145/3675095.3676611(40-47)Online publication date: 5-Oct-2024
  • (2024)NIR-sighted: A Programmable Streaming Architecture for Low-Energy Human-Centric Vision ApplicationsACM Transactions on Embedded Computing Systems10.1145/367207623:6(1-26)Online publication date: 11-Sep-2024
  • (2024)Hicclip: Sonification of Augmented Eating Sounds to Intervene Snacking BehaviorsProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661532(1372-1384)Online publication date: 1-Jul-2024
  • (2024)exHARProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36435008:1(1-30)Online publication date: 6-Mar-2024
  • (2024)Go-Go Biome: Evaluation of a Casual Game for Gut Health Engagement and ReflectionProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642742(1-20)Online publication date: 11-May-2024
  • (2024)Eating Speed Measurement Using Wrist-Worn IMU Sensors Towards Free-Living EnvironmentsIEEE Journal of Biomedical and Health Informatics10.1109/JBHI.2024.342287528:10(5816-5828)Online publication date: Oct-2024
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media