[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.4108/icst.pervasivehealth.2014.255423acmotherconferencesArticle/Chapter ViewAbstractPublication PagespervasivehealthConference Proceedingsconference-collections
research-article

Pervasive eating habits monitoring and recognition through a wearable acoustic sensor

Published: 20 May 2014 Publication History

Abstract

Eating habits provide clinical diagnosis evidences of lifestyle related diseases, such as dysphagia and indigestion. However, it is costly to obtain eating habit information of common people in terms of both time and expenses. This paper presents a pervasive approach for eating habit monitoring and recognition by a necklace-like device and a smartphone communicating via bluetooth. The necklace-like device acquires acoustic signals from the throat, and the data are processed in the smartphone to recognize important features. With complex acoustic signals collected from the throat, our method comprehensively analyzes and recognizes different events including chewing, swallowing, and breathing in the smartphone. Experiments show that the proposed approach can recognize different acoustic events effectively, and the recognition accuracy with K-Nearest Neighbor (KNN) and Support Vector Machine (SVM) is 86.82% and 98.35%, respectively. Finally, a real eating case study is conducted to validate the proposed approach.

References

[1]
O. Amft. A wearable earpad sensor for chewinging monitoring. IEEE Sensor Conference, 2010.
[2]
O. Amft. Ambient, on-body, and implantable monitoring technologies to assess dietary behaviour. International Handbook of Behavior, Diet and Nutrition, Springer, 2011.
[3]
O. Amft, M. Kusserow, and G. Troster. Automatic identification of temporal sequences in chewinging sounds. BIBM, 2010.
[4]
L. Bao and S. Intille. Activity recognition from user-annotated acceleration data. Pervasive Comput., 2004.
[5]
F. Bellisle. Why should we study human food intake behaviour? Nutr Metab Cardiovasc Dis., 13(4), 2003.
[6]
T. Cover and P. Hart. Nearest neighbor pattern classification. IEEE Trans. on Information Theory, 1967.
[7]
N. DeBelie and V. D. Smedt. Principal component analysis of chewinging sounds to detect differences in apple crispness. Postharvest Biol Technol, 18, 2000.
[8]
B. Drake. Food crushing sounds. an introductory study. J Food Sci, 28(2), 1963.
[9]
K. O. et al. No contact-type chewinging number counting equipment using infrared sensor. T. SICE, 38(9), 2002.
[10]
M. S. et al. Wearable eating habit sensing system using internal body sound. Journal of Advanced Mechanical Design, Systems, and Manufacturing, 4(1), 2010.
[11]
R. Z. et al. Multicategory classification using an extreme learning machine for microarray gene expression cancer diagnosis. IEEE/ACM Transactions on Computational Biology and Bioinformatics, 2007.
[12]
A. Farri, A. Accornero, and C. Burdese. Social importance of dysphagia: its impact on diagnosis and therapy. ACTA Otorhinolaryngo Italica, 2007.
[13]
M. Furuta and H. Miyako. Observation of matters related to chewinging habits. Japanese Journal of Psychosomatic Dentistry, 14(2), 1999.
[14]
T. Huynh and B. Schiele. Analyzing features for activity recognition. Proceedings of the 2005 joint conference on Smart objects and ambient intelligence, 2005.
[15]
W. L. III, A. Deibel, C. Glembin, and E. Munday. Analysis of food crushing sound during mastication: Frequency-time studies. Journal of Texture Studies, 19(1), 1988.
[16]
G. Lopez and I. Yamada. New healthcare society supported by wearable sensors and information mapping based services. WIVE, 2009.
[17]
P. Lopez-Meyer, S. Schuckers, O. Makeyev, and E. Sazonov. Detection of periods of food intake using support vector machines. EMBS, 2010.
[18]
J. Nishimura and T. Kuroda. Eating habits monitoring using wireless wearable in-ear microphone. ISWPC, 2008.
[19]
N. Ravi, N. Dandekar, and P. M. et al. Activity recognition from accelerometer data. AAAI, 2005.
[20]
V. Vapnik. Statistical learning theory. 1998.
[21]
K. Veropoulos, C. Campbell, and N. Cristianini. Controlling the sensitivity of support vector machines. IJCAI, 1999.
[22]
Z. M. Vickers. Food sounds: How much information do they contain? J Food Sci, 45(6), 1980.
[23]
A. Yadollahi and Z. Moussavi. Feature selection for swallowinging sounds classification. EMBC, 2007.
[24]
M. Zhang and A. Sawchuk. A feature selection-based framework for human activity recognition using wearable multimodal sensors. BODYNETS, 2011.

Cited By

View all
  • (2024)Automatic Recognition of Commensal Activities in Co-located and Online settingsCompanion Proceedings of the 26th International Conference on Multimodal Interaction10.1145/3686215.3686219(117-121)Online publication date: 4-Nov-2024
  • (2024)A Virtual Agent as a Commensal CompanionProceedings of the 24th ACM International Conference on Intelligent Virtual Agents10.1145/3652988.3673963(1-4)Online publication date: 16-Sep-2024
  • (2023)"Enjoy, but Moderately!": Designing a Social Companion Robot for Social Engagement and Behavior Moderation in Solitary Drinking ContextProceedings of the ACM on Human-Computer Interaction10.1145/36100287:CSCW2(1-24)Online publication date: 4-Oct-2023
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Other conferences
PervasiveHealth '14: Proceedings of the 8th International Conference on Pervasive Computing Technologies for Healthcare
May 2014
459 pages
ISBN:9781631900112

In-Cooperation

Publisher

ICST (Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering)

Brussels, Belgium

Publication History

Published: 20 May 2014

Check for updates

Author Tags

  1. KNN
  2. SVM
  3. eating habit
  4. feature extraction

Qualifiers

  • Research-article

Conference

PervasiveHealth '14

Acceptance Rates

Overall Acceptance Rate 55 of 116 submissions, 47%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)21
  • Downloads (Last 6 weeks)4
Reflects downloads up to 11 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Automatic Recognition of Commensal Activities in Co-located and Online settingsCompanion Proceedings of the 26th International Conference on Multimodal Interaction10.1145/3686215.3686219(117-121)Online publication date: 4-Nov-2024
  • (2024)A Virtual Agent as a Commensal CompanionProceedings of the 24th ACM International Conference on Intelligent Virtual Agents10.1145/3652988.3673963(1-4)Online publication date: 16-Sep-2024
  • (2023)"Enjoy, but Moderately!": Designing a Social Companion Robot for Social Engagement and Behavior Moderation in Solitary Drinking ContextProceedings of the ACM on Human-Computer Interaction10.1145/36100287:CSCW2(1-24)Online publication date: 4-Oct-2023
  • (2022)Towards Commensal Activities RecognitionProceedings of the 2022 International Conference on Multimodal Interaction10.1145/3536221.3556566(549-557)Online publication date: 7-Nov-2022
  • (2021)Look Ma, No HandsProceedings of the Conference on Information Technology for Social Good10.1145/3462203.3475895(13-18)Online publication date: 9-Sep-2021
  • (2020)MAMAS: Supporting Parent--Child Mealtime Interactions Using Automated Tracking and Speech RecognitionProceedings of the ACM on Human-Computer Interaction10.1145/33928764:CSCW1(1-32)Online publication date: 29-May-2020
  • (2015)Unintrusive eating recognition using Google glassProceedings of the 9th International Conference on Pervasive Computing Technologies for Healthcare10.5555/2826165.2826181(108-111)Online publication date: 20-May-2015

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media