[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3489517.3530490acmconferencesArticle/Chapter ViewAbstractPublication PagesdacConference Proceedingsconference-collections
research-article
Public Access

Human emotion based real-time memory and computation management on resource-limited edge devices

Published: 23 August 2022 Publication History

Abstract

Emotional AI or Affective Computing has been projected to grow rapidly in the upcoming years. Despite many existing developments in the application space, there has been a lack of hardware-level exploitation of the user's emotions. In this paper, we propose a deep collaboration between user's affects and the hardware system management on resource-limited edge devices. Based on classification results from efficient affect classifiers on smartphone devices, novel real-time management schemes for memory, and video processing are proposed to improve the energy efficiency of mobile devices. Case studies on H.264 / AVC video playback and Android smartphone usages are provided showing significant power saving of up to 23% and reduction of memory loading of up to 17% using the proposed affect adaptive architecture and system management schemes.

References

[1]
PWC, "Sizing the prize What's the real value of AI for your business and how can you capitalise?" [Online].
[2]
"Artificial Intelligence of Things Solutions by AIoT Market Applications and Services and Industry Verticals 2021 - 2026." [Online].
[3]
"Neural Networks And Machine Learning Are Powering A New Era Of Perceptive Intelligence." [Online].
[4]
H.-T. Cheng et al., "Wide & Deep Learning for Recommender Systems," ArXiv160607792 Cs Stat, Jun. 2016, Accessed: Nov. 20, 2021. [Online].
[5]
H. Sarker et al., "Finding Significant Stress Episodes in a Discontinuous Time Series of Rapidly Varying Mobile Sensor Data," in Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, San Jose California USA, May 2016, pp. 4489--4501.
[6]
"Online Resource: Apple Is Working on iPhone Features to Help Detect Depression, Cognitive Decline." [Online].
[7]
J. Tao and T. Tan, "Affective Computing: A Review," in Affective Computing and Intelligent Interaction, vol. 3784, J. Tao, T. Tan, and R. W. Picard, Eds. Berlin, Heidelberg: Springer Berlin Heidelberg, 2005, pp. 981--995.
[8]
L. Shu et al., "A Review of Emotion Recognition Using Physiological Signals," Sensors, vol. 18, no. 7, p. 2074, Jun. 2018
[9]
"Affective Computing Market Size, Share & Trends Analysis Report By Technology(Touch-based, Touchless), By Software, By Hardware, By End-use (HealthcareAutomotive), And Segment Forecasts, 2020 - 2027." [Online].
[10]
J. Posner, J. A. Russell, and B. S. Peterson, "The circumplex model of affect: An integrative approach to affective neuroscience, cognitive development, and psychopathology," Dev. Psychopathol., vol. 17, no. 03, Sep. 2005.
[11]
"Embatica website." [Online]. Available: https://store.empatica.com/products/e4-wristband?variant=17719039950953
[12]
C. Mundell, J. P. Vielma, and T. Zaman, "Predicting Performance Under Stressful Conditions Using Galvanic Skin Response," ArXiv160601836 Cs Stat, Jun. 2016, Accessed: Nov. 14, 2021. [Online].
[13]
J. A. Healey and R. W. Picard, "Detecting Stress During Real-World Driving Tasks Using Physiological Sensors," IEEE Trans. Intell. Transp. Syst., vol. 6, no. 2, pp. 156--166, Jun. 2005
[14]
N. Jaques et al, "Predicting students' happiness from physiology, phone, mobility, and behavioral data," in 2015 International Conference on Affective Computing and Intelligent Interaction (ACII), Xi'an, China, Sep. 2015, pp. 222--228.
[15]
K. Yano et al., "Profiting From IoT: The Key Is Very-Large-Scale Happiness Integration," Symp. VLSI Circuits Dig. Tech. Pap., p. 4, 2015.
[16]
A. Sano, P. Johns, and M. Czerwinski, "Designing Opportune Stress Intervention Delivery Timing using Multi-modal Data," Int. Conf. Affect. Comput. Intell. Interact. ACII, 2017.
[17]
Y. Li, A. S. Elmaghraby, A. El-Baz, and E. M. Sokhadze, "Using physiological signal analysis to design affective VR games," in 2015 IEEE International Symposium on Signal Processing and Information Technology (ISSPIT), Abu Dhabi, United Arab Emirates, Dec. 2015, pp. 57--62.
[18]
A. Jimenez-Molina et al., "Using Psychophysiological Sensors to Assess Mental Workload During Web Browsing," Sensors, vol. 18, no. 2, p. 458, Feb. 2018.
[19]
J. Wache, "The Secret Language of Our Body: Affect and Personality Recognition Using Physiological Signals," in Proceedings of the 16th International Conference on Multimodal Interaction, Istanbul Turkey, Nov. 2014, pp. 389--393.
[20]
S. R. Livingstone and F. A. Russo, "The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS): A dynamic, multimodal set of facial and vocal expressions in North American English," PLOS ONE, vol. 13, no. 5, p. e0196391, May 2018
[21]
G. Costantini, I. Iadarola, A. Paoloni, and M. Todisco, "EMOVO Corpus: an Italian Emotional Speech Database," p. 4.
[22]
H. Cao, D. G. Cooper, M. K. Keutmann, R. C. Gur, A. Nenkova, and R. Verma, "CREMA-D: Crowd-Sourced Emotional Multimodal Actors Dataset," IEEE Trans. Affect. Comput., vol. 5, no. 4, pp. 377--390, Oct. 2014.
[23]
K. Xu and C. S. Choy, "Low-power H.264/AVC baseline decoder for portable applications," in Proceedings of the 2007 international symposium on Low power electronics and design - ISLPED '07, Portland, OR, USA, 2007, pp. 256--261.
[24]
D. Hazer-Rau et al., "The uulmMAC Database---A Multimodal Affective Corpus for Affective Computing in Human-Computer Interaction," Sensors, vol. 20, no. 8, p. 2308, Apr. 2020
[25]
C. Stachl et al., "Predicting personality from patterns of behavior collected with smartphones," Proc. Natl. Acad. Sci., vol. 117, no. 30, pp. 17680--17687, Jul. 2020

Cited By

View all
  • (2024)Sentient libraries: empowering user expeditions with emotional artificial intelligenceLibrary Hi Tech News10.1108/LHTN-01-2024-0001Online publication date: 16-Jul-2024

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
DAC '22: Proceedings of the 59th ACM/IEEE Design Automation Conference
July 2022
1462 pages
ISBN:9781450391429
DOI:10.1145/3489517
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 23 August 2022

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. LSTM
  2. affective computing
  3. edge devices
  4. memory management
  5. system management
  6. wearable devices

Qualifiers

  • Research-article

Funding Sources

Conference

DAC '22
Sponsor:
DAC '22: 59th ACM/IEEE Design Automation Conference
July 10 - 14, 2022
California, San Francisco

Acceptance Rates

Overall Acceptance Rate 1,770 of 5,499 submissions, 32%

Upcoming Conference

DAC '25
62nd ACM/IEEE Design Automation Conference
June 22 - 26, 2025
San Francisco , CA , USA

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)104
  • Downloads (Last 6 weeks)18
Reflects downloads up to 01 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Sentient libraries: empowering user expeditions with emotional artificial intelligenceLibrary Hi Tech News10.1108/LHTN-01-2024-0001Online publication date: 16-Jul-2024

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media