[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3395035.3425969acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
short-paper

A Movement in Multiple Time Neural Network for Automatic Detection of Pain Behaviour

Published: 27 December 2020 Publication History

Abstract

The use of multiple clocks has been a favoured approach to modelling the multiple timescales of sequential data. Previous work based on clocks and multi-timescale studies in general have not clearly accounted for multidimensionality of data such that each dimension has its own timescale(s). Focusing on body movement data which has independent yet coordinating degrees of freedom, we propose a Movement in Multiple Time (MiMT) neural network. Our MiMT models multiple timescales by learning different levels of movement interpretation (i.e. labels) and further allows for separate timescales across movements dimensions. We obtain 0.75 and 0.58 average F1 scores respectively for binary frame-level and three-class window-level classification of pain behaviour based on the MiMT. Findings in ablation studies suggest that these two elements of the MiMT are valuable to modelling multiple timescales of multidimensional sequential data.

References

[1]
Min Aung, Sebastian Kaltwang, Bernardino Romera-Paredes, Brais Martinez, Aneesha Singh, Matteo Cella, et al. 2016. The automatic detection of chronic pain-related expression: requirements, challenges and the multimodal EmoPain dataset. IEEE Transactions on Affective Computing, Vol. 7, 4 (2016), 435--451.
[2]
Junyoung Chung, Sungjin Ahn, and Yoshua Bengio. 2019. Hierarchical multiscale recurrent neural networks. In Proceedings of ICML. 1--13.
[3]
Lá szló A. Jeni, Jeffrey F. Cohn, and Fernando De La Torre. 2013. Facing imbalanced data - Recommendations for the use of performance metrics. In Proceedings of Conference on Affective Computing and Intelligent Interaction. IEEE, 245--251.
[4]
Francis Keefe and Andrew Block. 1982. Development of an observation method for assessing pain behavior in chronic low back pain patients. Behavior Therapy, Vol. 13, 4 (1982), 363--375.
[5]
KM Kerr, JA White, DA Barr, and RAB Mollan. 1994. Standardization and definitions of the sit-stand-sit movement cycle. Gait and Posture, Vol. 2, 3 (1994), 182--190.
[6]
Jan Koutní k, Klaus Greff, Faustino Gomez, and Jü rgen Schmidhuber. 2014. A clockwork RNN. In International Conference on Machine Learning. 1863--1871.
[7]
Qianli Ma, Enhuan Chen, Zhenxi Lin, Jiangyue Yan, Zhiwen Yu, and Wing Ng. 2019. Convolutional Multitimescale Echo State Network. IEEE Trans Cybern (2019), 1--13.
[8]
Katherine Metcalf and David Leake. 2019. Unsupervised hierarchical temporal abstraction by simultaneously learning expectations and representations. In Proceedings of International Joint Conference on Artificial Intelligence. 3144--3150.
[9]
Temitayo Olugbade, Joseph Newbold, Rose Johnson, Erica Volta, Paolo Alborno, Radoslaw Niewiadomski, et al. 2020. Automatic Detection of Reflective Thinking in Mathematical Problem Solving based on Unconstrained Bodily Exploration. IEEE Transactions on Affective Computing (2020).
[10]
Temitayo A Olugbade, Aneesha Singh, Nadia Bianchi-Berthouze, Nicolai Marquardt, Min SH Aung, and Amanda C De C Williams. 2019. How can affect be detected and represented in technological support for physical rehabilitation? ACM Transactions on Computer-Human Interaction, Vol. 26, 1 (2019), 1--29.
[11]
Martin Peniak, Davide Marocco, Jun Tani, Yuichi Yamashita, Kerstin Ficher, and Angelo Cangelosi. 2011. Multiple Time Scales Recurrent Neural Network for Complex Action Acquisition. Frontiers in Computational Neuroscience, Vol. 5 (2011).
[12]
Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uskoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, and Illia Polosukhin. 2017. Attention Is All You Need Ashish. In Proceedings of NIPS. 5998--6008.
[13]
Chongyang Wang, Temitayo Olugbade, Akhil Mathur, Amanda C De C. Williams, Nicholas Lane, and Nadia Bianchi-Berthouze. 2019 a. Recurrent network based automatic detection of chronic pain protective behavior using mocap and semg data. In Proceedings of International Symposium on Wearable Computers. 225--230.
[14]
Chongyang Wang, Min Peng, Temitayo Olugbade, Nicholas Lane, Amanda C de C Williams, and Nadia Bianchi-Berthouze. 2019 b. Learning temporal and bodily attention in protective movement behavior detection. In Proceedings of Affective Computing and Intelligent Interaction Workshops and Demos. 324--330.
[15]
Yuichi Yamashita and Jun Tani. 2008. Emergence of functional hierarchy in a multiple timescale neural network model: A humanoid robot experiment. PLoS Computational Biology, Vol. 4 (2008).
[16]
Jeffrey Zacks and Khena Swallow. 2007. Event segmentation. Current Directions in Psychological Science, Vol. 16, 2 (2007), 80--84.

Cited By

View all
  • (2024)Movement Representation Learning for Pain Level ClassificationIEEE Transactions on Affective Computing10.1109/TAFFC.2023.333452215:3(1303-1314)Online publication date: Jul-2024
  • (2023)Pain Level and Pain-Related Behaviour Classification Using GRU-Based Sparsely-Connected RNNsIEEE Journal of Selected Topics in Signal Processing10.1109/JSTSP.2023.326235817:3(677-688)Online publication date: May-2023

Index Terms

  1. A Movement in Multiple Time Neural Network for Automatic Detection of Pain Behaviour
          Index terms have been assigned to the content through auto-classification.

          Recommendations

          Comments

          Please enable JavaScript to view thecomments powered by Disqus.

          Information & Contributors

          Information

          Published In

          cover image ACM Conferences
          ICMI '20 Companion: Companion Publication of the 2020 International Conference on Multimodal Interaction
          October 2020
          548 pages
          ISBN:9781450380027
          DOI:10.1145/3395035
          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

          Sponsors

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          Published: 27 December 2020

          Permissions

          Request permissions for this article.

          Check for updates

          Author Tags

          1. body movement
          2. multiple timescales
          3. neural networks
          4. pain
          5. time

          Qualifiers

          • Short-paper

          Funding Sources

          Conference

          ICMI '20
          Sponsor:
          ICMI '20: INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION
          October 25 - 29, 2020
          Virtual Event, Netherlands

          Contributors

          Other Metrics

          Bibliometrics & Citations

          Bibliometrics

          Article Metrics

          • Downloads (Last 12 months)8
          • Downloads (Last 6 weeks)0
          Reflects downloads up to 05 Mar 2025

          Other Metrics

          Citations

          Cited By

          View all
          • (2024)Movement Representation Learning for Pain Level ClassificationIEEE Transactions on Affective Computing10.1109/TAFFC.2023.333452215:3(1303-1314)Online publication date: Jul-2024
          • (2023)Pain Level and Pain-Related Behaviour Classification Using GRU-Based Sparsely-Connected RNNsIEEE Journal of Selected Topics in Signal Processing10.1109/JSTSP.2023.326235817:3(677-688)Online publication date: May-2023

          View Options

          Login options

          View options

          PDF

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader

          Figures

          Tables

          Media

          Share

          Share

          Share this Publication link

          Share on social media