Impact of Sliding Window Length in Indoor Human Motion Modes and Pose Pattern Recognition Based on Smartphone Sensors
<p>Overview of human activity recognition (HAR) workflow.</p> "> Figure 2
<p>(<b>a</b>) Human motion modes and (<b>b</b>) pose patterns covered in this study.</p> "> Figure 3
<p>Typical data collection scene. <b>Upper row</b>: elevators 1 and 2 and walking. <b>Lower row</b>: escalators 1 and 2 and going upstairs (from <b>left</b> to <b>right</b>).</p> "> Figure 3 Cont.
<p>Typical data collection scene. <b>Upper row</b>: elevators 1 and 2 and walking. <b>Lower row</b>: escalators 1 and 2 and going upstairs (from <b>left</b> to <b>right</b>).</p> "> Figure 4
<p>The 100-time bootstrapping results of motion mode recognition using support vector machine (SVM) and different window lengths (0.5–3 s).</p> "> Figure 5
<p>Distribution of compressed features of human motion modes with various window lengths: (<b>a</b>) 1 s; (<b>b</b>) 2 s; and (<b>c</b>) 3 s.</p> "> Figure 6
<p>Average F1 score of motion mode classification using different machine-learning methods and window lengths.</p> "> Figure 7
<p>Relationship between F1 score and window length (horizontal axis) for different motion modes.</p> "> Figure 8
<p>Distribution of compressed features of human poses with different window lengths: (<b>a</b>) 1 s; (<b>b</b>) 2 s; and (<b>c</b>) 3 s.</p> "> Figure 9
<p>Average F1 score of pose classification using different machine-learning methods and window lengths.</p> "> Figure 10
<p>Relationship between F1 score and window length for pose classification.</p> ">
Abstract
:1. Introduction
2. HAR Workflow
3. Experiment Setup
3.1. Data Aquicisiton
3.2. Adopted Sensors and Features
3.3. Performance Metric
3.4. Validation and Testing Strategy
4. Experiment Result and Analysis
4.1. Motion Mode Classification Result
4.1.1. Global Evaluation
4.1.2. Motion Mode-Specific Analysis
4.2. Pose Classification Result
4.2.1. Global Evaluation
4.2.2. Pose-Specific Analysis
5. Discussion
- Firstly, for motion mode recognition, neither a gyroscope nor a magnetometer were used. Although experiments have proven that the barometer and accelerometer are effective and sufficient, the current trends show that using additional sensors could help improve the recognition performance and system robustness. Therefore, an analysis using other smartphone sensors could be of interest and will be explored in future work.
- Secondly, the dataset is relatively impoverished, because the data collection was taxing for researchers and subjects. Sufficient amounts of data could hardly be acquired over a short time period. In the future, we will recruit additional subjects so that our data will cover a wider range of ages, heights, and weights of the subjects, and so on. We also aim to establish a comprehensive human motion mode and pose pattern dataset for public use.
- Finally, in this study, we mainly focused on revealing the impact of segmentation on HAR and manually tuning the window size. However, testing different window sizes before designing the system is time-consuming and inefficient. Advanced methods that could automatically tune the segmentation parameters based on the characteristics of the human activities to be distinguished would be considerably useful. Our future study will also focus on this aspect.
6. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- European Commission (EC). Horizon 2020—The Framework Programme for Research and Innovation; Technical Report; European Commission: Brussels, Belgium, 2013. [Google Scholar]
- Lin, J.J.; Mamykina, L.; Lindtner, S.; Delajoux, G.; Strub, H.B. Fish’N’Steps: Encouraging physical activity with an interactive computer game. In Proceedings of the 8th International Conference on Ubiquitous Computing, Orange County, CA, USA, 17–21 September 2006; pp. 261–278. [Google Scholar]
- Consolvo, S.; McDonald, D.W.; Toscos, T.; Chen, M.Y.; Froehlich, J.; Harrison, B.; Klasnja, P.; LaMarca, A.; LeGrand, L.; Libby, R.; et al. Activity sensing in the wild: A field trial of ubifit garden. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Florence, Italy, 5–10 April 2008; pp. 1797–1806. [Google Scholar]
- Lillo, I.; Niebles, J.C.; Soto, A. Sparse composition of body poses and atomic actions for human activity recognition in RGB-D videos. Image Vis. Comput. 2017, 59, 63–75. [Google Scholar] [CrossRef]
- Jalal, A.; Kim, Y.H.; Kim, Y.J.; Kamal, S.; Kim, D. Robust human activity recognition from depth video using spatiotemporal multi-fused features. Pattern Recognit. 2017, 61, 295–308. [Google Scholar] [CrossRef]
- Lu, Y.; Wei, Y.; Liu, L.; Zhong, J.; Sun, L.; Liu, Y. Towards unsupervised physical activity recognition using smartphone accelerometers. Multimed. Tools Appl. 2017, 76, 10701–10719. [Google Scholar] [CrossRef]
- Chen, C.; Jafari, R.; Kehtarnavaz, N. A survey of depth and inertial sensor fusion for human action recognition. Multimed. Tools Appl. 2017, 76, 4405–4425. [Google Scholar] [CrossRef]
- Gravina, R.; Ma, C.; Pace, P.; Aloi, G.; Russo, W.; Li, W.; Fortino, G. Cloud-based Activity-aaService cyber–physical framework for human activity monitoring in mobility. Future Gener. Comput. Syst. 2017, 75, 158–171. [Google Scholar] [CrossRef]
- Wannenburg, J.; Malekian, R. Physical activity recognition from smartphone accelerometer data for user context awareness sensing. IEEE Trans. Syst. Man Cybern. Syst. 2017, 47, 3142–3149. [Google Scholar] [CrossRef]
- Theodoridis, S.; Koutroumbas, K. Pattern Recognition, 4th ed.; Elsevier: London, UK, 2009. [Google Scholar]
- Tunca, C.; Alemdar, H.; Ertan, H.; Incel, O.D.; Ersoy, C. Multimodal wireless sensor network-based ambient assisted living in real homes with multiple residents. Sensors 2014, 14, 9692–9719. [Google Scholar] [CrossRef] [PubMed]
- Kunze, K.S.; Lukowicz, P.; Junker, H.; Troster, G. Where am I: Recognizing On-body Positions of Wearable Sensors. In Proceedings of the LoCA 2005: Location- and Context-Awareness, Oberpfaffenhofen, Germany, 12–13 May 2005; pp. 264–275. [Google Scholar]
- Zeng, M.; Nguyen, L.T.; Yu, B.; Mengshoel, O.J.; Zhu, J.; Wu, P.; Zhang, J. Convolutional neural networks for human activity recognition using mobile sensors. In Proceedings of the 2014 6th International Conference on Mobile Computing, Applications and Services (MobiCASE), Austin, TX, USA, 6–7 November 2014. [Google Scholar]
- Ordóñez, F.J.; Roggen, D. Deep convolutional and lstm recurrent neural networks for multimodal wearable activity recognition. Sensors 2016, 16, 115. [Google Scholar] [CrossRef] [PubMed]
- Yang, J. Toward physical activity diary: Motion recognition using simple acceleration features with mobile phones. In Proceedings of the 1st International Workshop on Interactive Multimedia for Consumer Electronics, Beijing, China, 19–23 October 2009; pp. 1–10. [Google Scholar]
- Foerster, F.; Smeja, M.; Fahrenberg, J. Detection of posture and motion by accelerometry: A validation study in ambulatory monitoring. Comput. Hum. Behav. 1999, 15, 571–583. [Google Scholar] [CrossRef]
- Elhoushi, M.; Georgy, J.; Noureldin, A.; Korenberg, M.J. Motion mode recognition for indoor pedestrian navigation using portable devices. IEEE Trans. Instrum. Meas. 2015, 65, 208–221. [Google Scholar] [CrossRef]
- Frank, K.; Nadales, M.; Robertson, P. Reliable real-time recognition of motion related human activities using MEMS inertial sensors. In Proceedings of the 23rd International Technical Meeting Satellite Division Institute of Navigation (ION GNSS), Portland, OR, USA, 21–24 September 2010; pp. 2919–2932. [Google Scholar]
- Ali, A.S.; Georgy, J.; Wright, D.B. Estimation of heading misalignment between a pedestrian and a wearable device. In Proceedings of the International Conference on Localization and GNSS 2014 (ICL-GNSS 2014), Helsinki, Finland, 24–26 June 2014; pp. 1–6. [Google Scholar]
- Xiao, Z.; Wen, H.; Markham, A.; Trigoni, N. Robust pedestrian dead reckoning (R-PDR) for arbitrary mobile device placement. In Proceedings of the 2014 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Busan, Korea, 27–30 October 2014; pp. 187–196. [Google Scholar]
- Bussmann, J.B.J.; Martens, W.L.J.; Tulen, J.H.M.; Schasfoort, F.C.; Van Den Berg-Emons, H.J.G.; Stam, H.J. Measuring daily behavior using ambulatory accelerometry: The activity monitor. Behav. Res. Methods Instrum. Comput. 2001, 33, 349–356. [Google Scholar] [CrossRef] [PubMed]
- Yang, J.-Y.; Chen, Y.-P.; Lee, G.-Y.; Liou, S.-N.; Wang, J.-S. Activity recognition using one triaxial accelerometer: A neuro-fuzzy classifier with feature reduction. In Proceedings of the 6th ICEC 2007: Entertainment Computing—ICEC 2007, Shanghai, China, 15–17 September 2007; pp. 395–400. [Google Scholar]
- Prasertsung, P.; Horanont, T. A classification of accelerometer data to differentiate pedestrian state. In Proceedings of the 20th International Computer Science and Engineering Conference: Smart Ubiquitos Computing and Knowledge, ICSEC 2016, Chiang Mai, Thailand, 14–17 December 2016. [Google Scholar]
- Choudhury, T.; Consolvo, S.; Harrison, B.; Hightower, J.; LaMarca, A.; LeGrand, L.; Rahimi, A.; Rea, A.; Bordello, G.; Hemingway, B.; et al. The mobile sensing platform: An embedded activity recognition system. IEEE Pervasive Comput. 2008, 7, 32–41. [Google Scholar] [CrossRef]
- Bao, L.; Intille, S.S. Activity recognition from user-annotated acceleration data. In Proceedings of the Pervasive 2004: Pervasive Computing, Vienna, Austria, 21–23 April 2004; pp. 1–17. [Google Scholar]
- Frank, K.; Nadales, V.; Robertson, P.; Angermann, M. Reliable realtime recognition of motion related human activities using MEMS inertial sensors. In Proceedings of the 23rd International Technical Meeting of the Satellite Division of the Institute of Navigation (ION GNSS 2010), Portland, OR, USA, 21–24 September 2010. [Google Scholar]
- Chen, Y.-P.; Yang, J.-Y.; Liou, S.-N.; Lee, G.-Y.; Wang, J.-S. Online classifier construction algorithm for human activity detection using a tri-axial accelerometer. Appl. Math. Comput. 2008, 205, 849–860. [Google Scholar] [CrossRef]
- Jin, G.H.; Lee, S.B.; Lee, T.S. Context awareness of human motion states using accelerometer. J. Med. Syst. 2007, 32, 93–100. [Google Scholar] [CrossRef]
- Janidarmian, M.; Roshan Fekr, A.; Radecka, K.; Zilic, Z. A comprehensive analysis on wearable acceleration sensors in human activity recognition. Sensors 2017, 17, 529. [Google Scholar] [CrossRef] [PubMed]
- Ertuǧrul, Ö.F.; Kaya, Y. Determining the optimal number of body-worn sensors for human activity recognition. Soft Comput. 2017, 21, 5053–5060. [Google Scholar] [CrossRef]
- Cornacchia, M.; Ozcan, K.; Zheng, Y.; Velipasalar, S. A survey on activity detection and classification using wearable sensors. IEEE Sens. J. 2017, 17, 386–403. [Google Scholar] [CrossRef]
- Diraco, G.; Leone, A.; Siciliano, P. An active vision system for fall detection and posture recognition in elderly healthcare. In Proceedings of the 2010 Design, Automation & Test in Europe Conference & Exhibition (DATE 2010), Dresden, Germany, 8–12 March 2010; pp. 1536–1541. [Google Scholar]
- Brusey, J.; Rednic, R.; Gaura, E.I.; Kemp, J.; Poole, N. Postural activity monitoring for increasing safety in bomb disposal missions. Meas. Sci. Technol. 2009, 20, 075204. [Google Scholar] [CrossRef] [Green Version]
- Zhang, H.; Yuan, W.; Shen, Q.; Li, T.; Chang, H. A handheld inertial pedestrian navigation system with accurate step modes and device poses recognition. IEEE Sens. J. 2015, 15, 1421–1429. [Google Scholar] [CrossRef]
- Pan, M.S.; Lin, H.W. A step counting algorithm for smartphone users: Design and implementation. IEEE Sens. J. 2015, 15, 2296–2305. [Google Scholar] [CrossRef]
- Sekine, M.; Tamura, T.; Togawa, T.; Fukui, Y. Classification of waist-acceleration signals in a continuous walking record. Med. Eng. Phys. 2000, 22, 285–291. [Google Scholar] [CrossRef]
- Yoshizawa, M.; Takasaki, W.; Ohmura, R. Parameter exploration for response time reduction in accelerometer-based activity recognition. In Proceedings of the 2013 ACM Conference on Pervasive and Ubiquitous Computing, Zurich, Switzerland, 8–12 September 2013; pp. 653–664. [Google Scholar]
- Aminian, K.; Rezakhanlou, K.; de Andres, E.; Fritsch, C.; Leyvraz, P.F.; Robert, P. Temporal feature estimation during walking using miniature accelerometers: An analysis of gait improvement after hip arthroplasty. Med. Biol. Eng. Comput. 1999, 37, 686–691. [Google Scholar] [CrossRef] [PubMed]
- Aminian, K.; Najafi, B.; Bla, C.; Leyvraz, P.F.; Robert, P. Spatio-temporal parameters of gait measured by an ambulatory system using miniature gyroscopes. J. Biomech. 2002, 35, 689–699. [Google Scholar] [CrossRef]
- Wan, J.; O’Grady, M.J.; O’Hare, G.M.P. Dynamic sensor event segmentation for real-time activity recognition in a smart home context. Pers. Ubiquitous Comput. 2015, 19, 287–301. [Google Scholar] [CrossRef]
- Mortazavi, B.; Lee, S.I.; Sarrafzadeh, M. User-centric exergaming with fine-grain activity recognition: A dynamic optimization approach. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication, Seattle, WC, USA, 13–17 September 2014. [Google Scholar]
- Atallah, L.; Lo, B.; King, R.; Yang, G.Z. Sensor positioning for activity recognition using wearable accelerometers. IEEE Trans. Biomed. Circuits Syst. 2011, 5, 320–329. [Google Scholar] [CrossRef] [PubMed]
- Gjoreski, H.; Gams, M. Accelerometer data preparation for activity recognition. In Proceedings of the International Multiconference Information Society, Ljubljana, Slovenia, 10–14 October 2011. [Google Scholar]
- Jiang, M.; Shang, H.; Wang, Z.; Li, H.; Wang, Y. A method to deal with installation errors of wearable accelerometers for human activity recognition. Physiol. Meas. 2011, 32, 347–358. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Kwapisz, J.R.; Weiss, G.M.; Moore, S.A. Activity recognition using cell phone accelerometers. In Proceedings of the 17th Conference on Knowledge Discovery and Data Mining, San Diego, CA, USA, 21–24 August 2011; Volume 12, pp. 74–82. [Google Scholar]
- Wang, J.H.; Ding, J.J.; Chen, Y.; Chen, H.H. Real time accelerometer-based gait recognition using adaptive windowed wavelet transforms. In Proceedings of the IEEE Asia Pacific Conference on Circuits and Systems, Kaohsiung, Taiwan, 2–5 December 2012; pp. 591–594. [Google Scholar]
- Sun, L.; Zhang, D.; Li, B.; Guo, B.; Li, S. Activity recognition on an accelerometer embedded mobile phone with varying positions and orientations. In Proceedings of the 7th International Conference on Ubiquitous Intelligence and Computing, Xi’an, China, 26–29 October 2010; pp. 548–562. [Google Scholar]
- Khan, A.M.; Lee, Y.K.; Lee, S.; Kim, T.S. Human activity recognition via an accelerometer-enabled-smartphone using kernel discriminant analysis. In Proceedings of the 5th International Conference on Future Information Technology, Busan, Korea, 21–23 May 2010; pp. 1–6. [Google Scholar]
- Lee, Y.S.; Cho, S.B. Activity recognition using hierarchical hidden markov models on a smartphone with 3D accelerometer. In Proceedings of the 6th International Conference on Hybrid Artificial Intelligent Systems, Wroclaw, Poland, 23–25 May 2011; pp. 460–467. [Google Scholar]
- Siirtola, P.; R¨oning, J. User-independent human activity recognition using a mobile phone: Offline recognition vs. In real-time on device recognition. In Proceedings of the 9th International Conference on Distributed Computing and Artificial Intelligence, Salamanca, Spain, 28–30 March 2012; pp. 617–627. [Google Scholar]
- Li, P.; Wang, Y.; Tian, Y.; Zhou, T.S.; Li, J.S. An Automatic User-Adapted Physical Activity Classification Method Using Smartphones. IEEE Trans. Biomed. Eng. 2017, 64, 706–714. [Google Scholar] [CrossRef] [PubMed]
- Chen, Y.; Shen, C. Performance analysis of smartphone-sensor behavior for human activity recognition. IEEE Access 2017, 5, 3095–3110. [Google Scholar] [CrossRef]
- Cortes, C.; Vapnik, V. Support-vector networks. Mach. Learn. 1995, 20, 273–297. [Google Scholar] [CrossRef] [Green Version]
- Cheng, W.-C.; Jhan, D.-M. Triaxial accelerometer-based fall detection method using a self-constructing cascade-AdaBoost-SVM classifier. IEEE J. Biomed. Health Inform. 2013, 17, 411–419. [Google Scholar] [CrossRef] [PubMed]
- Yin, J.; Yang, Q.; Pan, J.J. Sensor-based abnormal human activity detection. IEEE Trans. Knowl. Data Eng. 2008, 20, 1082–1090. [Google Scholar] [CrossRef]
- Witten, I.H.; Frank, E. Data Mining: Practical Machine Learning Tools and Techniques; Morgan Kaufmann: San Francisco, CA, USA, 2005. [Google Scholar]
- Susi, M.; Borio, D.; Lachapelle, G. Accelerometer signal features and classification algorithms for positioning applications. In Proceedings of the International Technical Meeting of The Institute of Navigation 2011, San Diego, CA, USA, 24–26 January 2011; pp. 158–169. [Google Scholar]
- Kwapisz, J.R.; Weiss, G.M.; Moore, S.A. Activity recognition using cell phone accelerometers. ACM SIGKDD Explor. Newslett. 2010, 12, 74–82. [Google Scholar] [CrossRef]
- Ravi, N.; Dandekar, N.; Mysore, P.; Littman, M.L. Activity recognition from accelerometer data. In Proceedings of the17th Conference on Innovative Applications of Artificial Intelligence, Pittsburgh, PA, USA, 9–13 July 2005; Volume 3, pp. 1541–1546. [Google Scholar]
- Hand, D.J.; Yu, K. I diot’s Bayes—not so stupid after all? Int. Stat. Rev. 2001, 69, 385–398. [Google Scholar]
- Joglekar, S. Adaboost—Sachin Joglekar’s Blog. Available online: https://codesachin.wordpress.com/tag/adaboost/ (accessed on 3 August 2016).
- Daghistani, T.; Alshammari, R. Improving Accelerometer-Based Activity Recognition by Using Ensemble of Classifiers. Int. J. Adv. Comput. Sci. Appl. 2016, 7, 128–133. [Google Scholar] [CrossRef]
- Ponce, H.; Miralles-Pechuán, L.; Martínez-Villaseñor, M.D.L. A flexible approach for human activity recognition using artificial hydrocarbon networks. Sensors 2016, 16, 1715. [Google Scholar] [CrossRef] [PubMed]
- Elhoushi, M.; Georgy, J.; Noureldin, A.; Korenberg, M.J. A survey on approaches of motion mode recognition using sensors. IEEE Trans. Intell. Transp. Syst. 2017, 18, 1662–1686. [Google Scholar] [CrossRef]
- Frank, K.; Diaz, E.M.; Robertson, P.; Sanchez, F.J.F. Bayesian recognition of safety relevant motion activities with inertial sensors and barometer. In Proceedings of the 2014 IEEE/ION Position, Location and Navigation Symposium—PLANS 2014, Monterey, CA, USA, 5–8 May 2014; pp. 174–184. [Google Scholar]
- Zhao, X.; Saeedi, S.; El-Sheimy, N.; Syed, Z.; Goodall, C. Towards arbitrary placement of multi-sensors assisted mobile navigation system. In Proceedings of the 23rd International Technical Meeting of The Satellite Division of the Institute of Navigation (ION GNSS 2010), Portland, OR, USA, 21–24 September 2010; pp. 556–564. [Google Scholar]
- Banos, O.; Galvez, J.M.; Damas, M.; Pomares, H.; Rojas, I. Window size impact in human activity recognition. Sensors 2014, 14, 6474–6499. [Google Scholar] [CrossRef] [PubMed]
- Khan, A.M.; Lee, Y.K.; Lee, S.; Kim, T.S. Accelerometer’s Position Independent Physical Activity Recognition System for Long-term Activity Monitoring in the Elderly. Med. Biol. Eng. Comput. 2010, 48, 1271–1279. [Google Scholar] [CrossRef] [PubMed]
- Kawahara, Y.; Kurasawa, H.; Morikawa, H. Recognizing User Context Using Mobile Handsets with Acceleration Sensors. In Proceedings of the 2007 IEEE International Conference on Portable Information Devices, Orlando, FL, USA, 25–29 May 2007. [Google Scholar]
- Lee, S.M.; Yoon, S.M.; Cho, H. Human activity recognition from accelerometer data using Convolutional Neural Network. In Proceedings of the 2017 IEEE International Conference on Big Data and Smart Computing (BigComp), Jeju, Korea, 13–16 February 2017. [Google Scholar]
- Park, J.G.; Patel, A.; Curtis, D.; Teller, S.; Ledlie, J. Online pose classification and walking speed estimation using handheld devices. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing, Pittsburgh, PA, USA, 5–8 September 2012. [Google Scholar]
- Android developers/motion sensors. Available online: https://developer.android.com/guide/topics/sensors/sensors_motion#sensors-motion-grav (accessed on 24 April 2018).
- Stone, M. Asymptotics for and against cross-validation. Biometrika 1977, 64, 29–35. [Google Scholar] [CrossRef]
- Varian, H. Bootstrap Tutorial. Math. J. 2005, 9, 768–775. [Google Scholar]
- Arlot, S.; Celisse, A. A survey of cross-validation procedures for model selection. Stat. Surv. 2010, 4, 40–79. [Google Scholar] [CrossRef] [Green Version]
- Hotelling, H. Analysis of a complex of statistical variables into principal components. J. Educ. Psychol. 1933, 24, 417. [Google Scholar] [CrossRef]
- Guo, S.; Xiong, H.; Zheng, X.; Zhou, Y. Activity Recognition and Semantic Description for Indoor Mobile Localization. Sensors 2017, 17, 649. [Google Scholar] [CrossRef] [PubMed]
- Guo, Q.; Liu, B.; Chen, C. two-layer and multi-strategy framework for human activity recognition using smartphone. In Proceedings of the 2016 IEEE International Conference on Communications (ICC), Kuala Lumpur, Malaysia, 22–27 May 2016. [Google Scholar]
- Tahavori, F.; Stack, E.; Agarwal, V.; Burnett, M.; Ashburn, A.; Hoseinitabatabaei, S.A.; Harwin, W. Physical activity recognition of elderly people and people with parkinson’s (PwP) during standard mobility tests using wearable sensors. In Proceedings of the 2017 International Smart Cities Conference (ISC2), Wuxi, China, 14–17 September 2017. [Google Scholar]
- Barshan, B.; Yüksek, M.C. Recognizing daily and sports activities in two open source machine learning environments using body-worn sensor units. Comput. J. 2013, 57, 1649–1667. [Google Scholar] [CrossRef]
- Ronao, C.A.; Cho, S.B. Human activity recognition with smartphone sensors using deep learning neural networks. Expert Syst. Appl. 2016, 59, 235–244. [Google Scholar] [CrossRef]
- Maurer, U.; Smailagic, A.; Siewiorek, D.P.; Deisher, M. Activity recognition and monitoring using multiple sensors on different body positions. In Proceedings of the International Workshop on Wearable and Implantable Body Sensor Networks (BSN’06), Cambridge, MA, USA, 3–5 April 2006; pp. 113–116. [Google Scholar]
- Beitzel, S.M. On Understanding and Classifying Web Queries; Illinois Institute of Technology: Chicago, IL, USA, 2006. [Google Scholar]
Height (cm) | [163,170) | [170,175) | [175,180) | |
---|---|---|---|---|
Weight (kg) | ||||
[50,60) | Subject 1, 3 | |||
[60,70) | Subject 8 | Subject 2 | Subject 4 | |
[70,80) | Subject 9, 10 | Subject 5, 6, 7 |
Sensors | Purpose | Data Stream | Description | Manufacturer | Measuring Range | Measuring Accuracy |
---|---|---|---|---|---|---|
Gravity sensor | Pose pattern classification | Gravity force along x axis | Qualcomm | 39.226593 m/s2 | 0.00119 m/s2 | |
Gravity force along y axis | ||||||
Gravity force along z axis | ||||||
Accelerometer | Motion mode classification | is the specific force along * axis | ||||
Barometer | Air pressure measurement | BOSCH | 1100 hPa | 0.00999 hPa |
No.# | FEATURE | DEFINITION |
---|---|---|
1 | Mean | |
2 | Absolute Mean | |
3 | Variance | |
4 | Standard deviation | |
5 | Mode | Values that appear most frequently in data set |
6 | Median | Middle value in a data set |
7 | Average Absolute Difference | |
8 | 75th Percentile | Value separating 25% higher data from 75% lower data in a data set. |
9 | Interquartile range | Difference between 75th and 25th percentile |
10 | Gradient (only for air pressure data) | The coefficient of first-order linear fitting |
11 | Coefficients of FFT (Fast Fourier Transform) | Energy of each frequency component |
Predicted Class | |||
---|---|---|---|
A | B | ||
Actual Class | A | TP | FN |
B | FP | TN |
Motion Mode | Recommended Window Size | |||
---|---|---|---|---|
F1 Score | ||||
85% | 90% | 95% | 99% | |
Stationary | 1.5 s | 2 s | 3 s | 4.5 s |
Walking | 1 s | 1.5 s | 3 s | 4 s |
Up elevator | 0.5 s | 0.5 s | 0.5 s | 1.5 s |
Down elevator | 0.5 s | 0.5 s | 0.5 s | 1.5 s |
Up stairs | 2 s | 3 s | 3.5 s | 5 s |
Down stairs | 1.5 s | 2 s | 2.5 s | 4 s |
Up escalator | 2 s | 2.5 s | 3.5 s | 4.5 s |
Down escalator | 2 s | 2.5 s | 3 s | 4.5 s |
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, G.; Li, Q.; Wang, L.; Wang, W.; Wu, M.; Liu, T. Impact of Sliding Window Length in Indoor Human Motion Modes and Pose Pattern Recognition Based on Smartphone Sensors. Sensors 2018, 18, 1965. https://doi.org/10.3390/s18061965
Wang G, Li Q, Wang L, Wang W, Wu M, Liu T. Impact of Sliding Window Length in Indoor Human Motion Modes and Pose Pattern Recognition Based on Smartphone Sensors. Sensors. 2018; 18(6):1965. https://doi.org/10.3390/s18061965
Chicago/Turabian StyleWang, Gaojing, Qingquan Li, Lei Wang, Wei Wang, Mengqi Wu, and Tao Liu. 2018. "Impact of Sliding Window Length in Indoor Human Motion Modes and Pose Pattern Recognition Based on Smartphone Sensors" Sensors 18, no. 6: 1965. https://doi.org/10.3390/s18061965
APA StyleWang, G., Li, Q., Wang, L., Wang, W., Wu, M., & Liu, T. (2018). Impact of Sliding Window Length in Indoor Human Motion Modes and Pose Pattern Recognition Based on Smartphone Sensors. Sensors, 18(6), 1965. https://doi.org/10.3390/s18061965