[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

Activity Recognition of Assembly Tasks Using Body-Worn Microphones and Accelerometers

Published: 01 October 2006 Publication History

Abstract

In order to provide relevant information to mobile users, such as workers engaging in the manual tasks of maintenance and assembly, a wearable computer requires information about the user's specific activities. This work focuses on the recognition of activities that are characterized by a hand motion and an accompanying sound. Suitable activities can be found in assembly and maintenance work. Here, we provide an initial exploration into the problem domain of continuous activity recognition using on-body sensing. We use a mock "wood workshop” assembly task to ground our investigation. We describe a method for the continuous recognition of activities (sawing, hammering, filing, drilling, grinding, sanding, opening a drawer, tightening a vise, and turning a screwdriver) using microphones and three-axis accelerometers mounted at two positions on the user's arms. Potentially "interesting” activities are segmented from continuous streams of data using an analysis of the sound intensity detected at the two different locations. Activity classification is then performed on these detected segments using linear discriminant analysis (LDA) on the sound channel and hidden Markov models (HMMs) on the acceleration data. Four different methods at classifier fusion are compared for improving these classifications. Using user-dependent training, we obtain continuous average recall and precision rates (for positive activities) of 78 percent and 74 percent, respectively. Using user-independent training (leave-one-out across five users), we obtain recall rates of 66 percent and precision rates of 63 percent. In isolation, these activities were recognized with accuracies of 98 percent, 87 percent, and 95 percent for the user-dependent, user-independent, and user-adapted cases, respectively.

References

[1]
S. Feiner, B. MacIntyre, and D. Seligmann, “Knowledge-Based Augmented Reality,” Comm. ACM, vol. 36, no. 7, pp. 52-62, 1993.
[2]
M. Lampe, M. Strassner, and E. Fleisch, “A Ubiquitous Computing Environment for Aircraft Maintenance,” Proc. ACM Symp. Applied Computing, pp. 1586-1592, 2004.
[3]
D. Abowd, A.K. Dey, R. Orr, and J. Brotherton, “Context-Awareness in Wearable and Ubiquitous Computing,” Virtual Reality, vol. 3, no. 3, pp. 200-211, 1998.
[4]
T. Starner, B. Schiele, and A. Pentland, “Visual Contextual Awareness in Wearable Computing,” Proc. IEEE Int'l Symp. Wearable Computers, pp. 50-57, 1998.
[5]
C. Vogler and D. Metaxas, “ASL Recognition Based on a Coupling between HMMs and 3D Motion Analysis,” Proc. Int'l Conf. Computer Vision, 1998.
[6]
A.D. Wilson and A.F. Bobick, “Learning Visual Behavior for Gesture Analysis,” Proc. IEEE Int'l Symp. Computer Vision, Nov. 1995.
[7]
J. Schlenzig, E. Hunter, and R. Jain, “Recursive Identification of Gesture Inputs Using Hidden Markov Models,” Proc. Second Conf. Applications of Computer Vision, pp. 187-194, Dec. 1994.
[8]
J.M. Rehg and T. Kanade, “Digiteyes: Vision-Based Human Hand Tracking,” technical report, Carnegie Mellon Univ., Dec. 1993.
[9]
J.B. J. Bussmann, W.L.J. Martens, J.H.M. Tulen, F. Schasfoort, H.J.G. van den Berg-Emons, and H. Stam, “Measuring Daily Behavior Using Ambulatory Accelerometry: The Activity Monitor,” Behavior Research Methods, Instruments + Computers, vol. 33, no. 3, pp. 349-356, 2001.
[10]
P. Bonato, “Advances in Wearable Technology and Applications in Physical and Medical Rehabilitation,” J. NeuroEng. and Rehabilitation, vol. 2, no. 2, 2005.
[11]
K. Aminian and B. Najafi, “Capturing Human Motion Using Body-Fixed Sensors: Outdoor Measurement and Clinical Applications,” Computer Animation and Virtual Worlds, vol. 15, pp. 79-94, 2004.
[12]
P.H. Veltink, H.B.J. Bussmann, W. deVries, W.L.J. Martens, and R.C. van Lummel, “Detection of Static and Dynamic Activities Using Uniaxial Accelerometers,” IEEE Trans. Rehabilitation Eng., vol. 4, no. 4, pp. 375-386, 1996.
[13]
K. Aminian, P. Robert, E.E. Buchser, B. Rutschmann, D. Hayoz, and M. Depairon, “Physical Activity Monitoring Based on Accelerometry: Validation and Comparison with Video Observation,” Medical Biology Eng. Computers, vol. 37, pp. 304-308, 1999.
[14]
M. Wetzler, J.R. Borderies, O. Bigaignon, P. Guillo, and P. Gosse, “Validation of a Two-Axis Accelerometer for Monitoring Patient Activity During Blood Pressure or ECG Holter Monitoring,” Clinical and Pathological Studies, 2003.
[15]
M. Uiterwaal, E.B. Glerum, H.J. Busser, and R.C. van Lummel, “Ambulatory Monitoring of Physical Activity in Working Situations, a Validation Study,” J. Medical Eng. Technology, vol. 22, no. 4, pp. 168-172, 1998.
[16]
J. Mantyjarvi, J. Himberg, and T. Seppanen, “Recognizing Human Motion with Multiple Acceleration Sensors,” Proc. IEEE Int'l Conf. Systems, Man, and Cybernetics, vol. 2, pp. 747-752, 2001.
[17]
C. Randell and H. Muller, “Context Awareness by Analysing Accelerometer Data,” Proc. IEEE Int'l Symp. Wearable Computers, pp. 175-176, 2000.
[18]
K. Van-Laerhoven and O. Cakmakci, “What Shall We Teach Our Pants?” Proc. IEEE Int'l Symp. Wearable Computers, pp. 77-83, 2000.
[19]
S. Antifakos, F. Michahelles, and B. Schiele, “Proactive Instructions for Furniture Assembly,” Proc. Fourth Int'l Conf. UbiComp, p. 351, 2002.
[20]
G. Fang, W. Gao, and D. Zhao, “Large Vocabulary Sign Language Recognition Based on Hierarchical Decision Trees,” Proc. Int'l Conf. Multimodal Interfaces, Nov. 2003.
[21]
V. Peltonen, J. Tuomi, A. Klapuri, J. Huopaniemi, and T. Sorsa, “Computational Auditory Scene Recognition,” Proc. IEEE Int'l Conf. Acoustics, Speech, and Signal Processing, vol. 2, pp. 1941-1944, May 2002.
[22]
M.C. Büchler, “Algorithms for Sound Classification in Hearing Instruments,” PhD thesis, ETH Zurich, 2002.
[23]
B. Clarkson, N. Sawhney, and A. Pentland, “Auditory Context Awareness in Wearable Computing,” Proc. Workshop Perceptual User Interfaces, Nov. 1998.
[24]
H. Wu and M. Siegel, “Correlation of Accelerometer and Microphone Data in the Coin Tap Test,” IEEE Trans. Instrumentation and Measurements, vol. 49, pp. 493-497, June 2000.
[25]
L. Xu, A. Kryzak, and C. Suen, “Methods of Combining Multiple Classifiers and Their Applications to Handwriting Recognition,” IEEE Trans. Systems, Man, and Cybernetics, vol. 22, pp. 418-435, May/June 1992.
[26]
T.K. Ho, “Multiple Classifier Combination: Lessons and Next Steps,” Hybrid Methods in Pattern Recognition. World Scientific, 2002.
[27]
J. Kittler, M. Hatef, R. Duin, and J. Matas, “On Combining Classifiers,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 20, no. 3, pp. 226-239, Mar. 1998.
[28]
T.K. Ho, J.J. Hull, and S.N. Srihari, “Decision Combination in Multiple Classifier Systems,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 16, no. 1, pp. 66-75, Jan. 1994.
[29]
L. Bao and S. Intille, “Activity Recognition from User-Annotated Acceleration Data,” Pervasive, 2004.
[30]
M. Stäger, P. Lukowicz, N. Perera, T. Büren, G. Tröster, and T. Starner, “Soundbutton: Design of a Low Power Wearable Audio Classification System,” Proc. IEEE Int'l Symp. Wearable Computers, 2003.
[31]
R. Duda, P. Hart, and D. Stork, Pattern Classification, second ed. Wiley, 2001.
[32]
C.V.C. Bouten, “A Triaxial Accelerometer and Portable Data Processing Unit for the Assessment of Daily Physical Activity,” IEEE Trans. Biomedical Eng., vol. 44, pp. 136-147, Mar. 1997.
[33]
L. Rabiner and B. Juang, “An Introduction to Hidden Markov Models,” IEEE ASSP Magazine, pp. 4-16, Jan. 1986.
[34]
T. Starner, J. Makhoul, R. Schwartz, and G. Chou, “Online Cursive Handwriting Recognition Using Speech Recognition Methods,” Proc. Int'l Conf. Acoustics, Speech, and Signal Processing, pp. 125-128, 1994.
[35]
K. Murphy, “The hmm Toolbox for MATLAB,”
[36]
N. Kern, B. Schiele, H. Junker, P. Lukowicz, and G. Tröster, “Wearable Sensing to Annotate Meeting Recordings,” Proc. IEEE Int'l Symp. Wearable Computers, pp. 186-193, Oct. 2002.
[37]
T. Fawcett, ROC Graphs: Notes and Practical Considerations for Researchers. Kluwer, 2004.
[38]
E. Tapia, S. Intille, and K. Larson, “Activity Recognition in the Home Using Simple and Ubiquitous Sensors,” Pervasive, pp. 158-175, 2004.
[39]
F. Provost, T. Fawcett, and R. Kohavi, “The Case Against Accuracy Estimation for Comparing Induction Algorithms,” Proc. 15th Int'l Money Laundering Conf., 1998.
[40]
I. Phillips and A. Chhabra, “Empirical Performance Evaluation of Graphics Recognition Systems,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 21, no. 9, pp. 849-870, Sept. 1999.
[41]
A. Hoover, G. Jean-Baptiste, X. Jiang, P. Flynn, H. Bunke, D. Goldof, K. Bowyer, D. Eggert, A. Fitzgibbon, and R. Fisher, “An Experimental Comparison of Range Image Segmentation Algorithms,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 18, no. 7, pp. 673-689, July 1996.
[42]
C. van Rijsbergen, Information Retrieval, second ed. Dept. of Computer Science, Univ. of Glasgow, 1979.
[43]
M. Stäger, P. Lukowicz, and G. Tröster, “Implementation and Evaluation of a Low-Power Sound-Based User Activity Recognition System,” Proc. IEEE Int'l Symp. Wearable Computers, 2004.
[44]
G. Ogris, T. Stiefmeier, H. Junker, P. Lukowicz, and G. Tröster, “Using Ultrasonic Hand Tracking to Augment Motion Analysis Based Recognition of Manipulative Gestures,” Proc. IEEE Int'l Symp. Wearable Computers, 2005.
[45]
O. Amft, H. Junker, and G. Tröster, “Detection of Eating and Drinking Arm Gestures Using Inertial Body-Worn Sensors,” Proc. IEEE Int'l Symp. Wearable Computers, Oct. 2005.
[46]
H. Brashear, T. Starner, P. Lukowicz, and H. Junker, “Using Multiple Sensors for Mobile Sign Language Recognition,” Proc. IEEE Int'l Symp. Wearable Computers, pp. 45-53, 2003.
[47]
H. Junker, P. Lukowicz, and G. Tröster, “Continuous Recognition of Arm Activities with Body-Worn Inertial Sensors,” Proc. IEEE Int'l Symp. Wearable Computers, pp. 188-189, 2004.
[48]
J.A. Ward, P. Lukowicz, and G. Tröster, “Gesture Spotting Using Wrist Worn Microphone and 3-Axis Accelerometer,” Proc. Soc-Eusai '05 Conf., Oct. 2005.
[49]
D. Minnen, I. Essa, and T. Starner, “Expectation Grammars: Leveraging High-Level Expectations for Activity Recognition,” Proc. IEEE Conf. Computer Vision and Pattern Recognition, June 2003.

Cited By

View all
  • (2024)WEAR: An Outdoor Sports Dataset for Wearable and Egocentric Activity RecognitionProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997768:4(1-21)Online publication date: 21-Nov-2024
  • (2024)ActSonic: Recognizing Everyday Activities from Inaudible Acoustic Wave Around the BodyProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997528:4(1-32)Online publication date: 21-Nov-2024
  • (2022)SAMoSAProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35502846:3(1-19)Online publication date: 7-Sep-2022
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence  Volume 28, Issue 10
October 2006
176 pages

Publisher

IEEE Computer Society

United States

Publication History

Published: 01 October 2006

Author Tags

  1. Pervasive computing
  2. classifier evaluation
  3. industry.
  4. wearable computers and body area networks

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 12 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)WEAR: An Outdoor Sports Dataset for Wearable and Egocentric Activity RecognitionProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997768:4(1-21)Online publication date: 21-Nov-2024
  • (2024)ActSonic: Recognizing Everyday Activities from Inaudible Acoustic Wave Around the BodyProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997528:4(1-32)Online publication date: 21-Nov-2024
  • (2022)SAMoSAProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35502846:3(1-19)Online publication date: 7-Sep-2022
  • (2022)Leveraging Sound and Wrist Motion to Detect Activities of Daily Living with Commodity SmartwatchesProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35345826:2(1-28)Online publication date: 7-Jul-2022
  • (2022)Continuous physical activity recognition for intelligent labour monitoringMultimedia Tools and Applications10.1007/s11042-021-11288-y81:4(4877-4895)Online publication date: 1-Feb-2022
  • (2022)Smartphone-based gait recognition using convolutional neural networks and dual-tree complex wavelet transformMultimedia Systems10.1007/s00530-022-00954-228:6(2307-2317)Online publication date: 1-Dec-2022
  • (2021)Daily Routine Recognition for Hearing Aid PersonalizationSN Computer Science10.1007/s42979-021-00538-32:3Online publication date: 11-Mar-2021
  • (2021)An individualized system of skeletal data-based CNN classifiers for action recognition in manufacturing assemblyJournal of Intelligent Manufacturing10.1007/s10845-021-01815-x34:2(633-649)Online publication date: 26-Jul-2021
  • (2020)ARM cortex M4-based extensible multimodal wearable platform for sensor research and context sensing from motion & soundAdjunct Proceedings of the 2020 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2020 ACM International Symposium on Wearable Computers10.1145/3410530.3414368(284-289)Online publication date: 10-Sep-2020
  • (2020)Nurse care activity recognitionAdjunct Proceedings of the 2020 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2020 ACM International Symposium on Wearable Computers10.1145/3410530.3414334(419-424)Online publication date: 10-Sep-2020
  • Show More Cited By

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media