Computationally Efficient 3D Orientation Tracking Using Gyroscope Measurements
<p>3D orientations (3DO) resulting from considering three simultaneous orthogonal rotations for 45° each as sequential rotations about in turn coordinate axes (<b>a</b>) <span class="html-italic">z</span>, <span class="html-italic">y</span>, and <span class="html-italic">x</span> and (<b>b</b>) <span class="html-italic">x</span>, <span class="html-italic">y</span>, and <span class="html-italic">z</span>. Due to the rotation non-commutativity, both result in a different 3DO and neither is equal to the actual 3DO of the sensor, which is illustrated with (<b>c</b>).</p> "> Figure 2
<p>Required number of measurement steps for computing the 3D orientation (3DO) using the Simultaneous Orthogonal Rotation Angle (SORA) and Euler angles for noiseless measurements. The sensor rotates for an angle of <span class="html-italic">φ</span> = 90° around the axis <b>v</b> = [1 1 1]/√3. In example (<b>a</b>), the rotation axis is constant (<span class="html-italic">θ</span> = 0°), and using the SORA delivers accurate 3DO through a single measurement. Using Euler angles leads to a systematic error <span class="html-italic">ε</span> in the estimated 3DO, which decreases with the number of steps <span class="html-italic">M</span>. To achieve <span class="html-italic">ε</span> ≤ 0.5°, <span class="html-italic">M</span> ≥ 75 is required and for <span class="html-italic">ε</span> ≤ 0.1°, <span class="html-italic">M</span> ≥ 373 is needed. In example (<b>b</b>), the rotation axis itself rotates around the axis [1 0 0] for an angle of <span class="html-italic">θ</span> = 90°. Achieving <span class="html-italic">ε</span> ≤ 0.5° using the SORA requires <span class="html-italic">M</span> ≥ 4, and using Euler angles requires <span class="html-italic">M</span> ≥ 139 (34.75 times more than for the SORA). In the same example, for <span class="html-italic">ε</span> ≤ 0.1°, using the SORA requires <span class="html-italic">M</span> ≥ 9, and using Euler angles requires <span class="html-italic">M</span> ≥ 694 (77.11 times more than for the SORA).</p> "> Figure 3
<p>Required number of measurement steps for computing the 3D orientation (3DO) using the Simultaneous Orthogonal Rotation Angle (SORA) and Euler angles for noisy measurements. The sensor rotates for an angle of <span class="html-italic">φ</span> = 90° around the axis <b>v</b> = [1 1 1]/√3. One thousand repetitions of the experiment are considered for each number of measurement steps <span class="html-italic">M</span>. Due to noise, using both the SORA and Euler angles leads to a systematic error ε in the estimated 3DO, which decreases with the number of steps <span class="html-italic">M</span>. However, for comparable levels of accuracy, the SORA approach still significantly outperforms the Euler angles method. For a constant rotation axis (example (<b>a</b>) <span class="html-italic">θ</span> = 0°) and using the SORA, <span class="html-italic">ε</span> ≤ 0.5° is achieved through a single measurement, and <span class="html-italic">ε</span> ≤ 0.1° is obtained in <span class="html-italic">M</span> ≥ 6 steps. Using Euler angles requires <span class="html-italic">M</span> ≥ 78 for <span class="html-italic">ε</span> ≤ 0.5° and <span class="html-italic">M</span> ≥ 407 for <span class="html-italic">ε</span> ≤ 0.1°. For example, (<b>b</b>), the rotation axis itself rotates around the axis [1 0 0] for an angle of <span class="html-italic">θ</span> = 90°. Achieving <span class="html-italic">ε</span> ≤ 0.5° using the SORA requires <span class="html-italic">M</span> ≥ 4, and using Euler angles requires <span class="html-italic">M</span> ≥ 143 (35.75 times more than for the SORA). In the same example, for <span class="html-italic">ε</span> ≤ 0.1°, using the SORA requires <span class="html-italic">M</span> ≥ 12, and using Euler angles requires <span class="html-italic">M</span> ≥ 745 (62.08 times more than for the SORA).</p> ">
Abstract
:1. Introduction
2. Effect of 3D Gyroscope Measurement Interpretation on the Computational Efficiency
2.1. Noiseless Measurements
2.2. Noisy Measurements
3. Effect of the Computational Method on the Computational Efficiency
3.1. DO Computation
3.1.1. Rotation Matrix
3.1.2. Rotation Quaternion
3.2. Computational Efficiency
3.2.1. Common Operations
3.2.2. Rotation Matrix
3.2.3. Rotation Quaternion
3.3. Discussion
4. Effect of 3DO Tracking Implementation on the Computational Efficiency
5. Experimental Validation
5.1. Experiment
- stepwise 3DO computation, in which we computed the 3DO for each consecutive measurement step, setting so M = 1 and performing 10,000 3DO computations, and
- only the final 3DO computation, in which we combined several consecutive measurements into a single rotation composite, at which point we computed the final 3DO setting so that M = 10,000 and performing a single 3DO computation.
5.2. Results and Discussion
6. Conclusions
Supplementary Materials
Author Contributions
Funding
Conflicts of Interest
References
- Lee, J.; Joo, H.; Lee, J.; Chee, Y. Automatic Classification of Squat Posture Using Inertial Sensors: Deep Learning Approach. Sensors 2020, 20, 361. [Google Scholar] [CrossRef] [Green Version]
- Martínez, A.; Jahnel, R.; Buchecker, M.; Snyder, C.; Brunauer, R.; Stöggl, T. Development of an Automatic Alpine Skiing Turn Detection Algorithm Based on a Simple Sensor Setup. Sensors 2019, 19, 902. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Benages Pardo, L.; Buldain Perez, D.; Orrite Uruñuela, C. Detection of Tennis Activities with Wearable Sensors. Sensors 2019, 19, 5004. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Stančin, S.; Tomažič, S. Early improper motion detection in golf swings using wearable motion sensors: The first approach. Sensors 2013, 13, 7505–7521. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Umek, A.; Kos, A. Validation of smartphone gyroscopes for mobile biofeedback applications. Pers. Ubiquit. Comput. 2016, 20, 657–666. [Google Scholar] [CrossRef] [Green Version]
- Formento, P.C.; Acevedo, R.; Ghoussayni, S.; Ewins, D. Gait Event Detection during Stair Walking Using a Rate Gyroscope. Sensors 2014, 14, 5470–5485. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Seel, T.; Raisch, J.; Schauer, T. IMU-Based joint angle measurement for gait analysis. Sensors 2014, 14, 6891–6969. [Google Scholar] [CrossRef] [Green Version]
- Allseits, E.; Kim, K.J.; Bennett, C.; Gailey, R.; Gaunaurd, I.; Agrawal, V. A Novel Method for Estimating Knee Angle Using Two Leg-Mounted Gyroscopes for Continuous Monitoring with Mobile Health Devices. Sensors 2018, 18, 2759. [Google Scholar] [CrossRef] [Green Version]
- Umek, A.; Kos, A.; Tomažič, S. Validation of smartphone gyroscopes for angular tracking in biofeedback applications. In Proceedings of the 2015 International Conference on Identification, Information, and Knowledge in the Internet of Things, Beijing, China, 22–23 October 2015. [Google Scholar]
- Sabatini, A.M. Estimating three-dimensional orientation of human body parts by inertial/magnetic sensing. Sensors 2011, 11, 1489–1525. [Google Scholar] [CrossRef] [Green Version]
- Luinge, H.J.; Veltink, P.H. Measuring orientation of human body segments using miniature gyroscopes and accelerometers. Med. Biol. Eng. Comput. 2005, 43, 273–282. [Google Scholar] [CrossRef]
- Fong, D.T.P.; Chan, Y.Y. The use of wearable inertial motion sensors in human lower limb biomechanics studies: A systematic review. Sensors 2010, 10, 11556–11565. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Pratt, K.A.; Sigward, S.M. Inertial Sensor Angular Velocities Reflect Dynamic Knee Loading during Single Limb Loading in Individuals Following Anterior Cruciate Ligament Reconstruction. Sensors 2018, 18, 3460. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Tunçel, O.; Altun, K.; Barshan, B. Classifying human leg motions with uniaxial piezoelectric gyroscopes. Sensors 2009, 9, 8508–8546. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ayrulu-Erdem, B.; Barshan, B. Leg motion classification with artificial neural networks using wavelet-based features of gyroscope signals. Sensors 2011, 11, 1721–1743. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Herrera-Alcántara, O.; Barrera-Animas, A.Y.; González-Mendoza, M.; Castro-Espinoza, F. Monitoring Student Activities with Smartwatches: On the Academic Performance Enhancement. Sensors 2019, 19, 1605. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Gallego, J.A.; Rocon, E.; Roa, J.O.; Moreno, J.C.; Pons, J.L. Real-Time Estimation of Pathological Tremor Parameters from Gyroscope Data. Sensors 2010, 10, 2129–2149. [Google Scholar] [CrossRef] [Green Version]
- Ali, A.; El-Sheimy, N. Low-Cost MEMS-Based Pedestrian Navigation Technique for GPS-Denied Areas. J. Sens. 2013, 2013. [Google Scholar] [CrossRef] [Green Version]
- Giansanti, D.; Maccioni, G.; Macellari, V. The development and test of a device for the reconstruction of 3-D position and orientation by means of a kinematic sensor assembly with rate gyroscopes and accelerometers. IEEE Trans. Biomed. Eng. 2005, 52, 1271–1277. [Google Scholar] [CrossRef]
- Wang, J.H.; Gao, Y. Land vehicle dynamics-aided inertial navigation. IEEE Trans. Aerosp. Electron. Syst. 2010, 46, 1638–1653. [Google Scholar] [CrossRef]
- Antonello, R.; Oboe, R. Exploring the potential of MEMS gyroscopes: Successfully using sensors in typical industrial motion control applications. IEEE Ind. Electron. Mag. 2012, 6, 14–24. [Google Scholar] [CrossRef]
- Stančin, S.; Tomažič, S. Angle estimation of simultaneous orthogonal rotations from 3D gyroscope measurements. Sensors 2011, 11, 8536–8549. [Google Scholar] [CrossRef] [PubMed]
- Tomažič, S.; Stančin, S. Simultaneous orthogonal rotation angle. Electrotech. Rev. 2011, 78, 7–11. [Google Scholar]
- Stančin, S.; Tomažič, S. On the interpretation of 3D gyroscope measurements. J. Sens. 2018, 2018. [Google Scholar] [CrossRef] [Green Version]
- Stančin, S.; Tomažič, S. Time- and computation-efficient calibration of MEMS 3D accelerometers and gyroscopes. Sensors 2014, 14, 14885–14915. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Kok, M.; Hol, J.D.; Schön, T.B. Using Inertial Sensors for Position and Orientation Estimation. Found. Trends® Signal Process. 2017, 11, 1–153. [Google Scholar] [CrossRef] [Green Version]
- Karamat, T.B.; Atia, M.M.; Noureldin, A. Performance Analysis of Code-Phase-Based Relative GPS Positioning and Its Integration with Land Vehicle’s Motion Sensors. IEEE Sens. J. 2014, 14, 3084–3100. [Google Scholar] [CrossRef]
- Martinelli, A. Vision and IMU Data Fusion: Closed-Form Solutions for Attitude, Speed, Absolute Scale, and Bias Determination. IEEE Trans. Robot. 2012, 28, 44–60. [Google Scholar] [CrossRef]
- Mirzaei, F.M.; Roumeliotis, S.I. A Kalman Filter-Based Algorithm for IMU-Camera Calibration: Observability Analysis and Performance Evaluation. IEEE Trans. Robot. 2008, 24, 1143–1156. [Google Scholar] [CrossRef]
- InvenSense MPU-6500 Product Specification. Available online: https://43zrtwysvxb2gf29r5o0athu-wpengine.netdna-ssl.com/wp-content/uploads/2015/02/MPU-6500-Datasheet2.pdf (accessed on 11 December 2019).
- Kuipers, J.B. Quaternions & Rotation Sequences: A Primer with Applications to Orbits, Aerospace and Virtual Reality, 2nd ed.; Princeton University Press: Princeton, NJ, USA, 1999. [Google Scholar]
- Eberly, D. Rotation Representations and Performance Issues. Geometric Tools, LLC. Available online: http://www.geometrictools.com/Documentation/RotationIssues.pdf (accessed on 28 October 2019).
- Arduino Zero Technical Specification. Available online: https://store.arduino.cc/genuino-zero (accessed on 28 October 2019).
- SMART ARM-based Microcontroller Datasheet. Available online: https://cdn.sparkfun.com/datasheets/Dev/Arduino/Boards/Atmel-42181-SAM-D21_Datasheet.pdf (accessed on 28 October 2019).
- Stančin, S. Computationally Efficient 3DO Tracking. GitHub Repositorty. Available online: https://github.com/s-sara-a/ComputationallyEfficient3DOTracking (accessed on 26 February 2020).
Interpretation | Computational Method | Computational Efficiency 1 | |||
---|---|---|---|---|---|
(A) | (M) | (VN) | (F) | ||
simultaneous 2 | rotation matrix | 30 M | 43 M | 1 M | 3 M |
rotation quaternion | 14 M + 39 | 23 M + 56 | 1 M | 3 M | |
sequential 3 | rotation matrix | 22 M | 42 M | 0 | 6 M |
rotation quaternion | 16 M + 39 | 31 M + 56 | 0 | 6 M |
Computational Scenario | Computational Method | Execution Time T [ms] | Gain Factor 3 |
---|---|---|---|
Stepwise 1 | Rotation matrix | 304.2 | 1.85 |
Rotation quaternion | 561.4 | 1.00 | |
Only final 2 | Rotation matrix | 304.2 | 1.00 |
Rotation quaternion | 173.7 | 1.75 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Stančin, S.; Tomažič, S. Computationally Efficient 3D Orientation Tracking Using Gyroscope Measurements. Sensors 2020, 20, 2240. https://doi.org/10.3390/s20082240
Stančin S, Tomažič S. Computationally Efficient 3D Orientation Tracking Using Gyroscope Measurements. Sensors. 2020; 20(8):2240. https://doi.org/10.3390/s20082240
Chicago/Turabian StyleStančin, Sara, and Sašo Tomažič. 2020. "Computationally Efficient 3D Orientation Tracking Using Gyroscope Measurements" Sensors 20, no. 8: 2240. https://doi.org/10.3390/s20082240
APA StyleStančin, S., & Tomažič, S. (2020). Computationally Efficient 3D Orientation Tracking Using Gyroscope Measurements. Sensors, 20(8), 2240. https://doi.org/10.3390/s20082240