A UWB-Based Lighter-Than-Air Indoor Robot for User-Centered Interactive Applications
<p>Illustration of the target application where the UWB-based LAIDR acts as an indoor movie-making agent during a live show and captures specific actor movie shots by automatically following the actor’s UWB position.</p> "> Figure 2
<p>Detailed view of the LAIDR body and accompanying 3D-printed parts.</p> "> Figure 3
<p>Steering controller: (<b>a</b>) block diagram; (<b>b</b>) performance.</p> "> Figure 4
<p>Altitude controller: (<b>a</b>) block diagram; (<b>b</b>) performance.</p> "> Figure 5
<p>Deployment of the UWB sensor network to track the LAIDR’s pose and the user’s position.</p> "> Figure 6
<p>The LAIDR’s pose computation using the UWB sensor network. (<b>a</b>) placement of autopilot board, UWB tags, and propulsion units on the LAIDR in the local PX4 frame; (<b>b</b>) the LAIDR’s heading in the UWB frame.</p> "> Figure 7
<p>UWB-based LAIDR pose fusion with PX4 Flight Stack: (<b>a</b>) physical connection between a companion computer and an autopilot board; (<b>b</b>) parameters’ path; (<b>c</b>) description of the “LPE_FUSION” value at the QGC.</p> "> Figure 8
<p>Data processing scheme using the ROS framework.</p> "> Figure 9
<p>Photo of the experimental environment showing the UWB sensor network in the ENU frame, the LAIDR platform in the NED frame, the 3D laser sensor in the ENU frame, and the ground station.</p> "> Figure 10
<p>Pre-assigned waypoints following setup: (<b>a</b>) scenario layout; (<b>b</b>) the LAIDR during the experiment at 1 s instant.</p> "> Figure 11
<p>Ground truth position computation using 3D laser sensor’s point cloud: (<b>a</b>) implementation steps; (<b>b</b>) visual output of each step.</p> "> Figure 12
<p>Coordinates transformation of the ground truth position.</p> "> Figure 13
<p>User-centered waypoints following setup: (<b>a</b>) scenario layout; (<b>b</b>) the LAIDR during the experiment and the user at the H2 location.</p> "> Figure 14
<p>Concept illustration of user intention recognition based on his UWB-based on-air drawn spatial pattern.</p> "> Figure 15
<p>Sample dataset images created by Algorithm 2.</p> "> Figure 16
<p>Fixed waypoints following results: (<b>a</b>) 3D plot; (<b>b</b>) 2D plot; (<b>c</b>) 1D plot with respect to time.</p> "> Figure 17
<p>User-centered waypoints following results: (<b>a</b>) 3D plot; (<b>b</b>) 2D plot; (<b>c</b>) 1D plot with respect to time.</p> "> Figure 18
<p>Confusion matrix of the CNN model trained on 8 classes.</p> "> Figure 19
<p>Demonstration of the UWB-based LAIDR as an indoor movie-making agent during a live event. (Left side: theater’s main camera view; right side: LAIDR’s GoPro shot).</p> ">
Abstract
:1. Introduction
- We developed the data processing scheme over an ROS framework to accommodate the custom-built LAIDR [34] platform’s integration needs for the user-centered application. In addition, we incorporated the UWB sensor network to overcome the influence of the indoor environment’s uncertainties on the LAIDR’s pose and user’s position tracking simultaneously;
- We proposed the dual interactions to exhibit the user-centered autonomous following of the LAIDR anywhere inside the operational space by relying only on the user’s 3D positional data acquired by their hand-held UWB sensor. The first interaction revealed a responsive update for the LAIDR’s path to autonomously follow the user footprint (2D position), while the second interaction detected the user intention by recognizing their on-air gesture (a spatial pattern) using a deep learning model.
2. System Architecture and Methods
2.1. Vehicle Platform and Control System
2.1.1. Specifications
2.1.2. Motions
2.1.3. Control System
2.2. Indoor Tracking System for Vehicle Pose and User Position
2.2.1. UWB Sensor Network and Data Acquisition
2.2.2. Coordinate Frames
- () is the UWB frame or the world frame;
- () and () are the two representations of the LAIDR platform’s pose in the UWB frame and local PX4 frame, respectively. Section 2.2.3 details the LAIDR’s UWB-based pose calculation and Section 2.2.4 specifies the UWB-based pose frame conversion from the UWB frame to the local PX4 frame;
- () is the user’s position in UWB frame.
2.2.3. UWB-Based LAIDR Pose Calculation
2.2.4. Pose Fusion with PX4 Flight Stack
2.3. Data Processing Scheme
2.3.1. Off-Board
2.3.2. On-Board
3. Experimental Setup
3.1. UWB-Based LAIDR Pose Tracking and Fixed Waypoints Following
3.1.1. Scenario
3.1.2. Ground Truth Position Computation and Transformation
3.2. User-Centered Autonomous Following
3.2.1. Based on User Footprint
Algorithm 1 Follower node execution flow for user-centered waypoints following |
Input: ROS topics “/mavros/local_position/pose” and “/uwb/user_position”, and operation space area Output: ROS topic “/mavros/setpoint_position/local” Procedure: Start with current user position as target waypoint Subscribe ROS topic “/uwb/user_position” Update “current user position” from ROS topic “/uwb/user_position” i = 2 waypoint (i) = (current user position X-axis, current user position Y-axis) threshold = 4.0 m while (true) Subscribe ROS topic “/mavros/local_position/pose” Update “current LAIDR position” from ROS topic “/mavros/local_position/pose” Subscribe ROS topic “/uwb/user_position” Update “current User position” from ROS topic “/uwb/user_position” if (distance (current LAIDR position, waypoint (i)) ≤ threshold) Switch the target waypoint if (i = 2) i = 1 waypoint (i) = (current user position X-axis, operation space Y-axis upper limit) threshold = 0.6 m else i = 2 waypoint (i) = (current user position X-axis, current user position Y-axis) threshold = 4.0 m Write waypoint (i) to ROS topic “/mavros/setpoint_position/local” Publish ROS topic “/mavros/setpoint_position/local” |
3.2.2. Based on User Intention
Algorithm 2 Pre-processing execution flow |
Input: ROS topic “/uwb/user_position” Output: ROS topics “/user_intention/cnn_input_image” Procedure: Initialize arrays “gesture[]” and “fgesture[]”, while (true) 1. UWB pattern registration: Subscribe ROS topic “/uwb/user_position” Update “3DgestureXYZ” from ROS topic “/uwb/user_position” if (button press == 1) Append “gesture[]” by writing “3DgestureXYZ” Apply an moving-average filter to smooth the stored array “gesture[]” Update the “fgesture[]” after filter 2. Rotation offset: Pick first and last point in XY-plane from “fgesture[]” Get “rotation angle” by using inverse tangent on two points 3. Input image: Calibrate the XZ-axes points from “fgesture[]” against “rotation angle” offset around Z-axis Normalize the calibrated points as “input image” between min. and max. limits of XZ-axes Write normalized image to ROS topic “/user_intention/cnn_input_image” Publish ROS topic “/user_intention/cnn_input_image” |
4. Experimental Results
4.1. UWB-Based LAIDR Pose Tracking and Fixed Waypoints Following
4.2. User-Centered Autonomous Following
4.2.1. Based on User Footprint
- From start to 30 s when the user stays at the H1 position (static mode);
- Above 30 s to 45 s when the user switches his position to H2 (switch mode);
- Above 45 s to 87 s when the user holds the H2 position (static mode);
- Above 87 s to 102 s when the user switches his position to H3 (switch mode);
- Above 102 s to end when the user maintains the H3 position (static mode).
4.2.2. Based on User Intention
5. Discussion
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Debeunne, C.; Vivet, D. A Review of Visual-LiDAR Fusion based Simultaneous Localization and Mapping. Sensors 2020, 20, 2068. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Bachmann, E.R.; Yun, X.; Peterson, C.W. An investigation of the effects of magnetic variations on inertial/magnetic orientation sensors. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA, New Orleans, LA, USA, 26 April–1 May 2004; pp. 1115–1122. [Google Scholar]
- Gozick, B.; Subbu, K.P.; Dantu, R.; Maeshiro, T. Magnetic maps for indoor navigation. IEEE Trans. Instrum. Meas. 2011, 60, 3883–3891. [Google Scholar] [CrossRef]
- Niloy, A.; Shama, A.; Chakrabortty, R.K.; Ryan, M.J.; Badal, F.R.; Tasneem, Z.; Ahamed, M.H.; Moyeen, S.I.; Das, S.K.; Ali, M.F. Critical design and control issues of indoor autonomous mobile robots: A review. IEEE Access 2021, 9, 35338–35370. [Google Scholar] [CrossRef]
- Szafir, D.; Mutlu, B.; Fong, T. Designing planning and control interfaces to support user collaboration with flying robots. Int. J. Robot. Res. 2017, 36, 514–542. [Google Scholar] [CrossRef]
- Galvane, Q.; Lino, C.; Christie, M.; Fleureau, J.; Servant, F.; Tariolle, F.O.-L.; Guillotel, P. Directing cinematographic drones. ACM Trans. Graph. 2018, 37, 1–18. [Google Scholar] [CrossRef] [Green Version]
- Lioulemes, A.; Galatas, G.; Metsis, V.; Mariottini, G.L.; Makedon, F. Safety challenges in using AR. Drone to collaborate with humans in indoor environments. In Proceedings of the 7th International Conference on PErvasive Technologies Related to Assistive Environments, Rhodes, Greece, 27–30 May 2014; pp. 1–4. [Google Scholar]
- Hwang, M.H.; Cha, H.R.; Jung, S.Y. Practical Endurance Estimation for Minimizing Energy Consumption of Multirotor Unmanned Aerial Vehicles. Energies 2018, 11, 2221. [Google Scholar] [CrossRef] [Green Version]
- Liew, C.F.; Yairi, T. Quadrotor or blimp? Noise and appearance considerations in designing social aerial robot. In Proceedings of the 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Tokyo, Japan, 3–6 March 2013; pp. 183–184. [Google Scholar]
- Gorjup, G.; Liarokapis, M. A Low-Cost, Open-Source, Robotic Airship for Education and Research. IEEE Access 2020, 8, 70713–70721. [Google Scholar] [CrossRef]
- Cho, S.; Mishra, V.; Tao, Q.; Vamell, P.; King-Smith, M.; Muni, A.; Smallwood, W.; Zhang, F. Autopilot design for a class of miniature autonomous blimps. In Proceedings of the 2017 IEEE Conference on Control Technology and Applications (CCTA), Maui, HI, USA, 27–30 August 2017; pp. 841–846. [Google Scholar]
- Burri, M.; Gasser, L.; Käch, M.; Krebs, M.; Laube, S.; Ledergerber, A.; Meier, D.; Michaud, R.; Mosimann, L.; Müri, L. Design and control of a spherical omnidirectional blimp. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; pp. 1873–1879. [Google Scholar]
- St-Onge, D.; Breches, P.Y.; Sharf, I.; Reeves, N.; Rekleitis, I.; Abouzakhm, P.; Girdhar, Y.; Harmat, A.; Dudek, G.; Giguere, P. Control, localization and human interaction with an autonomous lighter-than-air performer. Robot. Auton Syst 2017, 88, 165–186. [Google Scholar] [CrossRef]
- Yao, N.; Anaya, E.; Tao, Q.; Cho, S.; Zheng, H.; Zhang, F. Monocular vision-based human following on miniature robotic blimp. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 3244–3249. [Google Scholar]
- Yao, N.S.; Tao, Q.Y.; Liu, W.Y.; Liu, Z.; Tian, Y.; Wang, P.Y.; Li, T.; Zhang, F.M. Autonomous flying blimp interaction with human in an indoor space. Front. Inf. Tech. El 2019, 20, 45–59. [Google Scholar] [CrossRef]
- Probine, C.; Gorjup, G.; Buzzatto, J.; Liarokapis, M. A Shared Control Teleoperation Framework for Robotic Airships: Combining Intuitive Interfaces and an Autonomous Landing System. In Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics, Melbourne, Australia, 17–20 October 2021. [Google Scholar]
- St-Onge, D.; Gosselin, C.; Reeves, N. Dynamic modelling and control of a cubic flying blimp using external motion capture. Proc. Inst. Mech. Eng. Part I J. Syst. Control. Eng. 2015, 229, 970–982. [Google Scholar] [CrossRef]
- Galvane, Q.; Fleureau, J.; Tariolle, F.-L.; Guillotel, P. Automated cinematography with unmanned aerial vehicles. arXiv 2017, arXiv:1712.04353. [Google Scholar]
- Leong, X.W.; Hesse, H. Vision-Based Navigation for Control of Micro Aerial Vehicles. In IRC-SET 2018; Springer: Berlin/Heidelberg, Germany, 2019; pp. 413–427. [Google Scholar]
- Huh, S.; Shim, D.H.; Kim, J. Integrated navigation system using camera and gimbaled laser scanner for indoor and outdoor autonomous flight of UAVs. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; pp. 3158–3163. [Google Scholar]
- Engel, J.; Sturm, J.; Cremers, D. Camera-based navigation of a low-cost quadrocopter. In Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura-Algarve, Portugal, 7–12 October 2012; pp. 2815–2821. [Google Scholar]
- Shen, S.; Michael, N.; Kumar, V. Autonomous multi-floor indoor navigation with a computationally constrained MAV. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; pp. 20–25. [Google Scholar]
- Seguin, L.; Zheng, J.; Li, A.; Tao, Q.; Zhang, F. A deep learning approach to localization for navigation on a miniature autonomous blimp. In Proceedings of the 2020 IEEE 16th International Conference on Control & Automation (ICCA), Singapore, 9–11 October 2020; pp. 1130–1136. [Google Scholar]
- Kumar, G.A.; Patil, A.K.; Patil, R.; Park, S.S.; Chai, Y.H. A LiDAR and IMU Integrated Indoor Navigation System for UAVs and Its Application in Real-Time Pipeline Classification. Sensors 2017, 17, 1268. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Li, Y.; Zahran, S.; Zhuang, Y.; Gao, Z.; Luo, Y.; He, Z.; Pei, L.; Chen, R.; El-Sheimy, N. IMU/Magnetometer/Barometer/Mass-Flow Sensor Integrated Indoor Quadrotor UAV Localization with Robust Velocity Updates. Remote Sens. 2019, 11, 838. [Google Scholar] [CrossRef] [Green Version]
- Macoir, N.; Bauwens, J.; Jooris, B.; Van Herbruggen, B.; Rossey, J.; Hoebeke, J.; De Poorter, E. UWB Localization with Battery-Powered Wireless Backbone for Drone-Based Inventory Management. Sensors 2019, 19, 467. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Alarifi, A.; Al-Salman, A.; Alsaleh, M.; Alnafessah, A.; Al-Hadhrami, S.; Al-Ammar, M.A.; Al-Khalifa, H.S. Ultra Wideband Indoor Positioning Technologies: Analysis and Recent Advances. Sensors 2016, 16, 707. [Google Scholar] [CrossRef] [PubMed]
- Liu, X.; Zhang, S.; Tian, J.; Liu, L. An Onboard Vision-Based System for Autonomous Landing of a Low-Cost Quadrotor on a Novel Landing Pad. Sensors 2019, 19, 4703. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- PX4 Basic Concepts. Available online: https://docs.px4.io/master/en/getting_started/px4_basic_concepts.html (accessed on 17 September 2021).
- Standard PX4 Flight Stack. Available online: https://docs.px4.io/master/en/concept/architecture.html (accessed on 17 September 2021).
- Meier, L.; Honegger, D.; Pollefeys, M. PX4: A node-based multithreaded open source robotics framework for deeply embedded platforms. In Proceedings of the 2015 IEEE international conference on robotics and automation (ICRA), Seattle, WA, USA, 25–30 May 2015; pp. 6235–6240. [Google Scholar]
- MATLAB/Simulink PX4 Supported Toolbox. Available online: https://kr.mathworks.com/hardware-support/px4-autopilots.html (accessed on 17 September 2021).
- Elsharkawy, A.; Naheem, K.; Koo, D.; Kim, M.S. A UWB-Driven Self-Actuated Projector Platform for Interactive Augmented Reality Applications. Appl. Sci. 2021, 11, 2871. [Google Scholar] [CrossRef]
- Naheem, K.; Elsharkawy, A.; Kim, M.S. Tracking Feasibility of UWB Positioning System for Lighter-than-Air Indoor Robot Navigation. In Proceedings of the 2021 IEEE 21st International Conference on Control, Automation and Systems (ICCAS), Jeju, Korea, 12–15 October 2021; pp. 2109–2111. [Google Scholar]
- Elsharkawy, A.; Naheem, K.; Lee, Y.; Koo, D.; Kim, M.S. LAIDR: A Robotics Research Platform for Entertainment Applications. In Proceedings of the 2020 International Conference on Unmanned Aircraft Systems (ICUAS), Athens, Greece, 1–4 September 2020; pp. 534–539. [Google Scholar]
- Wan, C.; Kingry, N.; Dai, R. Design and autonomous control of a solar-power blimp. In Proceedings of the 2018 AIAA Guidance, Navigation, and Control Conference, Kissimmee, FL, USA, 8–12 January 2018; p. 1588. [Google Scholar]
- Ciholas-DWUSB-KIT. Available online: https://cuwb.io/products/dwusb/ (accessed on 16 February 2022).
- CUWB 2.0 (Archimedes). Available online: https://cuwb.io/products/archimedes-product/ (accessed on 17 September 2021).
- Using Vision or Motion Capture Systems for Position Estimation. Available online: https://dev.px4.io/v1.9.0_noredirect/en/ros/external_position_estimation.html (accessed on 17 September 2021).
- Switching State Estimators. Available online: https://docs.px4.io/master/en/advanced/switching_state_estimators.html (accessed on 17 September 2021).
- QGroundControl. Available online: http://qgroundcontrol.com/ (accessed on 17 September 2021).
- Geometry_msgs/PoseStamped Message. Available online: http://docs.ros.org/en/noetic/api/geometry_msgs/html/msg/PoseStamped.html (accessed on 17 September 2021).
- 3D Laser Sensor. Available online: http://soslab.co/SL (accessed on 17 September 2021).
- Anaconda. Available online: https://www.anaconda.com/ (accessed on 17 September 2021).
- MAVROS. Available online: http://wiki.ros.org/mavros (accessed on 17 September 2021).
- Penna, M.A.; Dines, K.A. A simple method for fitting sphere-like surfaces. IEEE Trans. Pattern Anal. Mach. Intell. 2007, 29, 1673–1678. [Google Scholar] [CrossRef] [PubMed]
- Demonstration Video of User-Centered Autonomous Following Based on User Footprint. Available online: https://youtu.be/dXe_IMDaFYI (accessed on 23 December 2021).
- Demonstration Video of LAIDR as Indoor Movie Making Agent during Live Event. Available online: https://youtu.be/JguZFh_rcs0 (accessed on 23 December 2021).
# | Item | Component Detail | Function and Quantity | Payload |
---|---|---|---|---|
1 | Hull | PVC with 1.12 mm thickness | Spherical shape of 1.95 m diameter ×1 | 2.167 kg |
2 | Propulsion units |
| Motor driver ×4 | 0.460 kg |
| BLDC motor ×4 | |||
| Servo motor ×4 | |||
3 | Control box |
| Autopilot board ×1 | 0.612 kg |
| Companion PC ×1 | |||
| LED music sync. over WiFi ×1 | |||
| Servo motors controller ×1 | |||
| Power source ×1 | |||
| Power regulator ×1 | |||
4 | Positioning sensor | Ciholas DWUSB | UWB tag ×2 | 0.030 kg |
5 | Video streaming system |
| Camera ×1 | 0.388 kg |
| HDMI wireless video transmitter ×1 | |||
| Gimbal ×1 | |||
6 | Cables/wiring | 0.88 mm wires, MOLEX con. | Transfer power and control signals | 0.176 kg |
Value | M | U1 | U2 | U3 | U4 | U5 | U6 |
---|---|---|---|---|---|---|---|
X [m] | 0.0 | 6.9 | 20.9 | 20.9 | 6.9 | 20.9 | 20.9 |
Y [m] | 0.0 | −2.2 | −2.2 | 8.8 | 8.8 | −3.7 | 10.3 |
Z [m] | 3.75 | 3.75 | 6.28 | 6.28 | 3.75 | 3.75 | 3.75 |
Structure | Input | Filter | Depth | Stride | Output |
---|---|---|---|---|---|
Convolution + ReLU | 288 × 432 ×3 | 3 × 3 | 32 | 1 × 1 | 286 × 430 × 32 |
Max Pooling | 286 × 430 × 32 | 2 × 2 | - | - | 143 × 215 × 32 |
Convolution + ReLU | 143 × 215 × 32 | 3 × 3 | 32 | 1 × 1 | 141 × 213 × 32 |
Max Pooling | 141 × 213 × 32 | 2 × 2 | - | - | 70 × 106 × 32 |
Convolution + ReLU | 70 × 106 × 32 | 3 × 3 | 32 | 1 × 1 | 68 × 104 × 32 |
Max Pooling | 68 × 104 × 32 | 2 × 2 | - | - | 34 × 52 × 32 |
Fully Connected + BN | 34 × 52 × 32 | - | - | - | 256 |
Softmax | 256 | - | - | - | 10 |
Value | RMSE |
---|---|
X [m] | 0.08 |
Y [m] | 0.06 |
Z [m] | 0.05 |
2D [m] | 0.10 |
3D [m] | 0.11 |
Class | Precision (%) | Recall (%) | F1-Score (%) | Support |
---|---|---|---|---|
L | 100 | 96 | 98 | 23 |
I | 96 | 100 | 98 | 23 |
N | 100 | 100 | 100 | 24 |
C | 100 | 100 | 100 | 23 |
S | 100 | 100 | 100 | 23 |
Circle | 100 | 100 | 100 | 24 |
Square | 100 | 100 | 100 | 24 |
Triangle | 100 | 100 | 100 | 24 |
Mean | 99.5 | 99.5 | 99.5 | Σ = 188 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Naheem, K.; Elsharkawy, A.; Koo, D.; Lee, Y.; Kim, M. A UWB-Based Lighter-Than-Air Indoor Robot for User-Centered Interactive Applications. Sensors 2022, 22, 2093. https://doi.org/10.3390/s22062093
Naheem K, Elsharkawy A, Koo D, Lee Y, Kim M. A UWB-Based Lighter-Than-Air Indoor Robot for User-Centered Interactive Applications. Sensors. 2022; 22(6):2093. https://doi.org/10.3390/s22062093
Chicago/Turabian StyleNaheem, Khawar, Ahmed Elsharkawy, Dongwoo Koo, Yundong Lee, and Munsang Kim. 2022. "A UWB-Based Lighter-Than-Air Indoor Robot for User-Centered Interactive Applications" Sensors 22, no. 6: 2093. https://doi.org/10.3390/s22062093
APA StyleNaheem, K., Elsharkawy, A., Koo, D., Lee, Y., & Kim, M. (2022). A UWB-Based Lighter-Than-Air Indoor Robot for User-Centered Interactive Applications. Sensors, 22(6), 2093. https://doi.org/10.3390/s22062093