WO2021189507A1 - Rotor unmanned aerial vehicle system for vehicle detection and tracking, and detection and tracking method - Google Patents
Rotor unmanned aerial vehicle system for vehicle detection and tracking, and detection and tracking method Download PDFInfo
- Publication number
- WO2021189507A1 WO2021189507A1 PCT/CN2020/082257 CN2020082257W WO2021189507A1 WO 2021189507 A1 WO2021189507 A1 WO 2021189507A1 CN 2020082257 W CN2020082257 W CN 2020082257W WO 2021189507 A1 WO2021189507 A1 WO 2021189507A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- target
- tracking
- detection
- vehicle
- algorithm
- Prior art date
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 79
- 238000000034 method Methods 0.000 title claims abstract description 21
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 claims abstract description 37
- 238000004364 calculation method Methods 0.000 claims abstract description 16
- 238000012544 monitoring process Methods 0.000 claims abstract description 5
- 230000005540 biological transmission Effects 0.000 claims description 22
- 230000033001 locomotion Effects 0.000 claims description 10
- 238000005259 measurement Methods 0.000 claims description 7
- 230000008569 process Effects 0.000 claims description 6
- 238000012360 testing method Methods 0.000 claims description 5
- 238000012549 training Methods 0.000 claims description 5
- 238000006243 chemical reaction Methods 0.000 claims description 3
- 238000004891 communication Methods 0.000 claims description 2
- 230000003993 interaction Effects 0.000 claims 1
- 239000011159 matrix material Substances 0.000 description 8
- 238000013461 design Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 4
- 238000013473 artificial intelligence Methods 0.000 description 3
- 238000012512 characterization method Methods 0.000 description 3
- 238000013135 deep learning Methods 0.000 description 3
- 238000013507 mapping Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 238000013528 artificial neural network Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 125000004122 cyclic group Chemical group 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000004438 eyesight Effects 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 238000011897 real-time detection Methods 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 208000035473 Communicable disease Diseases 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000004904 long-term response Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000011176 pooling Methods 0.000 description 1
- 238000013138 pruning Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
- 210000001050 stape Anatomy 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 230000016776 visual perception Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/48—Matching video sequences
Definitions
- the invention relates to the field of rotary wing unmanned aerial vehicles, in particular to a rotary wing unmanned aerial vehicle system used for vehicle detection and tracking and a detection and tracking method.
- rotary-wing drones have many advantages such as no site restrictions, fixed-point hovering, slow flight, limited space flight, vertical take-off and landing, etc., making them suitable for aerial photography, agriculture, plant protection, micro selfies, and express transportation.
- Disaster rescue, surveillance of infectious diseases, surveying and mapping, news reports, power inspections, disaster relief, film and television shooting and other fields have a wide range of applications.
- artificial intelligence technology has developed rapidly, and the combination of drones and artificial intelligence technology has become a new research focus.
- Target detection and target tracking technology based on deep learning endows drones with "intelligence”, enabling drones to have a wider information search area, further improving the ability to interpret and analyze local micro-information, and to perceive the surrounding environment.
- the accuracy of measurement and measurement has also been further improved.
- the empowerment of UAVs by artificial intelligence technology has added a pair of keen "eyes" to UAVs, enabling them to fly autonomously and perform higher-level tasks.
- the mainstream rotary-wing drones for target detection algorithms use the YOLOv3 neural network architecture model.
- the ground station has strong computing power and can use large networks for object detection.
- the network model structure of YOLOv3 is mainly composed of 75 volume base layers. Without using a fully connected layer, the network can correspond to input images of any size.
- the pooling layer is not used. Instead, the stride of the volume base layer is set to 2 to achieve the down-sampling effect, while transferring the scale-invariant features to the next layer.
- YOLOv3 also uses structures similar to ResNet and FPN networks, these two structures are also of great benefit to improving the detection accuracy. This network is better for motor vehicle detection from the perspective of drones. For larger network models, the calculation of the above-mentioned network structure is time-consuming, and the real-time performance is poor, requiring strong computing power support, and cannot be deployed on the end side.
- KCF Kernel Correlation Filtering
- CSK detection-based tracking loop structure
- KCF converts the descriptor of each channel of the HOG feature descriptor into a cyclic matrix through cyclic shift.
- the circulant matrix can be diagonalized by discrete Fourier transform (DFT). Therefore, matrix calculations can be effectively processed in the Fourier domain, especially matrix inversion.
- a kernel function is applied in the KCF tracker to improve the tracking performance, and the regression function f(z) is mapped to the nonlinear space.
- the ground station is used for calculation and reasoning to ensure sufficient computing resources, the time consumption of large-scale data transmission is unacceptable, the uncertainty of the wireless transmission network also has time delay, and the ground station needs to pay for the speed of the reasoning calculation. High cost.
- the real-time nature of flight control is a necessary factor for UAV safety. If the delay is high, it will inevitably affect the detection and tracking effect and flight safety.
- the KCF algorithm is affected by the uncertainties of the external environment, such as illumination changes, occlusion, etc., which will cause the target tracking to be lost, and the target cannot be repositioned after the target is lost, which will eventually lead to the tracking failure.
- the present invention proposes a rotary wing UAV system for vehicle detection and tracking.
- the UAV end-side calculation scheme is adopted, and the ground station is only used for tracking video monitoring and tracking.
- Manual flight control command issuance operation effectively improves the timeliness performance of the system.
- the YOLO Nano target detection algorithm solves the problem of heavy computing load on the airborne computer; through the Staple-based tracking algorithm and the target re-detection module, the KCF algorithm is limited to the external environment and the tracking is unstable.
- a rotary-wing UAV system for vehicle detection and tracking including UAV platform and ground station platform, using UAV end-to-side computing solution, greatly reducing system delay and ensuring motor vehicle targets Real-time detection and tracking.
- the UAV platform is used for real-time calculation and detection of tracking targets;
- the ground station platform is used for tracking and video monitoring of the targets, and issuing manual flight control instructions to the UAV platform.
- the unmanned aerial vehicle platform includes: a visible light camera, an onboard computer, a first wireless image transmission terminal, and a flight control module, wherein the onboard computer is respectively connected to the visible light camera, the first wireless image transmission terminal, and the flight control module. Modules are connected; the ground station platform includes a PC and a second wireless image transmission terminal, which exchange information; the first wireless image transmission terminal and the second wireless image transmission terminal exchange information;
- the visible light camera is used to collect image data
- the onboard computer is used to run a target detection algorithm and a target tracking algorithm
- the first wireless image transmission terminal is used to transmit a real-time video stream of target tracking and receive manual flight control instructions from a ground station;
- the PC is used for real-time video stream monitoring of target tracking and manual flight control command issuance;
- the second wireless image transmission terminal is used to receive a real-time video stream of target tracking and send manual flight control instructions.
- the target detection algorithm run by the onboard computer adopts the YOLO Nano algorithm.
- the target tracking algorithm run by the onboard computer adopts the Staple tracking algorithm.
- the on-board computer also includes a target re-detection module, which is used to determine whether the target is occluded according to the correlation value between the test sample of the Staple tracking algorithm and the training sample, and set the threshold of the correlation value.
- the threshold value judges that there is occlusion. If there is occlusion, the predicted value of the target is copied to the measured value, and the measured value is corrected to obtain the estimated value of the target position.
- the onboard computer is used to deploy the Ubuntu ROS operating system, which includes a camera node, a target detection node, a target tracking node, and a flight control node; among them, the camera node is used to collect image data, and the target detection node is used for all For vehicle positioning, the target tracking node is used to track the target vehicle, and the flight control node is used to control the flight of the rotary-wing UAV.
- the Ubuntu ROS operating system which includes a camera node, a target detection node, a target tracking node, and a flight control node; among them, the camera node is used to collect image data, and the target detection node is used for all
- the target tracking node is used to track the target vehicle
- the flight control node is used to control the flight of the rotary-wing UAV.
- the UAV platform calculates and detects and tracks the target in real time; the ground station platform sends flight control instructions to the UAV through wireless communication to control the flight of the aircraft.
- the method specifically includes the following steps:
- the visible light camera collects image data and publishes image topics through the camera node of the onboard computer;
- the target detection node subscribes to the image topic and uses it as the input of the target detection node.
- the onboard computer calculates the vehicle coordinate information according to the target detection algorithm and publishes the topic of vehicle coordinate information;
- the target tracking node subscribes to the topic of vehicle coordinate information, and the onboard computer predicts the location of the target vehicle according to the target tracking algorithm, and publishes the topic of the target location;
- the flight control node subscribes to the target location topic, performs coordinate conversion, calculates the distance between the target and the UAV, and sends flight control instructions to the flight control module accordingly;
- the flight control module executes instructions to control the movement of the drone.
- the target detection algorithm adopts the YOLO Nano algorithm.
- the use of this compact network architecture greatly reduces the size of the model while ensuring the detection accuracy, making end-side computing time-consuming to meet the requirements, and adapting to the computing power of the airborne computer.
- the target tracking algorithm specifically includes the following steps:
- the re-detection algorithm based on Kalman filter can quickly search for the target again after the target is lost, ensuring that the UAV has a long-term and long-term response to the vehicle target in a complex environment. Stable tracking.
- FIG. 1 is a block diagram of the node design of the airborne computer ROS system of the present invention
- FIG. 2 is a block diagram of the system hardware structure of the present invention.
- FIG. 3 is a block diagram of the system software structure of the present invention.
- Figure 4 is a flow chart of the improved tracking algorithm of the present invention.
- the present invention proposes a rotary wing unmanned aerial vehicle system for vehicle detection and tracking, aiming at practical application scenarios, that is, automatic detection and tracking of motor vehicles.
- the rotary-wing drone obtains real-time video streams on the ground through the mounted visible light camera, and can transmit the video streams to the ground workstation in real time, and detect vehicle targets in the image through the on-board target detection algorithm.
- the ground station manually selects the vehicle targets to be tracked in the video.
- the airborne target tracking algorithm is activated, and the rotary-wing UAV will automatically fly to track the selected target on the ground.
- the airborne computing device NVIADIA Jetson TX2 is used to run the Ubuntu ROS system, and the camera node, target detection node, target tracking node and flight control node are deployed and integrated on the ROS system.
- FIG. 1 is a block diagram of the node design of the airborne computer ROS system of the present invention.
- the workflow of the UAV platform includes the following steps:
- the visible light camera collects image data and publishes image topics through the camera node of the onboard computer;
- the target detection node subscribes to the image topic and uses it as the input of the target detection node.
- the onboard computer calculates the target coordinate information according to the target detection algorithm and publishes the target coordinate information topic;
- the target tracking node subscribes to the topic of coordinate information, and the onboard computer predicts the target location according to the target tracking algorithm, and publishes the target location topic;
- the flight control node subscribes to the target location topic, performs coordinate conversion, calculates the distance between the target and the aircraft, and sends flight control instructions to the flight control module accordingly;
- the flight control module executes instructions to control the movement of the drone.
- FIG. 2 is a block diagram of the hardware structure of the system of the present invention, including: the present invention is applied to the detection and tracking of ground motor vehicles by a rotary-wing drone.
- the drone load includes a visible light camera, TX2 airborne computer, wireless image transmission, etc., TX2
- the Ubuntu camera deployed on the airborne computer adopts a gimbal camera with self-stabilization function, which can shoot 1080P resolution video, and the acquisition rate is 30FPS.
- the gimbal camera is fixed under the drone and shoots the ground at a fixed angle.
- the algorithm processing unit is a TX2 airborne computer, with the Ubuntu16.04 operating system installed, and the ROS version is kinetic.
- Figure 3 is a block diagram of the system software structure of the present invention, which includes: Compared with ground station for inference calculation, the computing power of TX2 airborne computer is greatly reduced. Therefore, the target detection algorithm based on deep learning needs to be adjusted according to the computing power.
- the traditional The size of the YOLOv3 target detection model is 240M, the calculation complexity is too large, and it is no longer suitable for edge devices. The original network needs to be pruned.
- the size of YOLO Nano is only about 4.0MB, which is 15.1 times and 8.3 times smaller than Tiny YOLOv2 and Tiny YOLOv3, respectively. It requires 4.57B inference operations in calculation, which is 34% and 17% less than the latter two networks.
- the network structure mainly includes the residual PEP macro architecture and the fully connected attention macro architecture FCA.
- PEP consists of a 1*1 convolutional mapping layer, which maps the input feature map to a lower-dimensional tensor.
- the num in PEP(num) is the lower dimension; a 1*1 convolutional expansion layer, It will expand the channel of the feature map to higher dimensions; a depth-wise convolution layer, which will perform spatial convolution on different expansion layer output channels through different filters; a 1*1 Convolutional mapping layer, which maps the output channels of the previous layer to lower dimensions.
- the first two steps combine cross-channel fusion features; the second step increases the feature dimension, so that more channel features in the third step are used for spatial feature fusion (to improve the abstraction and characterization capabilities of features); the third step is the deep convolution part ( Spatial convolution); the fourth step is the point-by-point convolution part (channel convolution), which reduces the huge amount of calculation caused by the reduction of the channel after the convolution; the latter two parts form a deep separable convolution, which can reduce the computational complexity Guarantee the characterization ability of the model.
- the use of the residual PEP macro-architecture can significantly reduce the complexity of architecture and calculations, while ensuring the characterization ability of the model.
- the FCA macro architecture consists of two fully connected layers, which can learn the dynamic and non-linear internal dependencies between channels, and re-weight the importance of the channels through channel-level multiplication.
- the use of FCA helps to focus on more informative features based on global information, because it calibrates the dynamic features again. This can make more effective use of the ability of neural networks, that is, express important information as much as possible with limited parameters. Therefore, this module can make a better trade-off between pruning the model architecture, reducing model complexity, and increasing model representation.
- YOLO Nano The size of YOLO Nano is only about 4.0MB, which is 15.1 times and 8.3 times smaller than Tiny YOLOv2 and Tiny YOLOv3, respectively. It requires 4.57B inference operations in calculation, which is 34% and 17% less than the latter two networks. In the above, 69.1% mAP was achieved in the VOC2007 data set, and the accuracy rate was improved by 12 points and 10.7 points respectively compared with the latter two. Therefore, deploying the YOLO Nano algorithm on TX2 can significantly reduce the computational pressure and ensure the accuracy of target detection.
- the Staple algorithm is robust to motion blur and illuminance when considering HOG features for correlation filtering, but it is not robust to deformation. If the target is deformed, the color distribution of the entire target will basically not change. Therefore, the color histogram is very robust to deformation. On the other hand, the color histogram is not robust to light changes, which can be complemented by HOG features. Therefore, consider dividing into two channels and using these two features at the same time. Use HOG features to learn related filters to obtain a filter template, and update the template using the given formula. Use the color feature to learn the filter template, and then use the given update formula to update the learned template.
- Two templates are used to predict the target position and then weighted and averaged to obtain a composite response map, and the position of the maximum value in the response map is the target location.
- the Staple tracking algorithm overcomes some of the shortcomings of the KCF algorithm, there is still no reliable solution to target occlusion and loss. Therefore, the re-detection module is added to the Staple algorithm framework to effectively solve this problem. Specifically, the position of the target in the next frame is estimated, and then the estimated position is sampled to further lock the position of the target.
- the Kalman filter can establish a linear motion model of the target body, and the state of the target can be optimally estimated through the input value and output value of the model. Therefore, the Kalman filter is used to establish the target motion model to predict the next moment of the target. Position, camera shake can be regarded as Gaussian noise.
- the motion model and observation equation of the target can be expressed in the following form. Since the sampling interval between the two frames of images is very short, the motion of the target between the two frames is simplified to a uniform motion.
- x and y respectively represent the component of the pixel distance between the position of the target and the center of the image on the u-axis and the v-axis, with Represent the components of the target's moving speed on the u-axis and v-axis, respectively. Because the acceleration of the target body movement is random, it can be with Think of it as Gaussian noise.
- ⁇ t is the time interval between adjacent moments.
- the observation value at time k is:
- Fig. 4 is a flow chart of the improved tracking algorithm of the present invention, which includes: first perform the initialization work of the Kalman filter and the Staple tracking algorithm, and then perform the target tracking in the image sequence. In the tracking process, first predict the position of the target in the k frame according to the target state in the k-1 frame, and then sample the image block at the predicted position and input it to the Staple tracking algorithm to obtain the measured value of the target position in the image, and then according to Staple tracking algorithm tests the value of the correlation between the sample and the training sample to determine whether the target is occluded, and set a threshold for the correlation value. If it is lower than the threshold, it is determined that there is occlusion.
- the predicted value of the target is copied Give the measured value.
- the next step is to correct the target measurement value, and finally get the estimated value of the target position. In this process, after the target position of each frame is corrected, the target state of the previous frame is updated.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Astronomy & Astrophysics (AREA)
- Remote Sensing (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
Claims (10)
- 一种用于车辆检测跟踪的旋翼无人机系统,该系统包括无人机平台和地面站平台,其特征在于,所述无人机平台用于实时计算并检测跟踪目标;所述地面站平台用于对目标进行跟踪视频监控,对无人机平台下发手动飞控指令。A rotary wing unmanned aerial vehicle system for vehicle detection and tracking. The system includes an unmanned aerial vehicle platform and a ground station platform, characterized in that the unmanned aerial vehicle platform is used for real-time calculation and detection and tracking of targets; the ground station platform It is used to track and monitor the target and issue manual flight control instructions to the UAV platform.
- 根据权利要求1所述的用于车辆检测跟踪的旋翼无人机系统,其特征在于,所述无人机平台包括:可见光摄像机、机载计算机、第一无线图传终端和飞控模块,其中,所述机载计算机分别与可见光摄像机、第一无线图传终端、飞控模块相连接;所述地面站平台包括PC机和第二无线图传终端,两者信息交互;第一无线图传终端与第二无线图传终端信息交互;The rotary wing unmanned aerial vehicle system for vehicle detection and tracking according to claim 1, wherein the unmanned aerial vehicle platform comprises: a visible light camera, an onboard computer, a first wireless image transmission terminal and a flight control module, wherein , The onboard computer is respectively connected with a visible light camera, a first wireless image transmission terminal, and a flight control module; the ground station platform includes a PC and a second wireless image transmission terminal, which exchange information; the first wireless image transmission Information interaction between the terminal and the second wireless image transmission terminal;所述可见光摄像机,用于采集图像数据;The visible light camera is used to collect image data;所述机载计算机,用于运行目标检测算法和目标跟踪算法;The onboard computer is used to run a target detection algorithm and a target tracking algorithm;所述第一无线图传终端,用于传输目标跟踪实时视频流和地面站手动飞控指令接收;The first wireless image transmission terminal is used to transmit a real-time video stream of target tracking and receive manual flight control instructions from a ground station;所述PC机,用于目标跟踪实时视频流监控和手动飞控指令下发;The PC is used for real-time video stream monitoring of target tracking and manual flight control command issuance;所述第二无线图传终端,用于接收目标跟踪实时视频流和手动飞控指令发送。The second wireless image transmission terminal is used to receive a real-time video stream of target tracking and send manual flight control instructions.
- 根据权利要求1所述的用于车辆检测跟踪的旋翼无人机系统,其特征在于,所述机载计算机运行的目标检测算法采用YOLO Nano算法。The rotary wing unmanned aerial vehicle system for vehicle detection and tracking according to claim 1, wherein the target detection algorithm run by the onboard computer adopts the YOLO Nano algorithm.
- 根据权利要求1所述的用于车辆检测跟踪的旋翼无人机系统,其特征在于,所述机载计算机运行的目标跟踪算法采用Staple跟踪算法。The rotary-wing unmanned aerial vehicle system for vehicle detection and tracking according to claim 1, wherein the target tracking algorithm run by the on-board computer adopts the Staple tracking algorithm.
- 根据权利要求1所述的用于车辆检测跟踪的旋翼无人机系统,其特征在于,所述机载计算机还包括目标重检测模块,用于根据Staple跟踪算法的测试样本与训练样本相关性的值,判断目标是否被遮挡,设置相关性的值的阈值,若低于该阈值则判断有遮挡,若有遮挡,则将目标的预测值复制给测量值,对测量值进行修正,得到目标位置的估计值。The rotary wing unmanned aerial vehicle system for vehicle detection and tracking according to claim 1, wherein the onboard computer further comprises a target re-detection module for determining the correlation between the test sample and the training sample of the Staple tracking algorithm Value, determine whether the target is occluded, set the threshold of the correlation value, if it is lower than the threshold, it will determine that there is occlusion, if there is occlusion, copy the predicted value of the target to the measured value, correct the measured value, and get the target position Estimated value.
- 根据权利要求1所述的用于车辆检测跟踪的旋翼无人机系统,其特征在于,所述机载计算机用于部署Ubuntu ROS操作系统,该系统包含相机节点、目标检测节点、目标跟踪节点、飞控节点;其中,相机节点用于采集图像数据,目标检测节点用于所有车辆的定位,目标跟踪节点用于对目标车辆进行跟踪,飞控节点用于旋翼无人机的飞行控制。The rotary wing drone system for vehicle detection and tracking according to claim 1, wherein the onboard computer is used to deploy the Ubuntu ROS operating system, and the system includes a camera node, a target detection node, a target tracking node, Flight control node; among them, the camera node is used to collect image data, the target detection node is used to locate all vehicles, the target tracking node is used to track the target vehicle, and the flight control node is used to control the flight of the rotary-wing UAV.
- 一种基于权利要求1所述的用于车辆检测跟踪的旋翼无人机系统的检测跟踪方法,其特征在于,该方法包括:A detection and tracking method for a rotary-wing UAV system for vehicle detection and tracking based on claim 1, wherein the method comprises:无人机平台实时计算并检测跟踪目标;地面站平台通过无线通信向无人机发送飞控指令,控制飞机飞行。The UAV platform calculates and detects and tracks the target in real time; the ground station platform sends flight control instructions to the UAV through wireless communication to control the flight of the aircraft.
- 根据权利要求7所述的用于车辆检测跟踪的旋翼无人机系统的检测跟踪 方法,其特征在于:所述无人机平台包括可见光摄像机、机载计算机、第一无线图传终端和飞控模块,其中,所述机载计算机分别与可见光摄像机、第一无线图传终端、飞控模块相连接,机载计算机上部署Ubuntu ROS操作系统,该系统包含用于采集图像数据的相机节点、用于所有车辆定位的目标检测节点、用于对目标车辆进行跟踪的目标跟踪节点、用于控制旋翼无人机飞行的飞控节点;所述地面站平台包括PC机和第二无线图传终端,两者信息交互;第一无线图传终端与第二无线图传终端信息交互;该方法包括如下步骤:The detection and tracking method of a rotary-wing UAV system for vehicle detection and tracking according to claim 7, wherein the UAV platform includes a visible light camera, an onboard computer, a first wireless image transmission terminal, and a flight controller. Module, wherein the onboard computer is connected to the visible light camera, the first wireless image transmission terminal, and the flight control module, and the Ubuntu ROS operating system is deployed on the onboard computer. The system includes camera nodes for collecting image data, The target detection node for all vehicle positioning, the target tracking node for tracking the target vehicle, and the flight control node for controlling the flight of the rotor drone; the ground station platform includes a PC and a second wireless image transmission terminal, The two information exchange; the first wireless image transmission terminal and the second wireless image transmission terminal information exchange; the method includes the following steps:(1)可见光摄像机采集图像数据,通过机载计算机的相机节点发布图像话题;(1) The visible light camera collects image data and publishes image topics through the camera node of the onboard computer;(2)目标检测节点订阅图像话题,将其作为目标检测节点的输入,机载计算机根据目标检测算法计算车辆坐标信息并发布车辆坐标信息话题;(2) The target detection node subscribes to the image topic and uses it as the input of the target detection node. The onboard computer calculates the vehicle coordinate information according to the target detection algorithm and publishes the topic of vehicle coordinate information;(3)目标跟踪节点订阅车辆坐标信息话题,机载计算机根据目标跟踪算法预测目标车辆位置,并发布目标位置话题;(3) The target tracking node subscribes to the topic of vehicle coordinate information, and the onboard computer predicts the location of the target vehicle according to the target tracking algorithm, and publishes the topic of the target location;(4)飞控节点订阅目标位置话题,进行坐标转换,计算出目标与无人机的距离,并依此发送飞控指令给飞控模块;(4) The flight control node subscribes to the target location topic, performs coordinate conversion, calculates the distance between the target and the UAV, and sends flight control instructions to the flight control module accordingly;(5)飞控模块执行指令控制无人机运动。(5) The flight control module executes instructions to control the movement of the drone.
- 根据权利要求8所述的用于车辆检测跟踪的旋翼无人机系统的检测跟踪方法,其特征在于,所述目标检测算法采用YOLO Nano算法。The detection and tracking method of a rotary-wing UAV system for vehicle detection and tracking according to claim 8, wherein the target detection algorithm adopts the YOLO Nano algorithm.
- 根据权利要求8所述的用于车辆检测跟踪的旋翼无人机系统的检测跟踪方法,其特征在于,所述目标跟踪算法具体包括如下步骤:The detection and tracking method of a rotary-wing UAV system for vehicle detection and tracking according to claim 8, wherein the target tracking algorithm specifically includes the following steps:(1)进行卡尔曼滤波器和Staple跟踪算法的各项初始化工作;(1) Perform the initialization of Kalman filter and Staple tracking algorithm;(2)进行图像序列中的目标跟踪;(2) Perform target tracking in the image sequence;(3)在跟踪过程中,首先是根据k-1帧中的目标状态预测k帧中目标车辆的位置,然后在预测位置进行图像块的采样输入至Staple跟踪算法获得目标车辆在图像中位置的测量值,接着根据Staple跟踪算法的测试样本与训练样本相关性的值,对目标是否有遮挡进行判断,对相关性的值设置一个阈值,若低于该阈值则判断有遮挡,若有遮挡,则将目标车辆的预测值复制给测量值;(3) In the tracking process, first predict the position of the target vehicle in frame k based on the target state in frame k-1, and then sample the image block at the predicted position and input it to the Staple tracking algorithm to obtain the position of the target vehicle in the image. Measure the value, and then judge whether the target is occluded according to the correlation value between the test sample of the Staple tracking algorithm and the training sample. Set a threshold for the correlation value. If it is lower than the threshold, determine that there is occlusion. If there is occlusion, Copy the predicted value of the target vehicle to the measured value;(4)对目标测量值进行修正,得到对目标车辆位置的估计值;(4) Correct the target measurement value to obtain the estimated value of the target vehicle position;(5)对前一帧的目标状态进行更新。(5) Update the target state of the previous frame.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010212643.8A CN111476116A (en) | 2020-03-24 | 2020-03-24 | Rotor unmanned aerial vehicle system for vehicle detection and tracking and detection and tracking method |
CN202010212643.8 | 2020-03-24 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021189507A1 true WO2021189507A1 (en) | 2021-09-30 |
Family
ID=71748379
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/082257 WO2021189507A1 (en) | 2020-03-24 | 2020-03-31 | Rotor unmanned aerial vehicle system for vehicle detection and tracking, and detection and tracking method |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN111476116A (en) |
WO (1) | WO2021189507A1 (en) |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114049573A (en) * | 2021-11-09 | 2022-02-15 | 上海建工四建集团有限公司 | Method for supervising safety construction of village under construction residence |
CN114155511A (en) * | 2021-12-13 | 2022-03-08 | 吉林大学 | Environmental information acquisition method for automatically driving automobile on public road |
CN114545965A (en) * | 2021-12-31 | 2022-05-27 | 中国人民解放军国防科技大学 | A UAV polder piping inspection system and method based on deep learning |
CN114612825A (en) * | 2022-03-09 | 2022-06-10 | 云南大学 | Target detection method based on edge equipment |
CN114859967A (en) * | 2022-04-24 | 2022-08-05 | 北京同创信通科技有限公司 | Intelligent scrap steel quality testing system and method for automatically controlling unmanned aerial vehicle |
CN114879744A (en) * | 2022-07-01 | 2022-08-09 | 浙江大学湖州研究院 | Night work unmanned aerial vehicle system based on machine vision |
CN114882450A (en) * | 2022-04-13 | 2022-08-09 | 南京大学 | Method for detecting reversing behavior of high-speed ramp junction under unilateral cruising of unmanned aerial vehicle |
CN114897935A (en) * | 2022-05-13 | 2022-08-12 | 中国科学技术大学 | Unmanned aerial vehicle tracking method and system for air target object based on virtual camera |
CN114900654A (en) * | 2022-04-02 | 2022-08-12 | 北京斯年智驾科技有限公司 | Real-time monitoring video transmission system for autonomous vehicles |
CN114973033A (en) * | 2022-05-30 | 2022-08-30 | 青岛科技大学 | Unmanned aerial vehicle automatic target detection and tracking method |
CN115077549A (en) * | 2022-06-16 | 2022-09-20 | 南昌智能新能源汽车研究院 | Vehicle state tracking method, system, computer and readable storage medium |
CN115268506A (en) * | 2022-01-18 | 2022-11-01 | 中国人民解放军海军工程大学 | Unmanned aircraft photoelectric cooperative tracking control method, system, terminal and medium |
CN115712354A (en) * | 2022-07-06 | 2023-02-24 | 陈伟 | Man-machine interaction system based on vision and algorithm |
CN115865939A (en) * | 2022-11-08 | 2023-03-28 | 燕山大学 | Edge cloud collaborative decision-making-based target detection and tracking system and method |
CN115908475A (en) * | 2023-03-09 | 2023-04-04 | 四川腾盾科技有限公司 | Method and system for realizing image pre-tracking function of airborne photoelectric reconnaissance pod |
CN116068928A (en) * | 2022-11-23 | 2023-05-05 | 北京航天自动控制研究所 | Distributed heterogeneous unmanned aerial vehicle cluster air-ground integrated control system and method |
CN116493735A (en) * | 2023-06-29 | 2023-07-28 | 武汉纺织大学 | A real-time tracking method for moving spatter during 10,000-watt ultra-high power laser welding |
CN116703975A (en) * | 2023-06-13 | 2023-09-05 | 武汉天进科技有限公司 | Intelligent target image tracking method for unmanned aerial vehicle |
CN116778360A (en) * | 2023-06-09 | 2023-09-19 | 北京科技大学 | Ground target positioning method and device for flapping-wing flying robot |
CN116805195A (en) * | 2023-05-25 | 2023-09-26 | 南京航空航天大学 | A collaborative reasoning method and system for UAV swarms based on model segmentation |
CN117132914A (en) * | 2023-10-27 | 2023-11-28 | 武汉大学 | Method and system for identifying large model of universal power equipment |
CN118584953A (en) * | 2024-05-20 | 2024-09-03 | 南京农业大学 | Harvester-grain transport vehicle dual-mode switching collaborative grain unloading system and method based on harvester unloading port identification and tracking |
CN119225426A (en) * | 2024-11-28 | 2024-12-31 | 北京航空航天大学 | Anti-unmanned aerial vehicle system and method based on air-to-air visual recognition |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112950671B (en) * | 2020-08-06 | 2024-02-13 | 中国人民解放军32146部队 | Real-time high-precision parameter measurement method for moving target by unmanned aerial vehicle |
CN111932588B (en) * | 2020-08-07 | 2024-01-30 | 浙江大学 | A tracking method for airborne UAV multi-target tracking system based on deep learning |
CN112163628A (en) * | 2020-10-10 | 2021-01-01 | 北京航空航天大学 | Method for improving target real-time identification network structure suitable for embedded equipment |
CN112734800A (en) * | 2020-12-18 | 2021-04-30 | 上海交通大学 | Multi-target tracking system and method based on joint detection and characterization extraction |
CN112770272B (en) * | 2021-01-11 | 2022-02-25 | 四川泓宝润业工程技术有限公司 | Unmanned aerial vehicle and multi-platform data transmission device |
CN112907634B (en) * | 2021-03-18 | 2023-06-20 | 沈阳理工大学 | UAV-based vehicle tracking method |
CN113808161B (en) * | 2021-08-06 | 2024-03-15 | 航天时代飞鹏有限公司 | Vehicle-mounted multi-rotor unmanned aerial vehicle tracking method based on machine vision |
CN113949826B (en) * | 2021-09-28 | 2024-11-05 | 航天时代飞鸿技术有限公司 | A method and system for cooperative reconnaissance of drone clusters under limited communication bandwidth conditions |
CN114815866A (en) * | 2022-04-14 | 2022-07-29 | 哈尔滨工业大学人工智能研究院有限公司 | Switching control method for temporary loss target unmanned aerial vehicle with stability guarantee |
CN115514787B (en) * | 2022-09-16 | 2023-06-27 | 北京邮电大学 | Intelligent unmanned aerial vehicle assisted decision-making planning method and device for Internet of Vehicles environment |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190212316A1 (en) * | 2015-01-23 | 2019-07-11 | Airscout Inc. | Methods and systems for analyzing a field |
CN110222581A (en) * | 2019-05-13 | 2019-09-10 | 电子科技大学 | A kind of quadrotor drone visual target tracking method based on binocular camera |
CN110610512A (en) * | 2019-09-09 | 2019-12-24 | 西安交通大学 | UAV target tracking method based on BP neural network fusion Kalman filter algorithm |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102355574B (en) * | 2011-10-17 | 2013-12-25 | 上海大学 | Image stabilizing method of airborne tripod head moving target autonomous tracking system |
CN106289186B (en) * | 2016-09-21 | 2019-04-19 | 南京航空航天大学 | Rotor UAV airborne visual detection and multi-target positioning system and implementation method |
CN106981073B (en) * | 2017-03-31 | 2019-08-06 | 中南大学 | A method and system for real-time tracking of ground moving targets based on UAV |
CN107128492B (en) * | 2017-05-05 | 2019-09-20 | 成都通甲优博科技有限责任公司 | A kind of unmanned plane tracking, device and unmanned plane based on number of people detection |
CN109002059A (en) * | 2017-06-06 | 2018-12-14 | 武汉小狮科技有限公司 | A kind of multi-rotor unmanned aerial vehicle object real-time tracking camera system and method |
CN109445453A (en) * | 2018-09-12 | 2019-03-08 | 湖南农业大学 | A kind of unmanned plane Real Time Compression tracking based on OpenCV |
CN109785363A (en) * | 2018-12-29 | 2019-05-21 | 中国电子科技集团公司第五十二研究所 | A kind of unmanned plane video motion Small object real-time detection and tracking |
CN109816698B (en) * | 2019-02-25 | 2023-03-24 | 南京航空航天大学 | Unmanned aerial vehicle visual target tracking method based on scale self-adaptive kernel correlation filtering |
CN110058610A (en) * | 2019-05-07 | 2019-07-26 | 南京信息工程大学 | A kind of auxiliary of real-time inspection flock of sheep number is put sheep out to pasture method and system |
-
2020
- 2020-03-24 CN CN202010212643.8A patent/CN111476116A/en active Pending
- 2020-03-31 WO PCT/CN2020/082257 patent/WO2021189507A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190212316A1 (en) * | 2015-01-23 | 2019-07-11 | Airscout Inc. | Methods and systems for analyzing a field |
CN110222581A (en) * | 2019-05-13 | 2019-09-10 | 电子科技大学 | A kind of quadrotor drone visual target tracking method based on binocular camera |
CN110610512A (en) * | 2019-09-09 | 2019-12-24 | 西安交通大学 | UAV target tracking method based on BP neural network fusion Kalman filter algorithm |
Non-Patent Citations (2)
Title |
---|
ALEXANDER WONG; MAHMOUD FAMUORI; MOHAMMAD JAVAD SHAFIEE; FRANCIS LI; BRENDAN CHWYL; JONATHAN CHUNG: "YOLO Nano: a Highly Compact You Only Look Once Convolutional Neural Network for Object Detection", ARXIV.ORG, CORNELL UNIVERSITY LIBRARY, 201 OLIN LIBRARY CORNELL UNIVERSITY ITHACA, NY 14853, 3 October 2019 (2019-10-03), 201 Olin Library Cornell University Ithaca, NY 14853, XP081501523 * |
ZHAO CHANG: "Research on Target Tracking Technology Based on Multi-Rotor UAV", CHINESE MASTER'S THESES FULL-TEXT DATABASE, TIANJIN POLYTECHNIC UNIVERSITY, CN, 15 February 2019 (2019-02-15), CN, XP055852868, ISSN: 1674-0246 * |
Cited By (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114049573A (en) * | 2021-11-09 | 2022-02-15 | 上海建工四建集团有限公司 | Method for supervising safety construction of village under construction residence |
CN114155511A (en) * | 2021-12-13 | 2022-03-08 | 吉林大学 | Environmental information acquisition method for automatically driving automobile on public road |
CN114545965A (en) * | 2021-12-31 | 2022-05-27 | 中国人民解放军国防科技大学 | A UAV polder piping inspection system and method based on deep learning |
CN114545965B (en) * | 2021-12-31 | 2024-09-06 | 中国人民解放军国防科技大学 | Unmanned plane levee piping inspection system and method based on deep learning |
CN115268506A (en) * | 2022-01-18 | 2022-11-01 | 中国人民解放军海军工程大学 | Unmanned aircraft photoelectric cooperative tracking control method, system, terminal and medium |
CN114612825A (en) * | 2022-03-09 | 2022-06-10 | 云南大学 | Target detection method based on edge equipment |
CN114612825B (en) * | 2022-03-09 | 2024-03-19 | 云南大学 | Target detection method based on edge equipment |
CN114900654A (en) * | 2022-04-02 | 2022-08-12 | 北京斯年智驾科技有限公司 | Real-time monitoring video transmission system for autonomous vehicles |
CN114900654B (en) * | 2022-04-02 | 2024-01-30 | 北京斯年智驾科技有限公司 | Real-time monitoring video transmission system for automatic driving vehicle |
CN114882450A (en) * | 2022-04-13 | 2022-08-09 | 南京大学 | Method for detecting reversing behavior of high-speed ramp junction under unilateral cruising of unmanned aerial vehicle |
CN114859967A (en) * | 2022-04-24 | 2022-08-05 | 北京同创信通科技有限公司 | Intelligent scrap steel quality testing system and method for automatically controlling unmanned aerial vehicle |
CN114897935A (en) * | 2022-05-13 | 2022-08-12 | 中国科学技术大学 | Unmanned aerial vehicle tracking method and system for air target object based on virtual camera |
CN114973033A (en) * | 2022-05-30 | 2022-08-30 | 青岛科技大学 | Unmanned aerial vehicle automatic target detection and tracking method |
CN114973033B (en) * | 2022-05-30 | 2024-03-01 | 青岛科技大学 | Unmanned aerial vehicle automatic detection target and tracking method |
CN115077549B (en) * | 2022-06-16 | 2024-04-26 | 南昌智能新能源汽车研究院 | Vehicle state tracking method, system, computer and readable storage medium |
CN115077549A (en) * | 2022-06-16 | 2022-09-20 | 南昌智能新能源汽车研究院 | Vehicle state tracking method, system, computer and readable storage medium |
CN114879744B (en) * | 2022-07-01 | 2022-10-04 | 浙江大学湖州研究院 | Night work unmanned aerial vehicle system based on machine vision |
CN114879744A (en) * | 2022-07-01 | 2022-08-09 | 浙江大学湖州研究院 | Night work unmanned aerial vehicle system based on machine vision |
CN115712354B (en) * | 2022-07-06 | 2023-05-30 | 成都戎盛科技有限公司 | Man-machine interaction system based on vision and algorithm |
CN115712354A (en) * | 2022-07-06 | 2023-02-24 | 陈伟 | Man-machine interaction system based on vision and algorithm |
CN115865939A (en) * | 2022-11-08 | 2023-03-28 | 燕山大学 | Edge cloud collaborative decision-making-based target detection and tracking system and method |
CN115865939B (en) * | 2022-11-08 | 2024-05-10 | 燕山大学 | Target detection and tracking system and method based on edge cloud collaborative decision |
CN116068928A (en) * | 2022-11-23 | 2023-05-05 | 北京航天自动控制研究所 | Distributed heterogeneous unmanned aerial vehicle cluster air-ground integrated control system and method |
CN115908475A (en) * | 2023-03-09 | 2023-04-04 | 四川腾盾科技有限公司 | Method and system for realizing image pre-tracking function of airborne photoelectric reconnaissance pod |
CN115908475B (en) * | 2023-03-09 | 2023-05-19 | 四川腾盾科技有限公司 | Implementation method and system for airborne photoelectric reconnaissance pod image pre-tracking function |
CN116805195A (en) * | 2023-05-25 | 2023-09-26 | 南京航空航天大学 | A collaborative reasoning method and system for UAV swarms based on model segmentation |
CN116778360A (en) * | 2023-06-09 | 2023-09-19 | 北京科技大学 | Ground target positioning method and device for flapping-wing flying robot |
CN116778360B (en) * | 2023-06-09 | 2024-03-19 | 北京科技大学 | Ground target positioning method and device for flapping-wing flying robot |
CN116703975B (en) * | 2023-06-13 | 2023-12-15 | 武汉天进科技有限公司 | Intelligent target image tracking method for unmanned aerial vehicle |
CN116703975A (en) * | 2023-06-13 | 2023-09-05 | 武汉天进科技有限公司 | Intelligent target image tracking method for unmanned aerial vehicle |
CN116493735A (en) * | 2023-06-29 | 2023-07-28 | 武汉纺织大学 | A real-time tracking method for moving spatter during 10,000-watt ultra-high power laser welding |
CN116493735B (en) * | 2023-06-29 | 2023-09-12 | 武汉纺织大学 | Real-time tracking method for motion splash in Wanwave-level ultra-high power laser welding process |
CN117132914A (en) * | 2023-10-27 | 2023-11-28 | 武汉大学 | Method and system for identifying large model of universal power equipment |
CN117132914B (en) * | 2023-10-27 | 2024-01-30 | 武汉大学 | General power equipment identification large model method and system |
CN118584953A (en) * | 2024-05-20 | 2024-09-03 | 南京农业大学 | Harvester-grain transport vehicle dual-mode switching collaborative grain unloading system and method based on harvester unloading port identification and tracking |
CN119225426A (en) * | 2024-11-28 | 2024-12-31 | 北京航空航天大学 | Anti-unmanned aerial vehicle system and method based on air-to-air visual recognition |
Also Published As
Publication number | Publication date |
---|---|
CN111476116A (en) | 2020-07-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021189507A1 (en) | Rotor unmanned aerial vehicle system for vehicle detection and tracking, and detection and tracking method | |
Rohan et al. | Convolutional neural network-based real-time object detection and tracking for parrot AR drone 2 | |
Lee et al. | Real-time, cloud-based object detection for unmanned aerial vehicles | |
CN102538782B (en) | Helicopter landing guide device and method based on computer vision | |
CN109242003B (en) | Vehicle-mounted vision system self-motion determination method based on deep convolutional neural network | |
CN103778645B (en) | Circular target real-time tracking method based on images | |
CN110580713A (en) | Satellite Video Target Tracking Method Based on Fully Convolutional Siamese Network and Trajectory Prediction | |
CN107943064A (en) | A kind of unmanned plane spot hover system and method | |
CN102456226B (en) | Tracking methods for regions of interest | |
CN111307291B (en) | Method, device and system for detecting and locating abnormal surface temperature based on UAV | |
CN104200494A (en) | Real-time visual target tracking method based on light streams | |
CN108829136A (en) | The a wide range of synergic monitoring method and apparatus of unmanned aerial vehicle group | |
CN108803655A (en) | A kind of UAV Flight Control platform and method for tracking target | |
CN105334347A (en) | Particle image velocimetry system and method based on unmanned plane | |
Valenti et al. | An autonomous flyer photographer | |
CN104820435A (en) | Quadrotor moving target tracking system based on smart phone and method thereof | |
CN114529585A (en) | Mobile equipment autonomous positioning method based on depth vision and inertial measurement | |
Zhu et al. | PairCon-SLAM: Distributed, online, and real-time RGBD-SLAM in large scenarios | |
CN107145167A (en) | A kind of video target tracking method based on digital image processing techniques | |
CN116824080A (en) | Method for realizing SLAM point cloud mapping of power transmission corridor based on multi-sensor fusion | |
CN113392723A (en) | Unmanned aerial vehicle forced landing area screening method, device and equipment based on artificial intelligence | |
Qin et al. | Visual-based tracking and control algorithm design for quadcopter UAV | |
WO2022198508A1 (en) | Lens abnormality prompt method and apparatus, movable platform, and readable storage medium | |
CN113111721B (en) | Human behavior intelligent identification method based on multi-unmanned aerial vehicle visual angle image data driving | |
CN116243725A (en) | Substation drone inspection method and system based on visual navigation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20927289 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20927289 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20927289 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 03.05.2023) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20927289 Country of ref document: EP Kind code of ref document: A1 |