[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN107300377B - A three-dimensional target localization method for rotary-wing UAV under the orbital trajectory - Google Patents

A three-dimensional target localization method for rotary-wing UAV under the orbital trajectory Download PDF

Info

Publication number
CN107300377B
CN107300377B CN201610943473.4A CN201610943473A CN107300377B CN 107300377 B CN107300377 B CN 107300377B CN 201610943473 A CN201610943473 A CN 201610943473A CN 107300377 B CN107300377 B CN 107300377B
Authority
CN
China
Prior art keywords
marker
matrix
image
aerial vehicle
unmanned aerial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610943473.4A
Other languages
Chinese (zh)
Other versions
CN107300377A (en
Inventor
邓方
张乐乐
陈杰
邱煌斌
陈文颉
彭志红
白永强
李佳洪
桂鹏
樊欣宇
顾晓丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN201610943473.4A priority Critical patent/CN107300377B/en
Publication of CN107300377A publication Critical patent/CN107300377A/en
Application granted granted Critical
Publication of CN107300377B publication Critical patent/CN107300377B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Image Analysis (AREA)
  • Medicines Containing Antibodies Or Antigens For Use As Internal Diagnostic Agents (AREA)

Abstract

The present invention discloses the rotor wing unmanned aerial vehicle objective localization method under a kind of track of being diversion, and using the single camera photographic subjects image being mounted on unmanned plane, and image is passed back to earth station;The marker with obvious characteristic is selected, and carries out visual identity;Then rotor wing unmanned aerial vehicle is diversion centered on the marker, carries out multipoint images measurement, calculates height and course deviation of the unmanned plane relative to landform where target based on the method for binocular vision model and the mutual iteration of linear regression model (LRM);Next, any static or moving target in camera coverage may be selected in operator, realize that the three-dimensional of target is accurately positioned.The present invention is carried out in same primary aerial mission, and flight leading portion calculates course deviation and relative altitude, and flight back segment carries out three-dimensional accurate positioning;The present invention traditional triangulation location method under track that solves the problems, such as to be diversion can not calculate relative altitude, to realize the three-dimensional localization to target.

Description

一种绕飞轨迹下的旋翼无人机三维目标定位方法A three-dimensional target localization method for rotary-wing UAV under the orbital trajectory

技术领域technical field

本发明属于视觉测量领域,具体涉及一种绕飞轨迹下的旋翼无人机三维目 标定位方法。The invention belongs to the field of visual measurement, and in particular relates to a three-dimensional target positioning method of a rotary-wing unmanned aerial vehicle under a flying trajectory.

背景技术Background technique

旋翼无人机以成本低、垂直起降和空中悬停等特点,在侦查、农业保险、 环保和灾后救援等领域获得了广泛应用。Rotor-wing UAVs have been widely used in the fields of reconnaissance, agricultural insurance, environmental protection and post-disaster rescue due to their low cost, vertical take-off and landing, and hovering in the air.

而基于视觉的旋翼无人机目标定位已是目前的研究热点问题之一,采用视 觉方法对目标进行三维定位,首先通过三角定位方法确定无人机与目标的相对 高度,然后才能进行目标的定位。考虑到旋翼无人机配备的低精度AHRS航姿 参考系统所带来的航向偏差较大,利用无人机拍摄的图像进行视觉测量时,图 像中投射的光线均发生一定的偏移。若采用传统的三角定位方法,从左右视图 投射的两组光线由于发生了偏移,求解得到的相对高度将产生较大的误差,故 无法准确计算无人机与物体间的相对高度,从而不能对目标进行有效的三维定 位。The vision-based target positioning of the rotor UAV has become one of the current research hotspots. The visual method is used to locate the target in three dimensions. First, the relative height between the UAV and the target is determined by the triangulation method, and then the target can be positioned. . Considering the large heading deviation caused by the low-precision AHRS attitude reference system equipped with the rotor UAV, when using the image captured by the UAV for visual measurement, the light projected in the image will be offset to a certain extent. If the traditional triangulation method is used, the two sets of rays projected from the left and right views will be offset, and the relative height obtained by the solution will have a large error, so the relative height between the drone and the object cannot be accurately calculated, so it is impossible to calculate the relative height between the drone and the object. Effective 3D positioning of the target.

发明内容SUMMARY OF THE INVENTION

有鉴于此,本发明提供了一种绕飞轨迹下的旋翼无人机三维目标定位方法, 能够计算得到航向偏差,减小对相对高度的计算误差,从而提高旋翼无人机对 目标的三维定位能力。In view of this, the present invention provides a three-dimensional target positioning method of the rotor UAV under the flying trajectory, which can calculate the heading deviation, reduce the calculation error of the relative height, and improve the three-dimensional positioning of the target by the rotor drone. ability.

有益效果:Beneficial effects:

(1)本发明所提供的方法针对配备低精度AHRS航姿参考系统系统的旋翼 无人机,能精确地计算AHRS航姿参考系统存在的航向偏差,进而计算绕飞轨 迹下旋翼无人机与目标所在地形之间的高度,从而实现旋翼无人机对目标的三 维视觉定位。(1) The method provided by the present invention can accurately calculate the heading deviation existing in the AHRS heading and attitude reference system for a rotor UAV equipped with a low-precision AHRS heading and attitude reference system, and then calculate the rotor unmanned aerial vehicle and the rotor unmanned aerial vehicle under the flying trajectory. The height between the terrain of the target, so as to realize the three-dimensional visual positioning of the target by the rotor UAV.

附图说明Description of drawings

图1为本发明的旋翼无人机目标三维定位系统结构图;Fig. 1 is the structure diagram of the three-dimensional positioning system of the rotor UAV target of the present invention;

图2为本发明所提供方法的流程图;Fig. 2 is the flow chart of the method provided by the present invention;

图3为本发明所使用的旋转视图双目视觉模型示意图;3 is a schematic diagram of a rotating view binocular vision model used in the present invention;

图4为本发明所使用的单目摄像机测距模型示意图;4 is a schematic diagram of a monocular camera ranging model used in the present invention;

图5为本发明所提供方法中的迭代过程流程图;Fig. 5 is the iterative process flow chart in the method provided by the present invention;

图6为本发明所提供方法中的的数据拟合曲线;Fig. 6 is in the method provided by the present invention The data fitting curve;

图7为本发明所提供方法中的的数据拟合曲线;Fig. 7 is in the method provided by the present invention The data fitting curve;

图8为本发明所提供方法的定位效果图。FIG. 8 is a positioning effect diagram of the method provided by the present invention.

具体实施方式Detailed ways

下面结合附图并举实施例,对本发明进行详细描述。The present invention will be described in detail below with reference to the accompanying drawings and embodiments.

搭建如下实验平台对本发明的有效性进行验证,使用一架T650四旋翼无人 机,一台笔记本作为地面站,无人机与地面站之间可进行实时通信,系统结构 如图1所示。Build the following experimental platform to verify the effectiveness of the present invention, using a T650 quadrotor unmanned aerial vehicle, a notebook as the ground station, real-time communication can be carried out between the unmanned aerial vehicle and the ground station, the system structure is shown in Figure 1.

对于无人机,机上带有GPS定位系统、AHRS航姿参考系统、高程计、无 线图像传输模块和无线数据收发模块,使3D Robotics公司的APM飞行控制系 统工作在自稳模式来保证无人机的稳定飞行。在无人机的机头位置安装摄像机, 俯视角β为45°,并通过无线图像传输模块回传图像到地面站,而由GPS定位系 统、AHRS航姿参考系统和高程计分别获得的无人机的位置、姿态和高程信息则 通过无线数据收发模块传输到地面站。For UAVs, the aircraft is equipped with GPS positioning system, AHRS heading reference system, altimeter, wireless image transmission module and wireless data transceiver module, so that 3D Robotics' APM flight control system works in the self-stabilizing mode to ensure the UAV stable flight. A camera is installed at the nose position of the UAV, the top-down angle β is 45°, and the image is sent back to the ground station through the wireless image transmission module. The position, attitude and elevation information of the aircraft are transmitted to the ground station through the wireless data transceiver module.

地面站以计算机为主体,运行无人机视觉定位等算法,使用USB接口连接 无线数据收发模块,实现无人机与地面站的相互通信。The ground station takes the computer as the main body, runs algorithms such as the visual positioning of the UAV, and uses the USB interface to connect the wireless data transceiver module to realize the mutual communication between the UAV and the ground station.

基于该实验平台,如图2所示,一种绕飞轨迹下的旋翼无人机三维目标定 位方法,包括以下步骤:Based on the experimental platform, as shown in Figure 2, a three-dimensional target positioning method of a rotary-wing UAV under a flying trajectory includes the following steps:

步骤一、系统启动后,利用搭载在无人机上的摄像机拍摄图像,并将图像 回传到地面站;Step 1. After the system is started, use the camera mounted on the UAV to capture images and send the images back to the ground station;

步骤二、从回传的图像中选择具有清晰轮廓的静态物体作为标志物,并对 标志物进行视觉识别;Step 2: Select a static object with a clear outline as a marker from the returned image, and visually identify the marker;

步骤二中对于标志物进行视觉识别的具体过程如下:The specific process of visually identifying the markers in step 2 is as follows:

运用SIFT算法对标志物进行识别,得到m个特征点P1,P2...Pm-1,Pm,并将这些 特征点进行存储作为模板,m为整数;Use the SIFT algorithm to identify the markers, obtain m feature points P 1 , P 2 . . . P m-1 , P m , and store these feature points as templates, where m is an integer;

步骤三、旋翼无人机以该标志物为中心绕飞,并利用视觉识别的结果对标 志物进行多点图像测量,基于双目视觉模型和线性回归模型相互迭代的方法计 算无人机相对于目标所在地形的高度和航向偏差;Step 3: The rotor UAV flies around the marker, and uses the results of visual recognition to measure the marker at multiple points. Based on the mutual iteration method of the binocular vision model and the linear regression model, the relative value of the UAV is calculated. Altitude and heading deviation of the terrain on which the target is located;

步骤三的流程图如图5所示,具体过程如下:The flowchart of step 3 is shown in Figure 5, and the specific process is as follows:

步骤3.1、旋翼无人机在绕飞轨迹下利用视觉识别分别对N个图像按时间顺 序进行测量,采用SIFT算法对当前第i个图像进行特征提取(1≤i≤N),然后利 用模板中的特征点与当前图像的特征点进行匹配,得到w组匹配点P1,P2...Pw-1,Pw (w≤m),最后取这些匹配点的几何中心Pf(f≤w)代表标志物在图像中的像素 位置,记为并记录在对第i个图像测量时的测量值,包括:无人机拍摄 点Oi在惯性参考系{I}的位置和姿态(ψiii),ψiii分别为方位角,俯 仰角和横滚角。Step 3.1. The rotary-wing UAV uses visual recognition to measure N images in chronological order under the orbital flight trajectory, and uses the SIFT algorithm to extract the features of the current i-th image (1≤i≤N), and then use the template Match the feature points of the current image with the feature points of the current image to obtain w groups of matching points P 1 , P 2 ... P w-1 , P w (w≤m), and finally take the geometric center of these matching points P f (f (f ≤w) represents the pixel position of the marker in the image, denoted as And record the measurement value when measuring the i-th image, including: the position of the drone shooting point O i in the inertial reference frame {I} and attitude (ψ i , θ i , φ i ), ψ i , θ i , φ i are the azimuth, pitch and roll angles, respectively.

步骤3.2、选取N个图像中的任意两个图像,共有n组,其中把在先测量的图像作为左视图L,在后测量的图像作为右视图R,构成旋转视图 的双目视觉模型,如图3所示。Step 3.2. Select any two images out of N images, there are a total of n groups, among which Taking the image measured earlier as the left view L, and the image measured later as the right view R, the binocular vision model of the rotated view is constructed, as shown in Figure 3.

计算无人机相对于标志物的相对高度hj,1≤j≤nCalculate the relative height h j of the drone relative to the marker, 1≤j≤n

其中,标志物在左、右视图的像素位置分别为Rl,Tl分别为左视图对应的无人机拍摄点Ol相对于惯性参考坐标系的旋转矩阵和平移矩 阵,Among them, the pixel positions of the marker in the left and right views are respectively R l , T l are the rotation matrix and translation matrix of the UAV shooting point O l corresponding to the left view relative to the inertial reference coordinate system, respectively,

其中,ψlll分别为左视图无人机拍摄点Ol的航向角,俯仰角和横滚角, ψrrr分别为右视图无人机拍摄点Ol航向角,俯仰角和横滚角,δψ为航向偏差, 有ψl=ψi-δψ(k),k为迭代次数,θl=θi,φl=φi(1≤i<N),设初始值δψ(0)=0;Among them, ψ l , θ l , φ l are the heading angle, pitch angle and roll angle of the left-view UAV shooting point O l , respectively, ψ r , θ r , φ r are the right-view UAV shooting point O, respectively l heading angle, pitch angle and roll angle, δψ is the heading deviation, there is ψ li -δψ(k), k is the number of iterations, θ li , φ li (1≤i<N ), set the initial value δψ(0)=0;

Rr,Tr分别为右视图对应的无人机拍摄点Or相对于惯性参考坐标系的旋转 矩阵和平移矩阵,R r , T r are the rotation matrix and translation matrix of the UAV shooting point Or corresponding to the right view relative to the inertial reference coordinate system, respectively,

其中,ψr=ψm-δψ(k),θr=θm,φr=φm(i<m≤N)Among them, ψ rm -δψ(k), θ rm , φ rm (i<m≤N)

左视图和右视图对应的无人机拍摄点在惯性参考坐标系下的坐标分别为R,T为右视图对应的摄像机坐标系相对于 左视图对应的摄像机坐标系的旋转矩阵、平移矩阵,R=RrRl T, T=Tl-RTTr=Rl(Or-Ol);M=[Pl -RTPr Pl×RTPr]-1T。The coordinates of the drone shooting points corresponding to the left view and the right view in the inertial reference coordinate system are: and R, T are the rotation matrix and translation matrix of the camera coordinate system corresponding to the right view relative to the camera coordinate system corresponding to the left view, R=R r R l T , T=T l -R T T r =R l (O r -O l ); M=[P l -RT P r P l ×RT P r ] -1 T.

步骤3.3、针对计算得到的n组相对高度hj,用3σ准则剔除粗大误差,然后 求n组平均值 Step 3.3. For the calculated relative heights h j of the n groups, use the 3σ criterion to remove the gross error, and then calculate the average value of the n groups

步骤3.4、获得相对高度后,利用步骤3.2中的N点测量值并基于线性回 归模型计算航向偏差δψ(k);Step 3.4, get the relative height Then, use the measured value of N points in step 3.2 and calculate the heading deviation δψ(k) based on the linear regression model;

一般地,[x y z]T,[xp yp zp]T分别表示无人机和物体在惯性参考坐标系{I}的 坐标,(xf′,yf′)表示物体在图像中的像素位置,f为摄像机的焦距,摄像机的测 距模型为Generally, [xyz] T , [x p y p z p ] T represent the coordinates of the drone and the object in the inertial reference coordinate system {I}, respectively, and (x f ′, y f ′) represent the position of the object in the image Pixel position, f is the focal length of the camera, and the ranging model of the camera is

姿态矩阵attitude matrix for

其中,h'为无人机与物体之间的相对高度,(ψ′,θ′,φ′)表示无人机在某个测量点的航向角、俯仰角和横滚角,其中,俯仰角θ′、横滚角φ′的测量精度高,其 误差忽略不计,而航向角ψ′的测量存在较大的航向偏差。Among them, h' is the relative height between the UAV and the object, (ψ', θ', φ') represents the heading angle, pitch angle and roll angle of the UAV at a certain measurement point, among which, the pitch angle The measurement accuracy of θ′ and roll angle φ′ is high, and their errors are negligible, while the measurement of the heading angle ψ′ has a large heading deviation.

在本实施例中,为了计算航向角的航向偏差,利用无人机在不同位置拍摄 的标志物的N点测量值,并通过线性回归方法进行求解,具体计算过程如下: [xG yG zG]T表示标志物在惯性参考坐标系{I}的坐标,令[xp yp zp]T=[xG yG zG]T为无人机与标志物的相对高度的平均值,令代入公式(4),可得In this embodiment, in order to calculate the heading deviation of the heading angle, the measured values of N points of the markers captured by the drone at different positions are used, and the linear regression method is used to solve the problem. The specific calculation process is as follows: [x G y G z G ] T represents the coordinates of the marker in the inertial reference coordinate system {I}, let [x p y p z p ] T =[x G y G z G ] T , is the average of the relative height of the drone and the marker, let Substitute into formula (4), we can get

设参数θ=[θab]T,θa=[xG,yG]T,θb=δψ(k),位置和 姿态的量测方程分别为式(6)和式(7):Let the parameters θ=[θ a , θ b ] T , θ a = [x G , y G ] T , θ b = δψ(k), The measurement equations of position and attitude are equations (6) and (7), respectively:

z1(i)=y1(i)+v1,v1~N(0,R1) (6)z 1 (i)=y 1 (i)+v 1 ,v 1 to N(0,R 1 ) (6)

其中v1,v2为测量噪声,R1,R2为实对称正定阵。则式(5)变形为Among them, v 1 , v 2 are measurement noises, and R 1 , R 2 are real symmetric positive definite matrices. Then formula (5) can be transformed into

其中,为姿态偏差,运用泰勒展开,式(8)变为in, For the attitude deviation, using Taylor expansion, Equation (8) becomes

由式(8)和式(9),得From formula (8) and formula (9), we get

设矩阵其中a1,3~a2,5表示矩阵Ai中对应 的元素;矩阵其中b1,1~b2,3表示在矩阵Bi中对应的元素。在本实施例中,对同一标志物进行N点视觉测量,故对应的矩阵为A1,…,AN,B1,…,BN,通过这些测量值得到如下线性回归模型,set matrix where a 1,3 ~a 2,5 represent the corresponding elements in the matrix A i ; the matrix Among them, b 1,1 to b 2,3 represent the corresponding elements in the matrix B i . In this embodiment, N-point visual measurement is performed on the same marker, so The corresponding matrices are A 1 ,…, AN , B 1 ,…,B N , and the following linear regression model is obtained through these measured values,

其中,I2为2×2的单位矩阵,噪声为where I 2 is a 2 × 2 identity matrix, and the noise is

V~N(0,R)V~N(0,R)

协方差矩阵为The covariance matrix is

参数θ的估计值为The estimated parameter θ is

通过式(12)可求解航向偏差δψ(k)。The heading deviation δψ(k) can be solved by formula (12).

步骤3.5、设e为常数,若|δψ(k)-δψ(k-1)|<e,则得到最终的相对高度的估 计值 和航向偏差的估计值 并执行步骤四;否则,, 转到步骤3.2,将当前的δψ(k)代入左右视图航向角的计算公式中,求出从而进行迭代计算。Step 3.5, set e as a constant, if |δψ(k)-δψ(k-1)|<e, then get the final estimated value of relative height and heading deviation estimates And go to step 4; otherwise, go to step 3.2, substitute the current δψ(k) into the calculation formula of the heading angle of the left and right views, and find Thereby iterative calculation is performed.

步骤四、在相对高度和航向偏差均有效估计的条件下,选择摄像机视野里 的任一目标并获得该目标的测量值,利用得出的航向偏差计算得到无人机的真 实航向,进而根据真实航向和高度估计值实现对目标的三维精确定位。Step 4. Under the condition that both the relative altitude and the heading deviation are effectively estimated, select any target in the field of view of the camera and obtain the measurement value of the target, and use the obtained heading deviation to calculate the true heading of the UAV, and then according to the real heading Heading and altitude estimates Realize the three-dimensional precise positioning of the target.

具体地,假设选定的目标与标志物处于同一平面,即估计的相对高度可以 认为无人机与目标的相对高度,[xt yt zt]T表示目标在惯性参考坐标系{I}的坐标, 有令[xp yp zp]T=[xt yt zt]T将目标的测量值以及无人机的真实 航向代入公式(4),计算得到目标的坐标,从而实现对目标的三维定位。Specifically, it is assumed that the selected target is in the same plane as the marker, i.e. the estimated relative height It can be considered as the relative height of the UAV and the target, [x t y t z t ] T represents the coordinates of the target in the inertial reference coordinate system {I}, there are Let [x p y p z p ] T = [x t y t z t ] T , Substitute the measured value of the target and the true heading of the UAV into formula (4), and calculate the coordinates of the target, so as to realize the three-dimensional positioning of the target.

下面具体说明该迭代过程的有效性,以绕飞轨迹是圆周为例,半径R=73m, 弧度rad=1.5π,δψ=0,1,...,59,60deg,由式(1)得到对应的61组然后运用数据 拟合的方法,以为因变量,δψ为自变量,如图6所示,得到的数学关系 表达式:The effectiveness of the iterative process will be described in detail below. Taking the circle around the flying trajectory as an example, the radius R=73m, the radian rad=1.5π, δψ=0,1,...,59,60deg, which can be obtained by formula (1). Corresponding 61 groups Then use the data fitting method to is the dependent variable, δψ is the independent variable, as shown in Figure 6, we get The mathematical relational expression for :

相同地,令由式(10)求解获得36组δψ,然后运用数 据拟合的方法,以δψ为因变量,为自变量,如图7所示,得到的数学关 系表达式:Similarly, let 36 groups of δψ are obtained by solving the formula (10), and then the data fitting method is used, with δψ as the dependent variable, is the independent variable, as shown in Figure 7, we get The mathematical relational expression for :

eδψ=δψ-δψt,其中ht,δψt为相对高度和航向偏差的真实值, 由式(9)得,Assume e δψ = δψ-δψ t , where h t , δψt are the true values of relative altitude and heading deviation, obtained from formula (9),

由式(10)得,According to formula (10),

其中,k1,k2为相关参数。Among them, k 1 and k 2 are related parameters.

双目视觉模型计算的相对高度代入线性回归方程,可有效地计算航向偏差。 然后,将航向偏差的估计值回代到双目视觉模型,可精确地计算相对高度。一 般地,AHRS系统的航向偏差不超过30deg,所以有|k2|>k1>0,而由于k2<0,根 据式(15),(16)可知,经过有限步迭代后,相对高度的估计值和航向偏差的估 计值将收敛到真实值。The relative height calculated by the binocular vision model is substituted into the linear regression equation, which can effectively calculate the heading deviation. Then, back-substituting the estimate of heading bias into the binocular vision model allows accurate calculation of relative altitude. Generally, the heading deviation of the AHRS system does not exceed 30deg, so |k 2 |>k 1 >0, and since k 2 <0, according to equations (15) and (16), it can be known that after a finite step iteration, the relative altitude estimated value of and heading deviation estimates will converge to the true value.

在无人机搭载相机的条件下,进行了飞行试验,无人机在绕飞轨迹下对标 志物进行图像测量,其中绕飞轨迹的半径R=73m,弧度rad=1.5π,无人机相对 于标志物的真实高度ht=45m,飞行速度V=3.44m/s,fGPS=4Hz,航向偏差的真 实值δψt=30deg,设e=0.02deg,本发明所提供方法的效果如表1,表2,如图8 所示。其中表中所列误差eh,eδψ,exy,ez都指的是均方根误差。Under the condition that the UAV is equipped with a camera, a flight test is carried out. The UAV performs image measurement on the marker under the orbital trajectory. The radius of the orbiting trajectory is R=73m, the arc rad=1.5π, and the UAV is relatively In the real height of the marker h t = 45m, the flight speed V = 3.44m/s, f GPS = 4Hz, the true value of the heading deviation δψ t = 30deg, set e = 0.02deg, the effect of the method provided by the present invention is as shown in the table 1, Table 2, as shown in Figure 8. The errors e h , e δψ , e xy , and ez listed in the table refer to the root mean square error.

表1迭代过程Table 1 Iterative process

表2目标定位结果Table 2 Target positioning results

指标index 本发明的三维定位Three-dimensional positioning of the present invention 相对高度估计误差e<sub>h</sub>/mRelative height estimation error e<sub>h</sub>/m 0.930.93 航向估计误差e<sub>δψ</sub>/degHeading estimation error e<sub>δψ</sub>/deg 1.891.89 定位误差e<sub>xy</sub>/mPositioning error e<sub>xy</sub>/m 10.8910.89 定位误差e<sub>z</sub>/mPositioning error e<sub>z</sub>/m 0.43 0.43

综上所述,以上仅为本发明的较佳实施例而已,并非用于限定本发明的保 护范围。凡在本发明的精神和原则之内,所作的任何修改、等同替换、改进等, 均应包含在本发明的保护范围之内。To sum up, the above are only preferred embodiments of the present invention, and are not intended to limit the protection scope of the present invention. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention shall be included within the protection scope of the present invention.

Claims (4)

1. A three-dimensional target positioning method of a rotor unmanned aerial vehicle under a flying-around track is characterized by comprising the following steps:
step one, extracting a static object from an image shot by a rotor unmanned aerial vehicle as a marker;
secondly, the rotor unmanned aerial vehicle performs fly-around by taking the marker as a center, performs N-angle shooting on the marker in the fly-around process, and obtains a measured value of each shot image; n is a positive integer;
step three, grouping the N shot images in pairs, wherein each group of shot images utilizes rotationCalculating the relative height of the unmanned gyroplane relative to the marker by using the binocular vision model of the view, and then taking an average value of N groups as a relative height error in the current iteration turn k
When the relative altitude is calculated, the heading deviation delta psi (k) required by the binocular vision model of the rotated view is calculated by adopting the heading deviation delta psi (k-1) obtained by the previous iteration turn k-1, and the heading angle psi (k) required by the binocular vision model is updated to be psi (k) ═ psiiδ ψ (k), wherein ψiThe course angle of the unmanned aerial vehicle when shooting the ith image is obtained; the initial value delta psi (0) is taken as 0;
step four, utilizing the relative height errorAnd the measured values of the N shot images are calculated to obtain the course deviation delta psi (k) of the current iteration turn k;
step five, judging whether the deviation between the delta psi (k) and the delta psi (k-1) is smaller than a set threshold value or not, and if so, taking the last iteration result as a relative height estimation valueAnd an estimate of course deviationAnd executing the step six; otherwise, adding 1 to the iteration turn k, and turning to the third step;
step six, calculating the real course of the rotor unmanned aerial vehicle by utilizing the estimated value of course deviation for any target in the visual field of the camera of the rotor unmanned aerial vehicle, and further calculating the real course and the height estimated value according to the real course and the height estimated valueRealizing three-dimensional positioning of the target;
in the fourth step, the specific way of calculating the heading deviation δ ψ (k) is as follows:
[xGyGzG]Tcoordinates representing the marker in an inertial reference frame { I }, are obtained
The ranging model of the camera is as follows:
wherein f is the focal length of the camera,to be the pose matrix of the drone at the time the ith image was taken,
1≤i≤N;
wherein,and (psi)iii) Shoot point O for unmanned aerial vehicle when shooting ith imageiPosition and attitude,. psi, in an inertial reference system { I }iiiRespectively an azimuth angle, a pitch angle and a roll angle,is the pixel position of the marker in the ith image.
Let parameter θ be ═ θab]T,θa=[xG,yG]T,θb=δψ(k),The measurement equation set is
Wherein v is1,v2To measure noise, R1,R2For a true symmetric positive definite matrix, the formula (3) is transformed into
WhereinFor the attitude deviation, the equation (4) is changed to
From formula (4) and formula (5) to obtain
Setting matrixWherein a is1,3~a2,5Representation matrix AiThe corresponding elements in (1);
matrix arrayWherein b is1,1~b2,3Is shown in matrix BiThe corresponding elements in (1); a linear regression model was obtained as follows,
wherein, I2Is a 2X 2 unit matrix, and has noise of V-N (0, R)
The covariance matrix is
The estimated value of the parameter theta is
The heading deviation δ ψ (k) can be solved by equation (8).
2. The method of claim 1, wherein the relative height h of the drone relative to the marker is a function of the distance between the drone and the markerjAnd j is more than or equal to 1 and less than or equal to n, and the calculation method comprises the following steps:
wherein T ═ Tl-RTTr=Rl(Or-Ol),M=[Pl -RTPr Pl×RTPr]-1T,R=RrRl T;Pl、PrPixel positions of the marker in the left view and the right view respectively; r and T are a rotation matrix and a translation matrix of a camera coordinate system corresponding to the right view relative to a camera coordinate system corresponding to the left view, Rl,TlUnmanned aerial vehicle shooting points O corresponding to left views respectivelylRotation matrix and translation matrix with respect to an inertial reference frame, Rr,TrUnmanned aerial vehicle shooting points O corresponding to right views respectivelyrA rotation matrix and a translation matrix relative to an inertial reference frame.
3. A method of three-dimensional targeting of a rotary-wing drone according to claim 2, wherein the measurement of the marker pixel locations is by:
identifying the marker image obtained in the step one to obtain a plurality of characteristic points; and identifying each image in the process of flying around to obtain a plurality of characteristic points, matching the characteristic points of each image with the characteristic points of the marker image, and taking the geometric center of the matched points as the pixel position of the marker in the image.
4. The method of claim 1, wherein step three involves eliminating gross errors of relative height using a 3 σ criterion prior to averaging the N sets.
CN201610943473.4A 2016-11-01 2016-11-01 A three-dimensional target localization method for rotary-wing UAV under the orbital trajectory Active CN107300377B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610943473.4A CN107300377B (en) 2016-11-01 2016-11-01 A three-dimensional target localization method for rotary-wing UAV under the orbital trajectory

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610943473.4A CN107300377B (en) 2016-11-01 2016-11-01 A three-dimensional target localization method for rotary-wing UAV under the orbital trajectory

Publications (2)

Publication Number Publication Date
CN107300377A CN107300377A (en) 2017-10-27
CN107300377B true CN107300377B (en) 2019-06-14

Family

ID=60138055

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610943473.4A Active CN107300377B (en) 2016-11-01 2016-11-01 A three-dimensional target localization method for rotary-wing UAV under the orbital trajectory

Country Status (1)

Country Link
CN (1) CN107300377B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109708622A (en) * 2017-12-15 2019-05-03 福建工程学院 A method for 3D modeling of buildings by UAV based on Pixhawk
CN110621962A (en) * 2018-02-28 2019-12-27 深圳市大疆创新科技有限公司 Positioning method of movable platform and related device and system
WO2020014909A1 (en) * 2018-07-18 2020-01-23 深圳市大疆创新科技有限公司 Photographing method and device and unmanned aerial vehicle
WO2020037492A1 (en) * 2018-08-21 2020-02-27 SZ DJI Technology Co., Ltd. Distance measuring method and device
CN110632941B (en) * 2019-09-25 2020-12-15 北京理工大学 A Trajectory Generation Method for UAV Target Tracking in Complex Environments
CN110675453B (en) * 2019-10-16 2021-04-13 北京天睿空间科技股份有限公司 Self-positioning method for moving target in known scene
CN110824295B (en) * 2019-10-22 2021-08-31 广东电网有限责任公司 Infrared thermal image fault positioning method based on three-dimensional graph
CN113469139B (en) * 2021-07-30 2022-04-05 广州中科智云科技有限公司 Data security transmission method and system for unmanned aerial vehicle edge side embedded AI chip
CN115272892B (en) * 2022-07-29 2023-07-11 同济大学 Unmanned aerial vehicle positioning deviation monitoring management and control system based on data analysis
CN117452831B (en) * 2023-12-26 2024-03-19 南京信息工程大学 A quadrotor drone control method, device, system and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10221072A (en) * 1997-02-03 1998-08-21 Asahi Optical Co Ltd System and method for photogrammetry
JP2003083744A (en) * 2001-09-12 2003-03-19 Starlabo Corp Imaging apparatus mounted to aircraft, and aircraft imaging data processing apparatus
CN102519434A (en) * 2011-12-08 2012-06-27 北京控制工程研究所 Test verification method for measuring precision of stereoscopic vision three-dimensional recovery data
CN105424006A (en) * 2015-11-02 2016-03-23 国网山东省电力公司电力科学研究院 Unmanned aerial vehicle hovering precision measurement method based on binocular vision

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10221072A (en) * 1997-02-03 1998-08-21 Asahi Optical Co Ltd System and method for photogrammetry
JP2003083744A (en) * 2001-09-12 2003-03-19 Starlabo Corp Imaging apparatus mounted to aircraft, and aircraft imaging data processing apparatus
CN102519434A (en) * 2011-12-08 2012-06-27 北京控制工程研究所 Test verification method for measuring precision of stereoscopic vision three-dimensional recovery data
CN105424006A (en) * 2015-11-02 2016-03-23 国网山东省电力公司电力科学研究院 Unmanned aerial vehicle hovering precision measurement method based on binocular vision

Also Published As

Publication number Publication date
CN107300377A (en) 2017-10-27

Similar Documents

Publication Publication Date Title
CN107300377B (en) A three-dimensional target localization method for rotary-wing UAV under the orbital trajectory
CN106153008B (en) A kind of rotor wing unmanned aerial vehicle objective localization method of view-based access control model
Jung et al. A direct visual servoing‐based framework for the 2016 IROS Autonomous Drone Racing Challenge
EP3158417B1 (en) Sensor fusion using inertial and image sensors
EP3158412B1 (en) Sensor fusion using inertial and image sensors
EP3158293B1 (en) Sensor fusion using inertial and image sensors
CN105698762B (en) Target method for rapidly positioning based on observation station at different moments on a kind of unit flight path
Kong et al. Autonomous landing of an UAV with a ground-based actuated infrared stereo vision system
Küng et al. The accuracy of automatic photogrammetric techniques on ultra-light UAV imagery
EP3158411B1 (en) Sensor fusion using inertial and image sensors
CN104215239B (en) Guidance method using vision-based autonomous unmanned plane landing guidance device
CN110068335A (en) A method and system for real-time positioning of UAV swarms in GPS-denied environment
CN108665499B (en) Near distance airplane pose measuring method based on parallax method
CN106155081B (en) A kind of a wide range of target monitoring of rotor wing unmanned aerial vehicle and accurate positioning method
EP2987001A1 (en) Landing system for an aircraft
EP2986940A1 (en) Landing site tracker
CN108955685A (en) A kind of tanker aircraft tapered sleeve pose measuring method based on stereoscopic vision
CN115144879B (en) Multi-machine multi-target dynamic positioning system and method
CN111288989A (en) A small unmanned aerial vehicle visual positioning method
CN106500699B (en) A kind of position and orientation estimation method suitable for Autonomous landing in unmanned plane room
US9816786B2 (en) Method for automatically generating a three-dimensional reference model as terrain information for an imaging device
Jung et al. Robust marker tracking algorithm for precise UAV vision-based autonomous landing
CN105606073A (en) Unmanned aerial vehicle processing system and flight state data processing method thereof
CN108873031B (en) An optimization method for external parameter calibration of a 2-DOF pod
CN110160503A (en) A kind of unmanned plane landscape matching locating method for taking elevation into account

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant