[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN107300377B - A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion - Google Patents

A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion Download PDF

Info

Publication number
CN107300377B
CN107300377B CN201610943473.4A CN201610943473A CN107300377B CN 107300377 B CN107300377 B CN 107300377B CN 201610943473 A CN201610943473 A CN 201610943473A CN 107300377 B CN107300377 B CN 107300377B
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
marker
matrix
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610943473.4A
Other languages
Chinese (zh)
Other versions
CN107300377A (en
Inventor
邓方
张乐乐
陈杰
邱煌斌
陈文颉
彭志红
白永强
李佳洪
桂鹏
樊欣宇
顾晓丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN201610943473.4A priority Critical patent/CN107300377B/en
Publication of CN107300377A publication Critical patent/CN107300377A/en
Application granted granted Critical
Publication of CN107300377B publication Critical patent/CN107300377B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Medicines Containing Antibodies Or Antigens For Use As Internal Diagnostic Agents (AREA)
  • Image Analysis (AREA)

Abstract

The present invention discloses the rotor wing unmanned aerial vehicle objective localization method under a kind of track of being diversion, and using the single camera photographic subjects image being mounted on unmanned plane, and image is passed back to earth station;The marker with obvious characteristic is selected, and carries out visual identity;Then rotor wing unmanned aerial vehicle is diversion centered on the marker, carries out multipoint images measurement, calculates height and course deviation of the unmanned plane relative to landform where target based on the method for binocular vision model and the mutual iteration of linear regression model (LRM);Next, any static or moving target in camera coverage may be selected in operator, realize that the three-dimensional of target is accurately positioned.The present invention is carried out in same primary aerial mission, and flight leading portion calculates course deviation and relative altitude, and flight back segment carries out three-dimensional accurate positioning;The present invention traditional triangulation location method under track that solves the problems, such as to be diversion can not calculate relative altitude, to realize the three-dimensional localization to target.

Description

Rotor unmanned aerial vehicle three-dimensional target positioning method under flying-around track
Technical Field
The invention belongs to the field of vision measurement, and particularly relates to a three-dimensional target positioning method for a rotor unmanned aerial vehicle under a flying-around track.
Background
The rotor unmanned aerial vehicle has the characteristics of low cost, vertical take-off and landing, hovering in the air and the like, and is widely applied to the fields of investigation, agricultural insurance, environmental protection, post-disaster rescue and the like.
The rotor unmanned aerial vehicle target positioning based on vision is one of the current research focus problems, a vision method is adopted to carry out three-dimensional positioning on a target, the relative height between the unmanned aerial vehicle and the target is determined through a triangulation positioning method, and then the target can be positioned. The course deviation brought by the low-precision AHRS attitude and heading reference system equipped for the rotor unmanned aerial vehicle is considered to be large, and when the images shot by the unmanned aerial vehicle are used for vision measurement, certain deviation occurs to the light rays projected in the images. If a traditional triangulation method is adopted, the relative height obtained by solving two groups of light rays projected from left and right views generates a large error due to the deviation, so that the relative height between the unmanned aerial vehicle and an object cannot be accurately calculated, and the target cannot be effectively positioned in a three-dimensional mode.
Disclosure of Invention
In view of the above, the invention provides a method for positioning a three-dimensional target of a rotor unmanned aerial vehicle under a flying-around track, which can calculate a course deviation and reduce a calculation error of a relative height, so that the three-dimensional positioning capability of the rotor unmanned aerial vehicle on the target is improved.
Has the advantages that:
(1) the method provided by the invention aims at the rotor unmanned aerial vehicle provided with the low-precision AHRS attitude reference system, and can accurately calculate the course deviation of the AHRS attitude reference system, so as to calculate the height between the rotor unmanned aerial vehicle and the terrain where the target is located under the orbit around flying, thereby realizing the three-dimensional visual positioning of the rotor unmanned aerial vehicle on the target.
Drawings
Fig. 1 is a block diagram of a target three-dimensional positioning system for a rotary-wing drone in accordance with the present invention;
FIG. 2 is a flow chart of a method provided by the present invention;
FIG. 3 is a schematic view of a rotated view binocular vision model used in the present invention;
FIG. 4 is a schematic view of a monocular camera ranging model used in the present invention;
FIG. 5 is a flow chart of an iterative process in the method provided by the present invention;
FIG. 6 shows a method of the present inventionFitting a curve to the data;
FIG. 7 shows a method of the present inventionFitting a curve to the data;
FIG. 8 is a diagram illustrating the positioning effect of the method of the present invention.
Detailed Description
The invention is described in detail below by way of example with reference to the accompanying drawings.
The following experimental platform is set up to verify the effectiveness of the invention, one T650 four-rotor unmanned aerial vehicle and one notebook computer are used as a ground station, real-time communication can be carried out between the unmanned aerial vehicle and the ground station, and the system structure is shown in figure 1.
For an unmanned aerial vehicle, a GPS (global positioning system), an AHRS (attitude and heading reference system), an altimeter, a wireless image transmission module and a wireless data transceiver module are arranged on the unmanned aerial vehicle, so that an APM (automatic flight management) flight control system of a 3D robot company works in a self-stabilization mode to ensure stable flight of the unmanned aerial vehicle, a camera is arranged at the head position of the unmanned aerial vehicle, the depression angle β is 45 degrees, the image is transmitted back to a ground station through the wireless image transmission module, and the position, attitude and altitude information of the unmanned aerial vehicle, which are respectively obtained by the GPS, the AHRS attitude reference system and the altimeter, are transmitted to the ground station through the wireless data transceiver module.
The ground station uses the computer as the main part, runs algorithms such as unmanned aerial vehicle visual positioning, uses the USB interface to connect wireless data transceiver module, realizes the intercommunication of unmanned aerial vehicle and ground station.
Based on the experimental platform, as shown in fig. 2, a three-dimensional target positioning method for a rotor unmanned aerial vehicle under a flying-around track comprises the following steps:
after a system is started, shooting an image by using a camera carried on an unmanned aerial vehicle, and transmitting the image back to a ground station;
secondly, selecting a static object with a clear outline from the returned image as a marker, and visually identifying the marker;
the specific process of visually identifying the marker in the step two is as follows:
the marker is identified by using SIFT algorithm to obtain m feature points P1,P2...Pm-1,PmStoring the characteristic points as a template, wherein m is an integer;
thirdly, the rotor unmanned aerial vehicle flies around by taking the marker as a center, multi-point image measurement is carried out on the marker by using a vision identification result, and the height and the course deviation of the unmanned aerial vehicle relative to the terrain where the target is located are calculated based on a method of mutual iteration of a binocular vision model and a linear regression model;
the flow chart of the third step is shown in fig. 5, and the specific process is as follows:
step 3.1, the rotor unmanned aerial vehicle measures the N images according to time sequence by using visual identification under the flying track, performs feature extraction on the ith current image by using SIFT algorithm (i is more than or equal to 1 and less than or equal to N), and then matches the feature points of the ith current image by using the feature points in the template to obtain w groups of matching points P1,P2...Pw-1,Pw(w ≦ m), and finally taking the geometric center P of these matching pointsf(f ≦ w) represents the pixel position of the marker in the image, notedAnd recording the measured value at the time of measurement on the ith image, including: unmanned shooting point OiPosition in inertial reference system { I }And attitude (psi)iii),ψiiiAzimuth, elevation and roll, respectively.
Step 3.2, selecting any two images in the N images, wherein N groups are provided in totalThe previously measured image is taken as the left view L and the later measured image is taken as the right view R to construct a binocular vision model of the rotated view, as shown in fig. 3.
Calculating the relative height h of the unmanned aerial vehicle relative to the markerj,1≤j≤n
Wherein, the pixel positions of the marker in the left and right views are respectivelyRl,TlUnmanned aerial vehicle shooting points O corresponding to left views respectivelylRelative to the rotational and translational matrices of the inertial reference frame,
wherein psilllRespectively a left view unmanned aerial vehicle shooting point OlHeading angle, pitch angle and roll angle of phirrrRespectively a right view unmanned aerial vehicle shooting point OlHeading angle, pitch angle and roll angle, delta psi as heading deviation, with psil=ψiδ ψ (k), k being the number of iterations, θl=θi,φl=φi(i is more than or equal to 1 and less than N), and setting an initial value delta psi (0) to be 0;
Rr,Trunmanned aerial vehicle that respectively is right view correspondence clapsPoint of photograph OrA rotation matrix and a translation matrix with respect to an inertial reference frame,
wherein psir=ψm-δψ(k),θr=θm,φr=φm(i<m≤N)
The coordinates of the unmanned aerial vehicle shooting points corresponding to the left view and the right view in the inertial reference coordinate system are respectivelyAndr and T are a rotation matrix and a translation matrix of a camera coordinate system corresponding to the right view relative to a camera coordinate system corresponding to the left view, and R is RrRl T, T=Tl-RTTr=Rl(Or-Ol);M=[Pl -RTPr Pl×RTPr]-1T。
Step 3.3, aiming at the n groups of relative heights h obtained by calculationjRemoving gross errors by using 3 sigma criterion, and then calculating n groups of average values
Step 3.4, obtaining the relative heightThen, calculating course deviation delta psi (k) by using the N point measurement values in the step 3.2 and based on the linear regression model;
generally, [ x y z ]]T,[xp yp zp]TRespectively indicate unmanned aerial vehicles andcoordinates of the object in the inertial reference frame { I }, (x)f′,yf') indicates the pixel position of the object in the image, f is the focal length of the camera, and the range model of the camera is
Attitude matrixIs composed of
Wherein h ' is the relative height between the unmanned aerial vehicle and the object, and (ψ ', θ ', φ ') represents the heading angle, the pitch angle and the roll angle of the unmanned aerial vehicle at a certain measuring point, wherein the measurement accuracy of the pitch angle θ ' and the roll angle φ ' is high, the error is ignored, and the measurement of the heading angle ψ ' has large heading deviation.
In this embodiment, in order to calculate the course deviation of the course angle, N-point measurement values of the markers photographed by the unmanned aerial vehicle at different positions are used, and the solution is performed by a linear regression method, and the specific calculation process is as follows: [ x ] ofG yG zG]TCoordinates of the marker in an inertial reference frame { I }, let [ xp yp zp]T=[xG yG zG]TIs the average value of the relative heights of the unmanned aerial vehicle and the marker, orderBy substituting the formula (4), the result is obtained
Let parameter θ be ═ θab]T,θa=[xG,yG]T,θb=δψ(k),The measurement equations of the position and the attitude are respectively formula (6) and formula (7):
z1(i)=y1(i)+v1,v1~N(0,R1) (6)
wherein v is1,v2To measure noise, R1,R2The array is a real symmetrical positive definite array. The formula (5) is transformed into
Wherein,for the attitude deviation, equation (8) is changed to
From formula (8) and formula (9) to give
Setting matrixWherein a is1,3~a2,5Representation matrix AiThe corresponding elements in (1); matrix arrayWherein b is1,1~b2,3Is shown in matrix BiTo the corresponding elements in (1). In this embodiment, the same marker is subjected to N-point vision measurement, soThe corresponding matrix is A1,…,AN,B1,…,BNFrom these measurements, a linear regression model is obtained,
wherein, I2Is a 2 × 2 identity matrix with noise of
V~N(0,R)
The covariance matrix is
The estimated value of the parameter theta is
The heading deviation δ ψ (k) can be solved by equation (12).
And 3.5, setting e as a constant, and if the absolute value delta psi (k) -delta psi (k-1) | is less than e, obtaining a final estimated value of the relative height And an estimate of course deviation And executing the step four; otherwise, go to step 3.2, substitute the current delta psi (k) into the calculation formula of left and right view heading angle to find outThereby performing an iterative calculation.
Step four, under the condition that the relative altitude and the course deviation are effectively estimated, selecting any target in the visual field of the camera and obtaining the measured value of the target, calculating the true course of the unmanned aerial vehicle by using the obtained course deviation, and further estimating the value according to the true course and the altitudeAnd the three-dimensional accurate positioning of the target is realized.
In particular, it is assumed that the selected target is in the same plane as the marker, i.e. the estimated relative heightThe relative height of the drone to the target, [ x ] can be consideredt yt zt]TCoordinates representing the object in an inertial reference frame, { I }, havingLet [ x)p yp zp]T=[xt yt zt]TAnd substituting the measured value of the target and the real course of the unmanned aerial vehicle into the formula (4), and calculating to obtain the coordinate of the target, thereby realizing the three-dimensional positioning of the target.
The effectiveness of the iterative process is described in detail below, taking the circular orbit as an example, the radius R is 73m, the radian rad is 1.5 pi, the δ ψ is 0,1,.. multidot.59, 60 deg., and the corresponding 61 sets are obtained from equation (1)Then applying a data fitting method toIs a dependent variable, and δ ψ is an independent variable, as shown in FIG. 6, is obtainedThe mathematical relational expression of (a):
in the same way, let36 groups of delta psi are obtained by solving the formula (10), then a data fitting method is used, the delta psi is taken as a dependent variable,as independent variable, as shown in FIG. 7, obtainIs expressed as:
is provided witheδψ=δψ-δψtWherein h istδ ψ t is the true value of the relative altitude and heading deviation, obtained by equation (9),
is obtained by the formula (10),
wherein k is1,k2Are the relevant parameters.
The relative height calculated by the binocular vision model is substituted into the linear regression equation, so that the course deviation can be effectively calculated. Then, the estimated value of the course deviation is substituted back to the binocular vision model, and the relative altitude can be accurately calculated. Generally, the heading bias of the AHRS system does not exceed 30deg, so there is | k2|>k1> 0, and due to k2< 0, it can be seen from equations (15) and (16) that the relative height estimate is obtained after finite iterationEstimated value of course deviationWill converge to the true value.
Under the condition that unmanned aerial vehicle carries on the camera, flight test has been carried out, and unmanned aerial vehicle carries out image measurement to the marker under the orbit of flying around, and wherein the radius R of orbit of flying around is 73m, and radian rad is 1.5 pi, and unmanned aerial vehicle is relative to the true height h of markert45m, flight speed V3.44 m/s, fGPS4Hz, true value delta psi of course deviationtThe method provided by the invention has the effects shown in table 1 and table 2 and shown in fig. 8, wherein 30deg is used and e is 0.02 deg. Error e listed in the tableh,eδψ,exy,ezAre referred to as root mean square errors.
TABLE 1 iterative procedure
TABLE 2 target location results
Index (I) Three-dimensional positioning of the invention
Relative altitude estimation error eh/m 0.93
Heading estimation error eδψ/deg 1.89
Positioning error exy/m 10.89
Positioning error ez/m 0.43
In summary, the above description is only a preferred embodiment of the present invention, and is not intended to limit the protection scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (4)

1. A three-dimensional target positioning method of a rotor unmanned aerial vehicle under a flying-around track is characterized by comprising the following steps:
step one, extracting a static object from an image shot by a rotor unmanned aerial vehicle as a marker;
secondly, the rotor unmanned aerial vehicle performs fly-around by taking the marker as a center, performs N-angle shooting on the marker in the fly-around process, and obtains a measured value of each shot image; n is a positive integer;
step three, grouping the N shot images in pairs, wherein each group of shot images utilizes rotationCalculating the relative height of the unmanned gyroplane relative to the marker by using the binocular vision model of the view, and then taking an average value of N groups as a relative height error in the current iteration turn k
When the relative altitude is calculated, the heading deviation delta psi (k) required by the binocular vision model of the rotated view is calculated by adopting the heading deviation delta psi (k-1) obtained by the previous iteration turn k-1, and the heading angle psi (k) required by the binocular vision model is updated to be psi (k) ═ psiiδ ψ (k), wherein ψiThe course angle of the unmanned aerial vehicle when shooting the ith image is obtained; the initial value delta psi (0) is taken as 0;
step four, utilizing the relative height errorAnd the measured values of the N shot images are calculated to obtain the course deviation delta psi (k) of the current iteration turn k;
step five, judging whether the deviation between the delta psi (k) and the delta psi (k-1) is smaller than a set threshold value or not, and if so, taking the last iteration result as a relative height estimation valueAnd an estimate of course deviationAnd executing the step six; otherwise, adding 1 to the iteration turn k, and turning to the third step;
step six, calculating the real course of the rotor unmanned aerial vehicle by utilizing the estimated value of course deviation for any target in the visual field of the camera of the rotor unmanned aerial vehicle, and further calculating the real course and the height estimated value according to the real course and the height estimated valueRealizing three-dimensional positioning of the target;
in the fourth step, the specific way of calculating the heading deviation δ ψ (k) is as follows:
[xGyGzG]Tcoordinates representing the marker in an inertial reference frame { I }, are obtained
The ranging model of the camera is as follows:
wherein f is the focal length of the camera,to be the pose matrix of the drone at the time the ith image was taken,
1≤i≤N;
wherein,and (psi)iii) Shoot point O for unmanned aerial vehicle when shooting ith imageiPosition and attitude,. psi, in an inertial reference system { I }iiiRespectively an azimuth angle, a pitch angle and a roll angle,is the pixel position of the marker in the ith image.
Let parameter θ be ═ θab]T,θa=[xG,yG]T,θb=δψ(k),The measurement equation set is
Wherein v is1,v2To measure noise, R1,R2For a true symmetric positive definite matrix, the formula (3) is transformed into
WhereinFor the attitude deviation, the equation (4) is changed to
From formula (4) and formula (5) to obtain
Setting matrixWherein a is1,3~a2,5Representation matrix AiThe corresponding elements in (1);
matrix arrayWherein b is1,1~b2,3Is shown in matrix BiThe corresponding elements in (1); a linear regression model was obtained as follows,
wherein, I2Is a 2X 2 unit matrix, and has noise of V-N (0, R)
The covariance matrix is
The estimated value of the parameter theta is
The heading deviation δ ψ (k) can be solved by equation (8).
2. The method of claim 1, wherein the relative height h of the drone relative to the marker is a function of the distance between the drone and the markerjAnd j is more than or equal to 1 and less than or equal to n, and the calculation method comprises the following steps:
wherein T ═ Tl-RTTr=Rl(Or-Ol),M=[Pl -RTPr Pl×RTPr]-1T,R=RrRl T;Pl、PrPixel positions of the marker in the left view and the right view respectively; r and T are a rotation matrix and a translation matrix of a camera coordinate system corresponding to the right view relative to a camera coordinate system corresponding to the left view, Rl,TlUnmanned aerial vehicle shooting points O corresponding to left views respectivelylRotation matrix and translation matrix with respect to an inertial reference frame, Rr,TrUnmanned aerial vehicle shooting points O corresponding to right views respectivelyrA rotation matrix and a translation matrix relative to an inertial reference frame.
3. A method of three-dimensional targeting of a rotary-wing drone according to claim 2, wherein the measurement of the marker pixel locations is by:
identifying the marker image obtained in the step one to obtain a plurality of characteristic points; and identifying each image in the process of flying around to obtain a plurality of characteristic points, matching the characteristic points of each image with the characteristic points of the marker image, and taking the geometric center of the matched points as the pixel position of the marker in the image.
4. The method of claim 1, wherein step three involves eliminating gross errors of relative height using a 3 σ criterion prior to averaging the N sets.
CN201610943473.4A 2016-11-01 2016-11-01 A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion Active CN107300377B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610943473.4A CN107300377B (en) 2016-11-01 2016-11-01 A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610943473.4A CN107300377B (en) 2016-11-01 2016-11-01 A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion

Publications (2)

Publication Number Publication Date
CN107300377A CN107300377A (en) 2017-10-27
CN107300377B true CN107300377B (en) 2019-06-14

Family

ID=60138055

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610943473.4A Active CN107300377B (en) 2016-11-01 2016-11-01 A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion

Country Status (1)

Country Link
CN (1) CN107300377B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109708622A (en) * 2017-12-15 2019-05-03 福建工程学院 The method that three-dimensional modeling is carried out to building using unmanned plane based on Pixhawk
WO2019165612A1 (en) * 2018-02-28 2019-09-06 深圳市大疆创新科技有限公司 Method for positioning a movable platform, and related device and system
WO2020014909A1 (en) * 2018-07-18 2020-01-23 深圳市大疆创新科技有限公司 Photographing method and device and unmanned aerial vehicle
CN112567201B (en) * 2018-08-21 2024-04-16 深圳市大疆创新科技有限公司 Distance measuring method and device
CN110632941B (en) * 2019-09-25 2020-12-15 北京理工大学 Trajectory generation method for target tracking of unmanned aerial vehicle in complex environment
CN110675453B (en) * 2019-10-16 2021-04-13 北京天睿空间科技股份有限公司 Self-positioning method for moving target in known scene
CN110824295B (en) * 2019-10-22 2021-08-31 广东电网有限责任公司 Infrared thermal image fault positioning method based on three-dimensional graph
CN113469139B (en) * 2021-07-30 2022-04-05 广州中科智云科技有限公司 Data security transmission method and system for unmanned aerial vehicle edge side embedded AI chip
CN115272892B (en) * 2022-07-29 2023-07-11 同济大学 Unmanned aerial vehicle positioning deviation monitoring management and control system based on data analysis
CN117452831B (en) * 2023-12-26 2024-03-19 南京信息工程大学 Four-rotor unmanned aerial vehicle control method, device, system and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10221072A (en) * 1997-02-03 1998-08-21 Asahi Optical Co Ltd System and method for photogrammetry
JP2003083744A (en) * 2001-09-12 2003-03-19 Starlabo Corp Imaging apparatus mounted to aircraft, and aircraft imaging data processing apparatus
CN102519434A (en) * 2011-12-08 2012-06-27 北京控制工程研究所 Test verification method for measuring precision of stereoscopic vision three-dimensional recovery data
CN105424006A (en) * 2015-11-02 2016-03-23 国网山东省电力公司电力科学研究院 Unmanned aerial vehicle hovering precision measurement method based on binocular vision

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10221072A (en) * 1997-02-03 1998-08-21 Asahi Optical Co Ltd System and method for photogrammetry
JP2003083744A (en) * 2001-09-12 2003-03-19 Starlabo Corp Imaging apparatus mounted to aircraft, and aircraft imaging data processing apparatus
CN102519434A (en) * 2011-12-08 2012-06-27 北京控制工程研究所 Test verification method for measuring precision of stereoscopic vision three-dimensional recovery data
CN105424006A (en) * 2015-11-02 2016-03-23 国网山东省电力公司电力科学研究院 Unmanned aerial vehicle hovering precision measurement method based on binocular vision

Also Published As

Publication number Publication date
CN107300377A (en) 2017-10-27

Similar Documents

Publication Publication Date Title
CN107300377B (en) A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion
CN106153008B (en) A kind of rotor wing unmanned aerial vehicle objective localization method of view-based access control model
CN107314771B (en) Unmanned aerial vehicle positioning and attitude angle measuring method based on coding mark points
EP3158417B1 (en) Sensor fusion using inertial and image sensors
EP3158293B1 (en) Sensor fusion using inertial and image sensors
EP3158412B1 (en) Sensor fusion using inertial and image sensors
CN112037260B (en) Position estimation method and device for tracking target and unmanned aerial vehicle
CN105335733B (en) Unmanned aerial vehicle autonomous landing visual positioning method and system
CN105549614B (en) Unmanned plane target tracking
CN108665499B (en) Near distance airplane pose measuring method based on parallax method
CN109683629B (en) Unmanned aerial vehicle electric power overhead line system based on combination navigation and computer vision
CN108955685B (en) Refueling aircraft taper sleeve pose measuring method based on stereoscopic vision
EP3734394A1 (en) Sensor fusion using inertial and image sensors
CN107067437B (en) Unmanned aerial vehicle positioning system and method based on multi-view geometry and bundle adjustment
RU2550811C1 (en) Method and device for object coordinates determination
US9816786B2 (en) Method for automatically generating a three-dimensional reference model as terrain information for an imaging device
CN106155081A (en) A kind of rotor wing unmanned aerial vehicle target monitoring on a large scale and accurate positioning method
CN108007437B (en) Method for measuring farmland boundary and internal obstacles based on multi-rotor aircraft
CN115144879B (en) Multi-machine multi-target dynamic positioning system and method
CN108445900A (en) A kind of unmanned plane vision positioning replacement differential technique
CN105389819B (en) A kind of lower visible image method for correcting polar line of half calibration and system of robust
CN109764864B (en) Color identification-based indoor unmanned aerial vehicle pose acquisition method and system
CN115388890A (en) Visual sense-based multi-unmanned aerial vehicle cooperative ground target positioning method
CN115272458A (en) Visual positioning method for fixed wing unmanned aerial vehicle in landing stage
CN114777768A (en) High-precision positioning method and system for satellite rejection environment and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant