CN107300377A - A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion - Google Patents
A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion Download PDFInfo
- Publication number
- CN107300377A CN107300377A CN201610943473.4A CN201610943473A CN107300377A CN 107300377 A CN107300377 A CN 107300377A CN 201610943473 A CN201610943473 A CN 201610943473A CN 107300377 A CN107300377 A CN 107300377A
- Authority
- CN
- China
- Prior art keywords
- mrow
- msub
- mtd
- msubsup
- mtr
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 32
- 230000004807 localization Effects 0.000 title abstract 3
- 238000005259 measurement Methods 0.000 claims abstract description 15
- 238000012417 linear regression Methods 0.000 claims abstract description 8
- 230000000007 visual effect Effects 0.000 claims abstract description 6
- 230000003068 static effect Effects 0.000 claims abstract description 3
- 239000003550 marker Substances 0.000 claims description 29
- 239000011159 matrix material Substances 0.000 claims description 26
- 238000004364 calculation method Methods 0.000 claims description 6
- 238000013519 translation Methods 0.000 claims description 5
- 238000010586 diagram Methods 0.000 claims description 3
- 238000012935 Averaging Methods 0.000 claims 1
- 230000008685 targeting Effects 0.000 claims 1
- 239000000835 fiber Substances 0.000 abstract 1
- 229920000728 polyester Polymers 0.000 abstract 1
- 230000005540 biological transmission Effects 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000012804 iterative process Methods 0.000 description 2
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 description 1
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Image Analysis (AREA)
- Medicines Containing Antibodies Or Antigens For Use As Internal Diagnostic Agents (AREA)
Abstract
The present invention discloses the rotor wing unmanned aerial vehicle objective localization method under a kind of track of being diversion, and using the single camera photographic subjects image being mounted on unmanned plane, and image is passed back into earth station;Mark of the selection with obvious characteristic, and carry out visual identity;Then rotor wing unmanned aerial vehicle is diversion centered on the mark, carries out multipoint images measurement, and the method based on binocular vision model and the mutual iteration of linear regression model (LRM) calculates height and course deviation of the unmanned plane relative to landform where target;Next, any static or moving target in camera coverage may be selected in operating personnel, realize that the three-dimensional of target is accurately positioned.The present invention is carried out with aerial mission once, and flight leading portion calculates course deviation and relative altitude, and flight back segment carries out three-dimensional and is accurately positioned;The present invention solves the problem of triangle polyester fibre method traditional under track of being diversion can not calculate relative altitude, so as to realize the three-dimensional localization to target.
Description
Technical Field
The invention belongs to the field of vision measurement, and particularly relates to a three-dimensional target positioning method for a rotor unmanned aerial vehicle under a flying-around track.
Background
The rotor unmanned aerial vehicle has the characteristics of low cost, vertical take-off and landing, hovering in the air and the like, and is widely applied to the fields of investigation, agricultural insurance, environmental protection, post-disaster rescue and the like.
The rotor unmanned aerial vehicle target positioning based on vision is one of the current research focus problems, a vision method is adopted to carry out three-dimensional positioning on a target, the relative height between the unmanned aerial vehicle and the target is determined through a triangulation positioning method, and then the target can be positioned. The course deviation brought by the low-precision AHRS attitude and heading reference system equipped for the rotor unmanned aerial vehicle is considered to be large, and when the images shot by the unmanned aerial vehicle are used for vision measurement, certain deviation occurs to the light rays projected in the images. If a traditional triangulation method is adopted, the relative height obtained by solving two groups of light rays projected from left and right views generates a large error due to the deviation, so that the relative height between the unmanned aerial vehicle and an object cannot be accurately calculated, and the target cannot be effectively positioned in a three-dimensional mode.
Disclosure of Invention
In view of the above, the invention provides a method for positioning a three-dimensional target of a rotor unmanned aerial vehicle under a flying-around track, which can calculate a course deviation and reduce a calculation error of a relative height, so that the three-dimensional positioning capability of the rotor unmanned aerial vehicle on the target is improved.
Has the advantages that:
(1) the method provided by the invention aims at the rotor unmanned aerial vehicle provided with the low-precision AHRS attitude reference system, and can accurately calculate the course deviation of the AHRS attitude reference system, so as to calculate the height between the rotor unmanned aerial vehicle and the terrain where the target is located under the orbit around flying, thereby realizing the three-dimensional visual positioning of the rotor unmanned aerial vehicle on the target.
Drawings
Fig. 1 is a block diagram of a target three-dimensional positioning system for a rotary-wing drone in accordance with the present invention;
FIG. 2 is a flow chart of a method provided by the present invention;
FIG. 3 is a schematic view of a rotated view binocular vision model used in the present invention;
FIG. 4 is a schematic view of a monocular camera ranging model used in the present invention;
FIG. 5 is a flow chart of an iterative process in the method provided by the present invention;
FIG. 6 shows a method of the present inventionFitting a curve to the data;
FIG. 7 shows a method of the present inventionFitting a curve to the data;
FIG. 8 is a diagram illustrating the positioning effect of the method of the present invention.
Detailed Description
The invention is described in detail below by way of example with reference to the accompanying drawings.
The following experimental platform is set up to verify the effectiveness of the invention, one T650 four-rotor unmanned aerial vehicle and one notebook computer are used as a ground station, real-time communication can be carried out between the unmanned aerial vehicle and the ground station, and the system structure is shown in figure 1.
For an unmanned aerial vehicle, a GPS (global positioning system), an AHRS (attitude and heading reference system), an altimeter, a wireless image transmission module and a wireless data transceiver module are arranged on the unmanned aerial vehicle, so that an APM (automatic flight management) flight control system of a 3D robot company works in a self-stabilization mode to ensure the stable flight of the unmanned aerial vehicle. A camera is installed at the head position of the unmanned aerial vehicle, the depression angle beta is 45 degrees, the image is transmitted back to the ground station through a wireless image transmission module, and the position, the posture and the elevation information of the unmanned aerial vehicle, which are respectively obtained by a GPS (global positioning system), an AHRS (attitude and heading reference system) and an altimeter, are transmitted to the ground station through a wireless data transceiver module.
The ground station uses the computer as the main part, runs algorithms such as unmanned aerial vehicle visual positioning, uses the USB interface to connect wireless data transceiver module, realizes the intercommunication of unmanned aerial vehicle and ground station.
Based on the experimental platform, as shown in fig. 2, a three-dimensional target positioning method for a rotor unmanned aerial vehicle under a flying-around track comprises the following steps:
after a system is started, shooting an image by using a camera carried on an unmanned aerial vehicle, and transmitting the image back to a ground station;
secondly, selecting a static object with a clear outline from the returned image as a marker, and visually identifying the marker;
the specific process of visually identifying the marker in the step two is as follows:
the marker is identified by using SIFT algorithm to obtain m feature points P1,P2...Pm-1,PmStoring the characteristic points as a template, wherein m is an integer;
thirdly, the rotor unmanned aerial vehicle flies around by taking the marker as a center, multi-point image measurement is carried out on the marker by using a vision identification result, and the height and the course deviation of the unmanned aerial vehicle relative to the terrain where the target is located are calculated based on a method of mutual iteration of a binocular vision model and a linear regression model;
the flow chart of the third step is shown in fig. 5, and the specific process is as follows:
step 3.1, the rotor unmanned aerial vehicle measures the N images according to time sequence by using visual identification under the flying track, performs feature extraction on the ith current image by using SIFT algorithm (i is more than or equal to 1 and less than or equal to N), and then matches the feature points of the ith current image by using the feature points in the template to obtain w groups of matching points P1,P2...Pw-1,Pw(w ≦ m), and finally taking the geometric center P of these matching pointsf(f ≦ w) represents the pixel position of the marker in the image, notedAnd recording the measured value at the time of measurement on the ith image, including: unmanned shooting point OiPosition in inertial reference system { I }And attitude (psi)i,θi,φi),ψi,θi,φiAzimuth, elevation and roll, respectively.
Step 3.2, selecting any two images in the N images, wherein N groups are provided in totalThe previously measured image is taken as the left view L and the later measured image is taken as the right view R to construct a binocular vision model of the rotated view, as shown in fig. 3.
Calculating the relative height h of the unmanned aerial vehicle relative to the markerj,1≤j≤n
Wherein, the pixel positions of the marker in the left and right views are respectivelyRl,TlUnmanned aerial vehicle shooting points O corresponding to left views respectivelylRelative to the rotational and translational matrices of the inertial reference frame,
wherein psil,θl,φlRespectively a left view unmanned aerial vehicle shooting point OlHeading angle, pitch angle and roll angle of phir,θr,φrRespectively a right view unmanned aerial vehicle shooting point OlHeading angle, pitch angle and roll angle psi as heading deviationl=ψiPhi (k), k being the number of iterations, thetal=θi,φl=φi(i is more than or equal to 1 and less than N), and setting an initial value psi (0) to be 0;
Rr,Trunmanned aerial vehicle shooting points O corresponding to right views respectivelyrA rotation matrix and a translation matrix with respect to an inertial reference frame,
wherein psir=ψm-ψ(k),θr=θm,φr=φm(i<m≤N)
The coordinates of the unmanned aerial vehicle shooting points corresponding to the left view and the right view in the inertial reference coordinate system are respectivelyAndr and T are a rotation matrix and a translation matrix of a camera coordinate system corresponding to the right view relative to a camera coordinate system corresponding to the left view, and R is RrRl T, T=Tl-RTTr=Rl(Or-Ol);M=[Pl-RTPrPl×RTPr]-1T。
Step 3.3, aiming at the n groups of relative heights h obtained by calculationjRemoving gross errors by using 3 sigma criterion, and then calculating n groups of average values
Step 3.4, obtaining the relative heightThen, the course deviation psi (k) is calculated by using the N point measurement values in the step 3.2 and based on the linear regression model;
generally, [ x y z ]]T,[xpypzp]TRespectively representing the coordinates of the drone and the object in an inertial reference frame { I }, (x)f′,yf') indicates the pixel position of the object in the image, f is the focal length of the camera, and the range model of the camera is
Attitude matrixIs composed of
Wherein h ' is the relative height between the unmanned aerial vehicle and the object, and (ψ ', θ ', φ ') represents the heading angle, the pitch angle and the roll angle of the unmanned aerial vehicle at a certain measuring point, wherein the measurement accuracy of the pitch angle θ ' and the roll angle φ ' is high, the error is ignored, and the measurement of the heading angle ψ ' has large heading deviation.
In this embodiment, in order to calculate the course deviation of the course angle, N-point measurement values of the markers photographed by the unmanned aerial vehicle at different positions are used, and the solution is performed by a linear regression method, and the specific calculation process is as follows: [ x ] ofGyGzG]TCoordinates of the marker in an inertial reference frame { I }, let [ xpypzp]T=[xGyGzG]T,Is the average value of the relative heights of the unmanned aerial vehicle and the marker, orderBy substituting the formula (4), the result is obtained
Let parameter θ be ═ θa,θb]T,θa=[xG,yG]T,θb=ψ(k),The measurement equations of the position and the attitude are respectively formula (6) and formula (7):
z1(i)=y1(i)+v1,v1~N(0,R1) (6)
wherein v is1,v2To measure noise, R1,R2The array is a real symmetrical positive definite array. The formula (5) is transformed into
Wherein,for the attitude deviation, equation (8) is changed to
From formula (8) and formula (9) to give
Setting matrixWherein a is1,3~a2,5Representation matrix AiThe corresponding elements in (1); matrix arrayWherein b is1,1~b2,3Is shown in matrix BiTo the corresponding elements in (1). In this embodiment, the same marker is subjected to N-point vision measurement, soThe corresponding matrix is A1,…,AN,B1,…,BNFrom these measurements, a linear regression model is obtained,
wherein, I2An identity matrix of 2 × 2 and noise of
V~N(0,R)
The covariance matrix is
The estimated value of the parameter theta is
The heading deviation ψ (k) can be solved by equation (12).
And 3.5, setting e as a constant, and if the absolute value psi (k) -psi (k-1) | is less than e, obtaining a final estimated value of the relative height And an estimate of course deviation And executing the step four; otherwise, go to step 3.2, substitute the current ψ (k) into the calculation formula of the left and right view heading angle to find outThereby performing an iterative calculation.
Step four, under the condition that the relative altitude and the course deviation are effectively estimated, selecting any target in the visual field of the camera and obtaining the measured value of the target, calculating the true course of the unmanned aerial vehicle by using the obtained course deviation, and further estimating the value according to the true course and the altitudeAnd the three-dimensional accurate positioning of the target is realized.
In particular, it is assumed that the selected target is in the same plane as the marker, i.e. the estimated relative heightThe relative height of the drone to the target, [ x ] can be consideredtytzt]TCoordinates representing the object in an inertial reference frame, { I }, havingLet [ x)pypzp]T=[xtytzt]T,And substituting the measured value of the target and the real course of the unmanned aerial vehicle into the formula (4), and calculating to obtain the coordinate of the target, thereby realizing the three-dimensional positioning of the target.
The effectiveness of the iterative process is described in detail below, taking the circular orbit as an example, the radius R is 73m, the radian rad is 1.5 pi, ψ is 0,1Then applying a data fitting method toIs a dependent variable, and psi is an independent variable, as shown in FIG. 6, to obtainThe mathematical relational expression of (a):
in the same way, let36 groups psi are obtained by solving the equation (10), then a data fitting method is used, psi is taken as a dependent variable,as independent variable, as shown in FIG. 7, obtainIs expressed as:
is provided witheψ=ψ-ψtWherein h istψ t is the true value of the relative altitude and heading deviation, obtained by the equation (9),
is obtained by the formula (10),
wherein k is1,k2Are the relevant parameters.
The relative height calculated by the binocular vision model is substituted into the linear regression equation, so that the course deviation can be effectively calculated. Then, the estimated value of the course deviation is substituted back to the binocular vision model, and the relative altitude can be accurately calculated. Generally, the heading bias of the AHRS system does not exceed 30deg, so there is | k2|>k1> 0, and due to k2< 0, it can be seen from equations (15) and (16) that the relative height estimate is obtained after finite iterationEstimated value of course deviationWill converge to the true value.
Under the condition that unmanned aerial vehicle carries on the camera, flight test has been carried out, and unmanned aerial vehicle carries out image measurement to the marker under the orbit of flying around, and wherein the radius R of orbit of flying around is 73m, and radian rad is 1.5 pi, and unmanned aerial vehicle is relative to the true height h of markert45m, flight speed V3.44 m/s, fGPS4Hz, true value psi of course deviationtThe method provided by the invention has the effects shown in table 1 and table 2 and shown in fig. 8, wherein 30deg is used and e is 0.02 deg. Error e listed in the tableh,eψ,exy,ezAre referred to as root mean square errors.
TABLE 1 iterative procedure
TABLE 2 target location results
Index (I) | Three-dimensional positioning of the invention |
Relative altitude estimation error eh/m | 0.93 |
Heading estimation error eδψ/deg | 1.89 |
Positioning error exy/m | 10.89 |
Positioning error ez/m | 0.43 |
In summary, the above description is only a preferred embodiment of the present invention, and is not intended to limit the protection scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (5)
1. A three-dimensional target positioning method of a rotor unmanned aerial vehicle under a flying-around track is characterized by comprising the following steps:
step one, extracting a static object from an image shot by a rotor unmanned aerial vehicle as a marker;
secondly, the rotor unmanned aerial vehicle performs fly-around by taking the marker as a center, performs N-angle shooting on the marker in the fly-around process, and obtains a measured value of each shot image; n is a positive integer;
step three, grouping the N shot images in pairs, wherein each group of shot images utilizes rotationCalculating the relative height of the unmanned gyroplane relative to the marker by using the binocular vision model of the view, and then taking an average value of N groups as a relative height error in the current iteration turn k
When the relative altitude is calculated, the heading deviation psi (k) required by the binocular vision model of the rotated view is calculated by adopting the heading deviation psi (k-1) obtained by the previous iteration turn k-1, and the heading angle psi (k) required by the binocular vision model is updated to psi (k) ═ psiiPhi (k), where phiiThe course angle of the unmanned aerial vehicle when shooting the ith image is obtained; the initial value psi (0) is taken as 0;
step four, calculating to obtain a course deviation psi (k) of the current iteration turn k by using the relative height error h (k) and the measured values of the N shot images;
step five, judging whether the deviation of psi (k) and psi (k-1) is smaller than a set threshold value, and if so, taking the last iteration result as a relative height estimated valueAnd an estimate of course deviationAnd executing the step six; otherwise, adding 1 to the iteration turn k, and turning to the third step;
step six, calculating the real course of the rotor unmanned aerial vehicle by utilizing the estimated value of course deviation for any target in the visual field of the camera of the rotor unmanned aerial vehicle, and further calculating the real course and the height estimated value according to the real course and the height estimated valueAnd realizing the three-dimensional positioning of the target.
2. The method of claim 1, wherein the relative height h of the drone relative to the marker is a function of the distance between the drone and the markerjAnd j is more than or equal to 1 and less than or equal to n, and the calculation method comprises the following steps:
<mrow> <msub> <mi>h</mi> <mi>j</mi> </msub> <mo>=</mo> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <msup> <msub> <mi>R</mi> <mi>l</mi> </msub> <mi>T</mi> </msup> <mrow> <mo>(</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mn>1</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> </mtable> </mfenced> <mo>)</mo> </mrow> <msub> <mi>MP</mi> <mi>l</mi> </msub> <mo>+</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> </mtable> </mfenced> <msup> <mi>MR</mi> <mi>T</mi> </msup> <msub> <mi>P</mi> <mi>r</mi> </msub> <mo>+</mo> <mi>T</mi> <mo>)</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow>
wherein T ═ Tl-RTTr=Rl(Or-Ol),M=[Pl-RTPrPl×RTPr]-1T,R=RrRl T;Pl、PrPixel positions of the marker in the left view and the right view respectively; r and T are right viewsRotation matrix and translation matrix of camera coordinate system corresponding to diagram relative to camera coordinate system corresponding to left view, Rl,TlUnmanned aerial vehicle shooting points O corresponding to left views respectivelylRotation matrix and translation matrix with respect to an inertial reference frame, Rr,TrUnmanned aerial vehicle shooting points O corresponding to right views respectivelyrA rotation matrix and a translation matrix relative to an inertial reference frame.
3. A method of three-dimensional targeting of a rotary-wing drone according to claim 2, wherein the measurement of the marker pixel locations is by:
identifying the marker image obtained in the step one to obtain a plurality of characteristic points; and identifying each image in the process of flying around to obtain a plurality of characteristic points, matching the characteristic points of each image with the characteristic points of the marker image, and taking the geometric center of the matched points as the pixel position of the marker in the image.
4. The method for three-dimensional target positioning of rotorcraft according to claim 1, wherein in step four, the heading deviation ψ (k) is calculated by:
[xGyGzG]Trepresenting the coordinates of the marker in the inertial reference frame I,is the average value of the relative heights of the unmanned aerial vehicle and the marker, and can be obtained
The ranging model of the camera is as follows:
<mrow> <mfenced open = "(" close = ")"> <mtable> <mtr> <mtd> <msub> <mi>x</mi> <mi>G</mi> </msub> </mtd> </mtr> <mtr> <mtd> <msub> <mi>y</mi> <mi>G</mi> </msub> </mtd> </mtr> </mtable> </mfenced> <mo>=</mo> <mfenced open = "(" close = ")"> <mtable> <mtr> <mtd> <msubsup> <mi>x</mi> <mi>o</mi> <mi>i</mi> </msubsup> </mtd> </mtr> <mtr> <mtd> <msubsup> <mi>y</mi> <mi>o</mi> <mi>i</mi> </msubsup> </mtd> </mtr> </mtable> </mfenced> <mo>+</mo> <mover> <mi>h</mi> <mo>&OverBar;</mo> </mover> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mfrac> <mn>1</mn> <mrow> <mrow> <mo>(</mo> <mrow> <mn>0</mn> <mo>,</mo> <mn>0</mn> <mo>,</mo> <mn>1</mn> </mrow> <mo>)</mo> </mrow> <msubsup> <mi>C</mi> <mi>b</mi> <mi>n</mi> </msubsup> <mfenced open = "(" close = ")"> <mtable> <mtr> <mtd> <msubsup> <mi>x</mi> <mi>f</mi> <mi>i</mi> </msubsup> </mtd> </mtr> <mtr> <mtd> <msubsup> <mi>y</mi> <mi>f</mi> <mi>i</mi> </msubsup> </mtd> </mtr> <mtr> <mtd> <mi>f</mi> </mtd> </mtr> </mtable> </mfenced> </mrow> </mfrac> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mn>1</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> </mtable> </mfenced> <msubsup> <mi>C</mi> <mi>b</mi> <mi>n</mi> </msubsup> <mfenced open = "(" close = ")"> <mtable> <mtr> <mtd> <msubsup> <mi>x</mi> <mi>f</mi> <mi>i</mi> </msubsup> </mtd> </mtr> <mtr> <mtd> <msubsup> <mi>y</mi> <mi>f</mi> <mi>i</mi> </msubsup> </mtd> </mtr> <mtr> <mtd> <mi>f</mi> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow>
wherein,i is more than or equal to 1 and less than or equal to N for the attitude matrix of the unmanned aerial vehicle when the ith image is shot,
<mrow> <msubsup> <mi>C</mi> <mi>b</mi> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mrow> <msub> <mi>cos&psi;</mi> <mi>i</mi> </msub> <msub> <mi>cos&theta;</mi> <mi>i</mi> </msub> </mrow> </mtd> <mtd> <mrow> <msub> <mi>cos&psi;</mi> <mi>i</mi> </msub> <msub> <mi>sin&theta;</mi> <mi>i</mi> </msub> <msub> <mi>sin&phi;</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>sin&psi;</mi> <mi>i</mi> </msub> <msub> <mi>cos&phi;</mi> <mi>i</mi> </msub> </mrow> </mtd> <mtd> <mrow> <msub> <mi>sin&psi;</mi> <mi>i</mi> </msub> <msub> <mi>sin&phi;</mi> <mi>i</mi> </msub> <mo>+</mo> <msub> <mi>cos&psi;</mi> <mi>i</mi> </msub> <msub> <mi>sin&theta;</mi> <mi>i</mi> </msub> <msub> <mi>cos&phi;</mi> <mi>i</mi> </msub> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>sin&psi;</mi> <mi>i</mi> </msub> <msub> <mi>cos&theta;</mi> <mi>i</mi> </msub> </mrow> </mtd> <mtd> <mrow> <msub> <mi>cos&psi;</mi> <mi>i</mi> </msub> <msub> <mi>cos&phi;</mi> <mi>i</mi> </msub> <mo>+</mo> <msub> <mi>sin&psi;</mi> <mi>i</mi> </msub> <msub> <mi>sin&theta;</mi> <mi>i</mi> </msub> <msub> <mi>sin&phi;</mi> <mi>i</mi> </msub> </mrow> </mtd> <mtd> <mrow> <msub> <mi>sin&psi;</mi> <mi>i</mi> </msub> <msub> <mi>sin&theta;</mi> <mi>i</mi> </msub> <msub> <mi>cos&phi;</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi>cos&psi;</mi> <mi>i</mi> </msub> <msub> <mi>sin&phi;</mi> <mi>i</mi> </msub> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>-</mo> <msub> <mi>sin&theta;</mi> <mi>i</mi> </msub> </mrow> </mtd> <mtd> <mrow> <msub> <mi>sin&phi;</mi> <mi>i</mi> </msub> <msub> <mi>cos&theta;</mi> <mi>i</mi> </msub> </mrow> </mtd> <mtd> <mrow> <msub> <mi>cos&phi;</mi> <mi>i</mi> </msub> <msub> <mi>cos&theta;</mi> <mi>i</mi> </msub> </mrow> </mtd> </mtr> </mtable> </mfenced> </mrow>
wherein,and (psi)i,θi,φi) Shoot point O for unmanned aerial vehicle when shooting ith imageiPosition and attitude,. psi, in an inertial reference system { I }i,θi,φiRespectively an azimuth angle, a pitch angle and a roll angle,is the pixel position of the marker in the ith image.
Let parameter θ be ═ θa,θb]T,θa=[xG,yG]T,θb=ψ(k),The measurement equation set is
<mrow> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <msub> <mi>z</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>=</mo> <msub> <mi>y</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>v</mi> <mn>1</mn> </msub> <mo>,</mo> <msub> <mi>v</mi> <mn>1</mn> </msub> <mo>~</mo> <mi>N</mi> <mrow> <mo>(</mo> <mn>0</mn> <mo>,</mo> <msub> <mi>R</mi> <mn>1</mn> </msub> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msubsup> <mi>C</mi> <mrow> <mi>b</mi> <mi>c</mi> </mrow> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>&ap;</mo> <msubsup> <mi>C</mi> <mi>b</mi> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>+</mo> <msubsup> <mi>&delta;C</mi> <mi>b</mi> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <msub> <mi>&theta;</mi> <mi>b</mi> </msub> <mo>+</mo> <msub> <mi>v</mi> <mn>2</mn> </msub> <mo>)</mo> </mrow> <mo>,</mo> <msub> <mi>v</mi> <mn>2</mn> </msub> <mo>~</mo> <mi>N</mi> <mrow> <mo>(</mo> <mn>0</mn> <mo>,</mo> <msub> <mi>R</mi> <mn>2</mn> </msub> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </mrow>
Wherein v is1,v2To measure noise, R1,R2For a true symmetric positive definite matrix, the formula (3) is transformed into
<mrow> <msub> <mi>&theta;</mi> <mi>a</mi> </msub> <mo>=</mo> <mi>f</mi> <mrow> <mo>(</mo> <msub> <mi>z</mi> <mn>1</mn> </msub> <mo>(</mo> <mi>i</mi> <mo>)</mo> <mo>-</mo> <msub> <mi>v</mi> <mn>1</mn> </msub> <mo>,</mo> <msubsup> <mi>C</mi> <mrow> <mi>b</mi> <mi>c</mi> </mrow> <mi>n</mi> </msubsup> <mo>(</mo> <mi>i</mi> <mo>)</mo> <mo>-</mo> <msubsup> <mi>&delta;C</mi> <mi>b</mi> <mi>n</mi> </msubsup> <mo>(</mo> <mi>i</mi> <mo>)</mo> <mo>(</mo> <mrow> <msub> <mi>&theta;</mi> <mi>b</mi> </msub> <mo>+</mo> <msub> <mi>v</mi> <mn>2</mn> </msub> </mrow> <mo>)</mo> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>4</mn> <mo>)</mo> </mrow> </mrow>
WhereinFor the attitude deviation, the equation (4) is changed to
<mrow> <mtable> <mtr> <mtd> <mrow> <mi>f</mi> <mrow> <mo>(</mo> <mrow> <msub> <mi>z</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>-</mo> <msub> <mi>v</mi> <mn>1</mn> </msub> <mo>,</mo> <msubsup> <mi>C</mi> <mrow> <mi>b</mi> <mi>c</mi> </mrow> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>-</mo> <msubsup> <mi>&delta;C</mi> <mi>b</mi> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mrow> <msub> <mi>&theta;</mi> <mi>b</mi> </msub> <mo>+</mo> <msub> <mi>v</mi> <mn>2</mn> </msub> </mrow> <mo>)</mo> </mrow> </mrow> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>&ap;</mo> <mi>f</mi> <mrow> <mo>(</mo> <mrow> <msub> <mi>z</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>,</mo> <msubsup> <mi>C</mi> <mrow> <mi>b</mi> <mi>c</mi> </mrow> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> </mrow> <mo>)</mo> </mrow> <mo>-</mo> <mfrac> <mrow> <mo>&part;</mo> <mi>f</mi> </mrow> <mrow> <mo>&part;</mo> <msub> <mi>y</mi> <mn>1</mn> </msub> </mrow> </mfrac> <msub> <mo>|</mo> <mrow> <msub> <mi>z</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>,</mo> <msubsup> <mi>C</mi> <mrow> <mi>b</mi> <mi>c</mi> </mrow> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>.</mo> <msub> <mi>v</mi> <mn>1</mn> </msub> <mo>-</mo> <mfrac> <mrow> <mo>&part;</mo> <mi>f</mi> </mrow> <mrow> <mo>&part;</mo> <msub> <mi>&theta;</mi> <mi>b</mi> </msub> </mrow> </mfrac> <msub> <mo>|</mo> <mrow> <msub> <mi>z</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>,</mo> <msubsup> <mi>C</mi> <mrow> <mi>b</mi> <mi>c</mi> </mrow> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>.</mo> <msub> <mi>v</mi> <mn>2</mn> </msub> <mo>-</mo> <mfrac> <mrow> <mo>&part;</mo> <mi>f</mi> </mrow> <mrow> <mo>&part;</mo> <msub> <mi>&theta;</mi> <mi>b</mi> </msub> </mrow> </mfrac> <msub> <mo>|</mo> <mrow> <msub> <mi>z</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>,</mo> <msubsup> <mi>C</mi> <mrow> <mi>b</mi> <mi>c</mi> </mrow> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>.</mo> <msub> <mi>&theta;</mi> <mi>b</mi> </msub> </mrow> </mtd> </mtr> </mtable> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>5</mn> <mo>)</mo> </mrow> </mrow>
From formula (46) and formula (5) to give
<mrow> <mi>f</mi> <mrow> <mo>(</mo> <msub> <mi>z</mi> <mn>1</mn> </msub> <mo>(</mo> <mi>i</mi> <mo>)</mo> <mo>,</mo> <msubsup> <mi>C</mi> <mrow> <mi>b</mi> <mi>c</mi> </mrow> <mi>n</mi> </msubsup> <mo>(</mo> <mi>i</mi> <mo>)</mo> <mo>)</mo> </mrow> <mo>&ap;</mo> <msub> <mi>&theta;</mi> <mi>a</mi> </msub> <mo>+</mo> <mfrac> <mrow> <mo>&part;</mo> <mi>f</mi> </mrow> <mrow> <mo>&part;</mo> <msub> <mi>&theta;</mi> <mi>b</mi> </msub> </mrow> </mfrac> <msub> <mo>|</mo> <mrow> <msub> <mi>z</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>,</mo> <msubsup> <mi>C</mi> <mrow> <mi>b</mi> <mi>c</mi> </mrow> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>.</mo> <msub> <mi>&theta;</mi> <mi>b</mi> </msub> <mo>+</mo> <mfrac> <mrow> <mo>&part;</mo> <mi>f</mi> </mrow> <mrow> <mo>&part;</mo> <msub> <mi>y</mi> <mn>1</mn> </msub> </mrow> </mfrac> <msub> <mo>|</mo> <mrow> <msub> <mi>z</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>,</mo> <msubsup> <mi>C</mi> <mrow> <mi>b</mi> <mi>c</mi> </mrow> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>.</mo> <msub> <mi>v</mi> <mn>1</mn> </msub> <mo>+</mo> <mfrac> <mrow> <mo>&part;</mo> <mi>f</mi> </mrow> <mrow> <mo>&part;</mo> <msub> <mi>&theta;</mi> <mi>b</mi> </msub> </mrow> </mfrac> <msub> <mo>|</mo> <mrow> <msub> <mi>z</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>,</mo> <msubsup> <mi>C</mi> <mrow> <mi>b</mi> <mi>c</mi> </mrow> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> </mrow> </msub> <mo>.</mo> <msub> <mi>v</mi> <mn>2</mn> </msub> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>6</mn> <mo>)</mo> </mrow> </mrow>
Setting matrixWherein a is1,3~a2,5Representation matrix AiThe corresponding elements in (1);
matrix arrayWherein b is1,1~b2,3Is shown in matrix BiThe corresponding elements in (1); a linear regression model was obtained as follows,
<mrow> <mi>f</mi> <mrow> <mo>(</mo> <msub> <mi>z</mi> <mn>1</mn> </msub> <mo>(</mo> <mi>i</mi> <mo>)</mo> <mo>,</mo> <msubsup> <mi>C</mi> <mrow> <mi>b</mi> <mi>c</mi> </mrow> <mi>n</mi> </msubsup> <mo>(</mo> <mi>i</mi> <mo>)</mo> <mo>)</mo> </mrow> <mo>=</mo> <mo>&lsqb;</mo> <msub> <mi>I</mi> <mn>2</mn> </msub> <mo>,</mo> <msub> <mi>B</mi> <mi>i</mi> </msub> <mo>&rsqb;</mo> <mi>&theta;</mi> <mo>+</mo> <mi>V</mi> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>7</mn> <mo>)</mo> </mrow> </mrow>
wherein, I2An identity matrix of 2 × 2, noise V N (0, R)
The covariance matrix is
<mrow> <mi>R</mi> <mo>=</mo> <mi>d</mi> <mi>i</mi> <mi>a</mi> <mi>g</mi> <mrow> <mo>(</mo> <msubsup> <mrow> <mo>{</mo> <msub> <mi>A</mi> <mi>i</mi> </msub> <msub> <mi>R</mi> <mn>1</mn> </msub> <msup> <msub> <mi>A</mi> <mi>i</mi> </msub> <mi>T</mi> </msup> <mo>+</mo> <msub> <mi>B</mi> <mi>i</mi> </msub> <msub> <mi>R</mi> <mn>2</mn> </msub> <msup> <msub> <mi>B</mi> <mi>i</mi> </msub> <mi>T</mi> </msup> <mo>}</mo> </mrow> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>N</mi> </msubsup> <mo>)</mo> </mrow> </mrow>
The estimated value of the parameter theta is
<mrow> <mtable> <mtr> <mtd> <mrow> <mover> <mi>&theta;</mi> <mo>^</mo> </mover> <mo>=</mo> <msup> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mi>x</mi> <mi>G</mi> </msub> </mtd> <mtd> <msub> <mi>y</mi> <mi>G</mi> </msub> </mtd> <mtd> <mrow> <mi>&delta;</mi> <mi>&psi;</mi> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> </mtable> </mfenced> <mi>T</mi> </msup> <mo>=</mo> <msup> <mrow> <mo>&lsqb;</mo> <mrow> <munderover> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>N</mi> </munderover> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mi>I</mi> <mn>2</mn> </msub> </mtd> </mtr> <mtr> <mtd> <msubsup> <mi>B</mi> <mi>i</mi> <mi>T</mi> </msubsup> </mtd> </mtr> </mtable> </mfenced> <msup> <mrow> <mo>(</mo> <mrow> <msub> <mi>A</mi> <mi>i</mi> </msub> <msub> <mi>R</mi> <mn>1</mn> </msub> <msubsup> <mi>A</mi> <mi>i</mi> <mi>T</mi> </msubsup> <mo>+</mo> <msub> <mi>B</mi> <mi>i</mi> </msub> <msub> <mi>R</mi> <mn>2</mn> </msub> <msubsup> <mi>B</mi> <mi>i</mi> <mi>T</mi> </msubsup> </mrow> <mo>)</mo> </mrow> <mrow> <mo>-</mo> <mn>1</mn> </mrow> </msup> <mrow> <mo>&lsqb;</mo> <mrow> <msub> <mi>I</mi> <mn>2</mn> </msub> <mo>,</mo> <msub> <mi>B</mi> <mi>i</mi> </msub> </mrow> <mo>&rsqb;</mo> </mrow> </mrow> <mo>&rsqb;</mo> </mrow> <mrow> <mo>-</mo> <mn>1</mn> </mrow> </msup> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mo>&times;</mo> <munderover> <mi>&Sigma;</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>N</mi> </munderover> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <msub> <mi>I</mi> <mn>2</mn> </msub> </mtd> </mtr> <mtr> <mtd> <msubsup> <mi>B</mi> <mi>i</mi> <mi>T</mi> </msubsup> </mtd> </mtr> </mtable> </mfenced> <msup> <mrow> <mo>(</mo> <mrow> <msub> <mi>A</mi> <mi>i</mi> </msub> <msub> <mi>R</mi> <mn>1</mn> </msub> <msubsup> <mi>A</mi> <mi>i</mi> <mi>T</mi> </msubsup> <mo>+</mo> <msub> <mi>B</mi> <mi>i</mi> </msub> <msub> <mi>R</mi> <mn>2</mn> </msub> <msubsup> <mi>B</mi> <mi>i</mi> <mi>T</mi> </msubsup> </mrow> <mo>)</mo> </mrow> <mrow> <mo>-</mo> <mn>1</mn> </mrow> </msup> <mi>f</mi> <mrow> <mo>(</mo> <mrow> <msub> <mi>z</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> <mo>,</mo> <msubsup> <mi>C</mi> <mrow> <mi>b</mi> <mi>c</mi> </mrow> <mi>n</mi> </msubsup> <mrow> <mo>(</mo> <mi>i</mi> <mo>)</mo> </mrow> </mrow> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> </mtable> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>8</mn> <mo>)</mo> </mrow> </mrow>
The heading deviation ψ (k) can be solved by equation (8).
5. The method of claim 1, wherein step three involves eliminating gross errors of relative height using a 3 σ criterion prior to averaging the N sets.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610943473.4A CN107300377B (en) | 2016-11-01 | 2016-11-01 | A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610943473.4A CN107300377B (en) | 2016-11-01 | 2016-11-01 | A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107300377A true CN107300377A (en) | 2017-10-27 |
CN107300377B CN107300377B (en) | 2019-06-14 |
Family
ID=60138055
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610943473.4A Active CN107300377B (en) | 2016-11-01 | 2016-11-01 | A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107300377B (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109708622A (en) * | 2017-12-15 | 2019-05-03 | 福建工程学院 | The method that three-dimensional modeling is carried out to building using unmanned plane based on Pixhawk |
WO2019165612A1 (en) * | 2018-02-28 | 2019-09-06 | 深圳市大疆创新科技有限公司 | Method for positioning a movable platform, and related device and system |
CN110632941A (en) * | 2019-09-25 | 2019-12-31 | 北京理工大学 | Trajectory generation method for target tracking of unmanned aerial vehicle in complex environment |
CN110675453A (en) * | 2019-10-16 | 2020-01-10 | 北京天睿空间科技股份有限公司 | Self-positioning method for moving target in known scene |
CN110799921A (en) * | 2018-07-18 | 2020-02-14 | 深圳市大疆创新科技有限公司 | Shooting method and device and unmanned aerial vehicle |
CN110824295A (en) * | 2019-10-22 | 2020-02-21 | 广东电网有限责任公司 | Infrared thermal image fault positioning method based on three-dimensional graph |
CN112567201A (en) * | 2018-08-21 | 2021-03-26 | 深圳市大疆创新科技有限公司 | Distance measuring method and apparatus |
CN113469139A (en) * | 2021-07-30 | 2021-10-01 | 广州中科智云科技有限公司 | Data security transmission method and system for unmanned aerial vehicle edge side embedded AI chip |
CN115272892A (en) * | 2022-07-29 | 2022-11-01 | 同济大学 | Unmanned aerial vehicle positioning deviation monitoring management and control system based on data analysis |
CN117452831A (en) * | 2023-12-26 | 2024-01-26 | 南京信息工程大学 | Four-rotor unmanned aerial vehicle control method, device, system and storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10221072A (en) * | 1997-02-03 | 1998-08-21 | Asahi Optical Co Ltd | System and method for photogrammetry |
JP2003083744A (en) * | 2001-09-12 | 2003-03-19 | Starlabo Corp | Imaging apparatus mounted to aircraft, and aircraft imaging data processing apparatus |
CN102519434A (en) * | 2011-12-08 | 2012-06-27 | 北京控制工程研究所 | Test verification method for measuring precision of stereoscopic vision three-dimensional recovery data |
CN105424006A (en) * | 2015-11-02 | 2016-03-23 | 国网山东省电力公司电力科学研究院 | Unmanned aerial vehicle hovering precision measurement method based on binocular vision |
-
2016
- 2016-11-01 CN CN201610943473.4A patent/CN107300377B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10221072A (en) * | 1997-02-03 | 1998-08-21 | Asahi Optical Co Ltd | System and method for photogrammetry |
JP2003083744A (en) * | 2001-09-12 | 2003-03-19 | Starlabo Corp | Imaging apparatus mounted to aircraft, and aircraft imaging data processing apparatus |
CN102519434A (en) * | 2011-12-08 | 2012-06-27 | 北京控制工程研究所 | Test verification method for measuring precision of stereoscopic vision three-dimensional recovery data |
CN105424006A (en) * | 2015-11-02 | 2016-03-23 | 国网山东省电力公司电力科学研究院 | Unmanned aerial vehicle hovering precision measurement method based on binocular vision |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109708622A (en) * | 2017-12-15 | 2019-05-03 | 福建工程学院 | The method that three-dimensional modeling is carried out to building using unmanned plane based on Pixhawk |
WO2019165612A1 (en) * | 2018-02-28 | 2019-09-06 | 深圳市大疆创新科技有限公司 | Method for positioning a movable platform, and related device and system |
CN110799921A (en) * | 2018-07-18 | 2020-02-14 | 深圳市大疆创新科技有限公司 | Shooting method and device and unmanned aerial vehicle |
CN112567201A (en) * | 2018-08-21 | 2021-03-26 | 深圳市大疆创新科技有限公司 | Distance measuring method and apparatus |
CN112567201B (en) * | 2018-08-21 | 2024-04-16 | 深圳市大疆创新科技有限公司 | Distance measuring method and device |
CN110632941A (en) * | 2019-09-25 | 2019-12-31 | 北京理工大学 | Trajectory generation method for target tracking of unmanned aerial vehicle in complex environment |
CN110675453A (en) * | 2019-10-16 | 2020-01-10 | 北京天睿空间科技股份有限公司 | Self-positioning method for moving target in known scene |
CN110675453B (en) * | 2019-10-16 | 2021-04-13 | 北京天睿空间科技股份有限公司 | Self-positioning method for moving target in known scene |
CN110824295B (en) * | 2019-10-22 | 2021-08-31 | 广东电网有限责任公司 | Infrared thermal image fault positioning method based on three-dimensional graph |
CN110824295A (en) * | 2019-10-22 | 2020-02-21 | 广东电网有限责任公司 | Infrared thermal image fault positioning method based on three-dimensional graph |
CN113469139A (en) * | 2021-07-30 | 2021-10-01 | 广州中科智云科技有限公司 | Data security transmission method and system for unmanned aerial vehicle edge side embedded AI chip |
CN113469139B (en) * | 2021-07-30 | 2022-04-05 | 广州中科智云科技有限公司 | Data security transmission method and system for unmanned aerial vehicle edge side embedded AI chip |
CN115272892A (en) * | 2022-07-29 | 2022-11-01 | 同济大学 | Unmanned aerial vehicle positioning deviation monitoring management and control system based on data analysis |
CN117452831A (en) * | 2023-12-26 | 2024-01-26 | 南京信息工程大学 | Four-rotor unmanned aerial vehicle control method, device, system and storage medium |
CN117452831B (en) * | 2023-12-26 | 2024-03-19 | 南京信息工程大学 | Four-rotor unmanned aerial vehicle control method, device, system and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN107300377B (en) | 2019-06-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107300377B (en) | A kind of rotor wing unmanned aerial vehicle objective localization method under track of being diversion | |
CN106153008B (en) | A kind of rotor wing unmanned aerial vehicle objective localization method of view-based access control model | |
EP3158417B1 (en) | Sensor fusion using inertial and image sensors | |
EP3158412B1 (en) | Sensor fusion using inertial and image sensors | |
EP3158293B1 (en) | Sensor fusion using inertial and image sensors | |
CN107314771B (en) | Unmanned aerial vehicle positioning and attitude angle measuring method based on coding mark points | |
CN109683629B (en) | Unmanned aerial vehicle electric power overhead line system based on combination navigation and computer vision | |
Küng et al. | The accuracy of automatic photogrammetric techniques on ultra-light UAV imagery | |
CN105335733B (en) | Unmanned aerial vehicle autonomous landing visual positioning method and system | |
CN112037260B (en) | Position estimation method and device for tracking target and unmanned aerial vehicle | |
CN105644785B (en) | A kind of UAV Landing method detected based on optical flow method and horizon | |
EP3734394A1 (en) | Sensor fusion using inertial and image sensors | |
CN108955685B (en) | Refueling aircraft taper sleeve pose measuring method based on stereoscopic vision | |
CN106155081B (en) | A kind of a wide range of target monitoring of rotor wing unmanned aerial vehicle and accurate positioning method | |
CN105549614A (en) | Target tracking method of unmanned plane | |
CN105698762A (en) | Rapid target positioning method based on observation points at different time on single airplane flight path | |
CN107490364A (en) | A kind of wide-angle tilt is imaged aerial camera object positioning method | |
CN108665499A (en) | A kind of low coverage aircraft pose measuring method based on parallax method | |
US9816786B2 (en) | Method for automatically generating a three-dimensional reference model as terrain information for an imaging device | |
Yeromina et al. | Method of reference image selection to provide high-speed aircraft navigation under conditions of rapid change of flight trajectory | |
CN108007437B (en) | Method for measuring farmland boundary and internal obstacles based on multi-rotor aircraft | |
CN108445900A (en) | A kind of unmanned plane vision positioning replacement differential technique | |
CN115144879B (en) | Multi-machine multi-target dynamic positioning system and method | |
CN105389819B (en) | A kind of lower visible image method for correcting polar line of half calibration and system of robust | |
CN115272458A (en) | Visual positioning method for fixed wing unmanned aerial vehicle in landing stage |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |