CN115357049B - Visual-based unmanned aerial vehicle non-cooperative target limited time tracking method and system - Google Patents
Visual-based unmanned aerial vehicle non-cooperative target limited time tracking method and system Download PDFInfo
- Publication number
- CN115357049B CN115357049B CN202211115584.8A CN202211115584A CN115357049B CN 115357049 B CN115357049 B CN 115357049B CN 202211115584 A CN202211115584 A CN 202211115584A CN 115357049 B CN115357049 B CN 115357049B
- Authority
- CN
- China
- Prior art keywords
- cooperative target
- equation
- relative
- tracker
- unmanned aerial
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 40
- 230000000007 visual effect Effects 0.000 title claims abstract description 22
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 16
- 230000001133 acceleration Effects 0.000 claims description 13
- 238000004590 computer program Methods 0.000 claims description 6
- 230000005484 gravity Effects 0.000 claims description 5
- 238000003384 imaging method Methods 0.000 claims description 4
- 238000004891 communication Methods 0.000 abstract description 6
- 230000006870 function Effects 0.000 description 8
- 230000008859 change Effects 0.000 description 5
- 238000013461 design Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000004088 simulation Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000006641 stabilisation Effects 0.000 description 2
- 238000011105 stabilization Methods 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 230000001052 transient effect Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000013178 mathematical model Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000035515 penetration Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 230000009131 signaling function Effects 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention belongs to the technical field of tracking control, and provides a visual-based unmanned aerial vehicle non-cooperative target finite time tracking method and a visual-based unmanned aerial vehicle non-cooperative target finite time tracking system, wherein firstly, a tracker and a motion equation and a relative dynamics equation of a non-cooperative target are performed according to image information; then, according to the motion equation and the relative dynamics equation, a proportional relative dynamics equation of the tracker and the non-cooperative target is obtained; finally, according to a proportional relative dynamics equation, an integral sliding mode algorithm is adopted, and a finite time controller for controlling the unmanned aerial vehicle is established; the relative position between the target and the follower can be measured directly; compared with the traditional tracking control algorithm, the method adopts the monocular sensor with low cost and low power consumption as a measuring tool, and can be applied to indoor or suburban areas without communication or signal difference.
Description
Technical Field
The invention belongs to the technical field of tracking control, and particularly relates to a visual-based unmanned aerial vehicle non-cooperative target finite time tracking method and system.
Background
Unmanned aerial vehicles today have very high maneuverability, low purchase and maintenance costs and excellent vertical lift capabilities, and are considered to be the most popular autopilot aerial platform even in harsh environments. In addition, rapid advances in navigation, sensory sensors, and high performance battery technology have significantly increased the cruising and loading capabilities of unmanned aerial vehicles, making them ideal platforms for performing various tasks, such as search and rescue, area coverage, surveillance, object transportation, and smart agriculture. In the process of executing tasks by the unmanned aerial vehicle, target tracking is a working foundation, and can help a user to set a target object as a focus of the unmanned aerial vehicle, so that the following target object is observed, and target tracking control is performed. In addition, in the target tracking process, the target tracking method can be divided into a cooperative target and a non-cooperative target, wherein the non-cooperative target has no communication, and the non-cooperative target avoids the tracking of the unmanned aerial vehicle and can generate stronger maneuvering change. However, most of tracking control methods of the existing unmanned aerial vehicle target tracking system only consider the target state at the current moment, and lack an active mechanism to cope with the avoidance behavior of the target.
Many navigation sensors, such as GPS and INS, are used during autonomous multi-rotor operation, and such methods are primarily directed to cooperative targets, i.e., target information is obtained by communicating with the target. For non-cooperative targets, visual sensors are also employed to obtain target information. Due to the characteristics of light weight, small volume, passivity, low power consumption and the like, the camera is of great importance in unmanned aerial vehicle motion control so as to be capable of accurately monitoring and tracking the region and the target of interest. In the case of unmanned flight systems, such as quad-rotor aircraft, this can be easily achieved by mounting the camera sensors directly on the robot, thus forming a so-called "in-the-eye" system.
The inventors found that with the continued penetration of computer technology and vision research, image processing has matured gradually, and various control methods have been developed, and that vision servo control has exhibited great value in the development and application of robot systems, etc. Regarding visual servoing, four main categories may be defined: position-based visual servoing (PBVS), wherein the selected control error is defined in cartesian space; image-based visual servoing (IBVS), wherein a control error function is defined in image space; 2-1/2 or hybrid visual servoing, wherein the control error function is defined partly in cartesian and image space; direct visual servoing does not require specific features to be extracted, but rather utilizes the complete image in the control design. The above methods have advantages and disadvantages, and the efficacy thereof depends largely on the application requirements. With the above, the visual sensor cannot directly measure the relative position between the target and the follower, and other methods are needed to be adopted for estimation, and radar, an estimator or the like are needed to be arranged for implementation.
Disclosure of Invention
The invention provides a visual unmanned aerial vehicle non-cooperative target finite time tracking method and a visual unmanned aerial vehicle non-cooperative target finite time tracking system based on a monocular sensor, wherein the monocular sensor is fixed on a universal joint of an unmanned aerial vehicle under the condition of lack of communication, the unmanned aerial vehicle obtains image information of a target through the monocular sensor, and a position model of the unmanned aerial vehicle and the target is established according to the change of the image; on the basis, a limited time controller is designed by utilizing an integral sliding mode algorithm, so that the unmanned aerial vehicle reaches or intercepts a target in a limited time.
In order to achieve the above object, the present invention is realized by the following technical scheme:
in a first aspect, the present invention provides a vision-based unmanned aerial vehicle non-cooperative target finite time tracking method, including:
acquiring image information of a non-cooperative target;
according to the image information, a tracker and a motion equation of a non-cooperative target;
according to the image information, a tracker and a relative dynamics equation of a non-cooperative target;
obtaining a proportional relative dynamic equation of the tracker and the non-cooperative target according to the motion equation and the relative dynamic equation;
according to a relative proportion dynamics equation, an integral sliding mode algorithm is adopted to establish a finite time controller;
and controlling the tracker by using the established limited time controller.
Further, the geometric center of the non-cooperative target is defined as a characteristic point; according to the feature points, establishing a motion equation of a non-cooperative target and a motion equation of a tracker under an earth coordinate system; the difference between the motion equation of the non-cooperative target and the motion equation of the tracker is the relative dynamics equation of the tracker and the non-cooperative target.
Further, the azimuth angle and the elevation angle of the non-cooperative target relative to the follower are obtained according to the triangulation theorem through the size of the non-cooperative target on the image plane, the coordinates of the gravity center of the non-cooperative target on the image plane and the focal length of the unmanned aerial vehicle-mounted sensor when the image information of the non-cooperative target is obtained.
Further, according to the azimuth angle and elevation angle of the non-cooperative target relative to the follower, a relative position expression of the non-cooperative target relative to the follower under the machine body coordinate system is obtained; converting the relative position expression of the non-cooperative target relative to the follower under the machine body coordinate system to obtain a relative position expression D of the non-cooperative target relative to the follower under the earth coordinate system;
according to the principle of pinhole imaging, the following is obtained:
f is the focal length of the unmanned aerial vehicle sensor; d is the actual size of the non-cooperative target; d, d 1 The size in the image plane for the non-cooperative target; (y) 1 ,z 1 ) Is the coordinates of the center of gravity of the non-cooperative target in the image plane.
Further, the proportional relative position equation is obtained as a ratio of the difference of the acceleration of the non-cooperative target minus the acceleration of the tracker to the actual size of the non-cooperative target.
Further, the finite time controller is:
wherein d is the actual size of the non-cooperative target; s is(s) i Is an integral sliding mode surface; v i Relative speed for the corresponding i-axis; a epsilon (0, 1); b e (1, ++); r is (r) i 、δ i 、k 1i 、k 2i And alpha i Is a positive number; c.epsilon.0, 1.
Further, a monocular sensor is used to acquire image information of a non-cooperative target.
In a second aspect, the present invention also provides a vision-based unmanned aerial vehicle non-cooperative target finite time tracking system, including:
a data acquisition module configured to: acquiring image information of a non-cooperative target;
the equation of motion establishment module is configured to: according to the image information, a tracker and a motion equation of a non-cooperative target;
a relative dynamics equation establishment module configured to: according to the image information, a tracker and a relative dynamics equation of a non-cooperative target;
a proportional relative kinetic equation establishment module configured to: obtaining a proportional relative dynamic equation of the tracker and the non-cooperative target according to the motion equation and the relative dynamic equation;
a controller establishment module configured to: according to a proportional relative dynamics equation, an integral sliding mode algorithm is adopted to establish a finite time controller;
a control module configured to: and controlling the tracker by using the established limited time controller.
In a third aspect, the present invention also provides a computer readable storage medium having stored thereon a computer program which when executed by a processor implements the steps of the vision-based unmanned aerial vehicle non-cooperative target finite time tracking method of the first aspect.
In a fourth aspect, the present invention also provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the vision-based unmanned aerial vehicle non-cooperative target finite time tracking method of the first aspect when the program is executed.
Compared with the prior art, the invention has the beneficial effects that:
1. according to the image information, firstly, a tracker and a motion equation and a relative dynamics equation of a non-cooperative target are performed; then, according to the motion equation and the relative dynamics equation, a proportional relative dynamics equation of the tracker and the non-cooperative target is obtained; finally, according to a proportional relative dynamic equation, an integral sliding mode algorithm is adopted to establish a finite time controller for controlling the unmanned aerial vehicle; by the method, the limited time controller can be built only by acquiring the image information in the visual sensor, so that the application of radars, estimators and the like is avoided, and the cost is saved;
2. compared with the traditional tracking control algorithm, the invention adopts the monocular sensor with low cost and small power consumption as a measuring tool, and can be applied to indoor or suburban areas without communication or signal difference;
3. the target of the invention can be non-cooperative, namely the maneuvering is random but the acceleration is bounded, under the condition of no communication, the monocular sensor is adopted to obtain the target image information, and the proportional relative position model is established according to the image change, so that the defect that the monocular sensor cannot directly measure the relative position is overcome, and the problem of tracking the non-cooperative target is realized;
4. according to the invention, the transient process of unmanned aerial vehicle non-cooperative target tracking is considered, an integral sliding mode surface is designed, and the sliding mode control method is utilized to design system input, so that the continuity of a system controller is ensured, and buffeting is relieved;
5. the invention adopts the method based on limited time stabilization, which is different from the traditional progressive stability theory that the time interval tends to infinity, and the method based on limited time stabilization more effectively controls the actual system, because the time of the method can not tend to infinity in real life.
Drawings
The accompanying drawings, which are included to provide a further understanding of the embodiments and are incorporated in and constitute a part of this specification, illustrate and explain the embodiments and together with the description serve to explain the embodiments.
FIG. 1 is a flow chart of embodiment 1 of the present invention;
FIG. 2 is a model of the relative position between a follower and a target in the earth coordinate system according to embodiment 1 of the present invention;
FIG. 3 is an imaging model of a non-cooperative target under a monocular sensor according to embodiment 1 of the present invention;
fig. 4 is a motion trace of the unmanned plane and the non-cooperative target in a two-dimensional plane according to embodiment 1 of the present invention;
fig. 5 is a motion trace of the unmanned plane and the non-cooperative target in a three-dimensional plane according to embodiment 1 of the present invention;
fig. 6 is a diagram showing the relative positions of the unmanned aerial vehicle and the non-cooperative targets according to embodiment 1 of the present invention;
fig. 7 is a graph showing the relative speeds of the unmanned aerial vehicle and the non-cooperative targets according to embodiment 1 of the present invention.
Detailed Description
The invention will be further described with reference to the drawings and examples.
It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the present application. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.
Example 1:
as shown in fig. 1, the present embodiment provides a visual-based unmanned aerial vehicle non-cooperative target finite time tracking method, firstly, acquiring image information of a non-cooperative target; according to the image information, a tracker and a motion equation of a non-cooperative target; according to the image information, a tracker and a relative dynamics equation of a non-cooperative target; then, according to the motion equation and the relative dynamics equation, a proportional relative dynamics equation of the tracker and the non-cooperative target is obtained; finally, according to a proportional relative dynamic equation, an integral sliding mode algorithm is adopted to establish a finite time controller; the tracker is controlled by using the established limited time controller, and it can be understood that the tracker in the implementation is an unmanned aerial vehicle needing to be controlled. The method comprises the following specific steps:
s1, selecting a non-cooperative target;
s2, after the target is determined, writing a motion equation of the follower and the target according to the information of the follower and the target which are obtained preliminarily, and establishing a relative position model;
s3, acquiring the size of the non-cooperative target in an image plane and the barycentric coordinates of the non-cooperative target through a monocular sensor of the unmanned aerial vehicle, and continuously acquiring the image information of the non-cooperative target in real time;
s4, according to the image information change of the non-cooperative target, a tracker and a relative position model of the proportion of the non-cooperative target are obtained;
s5, designing a limited time controller by adopting an integral sliding mode algorithm according to the dynamics model established in the step S4, and enabling the unmanned aerial vehicle to reach or intercept the non-cooperative target in limited time.
The step S1 specifically includes: selecting a target, receiving and displaying an image returned by the unmanned aerial vehicle on line by a ground station computer, and running a target detection algorithm or manually selecting a tracking target to obtain an initial target bounding box area;
the step S2 specifically includes:
s2.1, using a visual algorithm to determine the geometric center of a non-cooperative target as a characteristic point;
s2.2, according to the flying height of the unmanned aerial vehicle, the position and the size of the camera internal reference and the non-cooperative target in the geometric center of the image plane, and the bounded acceleration, the motion equation of the target and the motion equation of the follower under the earth coordinate system O= (N, E, D) are expressed as:
position D of the target G (t) acceleration of a G (t), the equation of motion is as follows:
wherein,the initial velocity vector is the target; />Is the initial position vector of the target.
The position of the tracker is D F (t) acceleration of a F (t), the equation of motion is as follows:
wherein,initial velocity vector for tracker; />Is the tracker initial position vector.
Thereby establishing a relative position model between the follower and the target as follows:
D(t)=D G (t)-D F (t)
the control objective herein isWhen the time tends to +.>When the relative position tends to the origin, i.e. the follower reaches (intercepts) the target in a limited time.
The step S3 specifically includes: acquiring the size d of a non-cooperative target on an image plane through a monocular sensor of an unmanned aerial vehicle 1 And the coordinates (y) of its center of gravity in the image plane 1 ,z 1 ) The focal length f and the range d of the actual size of the unmanned aerial vehicle on-board sensor are known min ≤d≤d max . From which the azimuth χ and the elevation of the target with respect to the follower are obtained according to the triangulation theoremThe specific expression is:
the step S4 specifically includes:
and S4.1, providing target information by a monocular sensor. The monocular sensor is arranged on a cradle head on the unmanned aerial vehicle, and the optical axis of the monocular sensor is parallel to the body coordinate system O of the unmanned aerial vehicle B =(X B ,Y B ,Z B ) X of (2) B A shaft.
Based on information provided by monocular sensors, e.g. azimuth χ and elevation of the target relative to the followerFrom which the relative positional expression in the body coordinate system can be obtained:
according to the transformation matrix between the unmanned plane body coordinate system and the earth coordinate system, the method comprises the following steps ofThe relative position expression in the earth coordinate system is therefore +.>The transformation matrix is as follows:
wherein phi, theta and phi are Euler angles of the system.
S4.2, according to the image information of the tracked target obtained continuously in real time, obtaining the size d of the non-cooperative target in the image plane 1 The focal length f and the range d of the actual size of the unmanned aerial vehicle on-board sensor are known min ≤d≤d max . And establishing a non-cooperative target proportion relative position model based on the monocular sensor.
According to the principle of pinhole imaging, the following ratio information can be obtained:
since visual measurement cannot obtain the actual size of the target, the proportional relative position model can be obtained as follows:
thus, the problem can be reduced to designing a controller for a second order system with unknown parameters and disturbances such that the relative position converges to a small neighborhood of the origin in a finite time.
The step S5 specifically includes: and (3) according to the dynamics model established in the step (4), designing a limited time controller by adopting an integral sliding mode algorithm, and achieving that the unmanned aerial vehicle reaches (intercepts) a non-cooperative target in limited time.
S5.1, according to the kinematics and relative dynamics equation of the follower and the target, under the measurement of a monocular sensor, obtaining the following proportional relative dynamics:
wherein r= [ r ] 1 r 2 r 3 ] T Is the relative position; v= [ v 1 v 2 v 3 ] T Is the relative velocity; d is the actual size of the target, is a bounded unknown constant and meets 0<d min <d<d max ;a G =[a G1 a G2 a G3 ] T As target acceleration, consider a bounded disturbance; wherein a is Gmax =[a 1max a 2max a 3max ] T Is the maximum acceleration of the target.
S5.2, errors r and v reflect the control effect when both errors r and v approach zeroIs considered to achieve the control purpose. Design integral slip form surface s i I=1, 2,3. The concrete steps are as follows:
wherein k is 1i And k 2i Is the integral gain and a e (0, 1) and b e (0, 1) are the function exponential terms.
sig(*) a =|*| a sign, |and sign represent absolute values and signal functions, respectively.
S5.3, designing a corresponding finite time integral sliding mode controller u expression by selecting a corresponding Liapunov function according to the defined integral sliding mode surface, wherein the finite time integral sliding mode controller u expression is as follows:
wherein d is the actual size of the non-cooperative target; s is(s) i Is an integral sliding mode surface; v i Relative speed for the corresponding i-axis; a epsilon (0, 1); b e (1, ++); r is (r) i 、δ i 、k 1i 、k 2i And alpha i Is a positive number; c.epsilon.0, 1. The integrated slip plane is designed so that the controller is obtained as a continuous function to eliminate buffeting and singularities.
S5.4, under the controller designed in S5.3, the unmanned aerial vehicle based on the monocular sensor can reach (intercept) a non-cooperative target in a limited time.
Wherein k is i =min{k 1i ,k 2i },k=min{k i },i=1,2,3,0<β<1, a step of; v is setLyapunov function, V 0 =V(s 0 ) Representing the function initial value.
To demonstrate the effectiveness of this embodiment, in other embodiments, the following simulation verification was performed:
in the simulation experiment, a control target is a non-cooperative target, and a monocular sensor-based unmanned aerial vehicle limited time controller is designed, so that the unmanned aerial vehicle reaches (intercepts) the non-cooperative target in a limited time. Considered herein is a non-cooperative target, set the initial position of the target to D G0 =[50,30,4] T m, initial velocity V G0 =[4,2,1] T m/s, acceleration is a G =a 0 +Δa m/s 2 Where Δa=0.05 [ sin (2πt/60), sin (2πt/60)] T m/s 2 ,a 0 The expression is:
the follower is the unmanned aerial vehicle, and initial position and initial speed of the unmanned aerial vehicle are set to be 0. The controller parameters are respectively as follows:k 1i =k 2i =2,δ i =0.001,d min =0.6,d max =1,α i =10。
and simulating the mathematical model established in the control method of the embodiment by using MATLAB software to obtain simulation diagrams 4 to 7. Fig. 4 shows the tracking trajectory of the drone and the non-cooperative target in XOY plane under the earth coordinate system. Fig. 5 shows a tracking trajectory of a drone and a non-cooperative target in a three-dimensional volume under the earth coordinate system. It can be seen that the drone reaches (intercepts) non-cooperative targets in a limited time. Fig. 6 and 7 show trajectories of relative positions and relative speeds of the unmanned aerial vehicle and the non-cooperative target. It can be seen that the vision-based unmanned aerial vehicle non-cooperative target limited time tracking method based on the monocular sensor can achieve a good tracking effect on a moving target, and achieves limited time tracking control of the unmanned aerial vehicle on the non-cooperative target in an environment without GPS signals.
Stability analysis:
designing Lyapunov function as
The integral sliding mode surface is as follows:the controller is designed as +.>Appropriate design parameters are selected. Available->Wherein k is i =min(k 1i ,k 2i ),k=min(k 1 ,k 2 ,k 3 ),/>According to the lyapunov finite time stability theorem, the unmanned aerial vehicle can reach (intercept) the non-cooperative target in a finite time.
In the embodiment, under the condition that no communication and unknown non-cooperative target relative positions exist, the tracking problem is converted into a second-order system problem with unknown disturbance by utilizing the image change condition. And secondly, the limitation of the traditional back-stepping method is avoided, an integral sliding mode control algorithm is selected, and calculation and buffeting are reduced. And a continuous limited time controller is designed to allow the unmanned aerial vehicle to reach (intercept) non-cooperative targets in a limited time, taking account of transient processes in tracking.
Example 2:
the embodiment provides a visual-based unmanned aerial vehicle non-cooperative target finite time tracking system, which comprises the following components:
a data acquisition module configured to: acquiring image information of a non-cooperative target;
the equation of motion establishment module is configured to: according to the image information, a tracker and a motion equation of a non-cooperative target;
a relative dynamics equation establishment module configured to: according to the image information, a tracker and a relative dynamics equation of a non-cooperative target;
a proportional relative kinetic equation establishment module configured to: obtaining a proportional relative dynamic equation of the tracker and the non-cooperative target according to the motion equation and the relative dynamic equation;
a controller establishment module configured to: according to a proportional relative dynamics equation, an integral sliding mode algorithm is adopted to establish a finite time controller;
a control module configured to: and controlling the tracker by using the established limited time controller.
The working method of the system is the same as the visual unmanned aerial vehicle non-cooperative target finite time tracking method of embodiment 1, and is not described here again.
Example 3:
the present embodiment provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the vision-based unmanned aerial vehicle non-cooperative target finite time tracking method described in embodiment 1.
Example 4:
the present embodiment provides an electronic device including a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the visual-based unmanned aerial vehicle non-cooperative target finite time tracking method of embodiment 1 when the program is executed.
The above description is only a preferred embodiment of the present embodiment, and is not intended to limit the present embodiment, and various modifications and variations can be made to the present embodiment by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present embodiment should be included in the protection scope of the present embodiment.
Claims (7)
1. A vision-based unmanned aerial vehicle non-cooperative target finite time tracking method, comprising:
acquiring image information of a non-cooperative target;
acquiring a motion equation of a tracker and a non-cooperative target according to the image information;
establishing a relative dynamics equation of the tracker and the non-cooperative target according to the image information;
obtaining a proportional relative dynamic equation of the tracker and the non-cooperative target according to the motion equation and the relative dynamic equation;
according to a proportional relative dynamics equation, an integral sliding mode algorithm is adopted to establish a finite time controller;
controlling the tracker by using the established limited time controller;
the geometric center of the non-cooperative target is defined as a characteristic point; according to the feature points, establishing a motion equation of a non-cooperative target and a motion equation of a tracker under an earth coordinate system; the difference between the motion equation of the non-cooperative target and the motion equation of the tracker is the relative motion equation of the tracker and the non-cooperative target;
obtaining a proportional relative position equation as a ratio of the difference of the acceleration of the non-cooperative target minus the acceleration of the tracker to the actual size of the non-cooperative target;
the finite time controller is:
wherein,is the actual size of the non-cooperative target; />Is an integralA slip-form surface; />For corresponding->The relative speed of the shaft;;/>;/>、/>、/>、/>and->Is a positive number; />。
2. The vision-based unmanned aerial vehicle non-cooperative target finite time tracking method according to claim 1, wherein the azimuth and elevation angle of the non-cooperative target relative to the follower are obtained according to a triangulation theorem by the size of the non-cooperative target on an image plane, the coordinates of the center of gravity of the non-cooperative target on the image plane, and the focal length of an unmanned aerial vehicle-mounted sensor when the image information of the non-cooperative target is acquired.
3. A vision-based unmanned aerial vehicle non-cooperative target as claimed in claim 2 havingThe time-limited tracking method is characterized in that a relative position expression of the non-cooperative target relative to the follower under a machine body coordinate system is obtained according to the azimuth angle and the elevation angle of the non-cooperative target relative to the follower; converting the relative position expression of the non-cooperative target relative to the follower in the machine body coordinate system to obtain the relative position expression of the non-cooperative target relative to the follower in the earth coordinate systemD;
According to the principle of pinhole imaging, the following is obtained:
wherein,the focal length of the unmanned aerial vehicle sensor is the focal length of the unmanned aerial vehicle; />Is the actual size of the non-cooperative target; />The size in the image plane for the non-cooperative target; />Is the coordinates of the center of gravity of the non-cooperative target in the image plane.
4. A method for visual-based unmanned aerial vehicle non-cooperative target finite time tracking according to claim 1, wherein image information of the monocular sensor non-cooperative target is employed.
5. A vision-based unmanned aerial vehicle non-cooperative target finite time tracking system, comprising:
a data acquisition module configured to: acquiring image information of a non-cooperative target;
the equation of motion establishment module is configured to: acquiring a motion equation of a tracker and a non-cooperative target according to the image information;
a relative dynamics equation establishment module configured to: establishing a relative dynamics equation of the tracker and the non-cooperative target according to the image information;
a proportional relative kinetic equation establishment module configured to: obtaining a proportional relative dynamic equation of the tracker and the non-cooperative target according to the motion equation and the relative dynamic equation;
a controller establishment module configured to: according to a proportional relative dynamics equation, an integral sliding mode algorithm is adopted to establish a finite time controller;
a control module configured to: controlling the tracker by using the established limited time controller;
the geometric center of the non-cooperative target is defined as a characteristic point; according to the feature points, establishing a motion equation of a non-cooperative target and a motion equation of a tracker under an earth coordinate system; the difference between the motion equation of the non-cooperative target and the motion equation of the tracker is the relative motion equation of the tracker and the non-cooperative target;
obtaining a proportional relative position equation as a ratio of the difference of the acceleration of the non-cooperative target minus the acceleration of the tracker to the actual size of the non-cooperative target;
the finite time controller is:
wherein,is the actual size of the non-cooperative target; />Is an integral sliding mode surface; />For corresponding->The relative speed of the shaft;;/>;/>、/>、/>、/>and->Is a positive number; />。
6. A computer readable storage medium having stored thereon a computer program, wherein the program when executed by a processor implements the steps of the vision-based unmanned non-cooperative target finite time tracking method of any of claims 1 to 4.
7. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps of the vision-based unmanned aerial vehicle non-cooperative target finite time tracking method of any of claims 1 to 4 when the program is executed.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211115584.8A CN115357049B (en) | 2022-09-14 | 2022-09-14 | Visual-based unmanned aerial vehicle non-cooperative target limited time tracking method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211115584.8A CN115357049B (en) | 2022-09-14 | 2022-09-14 | Visual-based unmanned aerial vehicle non-cooperative target limited time tracking method and system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115357049A CN115357049A (en) | 2022-11-18 |
CN115357049B true CN115357049B (en) | 2024-04-16 |
Family
ID=84005858
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211115584.8A Active CN115357049B (en) | 2022-09-14 | 2022-09-14 | Visual-based unmanned aerial vehicle non-cooperative target limited time tracking method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115357049B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107529376B (en) * | 2013-08-01 | 2015-12-30 | 上海新跃仪表厂 | The method of the microsatellite non-cooperative target Relative Navigation of multimodality fusion |
CN109976363A (en) * | 2019-03-20 | 2019-07-05 | 中国科学院深圳先进技术研究院 | Unmanned aerial vehicle (UAV) control method, apparatus, computer equipment and storage medium |
CN111984020A (en) * | 2020-07-21 | 2020-11-24 | 广东工业大学 | SDRE-based adaptive optimal sliding mode control method for transitional flight mode of tilting quad-rotor unmanned aerial vehicle |
CN112947560A (en) * | 2021-02-07 | 2021-06-11 | 广东工业大学 | Sliding mode tracking control method and system for multiple high-rise fire-fighting unmanned aerial vehicles under unknown disturbance |
KR102294829B1 (en) * | 2020-09-23 | 2021-08-27 | 세종대학교산학협력단 | System and flight control method for unmanned aerial vehicle with variable load |
CN113900440A (en) * | 2021-07-21 | 2022-01-07 | 中国电子科技集团公司电子科学研究院 | Unmanned aerial vehicle control law design method and device and readable storage medium |
CN114967729A (en) * | 2022-03-28 | 2022-08-30 | 广东工业大学 | Multi-rotor unmanned aerial vehicle height control method and system |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102164372B1 (en) * | 2020-04-03 | 2020-10-12 | 주식회사 파블로항공 | Nonlinear Disturbance Observer Based Path Fol lowing for a Small Fixed Wing UAV |
-
2022
- 2022-09-14 CN CN202211115584.8A patent/CN115357049B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107529376B (en) * | 2013-08-01 | 2015-12-30 | 上海新跃仪表厂 | The method of the microsatellite non-cooperative target Relative Navigation of multimodality fusion |
CN109976363A (en) * | 2019-03-20 | 2019-07-05 | 中国科学院深圳先进技术研究院 | Unmanned aerial vehicle (UAV) control method, apparatus, computer equipment and storage medium |
CN111984020A (en) * | 2020-07-21 | 2020-11-24 | 广东工业大学 | SDRE-based adaptive optimal sliding mode control method for transitional flight mode of tilting quad-rotor unmanned aerial vehicle |
KR102294829B1 (en) * | 2020-09-23 | 2021-08-27 | 세종대학교산학협력단 | System and flight control method for unmanned aerial vehicle with variable load |
CN112947560A (en) * | 2021-02-07 | 2021-06-11 | 广东工业大学 | Sliding mode tracking control method and system for multiple high-rise fire-fighting unmanned aerial vehicles under unknown disturbance |
CN113900440A (en) * | 2021-07-21 | 2022-01-07 | 中国电子科技集团公司电子科学研究院 | Unmanned aerial vehicle control law design method and device and readable storage medium |
CN114967729A (en) * | 2022-03-28 | 2022-08-30 | 广东工业大学 | Multi-rotor unmanned aerial vehicle height control method and system |
Also Published As
Publication number | Publication date |
---|---|
CN115357049A (en) | 2022-11-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Lee et al. | Autonomous landing of a VTOL UAV on a moving platform using image-based visual servoing | |
CN107727079B (en) | Target positioning method of full-strapdown downward-looking camera of micro unmanned aerial vehicle | |
US11279045B2 (en) | Robot pose estimation method and apparatus and robot using the same | |
US8295547B1 (en) | Model-based feature tracking in 3-D and 2-D imagery | |
CN108827306A (en) | A kind of unmanned plane SLAM navigation methods and systems based on Multi-sensor Fusion | |
Langelaan et al. | Towards autonomous UAV flight in forests | |
CN106969784B (en) | A kind of combined error emerging system for concurrently building figure positioning and inertial navigation | |
CN108387236B (en) | Polarized light SLAM method based on extended Kalman filtering | |
CN112967392A (en) | Large-scale park mapping and positioning method based on multi-sensor contact | |
CN112504261B (en) | Unmanned aerial vehicle falling pose filtering estimation method and system based on visual anchor points | |
CN110262555B (en) | Real-time obstacle avoidance control method for unmanned aerial vehicle in continuous obstacle environment | |
CN116278880A (en) | Charging equipment and method for controlling mechanical arm to charge | |
CN111238469B (en) | Unmanned aerial vehicle formation relative navigation method based on inertia/data chain | |
CN108536163B (en) | Dynamic model/laser radar combined navigation method in single-sided structure environment | |
CN113129377B (en) | Three-dimensional laser radar rapid robust SLAM method and device | |
CN112947569A (en) | Visual servo target tracking control method for quad-rotor unmanned aerial vehicle based on preset performance | |
CN113076634B (en) | Multi-machine cooperative passive positioning method, device and system | |
Andersen et al. | Improving MAV pose estimation using visual information | |
CN115357049B (en) | Visual-based unmanned aerial vehicle non-cooperative target limited time tracking method and system | |
She et al. | Vision-based adaptive fixed-time uncooperative target tracking for QUAV with unknown disturbances | |
CN112581610B (en) | Robust optimization method and system for building map from multi-beam sonar data | |
CN116952229A (en) | Unmanned aerial vehicle positioning method, device, system and storage medium | |
CN114003041A (en) | Multi-unmanned vehicle cooperative detection system | |
Yang et al. | Inertial-aided vision-based localization and mapping in a riverine environment with reflection measurements | |
Jiang et al. | Quadrotors' Low-cost Vision-based Autonomous Landing Architecture on a Moving Platform |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |