[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN115357049A - Vision-based finite time tracking method and system for non-cooperative target of unmanned aerial vehicle - Google Patents

Vision-based finite time tracking method and system for non-cooperative target of unmanned aerial vehicle Download PDF

Info

Publication number
CN115357049A
CN115357049A CN202211115584.8A CN202211115584A CN115357049A CN 115357049 A CN115357049 A CN 115357049A CN 202211115584 A CN202211115584 A CN 202211115584A CN 115357049 A CN115357049 A CN 115357049A
Authority
CN
China
Prior art keywords
cooperative target
equation
relative
target
tracker
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211115584.8A
Other languages
Chinese (zh)
Other versions
CN115357049B (en
Inventor
李鸿一
佘雪华
周琪
姚得银
鲁仁全
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong University of Technology
Original Assignee
Guangdong University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong University of Technology filed Critical Guangdong University of Technology
Priority to CN202211115584.8A priority Critical patent/CN115357049B/en
Publication of CN115357049A publication Critical patent/CN115357049A/en
Application granted granted Critical
Publication of CN115357049B publication Critical patent/CN115357049B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention belongs to the technical field of tracking control, and provides a vision-based finite-time tracking method and system for a non-cooperative target of an unmanned aerial vehicle, which comprises the following steps of firstly, according to image information, a motion equation and a relative kinetic equation of a tracker and the non-cooperative target; then obtaining a proportional relative kinetic equation of the tracker and the non-cooperative target according to the motion equation and the relative kinetic equation; finally, according to a proportional relative kinetic equation, an integral sliding mode algorithm is adopted to establish a finite time controller for controlling the unmanned aerial vehicle; the relative position between the target and the follower can be directly measured; compared with the traditional tracking control algorithm, the method adopts the monocular sensor with low cost and low power consumption as a measuring tool, and can be applied to indoor or suburban areas without communication or poor signals.

Description

Vision-based finite time tracking method and system for non-cooperative target of unmanned aerial vehicle
Technical Field
The invention belongs to the technical field of tracking control, and particularly relates to a vision-based finite-time tracking method and system for a non-cooperative target of an unmanned aerial vehicle.
Background
Nowadays, drones have a high mobility, low acquisition and maintenance costs and excellent vertical take-off and landing capabilities, even in harsh environments, considered to be the most popular autopilot airborne platform. In addition, rapid advances in navigation, sensory sensors, and high-performance battery technology have significantly increased the range and load-bearing capabilities of drones, making them ideal platforms for performing various tasks, such as search and rescue, area coverage, surveillance, object transportation, and smart farming. In the task execution process of the unmanned aerial vehicle, target tracking is a work basis, and the target tracking can help a user set a target object as a focus of the unmanned aerial vehicle, so that the following target object is observed, and target tracking control needs to be performed. In addition, in the target tracking process, the target tracking method can be divided into a cooperative target and a non-cooperative target, the non-cooperative target is not communicated, the non-cooperative target is tracked by avoiding the unmanned aerial vehicle, and strong maneuvering change can be generated. However, most of the tracking control methods of the existing target tracking systems for unmanned aerial vehicles only consider the target state at the current moment, and an active mechanism for dealing with the evasive behavior of the target is lacked.
Many navigation sensors, such as GPS and INS, are used during autonomous multi-rotor operation, and such methods are primarily directed to cooperative targets, i.e., target information can be obtained by communicating with the target. For non-cooperative targets, some have also employed visual sensors to obtain target information. Due to the characteristics of light weight, small volume, passivity, low power consumption and the like, the camera is very important in the motion control of the unmanned aerial vehicle so as to accurately monitor and track interested areas and targets. In the case of unmanned flight systems, such as quad-rotor aircraft, this can be easily achieved by mounting the camera sensors directly on the robot, thus forming a so-called "eye-on-hand" system.
The inventor finds that with the continuous and deep research on computer technology and vision, the gradual maturity of image processing, and the development of various control methods, the visual servo control has shown great value in the development and application of robot systems. Regarding visual servo control, four main categories can be defined: position-based visual servoing (PBVS) wherein the selected control error is defined in Cartesian space; image-based visual servoing (IBVS), wherein a control error function is defined in image space; 2-1/2 or hybrid visual servoing, where the control error function is partially defined in cartesian and image space; direct visual servoing, without extracting specific features, utilizes complete images in the control design. The above methods have advantages and disadvantages, and the efficacy depends on the application requirements to a great extent. From the above, the vision sensor cannot directly measure the relative position between the target and the follower, and needs to adopt other methods for estimation, and thus needs to be provided with a radar or an estimator.
Disclosure of Invention
The invention provides a vision-based unmanned aerial vehicle non-cooperative target finite time tracking method and a system, and the invention provides the vision-based unmanned aerial vehicle non-cooperative target finite time tracking method based on a monocular sensor, under the condition of lacking communication, the non-cooperative target is tracked, the monocular sensor is fixed on a universal joint of the unmanned aerial vehicle, the unmanned aerial vehicle obtains image information of the target through the monocular sensor, and a position model of the unmanned aerial vehicle and the target is established according to the change of the image; on the basis, a finite time controller is designed by utilizing an integral sliding mode algorithm, so that the unmanned aerial vehicle can reach or intercept the target in finite time.
In order to realize the purpose, the invention is realized by the following technical scheme:
in a first aspect, the invention provides a vision-based finite time tracking method for a non-cooperative target of an unmanned aerial vehicle, which comprises the following steps:
acquiring image information of a non-cooperative target;
according to the image information, the motion equation of the tracker and the non-cooperative target is tracked;
according to the image information, a relative kinetic equation of a tracker and a non-cooperative target;
obtaining a proportional relative kinetic equation of the tracker and the non-cooperative target according to the motion equation and the relative kinetic equation;
according to a relative proportion kinetic equation, establishing a finite time controller by adopting an integral sliding mode algorithm;
and controlling the tracker by utilizing the established finite time controller.
Further, the geometric center of the non-cooperative target is set as a characteristic point; according to the characteristic points, establishing a motion equation of a non-cooperative target and a motion equation of a tracker in a terrestrial coordinate system; the difference between the equation of motion of the non-cooperative target and the equation of motion of the tracker is the equation of relative dynamics of the tracker and the non-cooperative target.
Furthermore, the azimuth angle and the elevation angle of the non-cooperative target relative to the follower are obtained according to the triangulation theorem through the size of the non-cooperative target on an image plane, the coordinate of the gravity center of the non-cooperative target on the image plane and the focal length of an unmanned aerial vehicle airborne sensor when the image information of the non-cooperative target is obtained.
Further, according to the azimuth angle and the elevation angle of the non-cooperative target relative to the follower, a relative position expression of the non-cooperative target relative to the follower is obtained in a machine body coordinate system; converting the relative position expression of the non-cooperative target relative to the follower in the body coordinate system to obtain a relative position expression D of the non-cooperative target relative to the follower in the terrestrial coordinate system;
according to the pinhole imaging principle, the following results are obtained:
Figure BDA0003845405190000031
wherein f is the focal length of an airborne sensor of the unmanned aerial vehicle; d is the actual size of the non-cooperative target; d is a radical of 1 The size of the non-cooperative target in the image plane; (y) 1 ,z 1 ) The coordinates of the center of gravity of the non-cooperative target in the image plane.
Further, a proportional relative position equation is obtained as a ratio of the difference of the acceleration of the non-cooperative target minus the acceleration of the tracker to the actual size of the non-cooperative target.
Further, the finite time controller is:
Figure BDA0003845405190000041
Figure BDA0003845405190000042
wherein d is the actual size of the non-cooperative target; s is i Is an integral sliding mode surface; v. of i Is the relative velocity corresponding to the i-axis; a is epsilon (0, 1); b ∈ (1, ∞); r is i 、δ i 、k 1i 、k 2i And alpha i Is a positive number; c ∈ (0, 1).
Further, a monocular sensor is used for acquiring image information of the non-cooperative target.
In a second aspect, the present invention further provides a vision-based non-cooperative target limited time tracking system for an unmanned aerial vehicle, including:
a data acquisition module configured to: acquiring image information of a non-cooperative target;
an equation of motion establishment module configured to: according to the image information, the motion equation of the tracker and the non-cooperative target is tracked;
a relative kinetic equation establishment module configured to: according to the image information, a relative kinetic equation of a tracker and a non-cooperative target;
a proportional relative dynamics equation establishment module configured to: obtaining a proportional relative kinetic equation of the tracker and the non-cooperative target according to the motion equation and the relative kinetic equation;
a controller establishment module configured to: according to a proportional relative kinetic equation, establishing a finite time controller by adopting an integral sliding mode algorithm;
a control module configured to: and controlling the tracker by utilizing the established finite time controller.
In a third aspect, the present invention also provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor, performs the steps of the method for the limited-time tracking of non-cooperative targets of vision-based drones according to the first aspect.
In a fourth aspect, the present invention further provides an electronic device, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor executes the program to implement the steps of the method for tracking non-cooperative target limited time of the vision-based drone according to the first aspect.
Compared with the prior art, the invention has the beneficial effects that:
1. firstly, according to image information, a motion equation and a relative kinetic equation of a tracker and a non-cooperative target are obtained; then, according to the motion equation and the relative kinetic equation, a proportional relative kinetic equation of the tracker and the non-cooperative target is obtained; finally, according to a proportional relative kinetic equation, an integral sliding mode algorithm is adopted to establish a finite time controller for controlling the unmanned aerial vehicle; by the method, the limited time controller can be established only by acquiring the image information in the visual sensor, so that the application of a radar or an estimator and the like is avoided, and the cost is saved;
2. compared with the traditional tracking control algorithm, the method adopts the monocular sensor with low cost and low power consumption as a measuring tool, and can be applied to indoor or suburban areas without communication or poor signals;
3. the target aimed by the method can be non-cooperative, namely the target is maneuvering random but the acceleration is bounded, under the condition of no communication, a monocular sensor is adopted to obtain target image information, a proportional relative position model is established according to image change, the defect that the monocular sensor cannot directly measure the relative position is overcome, and the tracking problem of the non-cooperative target is realized;
4. the transient state process of non-cooperative target tracking of the unmanned aerial vehicle is considered, an integral sliding mode surface is designed, system input is designed by using a sliding mode control method, the continuity of a system controller is ensured, and buffeting is relieved;
5. the invention adopts a finite time stabilization method, which is different from the traditional gradual stability theory that the time interval tends to infinity, and the finite time stabilization method more effectively controls the actual system, because the time of the actual system cannot tend to infinity when the actual system is controlled in real life.
Drawings
The accompanying drawings, which form a part hereof, are included to provide a further understanding of the present embodiments, and are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the present embodiments and together with the description serve to explain the embodiments and are not intended to limit the embodiments to the proper form disclosed herein.
FIG. 1 is a flowchart of example 1 of the present invention;
FIG. 2 is a model of the relative position between a follower and a target in a global coordinate system according to embodiment 1 of the present invention;
FIG. 3 is an imaging model of a non-cooperative target under a monocular sensor according to embodiment 1 of the present invention;
fig. 4 is a motion trajectory in a two-dimensional plane of an unmanned aerial vehicle and a non-cooperative target according to embodiment 1 of the present invention;
fig. 5 is a three-dimensional plane motion trajectory of the drone and the non-cooperative target of embodiment 1 of the present invention;
fig. 6 is the relative position of the drone and the non-cooperative target of embodiment 1 of the present invention;
fig. 7 is a diagram showing the relative speed of the drone and the non-cooperative target in embodiment 1 of the present invention.
Detailed Description
The invention is further described with reference to the following figures and examples.
It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the disclosure. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.
Example 1:
as shown in fig. 1, the present embodiment provides a method for tracking a non-cooperative target of an unmanned aerial vehicle in a limited time based on vision, first, acquiring image information of the non-cooperative target; according to the image information, a motion equation of a tracker and a non-cooperative target is obtained; according to the image information, a relative kinetic equation of a tracker and a non-cooperative target; then, according to the motion equation and the relative kinetic equation, obtaining a proportional relative kinetic equation of the tracker and the non-cooperative target; finally, according to a proportional relative kinetic equation, an integral sliding mode algorithm is adopted to establish a finite time controller; the tracker is controlled by utilizing the established limited time controller, and the tracker in the implementation is an unmanned aerial vehicle needing to be controlled. The method comprises the following specific steps:
s1, selecting a non-cooperative target;
s2, after the target is determined, compiling a motion equation of the follower and the target according to the preliminarily obtained follower and target information, and establishing a relative position model;
s3, acquiring the size of the non-cooperative target on an image plane and the gravity center coordinate of the non-cooperative target through a monocular sensor of the unmanned aerial vehicle, and continuously acquiring the image information of the non-cooperative target in real time;
s4, according to the image information change of the non-cooperative target, a proportion relative position model of the tracker and the non-cooperative target is obtained;
and S5, designing a finite time controller by adopting an integral sliding mode algorithm according to the dynamic model established in the step S4, and realizing that the unmanned aerial vehicle reaches or intercepts a non-cooperative target in finite time.
Wherein, step S1 specifically includes: selecting a target, receiving and displaying an image returned by the unmanned aerial vehicle on line by a ground station computer, and operating a target detection algorithm or manually selecting a tracking target to obtain an initial target enclosure frame area;
wherein, step S2 specifically includes:
s2.1, using a visual algorithm to set the geometric center of the non-cooperative target as a feature point;
s2.2, according to the flying height of the unmanned aerial vehicle, the camera internal reference, the position and the size of the non-cooperative target in the geometric center of the image plane and the bounded acceleration, considering that under an earth coordinate system O = (N, E, D), the motion equation of the target and the motion equation of a follower are expressed as follows:
position D of the object G (t) acceleration is a G (t), then the equation of motion is as follows:
Figure BDA0003845405190000081
wherein,
Figure BDA0003845405190000082
a target initial velocity vector is obtained;
Figure BDA0003845405190000083
is the initial position vector of the target.
The position of the tracker is D F (t) acceleration is a F (t), then the equation of motion is as follows:
Figure BDA0003845405190000084
wherein,
Figure BDA0003845405190000085
an initial velocity vector for the tracker;
Figure BDA0003845405190000086
is the tracker initial position vector.
Thus, the relative position model between the follower and the target is established as follows:
D(t)=D G (t)-D F (t)
Figure BDA0003845405190000087
the control objective herein is
Figure BDA0003845405190000088
When the time tends to
Figure BDA0003845405190000089
The relative position tends towards the origin, i.e. the follower reaches (intercepts) the target in a limited time.
Wherein, step S3 specifically includes: monocular transmission through unmanned aerial vehicleThe sensor obtains the size d of the non-cooperative target in the image plane 1 And the coordinates (y) of its center of gravity in the image plane 1 ,z 1 ) Focal length f and actual size range d of airborne sensor of known unmanned aerial vehicle min ≤d≤d max . According to the triangulation theorem, the azimuth angle χ and the elevation angle of the target relative to the follower are obtained
Figure BDA00038454051900000811
The specific expression is as follows:
Figure BDA00038454051900000810
wherein, step S4 specifically includes:
s4.1, the target information is provided by the monocular sensor. The monocular sensor is arranged on a holder of the unmanned aerial vehicle, and the optical axis of the monocular sensor is parallel to a body coordinate system O of the unmanned aerial vehicle B =(X B ,Y B ,Z B ) X of (2) B A shaft.
Based on target information provided by monocular sensor, such as azimuth χ and elevation of target relative to follower
Figure BDA0003845405190000097
The relative position expression under the body coordinate system can be obtained from the following expressions:
Figure BDA0003845405190000096
according to a conversion matrix between an unmanned aerial vehicle body coordinate system and a terrestrial coordinate system
Figure BDA0003845405190000091
The relative position in the terrestrial coordinate system is thus expressed as
Figure BDA0003845405190000092
The transformation matrix is as follows:
Figure BDA0003845405190000093
where φ, θ, ψ are the system Euler angles.
S4.2, obtaining the size d of the non-cooperative target on an image plane according to the image information of the tracked target continuously obtained in real time 1 Focal length f and actual size range d of airborne sensor of known unmanned aerial vehicle min ≤d≤d max . And establishing a monocular sensor-based non-cooperative target proportion relative position model.
According to the pinhole imaging principle, the following scale information can be obtained:
Figure BDA0003845405190000094
because the actual size of the target cannot be obtained by vision measurement, a proportional relative position model can be obtained as follows:
Figure BDA0003845405190000095
thus, the problem can be reduced to designing the controller for a second order system with unknown parameters and perturbations to converge the relative position to a small neighborhood of the origin in a limited time.
Wherein, step S5 specifically includes: and (4) designing a finite time controller by adopting an integral sliding mode algorithm according to the dynamic model established in the step (4), and realizing that the unmanned aerial vehicle reaches (intercepts) a non-cooperative target in finite time.
S5.1, according to the kinematics and the relative kinetic equation of the follower and the target, under the measurement of a monocular sensor, obtaining the following proportional relative kinetics:
Figure BDA0003845405190000101
Figure BDA0003845405190000102
wherein r = [ r ] 1 r 2 r 3 ] T Are relative positions; v = [ v = 1 v 2 v 3 ] T Is the relative velocity; d is the actual size of the target, is a bounded unknown constant, and satisfies 0<d min <d<d max ;a G =[a G1 a G2 a G3 ] T Target acceleration, considered as a bounded disturbance; wherein a is Gmax =[a 1max a 2max a 3max ] T The maximum acceleration of the target.
S5.2, errors r and v reflect the control effect, and when the errors r and v both tend to be close to zero, the control purpose is considered to be achieved. Design integral slip form surface s i I =1,2,3. The concrete expression is as follows:
Figure BDA0003845405190000103
wherein k is 1i And k 2i Is the integral gain, and a ∈ (0, 1) and b ∈ (0, 1) are function exponential terms.
sig(*) a =|*| a sign (, |) and sign (, respectively represent an absolute value and a signal function.
S5.3, according to the defined integral sliding mode surface, designing a corresponding finite time integral sliding mode controller u expression by selecting a corresponding Lyapunov function as follows:
Figure BDA0003845405190000104
Figure BDA0003845405190000105
wherein d is the actual size of the non-cooperative target; s is i Is an integral sliding mode surface; v. of i To the relative speed of the corresponding i-axis;a∈(0,1);b∈(1,∞);r i 、δ i 、k 1i 、k 2i And alpha i Is a positive number; c ∈ (0, 1). By designing an integral sliding mode surface, the obtained controller is a continuous function to eliminate buffeting and singularity.
And S5.4, under the controller designed in S5.3, the unmanned aerial vehicle based on the monocular sensor can reach (intercept) the non-cooperative target in a limited time.
Figure BDA0003845405190000111
Wherein k is i =min{k 1i ,k 2i },k=min{k i },i=1,2,3,0<β<1; v is a set Lyapunov function, V 0 =V(s 0 ) Representing the initial value of the function.
To demonstrate the effectiveness of this embodiment, in other embodiments, the following simulation verifications were performed:
in the simulation experiment, the control target is a non-cooperative target, and the finite time controller of the unmanned aerial vehicle based on the monocular sensor is designed, so that the unmanned aerial vehicle reaches (intercepts) the non-cooperative target in a finite time. Considered herein is a non-cooperative target, set to a target initial position D G0 =[50,30,4] T m, initial velocity V G0 =[4,2,1] T m/s, acceleration a G =a 0 +Δa m/s 2 Wherein Δ a =0.05[ 2 ] sin (2 π t/60), sin (2 π t/60)] T m/s 2 ,a 0 The expression is as follows:
Figure BDA0003845405190000112
the follower is unmanned aerial vehicle, sets for unmanned aerial vehicle initial position and initial velocity and all is 0. The controller parameters are respectively:
Figure BDA0003845405190000113
k 1i =k 2i =2,δ i =0.001,d min =0.6,d max =1,α i =10。
MATLAB software is used for simulating the mathematical model established in the control method of the embodiment to obtain simulation graphs 4 to 7. Fig. 4 shows the tracking trajectory of the drone and the non-cooperative target on the XOY plane in the terrestrial coordinate system. Fig. 5 shows the tracking trajectory of the drone and the non-cooperative target on the three-dimensional solid in the terrestrial coordinate system. It can be seen that the drone reaches (intercepts) the non-cooperative target in a limited time. Fig. 6 and 7 show the relative position and relative velocity trajectories of the drone and the non-cooperative target. Therefore, the monocular sensor-based non-cooperative target limited time tracking method for the unmanned aerial vehicle based on the vision can achieve a good tracking effect on the moving target, and achieves limited time tracking control on the non-cooperative target through the unmanned aerial vehicle in an environment without GPS signals.
And (3) stability analysis:
the Lyapunov function is designed to be
Figure BDA0003845405190000121
The integral slip form surface is:
Figure BDA0003845405190000122
the controller is designed as
Figure BDA0003845405190000123
Appropriate design parameters are selected. Can obtain
Figure BDA0003845405190000124
Wherein k is i =min(k 1i ,k 2i ),k=min(k 1 ,k 2 ,k 3 ),
Figure BDA0003845405190000125
According to the Lyapunov finite time stability theorem, the unmanned aerial vehicle can reach (intercept) a non-cooperative target in finite time.
The embodiment converts the tracking problem into a second-order system problem with unknown disturbance by using the image change condition under the condition of no communication and unknown non-cooperative target relative position. And secondly, the limitation of the traditional backstepping method is avoided, an integral sliding mode control algorithm is selected, and calculation and buffeting are reduced. And a continuous finite time controller is designed by considering the transient process in tracking, so that the unmanned aerial vehicle can reach (intercept) a non-cooperative target in a finite time.
Example 2:
the embodiment provides a vision-based unmanned aerial vehicle non-cooperative target limited time tracking system, which comprises:
a data acquisition module configured to: acquiring image information of a non-cooperative target;
an equation of motion setup module configured to: according to the image information, a motion equation of a tracker and a non-cooperative target is obtained;
a relative kinetic equation establishment module configured to: according to the image information, a relative kinetic equation of a tracker and a non-cooperative target;
a proportional relative dynamics equation establishment module configured to: obtaining a proportional relative kinetic equation of the tracker and the non-cooperative target according to the motion equation and the relative kinetic equation;
a controller establishment module configured to: according to a proportional relative kinetic equation, establishing a finite time controller by adopting an integral sliding mode algorithm;
a control module configured to: and controlling the tracker by utilizing the established finite time controller.
The working method of the system is the same as the non-cooperative target limited time tracking method of the unmanned aerial vehicle based on vision in embodiment 1, and is not described again here.
Example 3:
the present embodiment provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor, implements the steps of the vision-based drone non-cooperative target limited time tracking method described in embodiment 1.
Example 4:
the embodiment provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the program, the steps of the vision-based drone non-cooperative target limited-time tracking method described in embodiment 1 are implemented.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the present invention, and various modifications and changes may be made to the present invention by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present embodiment should be included in the protection scope of the present embodiment.

Claims (10)

1. A vision-based unmanned aerial vehicle non-cooperative target finite time tracking method is characterized by comprising the following steps:
acquiring image information of a non-cooperative target;
according to the image information, the motion equation of the tracker and the non-cooperative target is tracked;
according to the image information, a relative kinetic equation of a tracker and a non-cooperative target;
obtaining a proportional relative kinetic equation of the tracker and the non-cooperative target according to the motion equation and the relative kinetic equation;
according to a proportional relative kinetic equation, establishing a finite time controller by adopting an integral sliding mode algorithm;
and controlling the tracker by utilizing the established finite time controller.
2. The vision-based finite-time tracking method for the non-cooperative target of the unmanned aerial vehicle based on the vision as claimed in claim 1, characterized in that the geometric center of the non-cooperative target is defined as a feature point; according to the characteristic points, establishing a motion equation of a non-cooperative target and a motion equation of a tracker in a terrestrial coordinate system; the difference between the equation of motion of the non-cooperative target and the equation of motion of the tracker is the equation of motion of the tracker relative to the non-cooperative target.
3. The method of claim 1, wherein the azimuth and elevation angles of the non-cooperative target relative to the follower are obtained according to triangulation theorem by the size of the non-cooperative target in the image plane, the coordinates of the center of gravity of the non-cooperative target in the image plane, and the focal length of the sensor on board the unmanned aerial vehicle when acquiring the image information of the non-cooperative target.
4. The vision-based finite-time tracking method for the non-cooperative target of the unmanned aerial vehicle based on the vision as claimed in claim 3, characterized in that a relative position expression of the non-cooperative target relative to the follower in the coordinate system of the vehicle is obtained according to the azimuth angle and the elevation angle of the non-cooperative target relative to the follower; converting the relative position expression of the non-cooperative target relative to the follower in the body coordinate system to obtain a relative position expression D of the non-cooperative target relative to the follower in the terrestrial coordinate system;
according to the pinhole imaging principle, the following results are obtained:
Figure FDA0003845405180000021
wherein f is the focal length of an airborne sensor of the unmanned aerial vehicle; d is the actual size of the non-cooperative target; d 1 The size of the non-cooperative target in the image plane; (y) 1 ,z 1 ) The coordinates of the center of gravity of the non-cooperative target in the image plane.
5. The vision-based finite-time tracking method for the non-cooperative target of the unmanned aerial vehicle based on the vision as claimed in claim 4, wherein the proportional relative position equation is obtained as a ratio of the difference of the acceleration of the non-cooperative target minus the acceleration of the tracker to the actual size of the non-cooperative target.
6. The vision-based finite time tracking method for the non-cooperative target of the unmanned aerial vehicle as claimed in claim 1, wherein the finite time controller is:
Figure FDA0003845405180000022
Figure FDA0003845405180000023
wherein d is the actual size of the non-cooperative target; s is i Is an integral sliding mode surface; v. of i Is the relative speed corresponding to the i axis; a is an element (0, 1); b ∈ (1, ∞); r is a radical of hydrogen i 、δ i 、k 1i 、k 2i And alpha i Is a positive number; c ∈ (0, 1).
7. The method of claim 1, wherein monocular sensor based image information of non-cooperative targets is used.
8. A vision-based non-cooperative target limited time tracking system for unmanned aerial vehicles, comprising:
a data acquisition module configured to: acquiring image information of a non-cooperative target;
an equation of motion establishment module configured to: according to the image information, a motion equation of a tracker and a non-cooperative target is obtained;
a relative kinetic equation establishment module configured to: according to the image information, a relative kinetic equation of a tracker and a non-cooperative target;
a proportional relative dynamics equation establishment module configured to: obtaining a proportional relative kinetic equation of the tracker and the non-cooperative target according to the motion equation and the relative kinetic equation;
a controller establishment module configured to: according to a proportional relative kinetic equation, establishing a finite time controller by adopting an integral sliding mode algorithm;
a control module configured to: and controlling the tracker by utilizing the established finite time controller.
9. A computer-readable storage medium, on which a computer program is stored, which program, when being executed by a processor, carries out the steps of the method for limited time tracking of non-cooperative targets of vision-based drones of any of claims 1 to 7.
10. An electronic device comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor when executing the program implements the steps of the vision-based drone non-cooperative target limited time tracking method of any one of claims 1-7.
CN202211115584.8A 2022-09-14 2022-09-14 Visual-based unmanned aerial vehicle non-cooperative target limited time tracking method and system Active CN115357049B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211115584.8A CN115357049B (en) 2022-09-14 2022-09-14 Visual-based unmanned aerial vehicle non-cooperative target limited time tracking method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211115584.8A CN115357049B (en) 2022-09-14 2022-09-14 Visual-based unmanned aerial vehicle non-cooperative target limited time tracking method and system

Publications (2)

Publication Number Publication Date
CN115357049A true CN115357049A (en) 2022-11-18
CN115357049B CN115357049B (en) 2024-04-16

Family

ID=84005858

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211115584.8A Active CN115357049B (en) 2022-09-14 2022-09-14 Visual-based unmanned aerial vehicle non-cooperative target limited time tracking method and system

Country Status (1)

Country Link
CN (1) CN115357049B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107529376B (en) * 2013-08-01 2015-12-30 上海新跃仪表厂 The method of the microsatellite non-cooperative target Relative Navigation of multimodality fusion
CN109976363A (en) * 2019-03-20 2019-07-05 中国科学院深圳先进技术研究院 Unmanned aerial vehicle (UAV) control method, apparatus, computer equipment and storage medium
CN111984020A (en) * 2020-07-21 2020-11-24 广东工业大学 SDRE-based adaptive optimal sliding mode control method for transitional flight mode of tilting quad-rotor unmanned aerial vehicle
CN112947560A (en) * 2021-02-07 2021-06-11 广东工业大学 Sliding mode tracking control method and system for multiple high-rise fire-fighting unmanned aerial vehicles under unknown disturbance
KR102294829B1 (en) * 2020-09-23 2021-08-27 세종대학교산학협력단 System and flight control method for unmanned aerial vehicle with variable load
US20210311503A1 (en) * 2020-04-03 2021-10-07 Pablo Air Co., Ltd. Method in which small fixed-wing unmanned aerial vehicle follows path and lgvf path-following controller using same
CN113900440A (en) * 2021-07-21 2022-01-07 中国电子科技集团公司电子科学研究院 Unmanned aerial vehicle control law design method and device and readable storage medium
CN114967729A (en) * 2022-03-28 2022-08-30 广东工业大学 Multi-rotor unmanned aerial vehicle height control method and system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107529376B (en) * 2013-08-01 2015-12-30 上海新跃仪表厂 The method of the microsatellite non-cooperative target Relative Navigation of multimodality fusion
CN109976363A (en) * 2019-03-20 2019-07-05 中国科学院深圳先进技术研究院 Unmanned aerial vehicle (UAV) control method, apparatus, computer equipment and storage medium
US20210311503A1 (en) * 2020-04-03 2021-10-07 Pablo Air Co., Ltd. Method in which small fixed-wing unmanned aerial vehicle follows path and lgvf path-following controller using same
CN111984020A (en) * 2020-07-21 2020-11-24 广东工业大学 SDRE-based adaptive optimal sliding mode control method for transitional flight mode of tilting quad-rotor unmanned aerial vehicle
KR102294829B1 (en) * 2020-09-23 2021-08-27 세종대학교산학협력단 System and flight control method for unmanned aerial vehicle with variable load
CN112947560A (en) * 2021-02-07 2021-06-11 广东工业大学 Sliding mode tracking control method and system for multiple high-rise fire-fighting unmanned aerial vehicles under unknown disturbance
CN113900440A (en) * 2021-07-21 2022-01-07 中国电子科技集团公司电子科学研究院 Unmanned aerial vehicle control law design method and device and readable storage medium
CN114967729A (en) * 2022-03-28 2022-08-30 广东工业大学 Multi-rotor unmanned aerial vehicle height control method and system

Also Published As

Publication number Publication date
CN115357049B (en) 2024-04-16

Similar Documents

Publication Publication Date Title
Nguyen et al. Robust target-relative localization with ultra-wideband ranging and communication
Lange et al. Autonomous corridor flight of a UAV using a low-cost and light-weight RGB-D camera
CN106969784B (en) A kind of combined error emerging system for concurrently building figure positioning and inertial navigation
CN108563236B (en) Target tracking method of nano unmanned aerial vehicle based on concentric circle characteristics
CN110262555B (en) Real-time obstacle avoidance control method for unmanned aerial vehicle in continuous obstacle environment
CN111238469B (en) Unmanned aerial vehicle formation relative navigation method based on inertia/data chain
Lippiello et al. Closed-form solution for absolute scale velocity estimation using visual and inertial data with a sliding least-squares estimation
CN113129377A (en) Three-dimensional laser radar rapid robust SLAM method and device
CN113076634B (en) Multi-machine cooperative passive positioning method, device and system
She et al. Vision-based adaptive fixed-time uncooperative target tracking for QUAV with unknown disturbances
CN108646760B (en) Monocular vision based mobile robot target tracking and platform control system and method
Mills et al. Vision based control for fixed wing UAVs inspecting locally linear infrastructure using skid-to-turn maneuvers
CN115357049B (en) Visual-based unmanned aerial vehicle non-cooperative target limited time tracking method and system
Olivares-Mendez et al. Autonomous landing of an unmanned aerial vehicle using image-based fuzzy control
Araar et al. A new hybrid approach for the visual servoing of VTOL UAVs from unknown geometries
Ho et al. Characterization of flow field divergence for MAVs vertical control landing
Ramirez et al. Stability analysis of a vision-based UAV controller: An application to autonomous road following missions
Ammann et al. Undelayed initialization of inverse depth parameterized landmarks in UKF-SLAM with error state formulation
CN116952229A (en) Unmanned aerial vehicle positioning method, device, system and storage medium
Shastry et al. Autonomous detection and tracking of a high-speed ground vehicle using a quadrotor UAV
Yang et al. Inertial-aided vision-based localization and mapping in a riverine environment with reflection measurements
Jiang et al. Quadrotors' Low-cost Vision-based Autonomous Landing Architecture on a Moving Platform
Liu et al. Sensing via collisions: a smart cage for quadrotors with applications to self-localization
Zhang et al. An unscented Kalman filter-based visual pose estimation method for underwater vehicles
CN114003041A (en) Multi-unmanned vehicle cooperative detection system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant