[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN106647814B - A kind of unmanned plane vision auxiliary positioning and flight control system and method based on the identification of two dimensional code terrestrial reference - Google Patents

A kind of unmanned plane vision auxiliary positioning and flight control system and method based on the identification of two dimensional code terrestrial reference Download PDF

Info

Publication number
CN106647814B
CN106647814B CN201611092540.2A CN201611092540A CN106647814B CN 106647814 B CN106647814 B CN 106647814B CN 201611092540 A CN201611092540 A CN 201611092540A CN 106647814 B CN106647814 B CN 106647814B
Authority
CN
China
Prior art keywords
information
vector
dimensional code
module
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611092540.2A
Other languages
Chinese (zh)
Other versions
CN106647814A (en
Inventor
刘磊
谯睿智
王永骥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong University of Science and Technology
Original Assignee
Huazhong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong University of Science and Technology filed Critical Huazhong University of Science and Technology
Priority to CN201611092540.2A priority Critical patent/CN106647814B/en
Publication of CN106647814A publication Critical patent/CN106647814A/en
Application granted granted Critical
Publication of CN106647814B publication Critical patent/CN106647814B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Navigation (AREA)

Abstract

The invention discloses a kind of unmanned plane vision auxiliary positionings and flight control system and method based on the identification of two dimensional code terrestrial reference, the system includes drone body, sensor module, pursuit path generation module, vision processing module, sensor update module, flight control modules, vision auxiliary control switching module, command output module and camera, by carrying out vision extraction to the two dimensional code marker arranged on the specific position of course line, merge the calculating that inertial navigation system carries out exact position and posture information, and then assist and improve the precision of tradition GPS integrated navigation system, diversification information is provided by two dimensional code encoded information simultaneously for unmanned plane to guide, expand the diversity of aerial mission, in addition, it is proposed a kind of cascade flight control system of Adaptive Compensation Control based on deviation, realize Marker Identity state and unidentified The smooth transition of state improves the stability of flight control, and then improves the precision and rapidity of identification.

Description

A kind of unmanned plane vision auxiliary positioning and flight control system based on the identification of two dimensional code terrestrial reference And method
Technical field
The invention belongs to unmanned vehicle technical fields, more particularly, to a kind of nothing based on the identification of two dimensional code terrestrial reference Man-machine vision auxiliary positioning and flight control system and method.
Background technique
Recently as intelligence science and control the reach of science, unmanned plane becomes a currently more popular research Topic.At present unmanned plane be widely used in taking photo by plane, the earth mapping, geology rescue, fire rescue, the fields such as traffic monitoring.Nobody Not only there is machine actual social application to be worth, and also have important research significance, such as agricultural plant protection, electric power in engineering and science The fields such as inspection, forest fire protection, inspection calamity, have vast potential for future development.
Unmanned plane it is automatic in-flight, traditional integrated navigation technology is limited to GPS accuracy problem, in position precision About ± 2m or so is delivered in the occasion relatively high to airline operation, hovering required precision, such as Express Logistics, and the disaster relief is supported, Upper warship operation, the application such as auto-returned charging, generally requires to use other equipment to arrive at target point when assisting to improve flight Precision, have certain limitation.
Summary of the invention
Aiming at the above defects or improvement requirements of the prior art, the present invention provides one kind is precisely known based on two dimensional code terrestrial reference Other unmanned plane vision auxiliary positioning and flight control system arrange several with the mark of quick response code form on the specific position in course line Will object is as key point, by carrying out vision extraction to two dimensional code marker, fusion inertial navigation system carry out exact position and The calculating of posture information, and then the precision of tradition GPS integrated navigation system is assisted and improves, while believing by the coding of two dimensional code Breath provides diversification information for unmanned plane and guides, and expands the diversity of aerial mission.In addition, proposing a kind of based on the adaptive of deviation The cascade flight control system of control should be compensated, realizes the smooth transition of Marker Identity state and unidentified state, improves and flies The stability of row control, and then the precision and rapidity of identification are improved, thus solve conventional combination navigation skill in the prior art Art is limited to GPS accuracy problem, and position precision is lower, needs to use other equipment to arrive at target point when assisting to improve flight Precision the technical issues of.
To achieve the above object, according to one aspect of the present invention, a kind of nothing based on the identification of two dimensional code terrestrial reference is provided Man-machine vision auxiliary positioning and flight control system characterized by comprising drone body, sensor module, pursuit path generate Module, vision processing module, sensor update module, flight control modules, command output module, vision auxiliary control switching mould Block and camera:
The sensor module be used to obtain the drone body location information and the drone body the One movement velocity vector;
The pursuit path generation module is used to generate course line pursuit path according to preset task way point information, and to institute It states course line pursuit path progress discrete processes and obtains N number of expectation destination, N is positive integer;
The image for the two dimensional code marker that the vision processing module is used to be acquired according to the camera obtains institute The location information, posture information and encoded information for stating two dimensional code marker, by the location information, posture information and coding Information obtains the camera relative to the deviation distance vector of the two dimensional code marker and the camera relative to institute State the second movement velocity vector of two dimensional code marker;
The sensor update module is used for the location information using the drone body, first movement velocity is sweared Amount, the deviation distance vector and the second movement velocity vector carry out multi-sensor information by Kalman filtering algorithm Fusion obtains target position information, the first movement velocity of the target arrow through the filtered drone body of Kalman filtering algorithm Amount, target deviation distance vector and target the second movement velocity vector;
The flight control modules are used for the desired speed of the desired locations using target expectation destination, target expectation destination Vector, the target position information, the first movement velocity of target vector, the target deviation distance vector and the mesh It marks the second movement velocity vector to guidance command by deviation adaptive equalization generation, described guidance command is sent to described instruction Output module, wherein the target expectation destination is the destination that unmanned plane is currently being directed to, described to guidance command including roll Angle and pitch angle;
The vision auxiliary control switching module, for controlling institute when the two dimensional code marker is in identification state It states the information calculating that flight control modules are obtained according to the sensor module and the vision processing module to guidance command, in institute When stating two dimensional code marker and being in unidentified state, control what the flight control modules were obtained according only to the sensor module Information calculating is guidanceed command;
Described instruction output module is for exporting described guidance command.
Preferably, the camera is located at the bottom of the drone body, and the visual field direction of the camera is hung down Directly downward.
Preferably, the vision processing module includes image gray processing module, image binaryzation module, binary map processing mould Block, two-dimensional barcode information extraction module and position and attitude obtain module,
Described image gray processing module is used to convert single channel grayscale image for the image of the two dimensional code marker;
Described image binarization block is used to set a fixed threshold values according to single channel grayscale image, converts grayscale image to Binary map;
The binary map processing module is used to carry out contour detecting to the binary map, traverses all sides in the binary map Number is 4 polygon, and rejects the polygon that area is less than preset threshold, the polygon for being then 4 by remaining side number Rectangular projection is carried out, the square-shaped image of standard is obtained;
The two-dimensional barcode information extraction module is used for according in square-shaped image described in preset encoded information Rule Extraction Binary-coded information and angle point information;
The position and attitude obtains module and is used to obtain described take the photograph according to the binary-coded information and angle point information of extraction As head relative to the deviation distance vector of the two dimensional code marker and the camera relative to the two dimensional code marker The second movement velocity vector.
It is another aspect of this invention to provide that providing a kind of unmanned plane vision auxiliary positioning based on the identification of two dimensional code terrestrial reference With winged prosecutor method characterized by comprising
S1: the location information of unmanned plane and the first movement velocity vector of unmanned plane are obtained;
S2: according to preset task way point information generate course line pursuit path, and to the course line pursuit path carry out from Scattered processing obtains N number of expectation destination, and N is positive integer;
S3: the image of the two dimensional code marker acquired according to camera obtains the position letter of the two dimensional code marker Breath, posture information and encoded information, it is opposite to obtain the camera by the location information, posture information and encoded information It is transported in the deviation distance vector of the two dimensional code marker and the camera relative to the second of the two dimensional code marker Dynamic velocity vector;
S4: using the location information of the unmanned plane, the first movement velocity vector, the deviation distance vector and The second movement velocity vector carries out multi-sensor information fusion by Kalman filtering algorithm, obtains calculating through Kalman filtering Target position information, the first movement velocity of target vector, target deviation distance vector and the target of the filtered unmanned plane of method Second movement velocity vector;
S5: the desired locations of target expectation destination, the desired speed vector of target expectation destination, the target position are utilized Information, the first movement velocity of target vector, the target deviation distance vector and the second movement velocity of target arrow Amount is guidanceed command by deviation adaptive equalization generation, wherein the target expectation destination is what unmanned plane was currently being directed to Destination, it is described to guidance command including roll angle and pitch angle;
S6: it is guidanceed command described in output.
Preferably, the camera is located at the bottom of the unmanned plane, and the vertical court in visual field direction of the camera Under.
Preferably, step S3 specifically includes following sub-step:
S301: single channel grayscale image is converted by the image of the two dimensional code marker;
S302: a fixed threshold values is set according to single channel grayscale image, converts binary map for grayscale image;
S303: carrying out contour detecting to the binary map, traverse the polygon that all side numbers are 4 in the binary map, And the polygon that area is less than preset threshold is rejected, the polygon that remaining side number is 4 is then subjected to rectangular projection, is obtained The square-shaped image of standard;
S304: according to the binary-coded information and angle point in square-shaped image described in preset encoded information Rule Extraction Information;
S305: the camera is obtained relative to the two dimensional code according to the binary-coded information of extraction and angle point information Second movement velocity vector of the deviation distance vector and the camera of marker relative to the two dimensional code marker.
In general, through the invention it is contemplated above technical scheme is compared with the prior art, mainly have skill below Art advantage:
(1) by identifying to the two dimensional code marker arranged on the specific position of ground, the coding skill of two dimensional code is utilized Art obtains landmark information, and merges multiple sensors information, the positioning accuracy of Lai Tigao unmanned plane, thus auxiliary and raising tradition The precision of GPS integrated navigation system, simultaneously because two-dimensional encoded be capable of providing landmark information abundant and have cryptographic capabilities, energy It enough provides diversification information for unmanned plane to guide, and then the diversity of extended flight task;
(2) the same cascade flight control system is used in Marker Identity state and unidentified state, and proposed A kind of adaptive equalization prosecutor method based on deviation, under marker target identification state to the additional positional information got into Row compensation, can be realized the smooth transition of Marker Identity state and unidentified state, improve the stability of flight control, guarantee Rotor wing unmanned aerial vehicle can realize that fast accurate identifies under various interference environments.
Detailed description of the invention
Fig. 1 is a kind of hardware structure diagram of unmanned plane high-precision independent flight disclosed by the embodiments of the present invention;
Fig. 2 is a kind of unmanned plane vision auxiliary positioning identified based on two dimensional code terrestrial reference disclosed by the embodiments of the present invention and flown The structural schematic diagram of control system;
Fig. 3 is a kind of unmanned plane vision auxiliary positioning identified based on two dimensional code terrestrial reference disclosed by the embodiments of the present invention and flown The information exchange figure of each module of control system;
Fig. 4 is a kind of unmanned plane vision auxiliary positioning identified based on two dimensional code terrestrial reference disclosed by the embodiments of the present invention and flown The flow diagram of prosecutor method;
Fig. 5 is a kind of flow diagram of unmanned plane high-precision independent flight disclosed by the embodiments of the present invention.
Specific embodiment
In order to make the objectives, technical solutions, and advantages of the present invention clearer, with reference to the accompanying drawings and embodiments, right The present invention is further elaborated.It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, and It is not used in the restriction present invention.As long as in addition, technical characteristic involved in the various embodiments of the present invention described below Not constituting a conflict with each other can be combined with each other.
Fig. 1 show a kind of hardware structure diagram of unmanned plane high-precision independent flight disclosed by the embodiments of the present invention, in Fig. 1 Shown in hardware structure diagram, it may include accelerometer, gyroscope, ultrasonic wave that Fig. 1 upper left, which is navigation sensor set, Sensor, barometer, magnetometer and GPS module etc., wherein each sensor can pass through IIC and SPI interface and Fig. 1 lower left quarter The flight control mainboard communication divided, Fig. 1 lower right-most portion are the camera on unmanned plane, can pass through USB2.0 interface and Fig. 1 is right The visual processes mainboard of upper part communicates, and visual processes mainboard can be communicated by TTL serial ports and flight control mainboard.
Wherein, flight control mainboard can use STM32F407 embeded processor, and operation dominant frequency is 168Mhz.Navigation Sensor can specifically include: MPU6050 gyroscope and accelerometer, MS5611 high-precision barometer, M8NGPS receiver, US100 ultrasonic range finder.Visual processes mainboard can use S5P4418 high-performance processor, and operation dominant frequency is 1.4Ghz, tool There is 1GB DDR3 running memory.Camera can be KS2A17, can be USB2.0 with visual processes mainboard communication mode, Under 640 × 480 resolution ratio, maximum frame per second is 120fps.Visual processes mainboard can be connect with flight control mainboard with TTL serial ports Mode carry out data communication.
Fig. 2 is a kind of unmanned plane vision auxiliary positioning identified based on two dimensional code terrestrial reference disclosed by the embodiments of the present invention and flown The structural schematic diagram of control system, Fig. 3 are a kind of unmanned plane vision based on the identification of two dimensional code terrestrial reference disclosed by the embodiments of the present invention The information exchange figure of auxiliary positioning and each module of flight control system.As shown in Figures 2 and 3, system of the present invention includes unmanned plane Ontology, pursuit path generation module, vision processing module, sensor update module, flight control modules, refers to sensor module Enable output module, vision auxiliary control switching module and camera.
Wherein, the sensor module is used to obtain the location information of drone body and the first fortune of drone body Dynamic velocity vector;
Above-mentioned pursuit path generation module is used to generate course line pursuit path according to preset task way point information, and to upper It states course line pursuit path progress discrete processes and obtains N number of expectation destination, N is positive integer;
The image for the two dimensional code marker that above-mentioned vision processing module is used to be acquired according to camera obtains the two dimension Location information, posture information and the encoded information of code mark object, are obtained by above-mentioned location information, posture information and encoded information Deviation distance vector and camera to camera relative to two dimensional code marker are transported relative to the second of two dimensional code marker Dynamic velocity vector;
Wherein, camera is located at the bottom of drone body, and the visual field direction of camera is vertically downward.
Wherein, vision processing module includes image gray processing module, image binaryzation module, binary map processing module, two It ties up code information extraction modules and position and attitude obtains module,
Above-mentioned image gray processing module is used to convert single channel grayscale image for the image of two dimensional code marker;
Above-mentioned image binaryzation module is used to set a fixed threshold values according to single channel grayscale image, converts grayscale image to Binary map;
Above-mentioned binary map processing module is used to carry out contour detecting to binary map, and traversing all side numbers in binary map is 4 Polygon, and reject area be less than preset threshold polygon, then by remaining side number be 4 polygon carry out it is orthogonal Projection, obtains the square-shaped image of standard;
Above-mentioned two-dimensional barcode information extraction module is used for according to two in preset encoded information Rule Extraction square-shaped image Scale coding information and angle point information;
Above-mentioned position and attitude obtains module and is used to obtain camera according to the binary-coded information and angle point information of extraction Deviation distance vector and camera relative to two dimensional code marker are sweared relative to the second movement velocity of two dimensional code marker Amount.
Wherein, the size of two dimensional code marker is m centimetres of m cm x, the angle point information table of obtained two dimensional code marker Show the location information under the image coordinate of camera, mainly destination flight error is compensated due to subsequent, is united One provides four angle point real-world coordinates of two dimensional code marker respectively (m, m, 0), (m, 0,0), (0, m, 0), (0,0, 0), video camera imaging principle: sm'=A [R | T] M, wherein A is camera internal reference matrix, can be obtained by experimental calibration It arrives, m ' is position of the camera under camera coordinate system, and M is position of the camera under real-world coordinates system, and [R | T] is Rotational translation matrix, that is, position and posture of the camera under real-world coordinates system relative to some point, Ji Keqiu Camera is transported relative to the deviation distance vector and camera of two dimensional code marker relative to the second of two dimensional code marker out Dynamic velocity vector.
The sensor update module is used for using the location information of drone body, the first movement velocity vector, deviates Distance vector and the second movement velocity vector are obtained by Kalman filtering algorithm progress multi-sensor information fusion through karr Target position information, the first movement velocity of the target vector, target deviation distance of the graceful filtered drone body of filtering algorithm Vector and target the second movement velocity vector.
Wherein it is possible to be merged by designing Kalman filter to multi-sensor information, Lai Tigao measurement accuracy.Card The state of Thalmann filter more new formula are as follows:
Wherein, θ, γ are the pitch angle and roll angle in spin matrix R, and V is the unmanned plane speed under real-world coordinates Vector, a are the unmanned plane acceleration under real-world coordinates, abIt is the acceleration under unmanned plane coordinate, Ke Yiyou Accelerometer measures in unmanned plane obtain, wbIt is the angular velocity vector under unmanned plane coordinate, it can be by the gyro in unmanned plane Instrument measurement obtains, and Δ t is filter update interval time.
Above-mentioned flight control modules are used for the desired speed of the desired locations using target expectation destination, target expectation destination Vector, target position information, the first movement velocity of target vector, target deviation distance vector and the second movement velocity of target arrow Amount is guidanceed command by deviation adaptive equalization generation, this is guidanceed command and is sent to command output module, wherein target expectation Destination is the destination that unmanned plane is currently being directed to, and is guidanceed command including roll angle and pitch angle;
Wherein, the control target of high-precision flight is exactly so that the position of unmanned plane converges to target expectation destination set In sufficiently small neighborhood.In the high-precision mission phase for thering is vision to assist, since the precision of visual apparatus measurement is better than tradition Navigation equipment needs to compensate the input quantity in controller at this time:
Verr(t)=[Vd(t)-w3·V(t)]-w4·Vvision(t)
Wherein, Pd(t), Vd(t) be respectively unmanned plane desired locations and desired speed input vector, Perr(t), Verr(t) The respectively error input vector of position outer ring controller and speed inner loop control device, P (t), V (t) are respectively tradition GPS combination The position vector and movement velocity vector for the unmanned plane that navigation system is calculated, T (t), VvisionIt (t) is respectively visual processes The deviation distance vector sum movement velocity vector for the unmanned plane relative target two dimensional code marker that module is calculated, w1, w2, w3, w4For backoff weight coefficient, generally desirable w1=w2, w3=w4, backoff weight coefficient can take fixed value, can also use adaptive The mode answered determines:
w2=1-w1
During unmanned plane during flying, is limited by camera field range and practical flight environmental disturbances influence, it can be right The marker of unmanned plane extracts accurate information degree and affects, and traditional unmanned aerial vehicle control system is in the unidentified process of surface mark object With identify successfully after control in different control strategies is respectively adopted, therefore, flight control modules can be known in surface mark object Not with frequent switching under unidentified two states, cause to control unstable.The present invention is in surface mark object identification state and does not know Other state uses the same cascade flight control modules, that is, uses a flight control modules, and proposes a kind of based on inclined The Adaptive Compensation Control Method of difference, to the additional positions and motion velocity information got under surface mark object identification state It compensates.It can be realized surface mark object identification state and the smooth transition of unidentified state, improve the stability of flight control, Guarantee that unmanned plane can realize that high accurancy and precision flies under circumstances.
Above-metioned instruction output module is for exporting above-mentioned guidance command.
Fig. 4 is a kind of unmanned plane vision auxiliary positioning identified based on two dimensional code terrestrial reference disclosed by the embodiments of the present invention and flown The flow diagram of prosecutor method, wherein method shown in Fig. 4 the following steps are included:
S1: the location information of unmanned plane and the first movement velocity vector of unmanned plane are obtained;
S2: according to preset task way point information generate course line pursuit path, and to the course line pursuit path carry out from Scattered processing obtains N number of expectation destination, and N is positive integer;
S3: the image of the two dimensional code marker arranged in advance got according to camera is precisely identified;
Wherein, the implementation of step S3 are as follows: according to the figure for the two dimensional code marker arranged in advance that camera is got As obtain the two dimensional code marker location information, posture information and encoded information, by above-mentioned location information, posture information with And encoded information obtain camera relative to two dimensional code marker deviation distance vector and camera relative to two dimensional code mark Second movement velocity vector of will object;
S4: the information of two dimensional code marker, the location information of unmanned plane and the first movement velocity vector of identification are utilized Multi-sensor information fusion is carried out by Kalman filtering algorithm;
Wherein, the specific implementation of step S4 are as follows: using the location information of unmanned plane, the first movement velocity vector, partially Multi-sensor information fusion is carried out by Kalman filtering algorithm from distance vector and the second movement velocity vector, is obtained through card Target position information, the first movement velocity of target vector, the target deviation distance arrow of the filtered unmanned plane of Kalman Filtering algorithm Amount and target the second movement velocity vector.
S5: using obtained after Kalman filtering information generation guidance command, wherein in guidanceing command include roll angle and Pitch angle;
Wherein, the specific implementation of step S5 are as follows: it is expected destination using the desired locations of target expectation destination, target Desired speed vector, target position information, the first movement velocity of target vector, target deviation distance vector and target second are transported Dynamic velocity vector is guidanceed command by deviation adaptive equalization generation, wherein target it is expected destination be unmanned plane currently before Past destination, is guidanceed command including roll angle and pitch angle.
S6: output is above-mentioned to guidance command.
Fig. 5 is a kind of flow diagram of unmanned plane high-precision independent flight disclosed by the embodiments of the present invention.Have in Fig. 5 Three task destinations, wherein task destination (n) and (n+1) are crucial destination, and have marker in surface deployment: 1 He of two dimensional code Two dimensional code 2.Unmanned plane is when flying through destination (n-1), only with traditional GPS integrated navigation system.Flying through task destination (n) When (n+1), can position, posture and encoded information to ground marker extract, to assist traditional GPS integrated navigation System.
As it will be easily appreciated by one skilled in the art that the foregoing is merely illustrative of the preferred embodiments of the present invention, not to The limitation present invention, any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should all include Within protection scope of the present invention.

Claims (6)

1. a kind of unmanned plane vision auxiliary positioning and flight control system based on the identification of two dimensional code terrestrial reference characterized by comprising nothing Man-machine ontology, sensor module, pursuit path generation module, vision processing module, sensor update module, flight control mould Block, command output module, vision auxiliary control switching module and camera:
The sensor module is used to obtain the location information of the drone body and the first fortune of the drone body Dynamic velocity vector;
The pursuit path generation module is used to generate course line pursuit path according to preset task way point information, and to the boat Line pursuit path carries out discrete processes and obtains N number of expectation destination, and N is positive integer;
The image for the two dimensional code marker that the vision processing module is used to be acquired according to the camera obtains described two Location information, posture information and the encoded information for tieing up code mark object, by the location information, posture information and encoded information The camera is obtained relative to the deviation distance vector of the two dimensional code marker and the camera relative to described two Tie up the second movement velocity vector of code mark object;
The sensor update module be used for using the location information of the drone body, the first movement velocity vector, The deviation distance vector and the second movement velocity vector carry out multi-sensor information by Kalman filtering algorithm and melt Close, obtain target position information through the filtered drone body of Kalman filtering algorithm, the first movement velocity of target vector, Target deviation distance vector and target the second movement velocity vector;
The flight control modules are used for the desired locations using target expectation destination, the desired speed of target expectation destination is sweared Amount, the target position information, the first movement velocity of target vector, the target deviation distance vector and the target Second movement velocity vector is guidanceed command by deviation adaptive equalization generation, guidances command that be sent to described instruction defeated for described Module out, wherein the target expectation destination is the destination that unmanned plane is currently being directed to, described to guidance command including roll angle And pitch angle;
The vision auxiliary control switching module, for when the two dimensional code marker is in identification state, controlling described fly The information calculating that row control module is obtained according to the sensor module and the vision processing module is guidanceed command, described two When dimension code mark object is in unidentified state, the information that the flight control modules are obtained according only to the sensor module is controlled Calculating is guidanceed command;
Described instruction output module is for exporting described guidance command;
Wherein, when the flight control modules are guidanceed command by deviation adaptive equalization generation, calculating position outer ring controller Error input vector Perr(t) and the error input vector V of speed inner loop control deviceerr(t) formula is respectively as follows:
Pd(t) and Vd(t) be respectively unmanned plane desired locations and desired speed input vector, P (t) and V (t) are respectively tradition The position vector and movement velocity vector, T (t) and V for the unmanned plane that GPS integrated navigation system is calculatedvision(t) it is respectively The camera that the vision processing module is calculated relative to the two dimensional code marker deviation distance vector and Second movement velocity vector of the camera relative to the two dimensional code marker, w1、w2、w3And w4For penalty coefficient.
2. system according to claim 1, which is characterized in that the camera is located at the bottom of the drone body, And the visual field direction of the camera is vertically downward.
3. system according to claim 1, which is characterized in that the vision processing module include image gray processing module, Image binaryzation module, binary map processing module, two-dimensional barcode information extraction module and position and attitude obtain module,
Described image gray processing module is used to convert single channel grayscale image for the image of the two dimensional code marker;
Described image binarization block is used to set a fixed threshold values according to single channel grayscale image, converts two-value for grayscale image Figure;
The binary map processing module is used to carry out contour detecting to the binary map, traverses all side numbers in the binary map For 4 polygon, and the polygon that area is less than preset threshold is rejected, then carries out the polygon that remaining side number is 4 Rectangular projection obtains the square-shaped image of standard;
The two-dimensional barcode information extraction module is used for according to two in square-shaped image described in preset encoded information Rule Extraction Scale coding information and angle point information;
The position and attitude obtains module and is used to obtain the camera according to the binary-coded information and angle point information of extraction Deviation distance vector and the camera relative to the two dimensional code marker relative to the two dimensional code marker Two movement velocity vectors.
4. a kind of unmanned plane vision auxiliary positioning and winged prosecutor method based on the identification of two dimensional code terrestrial reference characterized by comprising
S1: the location information of unmanned plane and the first movement velocity vector of unmanned plane are obtained;
S2: course line pursuit path is generated according to preset task way point information, and discrete place is carried out to the course line pursuit path Reason obtains N number of expectation destination, and N is positive integer;
S3: the image of the two dimensional code marker acquired according to camera obtain the two dimensional code marker location information, Posture information and encoded information, by the location information, posture information and encoded information obtain the camera relative to Second movement of the deviation distance vector and the camera of the two dimensional code marker relative to the two dimensional code marker Velocity vector;
S4: the location information of the unmanned plane, the first movement velocity vector, the deviation distance vector and described are utilized Second movement velocity vector carries out multi-sensor information fusion by Kalman filtering algorithm, obtains filtering through Kalman filtering algorithm Target position information, the first movement velocity of target vector, target deviation distance vector and the target second of unmanned plane after wave Movement velocity vector;
S5: believed using the desired locations of target expectation destination, the desired speed vector of target expectation destination, the target position Breath, the first movement velocity of target vector, the target deviation distance vector and the target the second movement velocity vector It is guidanceed command by deviation adaptive equalization generation, wherein the target expectation destination is the boat that unmanned plane is currently being directed to Point, it is described to guidance command including roll angle and pitch angle;
Wherein, when being guidanceed command by deviation adaptive equalization generation, the error input vector P of calculating position outer ring controllererr (t) and the error input vector V of speed inner loop control deviceerr(t) formula is respectively as follows:
Pd(t) and Vd(t) be respectively unmanned plane desired locations and desired speed input vector, P (t) and V (t) are respectively tradition The position vector and movement velocity vector, T (t) and V for the unmanned plane that GPS integrated navigation system is calculatedvision(t) it is respectively The camera that the vision processing module is calculated relative to the two dimensional code marker deviation distance vector and Second movement velocity vector of the camera relative to the two dimensional code marker, w1、w2、w3And w4For penalty coefficient;
S6: it is guidanceed command described in output.
5. according to the method described in claim 4, it is characterized in that, the camera is located at the bottom of the unmanned plane, and The visual field direction of the camera is vertically downward.
6. according to the method described in claim 4, it is characterized in that, step S3 specifically includes following sub-step:
S301: single channel grayscale image is converted by the image of the two dimensional code marker;
S302: a fixed threshold values is set according to single channel grayscale image, converts binary map for grayscale image;
S303: carrying out contour detecting to the binary map, traverses all side numbers in the binary map and is 4 polygon, and picks Except area is less than the polygon of preset threshold, the polygon that remaining side number is 4 is then subjected to rectangular projection, obtains standard Square-shaped image;
S304: according to the binary-coded information and angle point letter in square-shaped image described in preset encoded information Rule Extraction Breath;
S305: the camera is obtained relative to the two-dimentional code mark according to the binary-coded information of extraction and angle point information Second movement velocity vector of the deviation distance vector and the camera of object relative to the two dimensional code marker.
CN201611092540.2A 2016-12-01 2016-12-01 A kind of unmanned plane vision auxiliary positioning and flight control system and method based on the identification of two dimensional code terrestrial reference Active CN106647814B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611092540.2A CN106647814B (en) 2016-12-01 2016-12-01 A kind of unmanned plane vision auxiliary positioning and flight control system and method based on the identification of two dimensional code terrestrial reference

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611092540.2A CN106647814B (en) 2016-12-01 2016-12-01 A kind of unmanned plane vision auxiliary positioning and flight control system and method based on the identification of two dimensional code terrestrial reference

Publications (2)

Publication Number Publication Date
CN106647814A CN106647814A (en) 2017-05-10
CN106647814B true CN106647814B (en) 2019-08-13

Family

ID=58814148

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611092540.2A Active CN106647814B (en) 2016-12-01 2016-12-01 A kind of unmanned plane vision auxiliary positioning and flight control system and method based on the identification of two dimensional code terrestrial reference

Country Status (1)

Country Link
CN (1) CN106647814B (en)

Families Citing this family (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107255827A (en) * 2017-07-06 2017-10-17 杨顺伟 The scenic spot navigation method and device of a kind of unmanned plane
CN107194399B (en) * 2017-07-14 2023-05-09 广东工业大学 Visual calibration method, system and unmanned aerial vehicle
CN107703973B (en) * 2017-09-11 2021-08-31 广州视源电子科技股份有限公司 Trajectory tracking method and device
CN109547971A (en) 2017-09-21 2019-03-29 索尼公司 Device and method, computer readable storage medium in wireless communication system
CN108305291B (en) * 2018-01-08 2022-02-01 武汉大学 Monocular vision positioning and attitude determination method utilizing wall advertisement containing positioning two-dimensional code
CN110471403B (en) 2018-05-09 2023-11-03 北京外号信息技术有限公司 Method for guiding an autonomously movable machine by means of an optical communication device
CN108803668B (en) * 2018-06-22 2021-08-24 中国南方电网有限责任公司超高压输电公司广州局 Intelligent inspection unmanned aerial vehicle nacelle system for static target monitoring
CN110325940A (en) * 2018-06-29 2019-10-11 深圳市大疆创新科技有限公司 A kind of flight control method, equipment, system and storage medium
CN109521781A (en) * 2018-10-30 2019-03-26 普宙飞行器科技(深圳)有限公司 Unmanned plane positioning system, unmanned plane and unmanned plane localization method
CN111121744A (en) * 2018-10-30 2020-05-08 千寻位置网络有限公司 Positioning method and device based on sensing unit, positioning system and mobile terminal
CN112147995B (en) * 2019-06-28 2024-02-27 深圳市创客工场科技有限公司 Robot motion control method and device, robot and storage medium
CN110446159B (en) * 2019-08-12 2020-11-27 上海工程技术大学 System and method for accurate positioning and autonomous navigation of indoor unmanned aerial vehicle
CN110375747A (en) * 2019-08-26 2019-10-25 华东师范大学 A kind of inertial navigation system of interior unmanned plane
CN110543989A (en) * 2019-08-29 2019-12-06 中国南方电网有限责任公司 Power transmission line machine patrol operation safety early warning method and device and computer equipment
CN110673619B (en) * 2019-10-21 2022-06-17 深圳市道通智能航空技术股份有限公司 Flight attitude control method and device, unmanned aerial vehicle and storage medium
CN111323789B (en) * 2020-03-19 2023-11-03 陕西思地三维科技有限公司 Ground morphology scanning device and method based on unmanned aerial vehicle and solid-state radar
CN111580551A (en) * 2020-05-06 2020-08-25 杭州电子科技大学 Navigation system and method based on visual positioning
CN111930133A (en) * 2020-07-20 2020-11-13 贵州电网有限责任公司 Transformer substation secondary screen cabinet inspection method based on rotor unmanned aerial vehicle
CN112040175A (en) * 2020-07-31 2020-12-04 深圳供电局有限公司 Unmanned aerial vehicle inspection method and device, computer equipment and readable storage medium
CN112381464A (en) * 2020-12-07 2021-02-19 北京小米松果电子有限公司 Shared vehicle scheduling method and device and storage medium
CN112859923B (en) * 2021-01-25 2022-02-18 西北工业大学 Unmanned aerial vehicle vision formation flight control system
CN113238580B (en) * 2021-06-03 2022-12-13 一飞智控(天津)科技有限公司 Method and system for switching static placement deviation and dynamic flight deviation of unmanned aerial vehicle
CN113657256B (en) * 2021-08-16 2023-09-26 大连海事大学 Unmanned aerial vehicle sea-air cooperative vision tracking and autonomous recovery method
CN113776523B (en) * 2021-08-24 2024-03-19 武汉第二船舶设计研究所 Robot low-cost navigation positioning method, system and application
CN114326766B (en) * 2021-12-03 2024-08-20 深圳先进技术研究院 Cooperative autonomous tracking and landing method for vehicle and machine
CN114237262B (en) * 2021-12-24 2024-01-19 陕西欧卡电子智能科技有限公司 Automatic berthing method and system for unmanned ship on water surface
CN114104310A (en) * 2021-12-31 2022-03-01 重庆高新区飞马创新研究院 Device and method for assisting unmanned aerial vehicle in landing based on GPS and AprilTag
CN114489102A (en) * 2022-01-19 2022-05-13 上海复亚智能科技有限公司 Self-inspection method and device for electric power tower, unmanned aerial vehicle and storage medium
CN115586798B (en) * 2022-12-12 2023-03-24 广东电网有限责任公司湛江供电局 Unmanned aerial vehicle anti-crash method and system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105184776A (en) * 2015-08-17 2015-12-23 中国测绘科学研究院 Target tracking method
CN105388905A (en) * 2015-10-30 2016-03-09 深圳一电航空技术有限公司 Unmanned aerial vehicle flight control method and device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9625913B2 (en) * 2014-12-09 2017-04-18 Embry-Riddle Aeronautical University, Inc. System and method for robust nonlinear regulation control of unmanned aerial vehicles synthetic jet actuators

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105184776A (en) * 2015-08-17 2015-12-23 中国测绘科学研究院 Target tracking method
CN105388905A (en) * 2015-10-30 2016-03-09 深圳一电航空技术有限公司 Unmanned aerial vehicle flight control method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于耦合补偿和自抗扰的飞行器姿态控制;姚瑶,刘磊,王永骥;《第34届中国控制会议》;20150730;第5707-5712页

Also Published As

Publication number Publication date
CN106647814A (en) 2017-05-10

Similar Documents

Publication Publication Date Title
CN106647814B (en) A kind of unmanned plane vision auxiliary positioning and flight control system and method based on the identification of two dimensional code terrestrial reference
CN109270953B (en) Multi-rotor unmanned aerial vehicle autonomous landing method based on concentric circle visual identification
AU2018331310B2 (en) A backup navigation system for unmanned aerial vehicles
Oleynikova et al. Reactive avoidance using embedded stereo vision for MAV flight
CN104854428B (en) sensor fusion
CN109901580A (en) A kind of unmanned plane cooperates with unmanned ground robot follows diameter obstacle avoidance system and its method
CN110058602A (en) Multi-rotor unmanned aerial vehicle autonomic positioning method based on deep vision
CN110077595A (en) Unmanned automated spacecraft lands and recovery system automatically under the conditions of complicated dynamic is jolted
Li et al. UAV autonomous landing technology based on AprilTags vision positioning algorithm
CN107463181A (en) A kind of quadrotor self-adoptive trace system based on AprilTag
CN103809598A (en) Rotor unmanned aircraft independent take-off and landing system based on three-layer triangle multi-color landing ground
CN106774386A (en) Unmanned plane vision guided navigation landing system based on multiple dimensioned marker
CN102190081B (en) Vision-based fixed point robust control method for airship
Bao et al. Vision-based horizon extraction for micro air vehicle flight control
Cho et al. Autonomous ship deck landing of a quadrotor UAV using feed-forward image-based visual servoing
CN104897159B (en) The whole air navigation aid of aircraft based on sequence image matching
Bi et al. A lightweight autonomous MAV for indoor search and rescue
Schofield et al. Autonomous power line detection and tracking system using UAVs
Nguyen et al. Post-mission autonomous return and precision landing of uav
Wang et al. Monocular vision and IMU based navigation for a small unmanned helicopter
CN106780337A (en) Unmanned plane based on two dimensional image warship visual simulation method
CN108445900A (en) A kind of unmanned plane vision positioning replacement differential technique
CN109240319A (en) The method and device followed for controlling unmanned plane
Qiu et al. Design and implementation of an autonomous landing control system of unmanned aerial vehicle for power line inspection
Gomez-Balderas et al. Vision-based autonomous hovering for a miniature quad-rotor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant