CN103809598B - A kind of rotor wing unmanned aerial vehicles based on three layers of isosceles triangle polychrome landing ramp autonomous landing system - Google Patents
A kind of rotor wing unmanned aerial vehicles based on three layers of isosceles triangle polychrome landing ramp autonomous landing system Download PDFInfo
- Publication number
- CN103809598B CN103809598B CN201410089860.7A CN201410089860A CN103809598B CN 103809598 B CN103809598 B CN 103809598B CN 201410089860 A CN201410089860 A CN 201410089860A CN 103809598 B CN103809598 B CN 103809598B
- Authority
- CN
- China
- Prior art keywords
- unmanned aerial
- wing unmanned
- rotor wing
- aerial vehicle
- landing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Landscapes
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Traffic Control Systems (AREA)
- Navigation (AREA)
Abstract
A kind of rotor wing unmanned aerial vehicles based on three layers of isosceles triangle polychrome landing ramp autonomous landing system, including rotor wing unmanned aerial vehicle (SRUA), airborne sensor, data processing unit, flight control system, airborne photographic head, landing ramp, ground, wireless image transmission module, wireless data transmission module, ground monitoring station;Airborne sensor includes Inertial Measurement Unit (IMU), GPS, barometer, ultrasonic;Data processing unit merges for the filtering of sensing data;Flight control system completes path planning, it is achieved the high accuracy of rotor wing unmanned aerial vehicle controls;Airborne photographic head carries out the collection of landing ramp image;Landing ramp, ground is red isosceles triangle, blue isosceles triangle, the landing ramp of green isosceles triangle nesting composition;Wireless image transmission module realizes image transmission earthward;Wireless data transmission module realizes rotor wing unmanned aerial vehicle and ground data and the communication of instruction;Ground monitoring station is made up of VPU and display terminal;This invention ensures that the reliability of rotor wing unmanned aerial vehicle navigation information, improve the control accuracy of the autonomous landing of rotor wing unmanned aerial vehicle, and low cost, application are conveniently, have important construction value.
Description
Technical field
The present invention relates to the rotor wing unmanned aerial vehicle autonomous landing system of a kind of vision navigation system, can be used for rotor
Unmanned plane independent navigation and control, be particularly well-suited to rotor wing unmanned aerial vehicle landing place fix, navigation accuracy want
Ask higher military and civilian field.
Background technology
Rotor wing unmanned aerial vehicle has the advantages such as size little, low cost, maneuverability, it is possible to realize taking off vertically
The functions such as landing, hovering, super low altitude flight, in terms of military, civilian and scientific research, application prospect is the widest
General.
Rotor wing unmanned aerial vehicle takes off and the research of airmanship is the most deep, and yields good result.
But the high accuracy navigation control method of landing phases is still the focus of research at present, domestic also in playing step
Section.Accurate elevation information is the basis realizing the autonomous landing of rotor wing unmanned aerial vehicle safety and stability, to rotor without
Man-machine performance has material impact.Owing to rotor wing unmanned aerial vehicle Airborne Inertial element height direction is unstable,
Limited by its size, weight, cost etc., generally by barometertic altimeter, output positioning height complete
Ball alignment system (GPS), ultrasonic survey high module equal-volume device little and low in energy consumption obtains accurately
Elevation information.Barometertic altimeter carries out corresponding height based on atmospheric pressure with the rule of height change and solves
Calculating, its simple in construction and capacity of will are strong, but during rotor wing unmanned aerial vehicle landing, rotor causes air-flow
Change, therefore accuracy of measurement is difficult to ensure that.Although gps signal has positional accuracy height, error
Will not the advantage such as accumulation in time, but gps signal renewal frequency is slow, and is easily subject to external interference,
The especially working environment of many buildings in city, it is easier to eclipse phenomena occurs, thus affects measurement
Precision.And ultrasonic can only within 4.2 meters altitude range provide accurate metrical information.
Develop the single camera vision system of comparative maturity at present, it is only necessary to a photographic head and a visual processes
Unit, arrives at landing point by detection landing mark vector aircraft.Existing landing ramp mark is mostly shape
The polygon combination of shape rule, H type, L-type, T-shaped or circular, its common ground is to be designed to contrast
Distinct black and white, in order to can terrestrial reference feature extraction out in the binary conversion treatment of image.Light
Line, environment and photographic head quality etc. all can be corresponding in the picture to black and white mark the gray scale of pixel
Value causes change in various degree so that black part gray value reduces, and white portion gray value raises,
The two difference reduces, and therefore, how landing ramp image to be carried out suitable Threshold segmentation process, to extract
Landing ramp characteristic information becomes extremely important and algorithm realization is more complicated.It addition, by visual beacon size
Limit, as too remote in rotor wing unmanned aerial vehicle distance landing ramp, landing ramp information fuzzy can be made to increase measurement by mistake
Difference, the most closely can cause part landing ramp pattern-information to run off camera fields of view and cause information dropout, therefore regard
The scope of heading of vision system has certain restriction.
Summary of the invention
The technology of the present invention solves problem: overcome the deficiency of existing vision guided navigation technology, by means of three layers
Isosceles triangle polychrome landing ramp, it is provided that a kind of precision is high, it is wide to measure scope, strong robustness, economy are reliable,
The monocular vision auxiliary being easily achieved surveys high rotor wing unmanned aerial vehicle autonomous landing system.
The technical solution of the present invention is: a kind of rotors based on three layers of isosceles triangle polychrome landing ramp are unmanned
Machine autonomous landing system, including rotor wing unmanned aerial vehicle (1), airborne sensor (2), data processing unit
(3), flight control system (4), airborne photographic head (5), landing ramp, ground (6), wireless
Image transmission module (7), wireless data transfer module (8) and ground monitoring station (9);Airborne biography
It is unmanned that sensor (2), data processing unit (3), flight control system (4) are loaded in rotor respectively
On machine (1), ground monitoring station (9) are by VPU (10) and display terminal (11) group
Become, wherein:
Rotor wing unmanned aerial vehicle (1) is that airborne sensor (2), data processing unit (3) and flight control
The carrier of system (4), it is rotor wing unmanned aerial vehicle autonomous landing systematic research main body;
Airborne sensor (2) is that Inertial Measurement Unit (IMU), global positioning system (GPS) connect
Receipts machine, magnetic compass, barometertic altimeter, ultrasonic;
The hardware core of data processing unit (3) is dsp processor, first the measurement to each sensor
Information carries out unruly-value rejecting, coordinate unification pretreatment, completes multiple measurement secondly based on Kalman filtering and believes
The fusion of breath, with to navigation information;
Flight control system (4) completes path rule according to navigation information and the mission requirements of rotor wing unmanned aerial vehicle
Drawing, method based on adaptive neural network completes control of landing;
Airborne photographic head (5) is fixed on rotor wing unmanned aerial vehicle (1), and camera lens is vertically downward, rotor
In unmanned plane descent, the image of landing ramp, photographic head captured in real time ground (6), and by wireless
Image transmission module (7) is transferred to ground monitoring station (9);
Landing ramp, ground (6) is as the visual beacon of rotor wing unmanned aerial vehicle, by red isosceles triangle, blue product
Font, the landing ramp of green isosceles triangle nesting composition;
Wireless image transmission module (7) is made up of image transmitting terminal and receiving terminal, is both needed to 12V direct current and supplies
Electricity, carrier frequency is 1.2GH;Transmitting terminal is fixed on rotor wing unmanned aerial vehicle (1), and receiving terminal is by regarding
Frequently capture card is connected with ground monitoring station;
Wireless data transfer module (8) is made up of with data receiver data transmitting terminal equally, is both needed to 5V
Direct current supply, carrier frequency is 900MHz, and baud rate is 115200bps;
Ground monitoring station (9) mainly completes to monitor and control task, whole by VPU and display
End composition;In order to alleviate airborne load, rotor wing unmanned aerial vehicle autonomous landing system passes through wireless image transmission mould
The landing ramp image that airborne photographic head (5) captures is sent to ground monitoring station (9) by block (7), and
Carried out image calculation by the VPU at ground monitoring station, obtain vision guided navigation information, then through nothing
Line data transmission module (8) is sent to rotor wing unmanned aerial vehicle autonomous landing system;Meanwhile, ground display is eventually
End can be real-time display the position of rotor wing unmanned aerial vehicle, attitude, velocity information, to facilitate ground handling people
Member sends control instruction according to mission requirements to rotor wing unmanned aerial vehicle.
The principle of the present invention is: landing ramp is placed on the pre-landing point of rotor wing unmanned aerial vehicle, to its size and feature
Clicking on rower fixed, typing VPU carries out visual system initialization.Rotor wing unmanned aerial vehicle descent
Middle captured in real time landing ramp image, and carry out the Threshold segmentation of image to separate product word based on color characteristic
Shape terrestrial reference, and then not bending moment algorithm based on the isosceles triangle terrestrial reference demarcated in advance carries out secondary judgement, passes through
After twice filtering based on CF is marked in image definitely, extract the terrestrial reference feature letter in image
Breath, can provide current rotor wing unmanned aerial vehicle relative to landing ramp in conjunction with the individual features information in world coordinate system
Highly.Vision measures height and carries out adaptive weighted with the measurement information of other height sensors in the same time
Average filter, obtains the high-precision navigation information of rotor wing unmanned aerial vehicle.Reliably elevation information send into rotor without
Man-machine autonomous landing system, to guide the landing of rotor wing unmanned aerial vehicle safety and stability.
Present invention advantage compared with prior art is:
(1) present invention selects the filtering method of self-adaptive weighted average to carry out the measurement of each height sensor
Information fusion, according to the reliability of each height sensor, adjusts the weights of each sensor at any time, it is ensured that
Optimal syncretizing effect.Self-adaptive weighted average not only ensure that rotor wing unmanned aerial vehicle autonomous landing system is overall
Accuracy in measurement, simultaneously lose star at GPS, landing ramp is blocked in the case of causing visual information to lose efficacy,
By its weights are independently updated, still ensure that rotor wing unmanned aerial vehicle autonomous landing system globe area height
Reliability.
(2) present invention utilizes color body strikingly color feature to reduce environmental disturbances, simplify figure
As Processing Algorithm;Meanwhile, in order to avoid redness that may be present in natural environment, blueness, green is homochromy
Information is disturbed, and visual beacon is designed as the isosceles triangle that shape is special;The descending redness of size, blueness,
The nested combination of green isosceles triangle pattern so that all have size in the image information of different height section captures
Suitably isosceles triangle terrestrial reference, it is ensured that the precision of omnidistance vision guided navigation.Landing ramp is easily manufactured, simple warp
Ji, volume is little, light weight, it is simple to carry.
Accompanying drawing explanation
Fig. 1 is the structure composition frame chart of the present invention;
Fig. 2 is the vision algorithm design flow diagram of the present invention;
Fig. 3 is the self-adaptive weighted average filtering figure of the present invention;
Fig. 4 is that information of the present invention transmits schematic diagram;
Detailed description of the invention
As it is shown in figure 1, the invention mainly comprises rotor wing unmanned aerial vehicle (1), airborne sensor (2), number
According to processing unit (3), flight control system (4), airborne photographic head (5), landing ramp, ground (6),
Wireless image transmission module (7), wireless data transfer module (8) and ground monitoring station (9);Machine
Set sensor (2), data processing unit (3), flight control system (4) are loaded in rotor respectively
On unmanned plane (1), ground monitoring station (9) are by VPU (10) and display terminal (11)
Composition.
Rotor wing unmanned aerial vehicle (1) is that airborne sensor (2), data processing unit (3) and flight control
The carrier of system (4), it is rotor wing unmanned aerial vehicle autonomous landing systematic research main body;
Airborne sensor (2) is that Inertial Measurement Unit (IMU), global positioning system (GPS) connect
Receipts machine, magnetic compass, barometertic altimeter, ultrasonic;
The hardware core of data processing unit (3) is dsp processor, first the measurement to each sensor
Information carries out unruly-value rejecting, coordinate unification pretreatment, completes multiple measurement secondly based on Kalman filtering and believes
The fusion of breath, with to navigation information;In terms of elevation information, rotor wing unmanned aerial vehicle autonomous landing system according to
The characteristic of height sensor, devises elevation information Fusion Module based on self-adaptive weighted average, obtains
Elevation information;
Flight control system (4) completes path rule according to navigation information and the mission requirements of rotor wing unmanned aerial vehicle
Drawing, method based on adaptive neural network completes control of landing;
Airborne photographic head (5) is fixed on rotor wing unmanned aerial vehicle (1), and camera lens is vertically downward, rotor
In unmanned plane descent, the image of landing ramp, photographic head captured in real time ground (6), and by wireless
Image transmission module (7) is transferred to ground monitoring station (9);
Landing ramp, ground (6) is as the visual beacon of rotor wing unmanned aerial vehicle, by red isosceles triangle, blue product
Font, the landing ramp of green isosceles triangle nesting composition;
Wireless image transmission module (7) is made up of image transmitting terminal and receiving terminal, is both needed to 12V direct current and supplies
Electricity, carrier frequency is 1.2GHz;Transmitting terminal is fixed on rotor wing unmanned aerial vehicle (1), and receiving terminal is by regarding
Frequently capture card is connected with ground monitoring station;
Wireless data transfer module (8) is made up of with data receiver data transmitting terminal equally, is both needed to 5V
Direct current supply, carrier frequency is 900MHz, and baud rate is 115200bps;
Ground monitoring station (9) mainly completes to monitor and control task, whole by VPU and display
End composition;In order to alleviate airborne load, rotor wing unmanned aerial vehicle autonomous landing system passes through wireless image transmission mould
The landing ramp image that airborne photographic head (5) captures is sent to ground monitoring station (9) by block (7), and
Carried out image calculation by the VPU at ground monitoring station, obtain vision guided navigation information, then through nothing
Line data transmission module (8) is sent to rotor wing unmanned aerial vehicle autonomous landing system;Meanwhile, ground display is eventually
End can be real-time display the position of rotor wing unmanned aerial vehicle, attitude, velocity information, to facilitate ground handling people
Member sends control instruction according to mission requirements to rotor wing unmanned aerial vehicle.
As in figure 2 it is shown, be the vision algorithm design flow diagram of the present invention.Rotor wing unmanned aerial vehicle carries out work of landing
Before work, ground monitoring station is carried out at vision according to trichroism isosceles triangle polychrome landing ramp size and camera internal reference
Reason system initialization.In figure, HpRepresenting the height in a moment on rotor wing unmanned aerial vehicle, rotor wing unmanned aerial vehicle is certainly
For rotor wing unmanned aerial vehicle level point height during main landing system start-up, to one rough estimate value, landing
During be the height after Multi-sensor Fusion.After visual system captures real-time landmark image, according to
HpValue judges to extract the terrestrial reference pattern of which kind of color, and then selects the image binaryzation method of respective color.
Rotor wing unmanned aerial vehicle is when more than 20 meters high-altitude landing, and visual system selects maximum red isosceles triangle conduct
Visual beacon, works as HpLess than HRBTime, it is switched to blue logo, at i.e. H near the groundpLess than HBGTime,
Then it is switched to the green mark of minimum.Wherein, HRBAnd HBGSize by landing ramp three kinds of isosceles triangles
The size of pattern and the focal length of photographic head are determined, in the present invention value of the two be respectively 700cm and
180cm.Utilize the data of the landmark image after bending moment algorithm does not calculates binaryzation, it may be judged whether demarcating
Not bending moment in the range of, after being marked in image definitely, carry out terrestrial reference feature extraction, and select corresponding face
The height calculation method of color, obtains eye level information, through the revised visual information of attitude angle with airborne
Height sensor carries out adaptive-filtering fusion, obtains present fusion height value H.
As it is shown on figure 3, be the self-adaptive weighted average filtering mould for height measurement information of the present invention
Block.First, the elevation information measured by each sensor is carried out pretreatment, as coordinate unit is same, flat
Sliding filtering is to reject incidental error, then with 4.2 meters for high/low space division boundary line, selects to regard in the high-altitude stage
Feel, barometertic altimeter, GPS information carry out self-adaptive weighted average filtering, and the low latitude stage is then above-mentioned
Add the high module of ultrasonic survey on the basis of three kinds of height sensors, carry out airborne photographic head, pressure altitude
Meter, self-adaptive weighted average filtering ultrasonic, GPS sensor are merged.In Fig. 3, j express time point,
H1(j), H2(j), H3(j), H4(j) be respectively the vision corresponding to jth time point, barometer, ultrasonic,
The measured value of GPS, w1(j), w2(j), w3(j), w4J () is these four corresponding weights of sensor, then may be used
Obtaining the fusion in j moment is highly:
In the low latitude stage, according to the measurement characteristics of four sensors, its weight initialization is as follows:
In descent, if visual pattern is not detected by landing ramp by not bending moment algorithm in resolving,
Then visual information is invalid, by w1J () is set to 0, otherwise, and w1J () is still 25%.If GPS
The quantity of the star received is less than 4, then GPS information exists bigger error, by w4J () is set to 0,
Otherwise, w4J () is still 5%.Each moment point is both needed to be updated four weights by following rule:
In the high-altitude stage, ultrasonic metrical information is unavailable, according to the measurement characteristics of other three sensors,
Its weight initialization is as follows:
In descent, if visual pattern is not detected by landing ramp by not bending moment algorithm in resolving,
Then visual information is invalid, by w1J () is set to 0, otherwise, and w1J () is still 60%.If GPS
The quantity of the star received is less than 4, then GPS information exists bigger error, by w4J () is set to 0,
Otherwise, w4J () is still 15%.Each moment point is both needed to be updated four weights by following rule:
As shown in Figure 4, for information transmission schematic diagram in rotor wing unmanned aerial vehicle descent.Rotor wing unmanned aerial vehicle drops
Captured in real time landing ramp image during falling, and be sent to by the wireless image transmission transmitting terminal of 1.2GHZ
Terrestrial wireless image transmitting receiving terminal, and it is sent to ground-based computer visual processes list by video frequency collection card
Unit.The elevation information that VPU output vision resolves, and through the modem of 900MHZ
It is sent to rotor wing unmanned aerial vehicle autonomous landing system, believes with from other height sensors phase height the most in the same time
Breath be filtered merge, and will merge after elevation information feed back to flight control system, to guide rotor
The safe independent landing of unmanned plane.Meanwhile, ground monitoring station can realize rotation by wireless system for transmitting data
The control command of wing unmanned plane during flying pattern.
Claims (3)
1. rotor wing unmanned aerial vehicles based on a three layers of isosceles triangle polychrome landing ramp autonomous landing system, it is characterised in that including: rotor wing unmanned aerial vehicle (1), airborne sensor (2), data processing unit (3), flight control system (4), airborne photographic head (5), landing ramp, ground (6), wireless image transmission module (7), wireless data transfer module (8) and ground monitoring station (9);Airborne sensor (2), data processing unit (3), flight control system (4) are loaded on rotor wing unmanned aerial vehicle (1) respectively, ground monitoring station (9) is made up of, wherein VPU (10) and display terminal (11):
Rotor wing unmanned aerial vehicle (1) is airborne sensor (2), data processing unit (3) and the carrier of flight control system (4), and it is rotor wing unmanned aerial vehicle autonomous landing systematic research object;
Airborne sensor (2) is Inertial Measurement Unit (IMU), global positioning system (GPS) receiver, magnetic compass, barometertic altimeter, ultrasonic;
The hardware core of data processing unit (3) is dsp processor, and the first measurement information to each sensor carries out unruly-value rejecting, coordinate and unit and unifies pretreatment, completes the fusion of multiple measurement information secondly based on Kalman filtering, to provide navigation information;
Flight control system (4) completes path planning according to navigation information and the mission requirements of rotor wing unmanned aerial vehicle, and method based on adaptive neural network completes control of landing;
Airborne photographic head (5) is fixed on rotor wing unmanned aerial vehicle (1), and camera lens is vertically downward, in rotor wing unmanned aerial vehicle descent, the image of landing ramp, photographic head captured in real time ground (6), and it is transferred to ground monitoring station (9) by wireless image transmission module (7);
Landing ramp, ground (6) is as the visual beacon of rotor wing unmanned aerial vehicle, the landing ramp being made up of red isosceles triangle, blue isosceles triangle, green isosceles triangle nesting;
Wireless image transmission module (7) is made up of image transmitting terminal and receiving terminal, is both needed to 12V direct current supply, and carrier frequency is 1.2GHz;Transmitting terminal is fixed on rotor wing unmanned aerial vehicle (1), and receiving terminal is connected with ground monitoring station by video frequency collection card;
Wireless data transfer module (8) is made up of with data receiver data transmitting terminal equally, is both needed to 5V direct current supply, and carrier frequency is 900MHz, and baud rate is 115200bps;
Ground monitoring station (9) mainly completes to monitor and control task, is made up of VPU and display terminal;In order to alleviate airborne load, the landing ramp image that airborne photographic head (5) captures is sent to ground monitoring station (9) by wireless image transmission module (7) by rotor wing unmanned aerial vehicle autonomous landing system, and carried out image calculation by the VPU at ground monitoring station, obtain vision guided navigation information, then be sent to rotor wing unmanned aerial vehicle autonomous landing system through wireless data transfer module (8);Meanwhile, the display position of rotor wing unmanned aerial vehicle, attitude, the velocity information that ground display terminal can be real-time, to facilitate surface personnel to send control instruction according to mission requirements to rotor wing unmanned aerial vehicle.
A kind of rotor wing unmanned aerial vehicles based on three layers of isosceles triangle polychrome landing ramp the most according to claim 1 autonomous landing system, it is characterized in that: described data processing unit (3) is on the basis of original much information based on Kalman filtering merges, the airborne photographic head that used for rotor wing unmanned aerial vehicle autonomous landing system, barometertic altimeter, ultrasonic, GPS, devise elevation information Fusion Module based on self-adaptive weighted average, it is achieved autonomous landing.
A kind of rotor wing unmanned aerial vehicles based on three layers of isosceles triangle polychrome landing ramp the most according to claim 1 autonomous landing system, it is characterised in that: described landing ramp, ground (6) is as visual beacon;Rotor wing unmanned aerial vehicle is operated in the place that half-tone information is abundant more, and in order to prevent surrounding from introducing noise jamming, the visual beacon in rotor wing unmanned aerial vehicle autonomous landing system is designed as the pure colour of signature;In order to make a distinction with redness that may be present, blueness, green information, again visual beacon is designed as the isosceles triangle of regular shape, use not bending moment algorithm that it is recognized for facilitate;Simultaneously in order to meet the demand of far and near distance high-precision vision navigation, the nested isosceles triangle pattern of final choice three layers is as landing mark, rotor wing unmanned aerial vehicle is when more than 20 meters high-altitude landing, vision resolving is carried out by extracting outer layer large scale redness isosceles triangle characteristic information, when under height in 700cm, carry out vision resolving by extracting middle level blueness isosceles triangle characteristic information, near the ground i.e. height is less than 180cm time, then extract internal layer small size green isosceles triangle characteristic information and carry out vision resolving.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410089860.7A CN103809598B (en) | 2014-03-12 | 2014-03-12 | A kind of rotor wing unmanned aerial vehicles based on three layers of isosceles triangle polychrome landing ramp autonomous landing system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410089860.7A CN103809598B (en) | 2014-03-12 | 2014-03-12 | A kind of rotor wing unmanned aerial vehicles based on three layers of isosceles triangle polychrome landing ramp autonomous landing system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103809598A CN103809598A (en) | 2014-05-21 |
CN103809598B true CN103809598B (en) | 2016-08-10 |
Family
ID=50706530
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410089860.7A Active CN103809598B (en) | 2014-03-12 | 2014-03-12 | A kind of rotor wing unmanned aerial vehicles based on three layers of isosceles triangle polychrome landing ramp autonomous landing system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103809598B (en) |
Families Citing this family (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN203786564U (en) * | 2014-04-22 | 2014-08-20 | 零度智控(北京)智能科技有限公司 | Dual-redundancy flight control system |
CN104049641B (en) * | 2014-05-29 | 2017-08-25 | 深圳市大疆创新科技有限公司 | A kind of automatic landing method, device and aircraft |
FR3024127B1 (en) * | 2014-07-25 | 2016-08-26 | Airbus Operations Sas | AUTONOMOUS AUTOMATIC LANDING METHOD AND SYSTEM |
CN105292472A (en) * | 2014-07-28 | 2016-02-03 | 中国科学院沈阳自动化研究所 | Multi-purpose flexible-wing unmanned aerial vehicle |
CN104166854B (en) * | 2014-08-03 | 2016-06-01 | 浙江大学 | For the visual rating scale terrestrial reference positioning identifying method of miniature self-service machine Autonomous landing |
JP6483823B2 (en) * | 2014-11-19 | 2019-03-13 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | Positioning mechanism, UAV dock using the positioning mechanism, and UAV supply method |
CN104590552A (en) * | 2014-12-08 | 2015-05-06 | 天津大学 | Miniature multi-rotor aircraft based on visual navigation |
CN105676875A (en) * | 2015-03-10 | 2016-06-15 | 张超 | Automatic landing system of unmanned aerial vehicle |
CN105955289A (en) * | 2015-03-10 | 2016-09-21 | 李军 | Unmanned plane automatic landing system |
CN104808684B (en) * | 2015-04-22 | 2017-11-03 | 深圳市视晶无线技术有限公司 | Aircraft precision positioning anchoring system and its positioning anchorage method |
CN104808685A (en) * | 2015-04-27 | 2015-07-29 | 中国科学院长春光学精密机械与物理研究所 | Vision auxiliary device and method for automatic landing of unmanned aerial vehicle |
CN104965213A (en) * | 2015-05-27 | 2015-10-07 | 深圳市高巨创新科技开发有限公司 | Unmanned aircraft positioning method and apparatus |
CN107710091B (en) * | 2015-06-26 | 2021-07-16 | 深圳市大疆创新科技有限公司 | System and method for selecting an operating mode of a mobile platform |
CN104980708A (en) * | 2015-06-30 | 2015-10-14 | 江苏首控制造技术有限公司 | Hand-held-terminal-contained remote direct-broadcasting monitoring system of unmanned plane |
CN104994355A (en) * | 2015-07-14 | 2015-10-21 | 杨珊珊 | Intelligent active image acquisition and update system and method |
CN105652887A (en) * | 2016-03-22 | 2016-06-08 | 临沂高新区翔鸿电子科技有限公司 | Unmanned aerial vehicle landing method adopting two-level graph recognition |
CN105857630A (en) * | 2016-03-30 | 2016-08-17 | 乐视控股(北京)有限公司 | Parking apron device, aircraft and aircraft parking system |
CN105867411A (en) * | 2016-04-14 | 2016-08-17 | 南方科技大学 | Unmanned aerial vehicle landing method and system |
CN109690438A (en) * | 2016-07-07 | 2019-04-26 | 深圳市大疆创新科技有限公司 | For using the method and system of machine readable code control loose impediment |
WO2018035835A1 (en) * | 2016-08-26 | 2018-03-01 | SZ DJI Technology Co., Ltd. | Methods and system for autonomous landing |
CN106371447B (en) * | 2016-10-25 | 2020-07-07 | 南京奇蛙智能科技有限公司 | Control method for all-weather accurate landing of unmanned aerial vehicle |
CN106774386B (en) * | 2016-12-06 | 2019-08-13 | 杭州灵目科技有限公司 | Unmanned plane vision guided navigation landing system based on multiple dimensioned marker |
CN107063261B (en) * | 2017-03-29 | 2020-01-17 | 东北大学 | Multi-feature information landmark detection method for precise landing of unmanned aerial vehicle |
CN108733068A (en) * | 2017-04-24 | 2018-11-02 | 菜鸟智能物流控股有限公司 | Aircraft with a flight control device |
CN107515615A (en) * | 2017-08-30 | 2017-12-26 | 泸州深远世宁无人机科技有限公司 | For the landing-gear of unmanned plane in particular circumstances |
CN108227751B (en) * | 2018-01-29 | 2020-12-29 | 广州亿航智能技术有限公司 | Landing method and system of unmanned aerial vehicle |
CN108536167A (en) * | 2018-07-17 | 2018-09-14 | 哈尔滨工业大学(威海) | Unmanned plane Autonomous landing method for tilting non-stationary platform |
CN108919830A (en) * | 2018-07-20 | 2018-11-30 | 南京奇蛙智能科技有限公司 | A kind of flight control method that unmanned plane precisely lands |
CN110244749A (en) * | 2019-04-22 | 2019-09-17 | 西北农林科技大学 | A kind of agricultural unmanned plane mobile platform independently precisely lands control system and method |
CN110322462B (en) * | 2019-06-13 | 2021-07-27 | 暨南大学 | Unmanned aerial vehicle visual landing method and system based on 5G network |
CN111142546A (en) * | 2019-11-22 | 2020-05-12 | 航天时代飞鸿技术有限公司 | Multi-rotor unmanned aerial vehicle accurate landing guiding system and method |
CN111137463A (en) * | 2020-01-15 | 2020-05-12 | 亿航智能设备(广州)有限公司 | Unmanned aerial vehicle landing guiding system and method, charging platform and unmanned aerial vehicle |
CN111596674A (en) * | 2020-03-31 | 2020-08-28 | 广州中科云图智能科技有限公司 | Landing positioning method and device for unmanned aerial vehicle and unmanned aerial vehicle nest |
CN111598952B (en) * | 2020-05-21 | 2022-07-08 | 华中科技大学 | Multi-scale cooperative target design and online detection identification method and system |
CN111857171B (en) * | 2020-07-30 | 2021-09-21 | 华南理工大学 | Unmanned aerial vehicle control method based on quality self-adaptive multilayer neural network |
CN112600632A (en) * | 2020-11-14 | 2021-04-02 | 泰州芯源半导体科技有限公司 | Wireless data communication platform using signal analysis |
CN112904879B (en) * | 2021-01-18 | 2024-04-12 | 天津大学 | Autonomous tracking landing system and method for four-rotor aircraft based on color block landmarks |
CN114115010B (en) * | 2021-11-18 | 2024-06-21 | 中国特种飞行器研究所 | Flying height control system for unmanned fire balloon |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100842101B1 (en) * | 2007-06-15 | 2008-06-30 | 주식회사 대한항공 | Automatic recovery method of uav using vision information |
WO2010045271A1 (en) * | 2008-10-14 | 2010-04-22 | Joshua Victor Aller | Target and method of detecting, identifying, and determining 3-d pose of the target |
CN101833104A (en) * | 2010-04-27 | 2010-09-15 | 北京航空航天大学 | Three-dimensional visual navigation method based on multi-sensor information fusion |
CN101944295A (en) * | 2010-09-08 | 2011-01-12 | 北京航空航天大学 | Method for arranging traffic pattern of unmanned aerial vehicle |
CN103208206A (en) * | 2013-03-21 | 2013-07-17 | 北京航空航天大学 | Method for arranging traffic patterns of unmanned aerial vehicles on terrain constraint condition |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090306840A1 (en) * | 2008-04-08 | 2009-12-10 | Blenkhorn Kevin P | Vision-based automated landing system for unmanned aerial vehicles |
-
2014
- 2014-03-12 CN CN201410089860.7A patent/CN103809598B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100842101B1 (en) * | 2007-06-15 | 2008-06-30 | 주식회사 대한항공 | Automatic recovery method of uav using vision information |
WO2010045271A1 (en) * | 2008-10-14 | 2010-04-22 | Joshua Victor Aller | Target and method of detecting, identifying, and determining 3-d pose of the target |
CN101833104A (en) * | 2010-04-27 | 2010-09-15 | 北京航空航天大学 | Three-dimensional visual navigation method based on multi-sensor information fusion |
CN101944295A (en) * | 2010-09-08 | 2011-01-12 | 北京航空航天大学 | Method for arranging traffic pattern of unmanned aerial vehicle |
CN103208206A (en) * | 2013-03-21 | 2013-07-17 | 北京航空航天大学 | Method for arranging traffic patterns of unmanned aerial vehicles on terrain constraint condition |
Non-Patent Citations (1)
Title |
---|
一种小型无人旋翼机高度信息融合方法;雷旭升等;《机器人》;20121230;第34卷(第4期);第432-439页 * |
Also Published As
Publication number | Publication date |
---|---|
CN103809598A (en) | 2014-05-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103809598B (en) | A kind of rotor wing unmanned aerial vehicles based on three layers of isosceles triangle polychrome landing ramp autonomous landing system | |
US11218689B2 (en) | Methods and systems for selective sensor fusion | |
US12130636B2 (en) | Methods and system for autonomous landing | |
CN103869822B (en) | The perception of many rotor wing unmanned aerial vehicles and avoidance system and bypassing method thereof | |
CN106774386B (en) | Unmanned plane vision guided navigation landing system based on multiple dimensioned marker | |
CN108062108A (en) | A kind of intelligent multi-rotor unmanned aerial vehicle and its implementation based on airborne computer | |
CN105644785B (en) | A kind of UAV Landing method detected based on optical flow method and horizon | |
CN104808685A (en) | Vision auxiliary device and method for automatic landing of unmanned aerial vehicle | |
CN109923492A (en) | Flight path determines | |
CN102190081B (en) | Vision-based fixed point robust control method for airship | |
CN104854428A (en) | Sensor fusion | |
CN109460046B (en) | Unmanned aerial vehicle natural landmark identification and autonomous landing method | |
CN103852077A (en) | Automatic anti-cheating judgment method for unmanned aerial vehicle positioning information in link failure process | |
Marques et al. | Unmanned Aircraft Systems in Maritime Operations: Challenges addressed in the scope of the SEAGULL project | |
CN106500699B (en) | A kind of position and orientation estimation method suitable for Autonomous landing in unmanned plane room | |
Rady et al. | A hybrid localization approach for UAV in GPS denied areas | |
CN102788579A (en) | Unmanned aerial vehicle visual navigation method based on SIFT algorithm | |
CN107783119A (en) | Apply the Decision fusion method in obstacle avoidance system | |
CN108871409A (en) | A kind of fault detection method and system | |
CN112198903A (en) | Modular multifunctional onboard computer system | |
CN114689030A (en) | Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision | |
Hosseinpoor et al. | Pricise target geolocation based on integeration of thermal video imagery and rtk GPS in UAVS | |
Guo et al. | Airborne vision-aided landing navigation system for fixed-wing UAV | |
Minwalla et al. | Experimental evaluation of PICAS: An electro-optical array for non-cooperative collision sensing on unmanned aircraft systems | |
CN112639655A (en) | Control method and device for return flight of unmanned aerial vehicle, movable platform and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |