CN110488848A - Unmanned plane vision guide it is autonomous drop method and system - Google Patents
Unmanned plane vision guide it is autonomous drop method and system Download PDFInfo
- Publication number
- CN110488848A CN110488848A CN201910783859.7A CN201910783859A CN110488848A CN 110488848 A CN110488848 A CN 110488848A CN 201910783859 A CN201910783859 A CN 201910783859A CN 110488848 A CN110488848 A CN 110488848A
- Authority
- CN
- China
- Prior art keywords
- aerial vehicle
- unmanned aerial
- pattern
- scene image
- landing point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 17
- 238000005259 measurement Methods 0.000 claims abstract description 24
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 claims description 16
- 238000005516 engineering process Methods 0.000 claims description 8
- 230000000007 visual effect Effects 0.000 claims description 8
- 230000003287 optical effect Effects 0.000 claims description 5
- 210000001015 abdomen Anatomy 0.000 claims description 4
- 230000000903 blocking effect Effects 0.000 claims description 2
- 238000000605 extraction Methods 0.000 claims description 2
- 238000004364 calculation method Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000009434 installation Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000007547 defect Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/04—Control of altitude or depth
- G05D1/06—Rate of change of altitude or depth
- G05D1/0607—Rate of change of altitude or depth specially adapted for aircraft
- G05D1/0653—Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing
- G05D1/0676—Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing specially adapted for landing
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Image Analysis (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention discloses a kind of unmanned plane vision guide it is autonomous drop method, comprising: step 1: unmanned plane flies to drop point overhead effective coverage, monocular cam photographic subjects scene image;Wherein: including identification pattern in target scene image, there is internal pattern in the centre of identification pattern;Step 2: target scene image being handled, identification marking pattern frame by frame, extract feature, solve relative position of the unmanned plane relative to drop point;Wherein: when the unmanned plane calculated is greater than threshold level relative to the height of drop point, using the characteristic size of whole identification pattern as reference measurement relative position;When the unmanned plane that position calculates is lower than threshold level relative to the height of drop point, using the characteristic size of the inside pattern in identification pattern as reference measurement relative position.Step 3: controlling the centering that unmanned plane is completed with drops point, uniform descent to drop point depending on the relative position.The present invention can guide unmanned plane precision approach to designated place.
Description
Technical Field
The invention relates to the technical field of unmanned aerial vehicle navigation, in particular to a method and a system for unmanned aerial vehicle vision-guided autonomous landing.
Background
With the gradual expansion of the application range of the unmanned aerial vehicle technology in the military and civil fields, the unmanned aerial vehicle with the accurate landing function is more and more concerned, such as accurate landing/carrier landing, fixed-point launching and the like. Conventional guidance techniques include inertial guidance, radar guidance, high-precision satellite guidance, and the like. The position error of inertial navigation is accumulated and increased along with time, and the navigation precision is influenced; the positioning accuracy of the radar is limited, and the equipment is complex; high-precision satellite guidance is easy to interfere by means of satellite signals, and reliability is difficult to guarantee in the final phase reduction. The visual guidance is based on an image processing technology, carries out corresponding processing on images acquired by the camera, and solves pose information in the target motion process by combining camera internal parameters, constraint condition information and the like, so that the novel navigation technology for controlling the aircraft to land is widely concerned due to the advantages of high precision, interference resistance, passive imaging, low cost and the like. The vision measurement mainly divide into monocular vision measurement and binocular vision measurement two kinds, and binocular vision measurement needs install two cameras additional on unmanned aerial vehicle, and camera installation accuracy requires highly, realizes complicacy, and the measuring distance requires the baseline more long more far away, installs the difficulty to unmanned aerial vehicle. The monocular vision measurement only needs one camera, and the size information of the target pattern is combined, so that the relative positioning can be realized, the system is simple in structure, the installation requirement is low, and the monocular vision measurement has great advantages for the application scene of the appointed landing point.
Disclosure of Invention
In order to solve the defects of the prior art, the invention aims to provide an unmanned aerial vehicle vision-guided autonomous landing method and an unmanned aerial vehicle vision-guided autonomous landing system, a monocular camera is used for acquiring a target scene image, relative position calculation is realized through image processing, target identification, accurate extraction, state switching control and other technologies, accurate guiding data is provided for the unmanned aerial vehicle landing,
one object of the invention is realized by the following technical scheme:
an unmanned aerial vehicle vision-guided autonomous landing method comprises the following steps:
step 1: after the unmanned aerial vehicle flies to the effective area above the landing point, a monocular camera arranged under the belly of the unmanned aerial vehicle shoots a target scene image; wherein: the target scene image comprises an identification pattern, and an internal pattern is arranged in the middle of the identification pattern;
step 2: carrying out frame-by-frame processing on the target scene image, identifying the identification pattern, extracting the characteristics, and solving the relative position of the unmanned aerial vehicle relative to the landing point; wherein: when the calculated height of the unmanned aerial vehicle relative to the landing point is larger than a threshold height h', measuring a relative position by taking the characteristic size of the whole identification pattern as a reference when identifying the next frame of target scene image; when the height of the unmanned aerial vehicle calculated by the position relative to the landing point is lower than a threshold height h', measuring the relative position by taking the characteristic size of an internal pattern in the identification pattern as a reference when identifying the next frame of target scene image;
and step 3: and controlling the unmanned aerial vehicle to complete the centering with the landing point according to the relative position of the unmanned aerial vehicle relative to the landing point, and descending to the landing point at a constant speed.
Preferably, the effective area is an inverted circular truncated cone, and the range of the effective area is as follows:
h1=r1/tan(θ/2)
h2=r2/tan(θ/2)
wherein the height of the upper bottom of the circular truncated cone from the descending point is h1The height of the lower bottom of the circular truncated cone from the descending point is h2The radius of the upper bottom of the circular truncated cone is r1The radius of the lower bottom of the circular truncated cone is r2The included angle between the optical axis direction of the camera and the vertical axis direction of the machine body is theta, the vertical resolution of the camera is D, and the side length of the complete identification pattern is w1Side length of the internal pattern is w2The minimum resolution of the marking pattern required for correct visual measurement is Pmin。
Preferably, the interior pattern is provided with features to resist shadow occlusion.
Preferably, the extracting the feature means extracting an edge feature of the complete identification pattern or the internal pattern by an image processing technology, and using an actual size of the edge feature as a measurement parameter.
Preferably, the threshold height h' is:
the other purpose of the invention is realized by the following technical scheme:
the utility model provides an unmanned aerial vehicle vision guide is from system of falling, is including installing the monocular camera, installing at the inside embedded vision processor of unmanned aerial vehicle and the flight control computer in unmanned aerial vehicle ventral bottom, its characterized in that:
when the unmanned aerial vehicle flies to an effective area above a landing point, the monocular camera shoots a target scene image and transmits the target scene image to the embedded visual processor frame by frame; the target scene image comprises a mark pattern, and an internal pattern is arranged in the middle of the mark pattern;
the embedded vision processor is used for processing the target scene image frame by frame, identifying the identification pattern, extracting the characteristics, solving the relative position of the unmanned aerial vehicle relative to the landing point, and sending the relative position data of the unmanned aerial vehicle and the landing point to the flight control computer according to a certain protocol format;
and the flight control computer controls the unmanned aerial vehicle to complete the centering with the landing point according to the relative position of the unmanned aerial vehicle relative to the landing point, and the unmanned aerial vehicle descends to the landing point at a constant speed.
Preferably, the flight control computer is further configured to control the illumination device to illuminate the identification pattern in the event of insufficient light.
The unmanned aerial vehicle landing method has the advantages that after the unmanned aerial vehicle flies above the landing point and enters an effective range, the relative position between the unmanned aerial vehicle and the landing point is accurately calculated through a monocular vision guiding means and is sent to a flight control computer of the unmanned aerial vehicle, and the unmanned aerial vehicle can be guided to land at an appointed place accurately in a whole period of time by adopting a mode of lighting identification patterns under the condition of insufficient light.
Drawings
Fig. 1 is a schematic flowchart of a method for autonomous landing guided by vision of an unmanned aerial vehicle according to an embodiment.
Fig. 2 is a schematic diagram of the empty active area above the landing point.
Fig. 3 is a schematic diagram of a layout of a logo.
Fig. 4 is a schematic structural diagram of an unmanned aerial vehicle vision-guided autonomous landing system.
Fig. 5 is a high precision satellite/vision measurement data plot for height Z.
Fig. 6 is a graph of X-direction high precision satellite/vision measurement data.
Fig. 7 is a Y-direction high precision satellite/vision measurement data plot.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples.
Example one
The embodiment provides an unmanned aerial vehicle vision-guided autonomous landing method, which comprises the following steps:
step 1: after the unmanned aerial vehicle flies to the effective area above the landing point, a monocular camera installed under the belly of the unmanned aerial vehicle shoots a target scene image.
Referring to fig. 2, the effective area is an inverted circular truncated cone, and the height of the upper bottom of the circular truncated cone from the descending point is h1The height of the lower bottom of the circular truncated cone from the descending point is h2The radius of the upper bottom of the circular truncated cone is r1The radius of the lower bottom of the circular truncated cone is r2The included angle between the optical axis direction of the camera and the vertical axis direction of the machine body is theta, the vertical resolution of the camera is D, and the side length of the complete identification pattern is w1Inner patternHas a side length of w2Minimum resolution P of the marking pattern required for correct visual measurementminIn relation, the range of the effective area is:
h1=r1/tan(θ/2)
h2=r2/tan(θ/2)
when the unmanned aerial vehicle enters the effective area, the identification pattern arranged on the landing point is inevitably contained in the target scene image shot by the camera fixed at the bottom of the belly. Referring to fig. 3, the logo is a square image, an internal pattern is arranged in the middle of the logo, and the internal pattern can be added with a shadow blocking resistant feature, so that the proper identification and measurement can be still performed when the shadow of the object is projected on the pattern.
Step 2: and carrying out frame-by-frame processing on the target scene image, identifying the identification pattern, extracting the characteristics, and solving the relative position of the unmanned aerial vehicle relative to the landing point.
When the calculated height of the unmanned aerial vehicle relative to the landing point is larger than a threshold height h', measuring the relative position by taking the characteristic size of the whole identification pattern as a reference when identifying the next frame of target scene image; when the height of the unmanned aerial vehicle relative to the landing point calculated by the position is lower than the threshold height h', the relative position is measured by taking the characteristic size of the internal pattern in the identification pattern as a reference when the next frame of target scene image is identified.
Extracting the feature refers to extracting the edge feature of the complete identification pattern or the internal pattern through an image processing technology, and using the actual size of the edge feature as a measurement parameter.
The threshold height h' is set in advance to be switched to the height for recognizing the internal pattern in order to avoid that the whole mark pattern exceeds the field range of the camera when the height is reduced, and the calculation method comprises the following steps:
and step 3: and controlling the unmanned aerial vehicle to complete the centering with the landing point according to the relative position of the unmanned aerial vehicle relative to the landing point, and descending to the landing point at a constant speed.
Example two
Referring to fig. 3, the embodiment provides an unmanned aerial vehicle vision guidance is from system of independently falling, including installing monocular camera, the embedded vision processor and the flight control computer of installing in unmanned aerial vehicle inside at unmanned aerial vehicle ventral bottom. The embedded vision processor is connected with the monocular camera through a video interface (such as USB and DVI) and connected with the flight control computer through a data interface (such as a network port and a serial port).
The focal length of the monocular camera is adjusted to certain specific values, and the internal parameters are calibrated in advance under the conditions of the focal lengths. The included angle theta between the optical axis direction of the camera and the vertical axis direction of the unmanned aerial vehicle body is known, or the included angle theta between the optical axis direction of the camera and the vertical axis direction of the unmanned aerial vehicle body can be measured in real time through mechanical devices such as a cloud deck and fed back to the unmanned aerial vehicle flight control equipment. When the unmanned aerial vehicle flies to an effective area above a landing point, the monocular camera shoots a target scene image and transmits the target scene image to the embedded visual processor frame by frame. The target scene image includes a mark pattern, and the effective region and the mark pattern are the same as those described in the first embodiment, and are not repeated here.
The embedded vision processor is used for processing the target scene image frame by frame, identifying the identification pattern, extracting the characteristics, solving the relative position of the unmanned aerial vehicle relative to the landing point, and sending the relative position data of the unmanned aerial vehicle and the landing point to the flight control computer according to a certain protocol format.
When the calculated height of the unmanned aerial vehicle relative to the landing point is larger than a threshold height h', measuring the relative position by taking the characteristic size of the whole identification pattern as a reference when identifying the next frame of target scene image; when the height of the unmanned aerial vehicle relative to the landing point calculated by the position is lower than the threshold height h', the relative position is measured by taking the characteristic size of the internal pattern in the identification pattern as a reference when the next frame of target scene image is identified.
Extracting the feature refers to extracting the edge feature of the complete identification pattern or the internal pattern through an image processing technology, and using the actual size of the edge feature as a measurement parameter.
The threshold height h' is set in advance to be switched to the height for recognizing the internal pattern in order to avoid that the whole mark pattern exceeds the field range of the camera when the height is reduced, and the calculation method comprises the following steps:
and the flight control computer controls the unmanned aerial vehicle to complete the centering with the landing point according to the relative position of the unmanned aerial vehicle relative to the landing point, and the unmanned aerial vehicle descends to the landing point at a constant speed.
Under the condition of insufficient light, the flight control computer can also control the lighting equipment to illuminate the identification pattern, and the flight control computer can be used at night and in a dark environment.
In the present embodiment, in the whole landing process, the high-precision satellite measurement data is taken as the reference, the data of the vision measurement is converted by the coordinate system to obtain the position coordinates (X, Y, Z) of the center of mass of the aircraft relative to the landing point, as shown in fig. 5-7, where fig. 5 is a high-precision satellite/vision measurement data curve of the height Z, fig. 6 is a high-precision satellite/vision measurement data curve of the X direction, fig. 7 is a high-precision satellite/vision measurement data curve of the Y direction, it can be seen from the figure that the centering precision (X, Y direction) is in centimeter level, and the height Z precision is in decimeter level.
In conclusion, the invention provides a simple, reliable and high-precision visual guidance means for autonomous landing of the rotor unmanned aerial vehicle, and can provide guarantee for day and night accurate landing of the unmanned aerial vehicle.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (7)
1. An unmanned aerial vehicle vision-guided autonomous landing method comprises the following steps:
step 1: after the unmanned aerial vehicle flies to the effective area above the landing point, a monocular camera arranged under the belly of the unmanned aerial vehicle shoots a target scene image; wherein: the target scene image comprises an identification pattern, and an internal pattern is arranged in the middle of the identification pattern;
step 2: carrying out frame-by-frame processing on the target scene image, identifying the identification pattern, extracting the characteristics, and solving the relative position of the unmanned aerial vehicle relative to the landing point; wherein: when the calculated height of the unmanned aerial vehicle relative to the landing point is larger than a threshold height h', measuring a relative position by taking the characteristic size of the whole identification pattern as a reference when identifying the next frame of target scene image; when the height of the unmanned aerial vehicle calculated by the position relative to the landing point is lower than a threshold height h', measuring the relative position by taking the characteristic size of an internal pattern in the identification pattern as a reference when identifying the next frame of target scene image;
and step 3: and controlling the unmanned aerial vehicle to complete the centering with the landing point according to the relative position of the unmanned aerial vehicle relative to the landing point, and descending to the landing point at a constant speed.
2. The unmanned aerial vehicle vision-guided autonomous landing method of claim 1, wherein the active area is an inverted circular truncated cone, and the range of the active area is as follows:
h1=r1/tan(θ/2)
h2=r2/tan(θ/2)
wherein,the height of the upper bottom of the circular truncated cone from the descending point is h1The height of the lower bottom of the circular truncated cone from the descending point is h2The radius of the upper bottom of the circular truncated cone is r1The radius of the lower bottom of the circular truncated cone is r2The included angle between the optical axis direction of the camera and the vertical axis direction of the machine body is theta, the vertical resolution of the camera is D, and the side length of the complete identification pattern is w1Side length of the internal pattern is w2The minimum resolution of the marking pattern required for correct visual measurement is Pmin。
3. The unmanned aerial vehicle vision-guided autonomous landing method of claim 1, wherein the inner pattern is provided with a shadow blocking resistant feature.
4. The unmanned aerial vehicle vision-guided autonomous landing method according to claim 1, wherein the feature extraction is to extract an edge feature of a complete identification pattern or an internal pattern by an image processing technology, and use an actual size of the edge feature as a measurement parameter.
5. The unmanned aerial vehicle vision-guided autonomous landing method according to claim 1, wherein the threshold height h' is:
6. the utility model provides an unmanned aerial vehicle vision guide is from system of falling, is including installing the monocular camera, installing at the inside embedded vision processor of unmanned aerial vehicle and the flight control computer in unmanned aerial vehicle ventral bottom, its characterized in that:
when the unmanned aerial vehicle flies to an effective area above a landing point, the monocular camera shoots a target scene image and transmits the target scene image to the embedded visual processor frame by frame; the target scene image comprises a mark pattern, and an internal pattern is arranged in the middle of the mark pattern;
the embedded vision processor is used for processing the target scene image frame by frame, identifying the identification pattern, extracting the characteristics, solving the relative position of the unmanned aerial vehicle relative to the landing point, and sending the relative position data of the unmanned aerial vehicle and the landing point to the flight control computer according to a certain protocol format;
and the flight control computer controls the unmanned aerial vehicle to complete the centering with the landing point according to the relative position of the unmanned aerial vehicle relative to the landing point, and the unmanned aerial vehicle descends to the landing point at a constant speed.
7. The unmanned aerial vehicle vision-guided autonomous landing system of claim 6, wherein the flight control computer is further configured to control the lighting device to illuminate the identification pattern in case of insufficient light.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910783859.7A CN110488848B (en) | 2019-08-23 | 2019-08-23 | Unmanned aerial vehicle vision-guided autonomous landing method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910783859.7A CN110488848B (en) | 2019-08-23 | 2019-08-23 | Unmanned aerial vehicle vision-guided autonomous landing method and system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110488848A true CN110488848A (en) | 2019-11-22 |
CN110488848B CN110488848B (en) | 2022-09-06 |
Family
ID=68553221
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910783859.7A Active CN110488848B (en) | 2019-08-23 | 2019-08-23 | Unmanned aerial vehicle vision-guided autonomous landing method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110488848B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111412898A (en) * | 2020-04-16 | 2020-07-14 | 中国建筑股份有限公司 | Large-area deformation photogrammetry method based on ground-air coupling |
CN113759943A (en) * | 2021-10-13 | 2021-12-07 | 北京理工大学重庆创新中心 | Unmanned aerial vehicle landing platform, identification method, landing method and flight operation system |
CN113867387A (en) * | 2021-09-27 | 2021-12-31 | 中国航空无线电电子研究所 | Unmanned aerial vehicle autonomous landing course identification method |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011178186A (en) * | 2010-02-26 | 2011-09-15 | Mitsubishi Heavy Ind Ltd | Landing guide device and method |
CN103226356A (en) * | 2013-02-27 | 2013-07-31 | 广东工业大学 | Image-processing-based unmanned plane accurate position landing method |
CN106371447A (en) * | 2016-10-25 | 2017-02-01 | 南京奇蛙智能科技有限公司 | Controlling method for all-weather precision landing of unmanned aerial vehicle |
CN106502257A (en) * | 2016-10-25 | 2017-03-15 | 南京奇蛙智能科技有限公司 | A kind of unmanned plane precisely lands jamproof control method |
CN107544550A (en) * | 2016-06-24 | 2018-01-05 | 西安电子科技大学 | A kind of Autonomous Landing of UAV method of view-based access control model guiding |
CN108919830A (en) * | 2018-07-20 | 2018-11-30 | 南京奇蛙智能科技有限公司 | A kind of flight control method that unmanned plane precisely lands |
CN109885084A (en) * | 2019-03-08 | 2019-06-14 | 南开大学 | A kind of multi-rotor unmanned aerial vehicle Autonomous landing method based on monocular vision and fuzzy control |
CN113448345A (en) * | 2020-03-27 | 2021-09-28 | 北京三快在线科技有限公司 | Unmanned aerial vehicle landing method and device |
-
2019
- 2019-08-23 CN CN201910783859.7A patent/CN110488848B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011178186A (en) * | 2010-02-26 | 2011-09-15 | Mitsubishi Heavy Ind Ltd | Landing guide device and method |
CN103226356A (en) * | 2013-02-27 | 2013-07-31 | 广东工业大学 | Image-processing-based unmanned plane accurate position landing method |
CN107544550A (en) * | 2016-06-24 | 2018-01-05 | 西安电子科技大学 | A kind of Autonomous Landing of UAV method of view-based access control model guiding |
CN106371447A (en) * | 2016-10-25 | 2017-02-01 | 南京奇蛙智能科技有限公司 | Controlling method for all-weather precision landing of unmanned aerial vehicle |
CN106502257A (en) * | 2016-10-25 | 2017-03-15 | 南京奇蛙智能科技有限公司 | A kind of unmanned plane precisely lands jamproof control method |
CN108919830A (en) * | 2018-07-20 | 2018-11-30 | 南京奇蛙智能科技有限公司 | A kind of flight control method that unmanned plane precisely lands |
CN109885084A (en) * | 2019-03-08 | 2019-06-14 | 南开大学 | A kind of multi-rotor unmanned aerial vehicle Autonomous landing method based on monocular vision and fuzzy control |
CN113448345A (en) * | 2020-03-27 | 2021-09-28 | 北京三快在线科技有限公司 | Unmanned aerial vehicle landing method and device |
Non-Patent Citations (4)
Title |
---|
VIDYA SUDEVAN 等: "Vision based autonomous landing of an Unmanned Aerial Vehicle on a stationary target", 《2017 17TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND SYSTEMS (ICCAS)》 * |
吴益超 等: "无人直升机着舰视觉引导系统设计与试验", 《电光与控制》 * |
邢伯阳 等: "基于复合地标导航的动平台四旋翼飞行器自主优化降落技术", 《航空学报》 * |
魏祥灰: "着陆区域视觉检测及无人机自主着陆导引研究", 《中国优秀博硕士学位论文全文数据库(硕士)工程科技Ⅱ辑》 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111412898A (en) * | 2020-04-16 | 2020-07-14 | 中国建筑股份有限公司 | Large-area deformation photogrammetry method based on ground-air coupling |
CN113867387A (en) * | 2021-09-27 | 2021-12-31 | 中国航空无线电电子研究所 | Unmanned aerial vehicle autonomous landing course identification method |
CN113867387B (en) * | 2021-09-27 | 2024-04-12 | 中国航空无线电电子研究所 | Unmanned aerial vehicle autonomous landing course recognition method |
CN113759943A (en) * | 2021-10-13 | 2021-12-07 | 北京理工大学重庆创新中心 | Unmanned aerial vehicle landing platform, identification method, landing method and flight operation system |
Also Published As
Publication number | Publication date |
---|---|
CN110488848B (en) | 2022-09-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104215239B (en) | Guidance method using vision-based autonomous unmanned plane landing guidance device | |
CN105335733B (en) | Unmanned aerial vehicle autonomous landing visual positioning method and system | |
KR101494654B1 (en) | Method and Apparatus for Guiding Unmanned Aerial Vehicle and Method and Apparatus for Controlling Unmanned Aerial Vehicle | |
JP5690539B2 (en) | Automatic take-off and landing system | |
EP2413096B1 (en) | Ground-based videometrics guiding method for aircraft landing or unmanned aerial vehicles recovery | |
KR100842104B1 (en) | Guide and control method for automatic landing of uavs using ads-b and vision-based information | |
CN110488848B (en) | Unmanned aerial vehicle vision-guided autonomous landing method and system | |
US20190197908A1 (en) | Methods and systems for improving the precision of autonomous landings by drone aircraft on landing targets | |
CN106502257B (en) | Anti-interference control method for precise landing of unmanned aerial vehicle | |
CN110879617A (en) | Infrared-guided unmanned aerial vehicle landing method and device | |
CN107576329B (en) | Fixed wing unmanned aerial vehicle landing guiding cooperative beacon design method based on machine vision | |
CN108710381A (en) | A kind of servo-actuated landing method of unmanned plane | |
CN112462791A (en) | Full-automatic high-precision flight landing system and method for airport of vehicle-mounted unmanned aerial vehicle | |
CN113759943A (en) | Unmanned aerial vehicle landing platform, identification method, landing method and flight operation system | |
KR101733677B1 (en) | Apparatus and method for unmanned plane precision landing using artificial landmark and ultrasonic sensor | |
CN112119428A (en) | Method, device, unmanned aerial vehicle, system and storage medium for acquiring landing position | |
JP6791387B2 (en) | Aircraft, air vehicle control device, air vehicle control method and air vehicle control program | |
RU2722521C1 (en) | Method for accurate landing of unmanned aerial vehicle on landing platform | |
KR101537324B1 (en) | Automatic carrier take-off and landing System based on image processing | |
CN110231836A (en) | A kind of guidance unmanned plane drops to running target calibration method | |
JP7070636B2 (en) | Aircraft, air vehicle control device, air vehicle control method and air vehicle control program | |
CN112904895B (en) | Image-based airplane guiding method and device | |
KR101539065B1 (en) | Method of Automatic carrier take-off and landing based on image processing using light emitter | |
CN209197737U (en) | Detection System for Bridge | |
JP7028247B2 (en) | Aircraft, air vehicle control device, air vehicle control method and air vehicle control program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |