[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN105352495B - Acceleration and light stream Data Fusion of Sensor unmanned plane horizontal velocity control method - Google Patents

Acceleration and light stream Data Fusion of Sensor unmanned plane horizontal velocity control method Download PDF

Info

Publication number
CN105352495B
CN105352495B CN201510789452.7A CN201510789452A CN105352495B CN 105352495 B CN105352495 B CN 105352495B CN 201510789452 A CN201510789452 A CN 201510789452A CN 105352495 B CN105352495 B CN 105352495B
Authority
CN
China
Prior art keywords
unmanned plane
light stream
sensor
acceleration
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201510789452.7A
Other languages
Chinese (zh)
Other versions
CN105352495A (en
Inventor
鲜斌
金鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University
Original Assignee
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University filed Critical Tianjin University
Priority to CN201510789452.7A priority Critical patent/CN105352495B/en
Publication of CN105352495A publication Critical patent/CN105352495A/en
Application granted granted Critical
Publication of CN105352495B publication Critical patent/CN105352495B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Navigation (AREA)

Abstract

The present invention relates to unmanned plane localization method, to provide a kind of new Adaptive Second noise point detecting method based on directional information, this method can significantly reduce the probability of miscarriage of justice of non-noise point, and the salt-pepper noise in image can be more effectively removed, it is stronger to the robustness of varying strength noise denoising.For this, the present invention adopts the technical scheme that, acceleration and light stream Data Fusion of Sensor unmanned plane horizontal velocity control method, the horizontal velocity information of unmanned plane is obtained using the light stream sensor installed in four rotor wing unmanned aerial vehicle bottoms, and the acceleration information of unmanned plane is obtained using the acceleration transducer on flight controller pcb board, above-mentioned data are merged using complementary filter, obtain more accurately unmanned plane and the relative velocity on ground.Present invention is mainly applied to unmanned plane positioning.

Description

Acceleration and light stream Data Fusion of Sensor unmanned plane horizontal velocity control method
Technical field
The present invention relates to unmanned plane localization method, more particularly to one kind based on acceleration transducer and light stream sensor number According to the unmanned plane autonomic positioning method of fusion.
Technical background
The orientation problem of unmanned plane is primarily referred to as determining unmanned plane in flight environment of vehicle relative to used using self-sensor device The position of property coordinate system and attitude information.Accurate pose estimation be realize four rotor wing unmanned aerial vehicle safe flights, trajectory planning with And premise and the basis of the complicated aerial mission such as target following.
The method that now widely used UAV Navigation System is mainly based upon GPS location, but its positioning precision compared with It is low, and almost there is no signal indoors, so GPS sensor can not be utilized by realizing the positioning flight of unmanned plane indoors.
Real-time light stream vision system is used for the positioning and control of four rotor wing unmanned aerial vehicles by Chiba, Japan university.It employs one Individual camera vertically downward completes optical flow computation, and it is merged with inertial guidance data, employs three layers of nested Kalman's filter Wave technology, by gamma controller, the indoor and outdoor that realizes four rotor wing unmanned aerial vehicles is independently taken off, spot hover, track following, The complex tasks such as Autonomous landing, and control effect is more good.(meeting:IEEE International Conference on Robotics and Automation;Author:F.Kendoula, I.Fantoni, K.Nonami;Publish days:2007;Text Chapter topic:Three Nested Kalman Filters-based Algorithm for Real-time Estimation of Optical Flow,UAV Motion and Obstacles Detection;The page number:4746-4751) (periodical: Journal of Field Robotics;Author:F.Kendoula, I.Fantoni, K.Nonami, publish days:2010; Title of article:Guidance and Nonlinear Control System for Autonomous Flight of Minirotorcraft Unmanned Aerial Vehicles, the page number:311–334).
The researcher of Swiss Zurich Federal Institute of Technology is using PX4FLOW light stream sensors as location measurement unit And be used in the positioning control system of indoor and outdoor unmanned plane, though the research institution realizes the indoor track following based on optical flow method Experiment, but it can be seen that light stream sensor obtain unmanned plane velocity information after long range integral action, in the length of side During about 3 meters of continuous two circles rectangular path tracking, its maximum positioning error is more than 0.5 meter, with time cumulative process In its positional information larger drift (meeting be present:IEEE International Conference on Robotics and Automation;Author:Honegger D,Meier L,Tanskanen P;Publish days:2013;Title of article:An Open Source and Open Hardware Embedded Metric Optical Flow CMOS Camera for Indoor and Outdoor Applications;The page number:1736-1741).
Fly in addition, Zurich, SUI federal Institute of Technology realizes avoidance of four rotor wing unmanned aerial vehicles in corridor by optical flow method Capable highly difficult task, they utilize advanced 190.Fish-eye camera obtains Optic flow information, using pyramid Lucas- Kanada methods complete optical flow computation, and feature point extraction is carried out using Shi-Tomasi angular-point detection methods, and by itself and inertial navigation number According to fusion, the influence of light stream rotational component is eliminated.(meeting:IEEE International Conference on Robotics and Automation;Author:S.Zingg,D.Scaramuzza,S.Weiss,et al;Publish days:2010 Year;Title of article:MAV navigation through indoor corridors using optical flow;The page number: 3361-3368)。
The content of the invention
For overcome the deficiencies in the prior art, there is provided a kind of new Adaptive Second noise spot detection side based on directional information Method, this method can significantly reduce the probability of miscarriage of justice of non-noise point, and can more effectively remove the spiced salt in image Noise is stronger to the robustness of varying strength noise denoising.Therefore, the present invention adopts the technical scheme that, acceleration and light stream Data Fusion of Sensor unmanned plane horizontal velocity control method, is obtained using the light stream sensor installed in four rotor wing unmanned aerial vehicle bottoms The horizontal velocity information of unmanned plane is taken, and the acceleration of unmanned plane is obtained using the acceleration transducer on flight controller pcb board Information is spent, is merged above-mentioned data using complementary filter, obtains more accurately unmanned plane and the relative velocity on ground.
When carrying four rotor wing unmanned aerial vehicle flight of light stream sensor, extraneous image moves relative to light stream sensor, Pixel motion is just formd on the sensitized lithography of light stream sensor, its speed is expressed as voptical, its value is proportional to unmanned plane Relative velocity vq, unmanned plane and the relative distance h on ground are inversely proportional to, its relation can be expressed as:
, can be to extrapolate relative velocity v of the unmanned plane relative to ground by above-mentioned relationq
Due to can only directly or indirectly obtain the level of unmanned plane using the algorithm based on acceleration transducer and optical flow method Velocity information, so only consider horizontal direction unmanned plane horizontal velocity information fusion method, the body axis system of unmanned plane X directions on:
In the ideal situation, the horizontal velocity information v in unmanned plane x directionsxCorresponding acceleration information a has such as ShiShimonoseki System:
axFor the acceleration in acceleration information a x directions, the point above above formula symbol represents first derivative, in actual survey During amount, due to the presence of sensor itself precision and external interference, measurement result has usually contained substantial amounts of noise signal And interference information, the velocity information obtained by optical flow method is reduced to following form here:
voptical_x=vxx
Here voptical_xThe velocity amplitude read for light stream sensor, vxPhase for unmanned plane sensor relative to ground To speed true value;μxIt is steady state value for measurement noise;
The state of flight information for the unmanned plane that flight controller obtains is discrete information, after the fusion at each time point Horizontal level velocity information can be expressed as vx(n), wherein n is the sampling number of unmanned machine information.
In blending algorithm, make the fusion velocity amplitude of first time equal with the uncorrected data value of light stream sensor, i.e.,:
vx(1)=voptical_x
After first time point, the output valve v of the complementary filter in time domainx(k), that is, the horizontal velocity after merging It is written as form:
vx(k)=∫ ax-K1(voptical_x-vx(k-1))
Wherein vx(k)For the horizontal velocity information in the x directions of this time complementary filter output, vx(k-1)It is mutual for the upper time Mend the horizontal velocity information in the x directions of wave filter output, axFor the acceleration in acceleration information a x directions, K1For the speed in x directions Spend the proportionality coefficient of deviation;The wave filter utilizes the uncorrected data v of light stream sensoroptical_xWith the output v of a upper timex(k-1)Do Difference is used as and and Proportional coefficient K1Proportional Feedback is constructed, with the data a of acceleration transducerxIntegration as forward path, utilize The velocity information that feedback channel compensation integration obtains, was made the difference by measured value and a upper time point, eliminated the measurement of light stream sensor Velocity amplitude measurement noise μx, so as to obtain the velocity information in an accurate x direction.
In actual control system, by adjusting K1Size, change unmanned plane complementary filter to acceleration transducer With the confidence coefficient of light stream sensor;Work as K1It is larger for the confidence coefficient of light stream sensor when larger, conversely, to acceleration The confidence coefficient of sensor is larger.
Similarly for the body axis system y directions of unmanned plane, using with x directions same speed blending algorithm, statement is such as Under:
The velocity information in the y directions that optical flow method obtains can be expressed as:
voptical_y=vyy
Here voptical_yThe velocity amplitude read for light stream sensor, vyPhase for unmanned plane sensor relative to ground To speed true value;μyIt is steady state value for measurement noise;
Equally, in blending algorithm, make the fusion velocity amplitude of first time equal with the uncorrected data value of light stream sensor, i.e.,:
vy(1)=voptical_y
After first time point, the output valve v of the complementary filter in time domainy(k), that is, the horizontal velocity after merging It is written as form:
vy(k)=∫ ay-K2(voptical_y-vy(k-1))
Wherein vy(k)For the horizontal velocity information in the y directions of this time complementary filter output, vy(k-1)It is mutual for the upper time Mend the y directions horizontal velocity information of wave filter output, ayThe acceleration in the y directions obtained for acceleration transducer, K2For y directions Velocity deviation proportionality coefficient.It is similar with the x directions of body axis system, by changing by adjusting K2Size, can adjust Save confidence coefficient of the wave filter for light stream sensor and acceleration transducer.
The technical characterstic and effect of the present invention:
The unmanned plane velocity information and acceleration transducer that the present invention will be obtained using complementary filter algorithm using optical flow method The acceleration information of acquisition is merged, and realizes the high-precision correct velocity information that unmanned plane is obtained in the range of the long period, Meet the stable needs independently to hover in unmanned plane room.
Brief description of the drawings:
Fig. 1 is the complementary filter structured flowchart that the present invention uses;A in figureaccelFor the measured value of acceleration transducer, vopticalFor the measured value of light stream sensor, v is the defeated velocity amplitude of complementary filter output.
Fig. 2 is to use unmanned plane horizontal velocity controller architecture figure;A in figuredThe horizontal acceleration given for remote control, vt For to adGiven speed after integration, v are unmanned plane actual speed after complementary filter merges, atFor speed control The given acceleration of output.
Fig. 3 be hand-held horizontal x, y direction light stream sensor of unmanned plane uncorrected data and fusion after data, wherein Fig. 3 (a) For the syncretizing effect in x directions, the syncretizing effect in Fig. 3 (b) positions y directions.
Fig. 4 is the data of object attitude angle and actual attitude angle during hovering, and wherein Fig. 4 (a) is the given angle of pitch And the measurement angle of pitch, Fig. 4 (b) are given roll angle and measurement roll angle.
Embodiment
Melted the technical problem to be solved by the invention is to provide one kind based on light stream sensor with acceleration transducer data The unmanned plane autonomic positioning method of conjunction, realize the spot hover of unmanned plane under indoor environment.
The technical solution adopted by the present invention is:It is used for using light stream sensor and the method for acceleration transducer data fusion In the alignment system of unmanned plane, comprise the following steps:
The velocity information and velocity information of unmanned plane are obtained using the light stream sensor installed in four rotor wing unmanned aerial vehicle bottoms Processing, the acceleration positional information of unmanned plane is obtained using the acceleration transducer on flight controller pcb board, filtered using complementation Ripple device is merged above-mentioned data, obtains more accurately unmanned plane and the relative velocity on ground.
The described velocity information and velocity information using light stream sensor acquisition unmanned plane is handled:
Light stream is the apparent movement that extraneous image is formed on the retina due to relative motion, can typically be calculated by light stream Method is handled to obtain to video flowing.When carrying four rotor wing unmanned aerial vehicle flight of light stream sensor, extraneous image relative to Light stream sensor moves, and just forms pixel motion on the sensitized lithography of sensor, its speed is expressed as voptical, its value is just Than in the relative velocity v of unmanned planeq, unmanned plane and the relative distance h on ground are inversely proportional to, its relation can be expressed as:
, can be to extrapolate relative velocity v of the unmanned plane relative to ground by above-mentioned relationq
What optical flow method directly obtained is the horizontal velocity information of unmanned plane, can be strong by indoor light yet with optical flow method The factors such as degree, aircraft altitude influence, and the reading of light stream sensor has stronger noise, and light stream sensor meeting is used alone Have influence on the precision of control.
Can be by acquisition speed information after the acceleration information integration of unmanned plane using acceleration transducer.But pair plus Velocity information is integrated, and can also be integrated the noise of acceleration transducer, so as to influence acquired velocity information Precision.
Described use complementary filter, which carries out data fusion, is:
The unmanned plane positional information obtained using acceleration transducer and optical flow method is melted using complementary filter algorithm Close, the unmanned plane that can be obtained relative to ground velocity information, it is and more accurate, controlled by constructing single closed-loop proportional-integral Device, realize the spot hover of unmanned plane indoors.Mainly the data anastomosing algorithm based on complementary filter will be carried out below Introduce:
Due to can only directly or indirectly obtain unmanned plane using the location algorithm based on acceleration transducer and optical flow method The velocity information of horizontal level, so only considering the horizontal velocity information fusion method of horizontal direction unmanned plane.Further, since water Square upwards x directions it is similar with y bearing data processing methods, illustrated below by taking x directions as an example.
In the ideal situation, the horizontal velocity information v in unmanned plane x directionsxCorresponding velocity information axWith such as ShiShimonoseki System:
In actual measurement process, due to the presence of sensor itself precision and external interference, measurement result usually contains There are substantial amounts of noise signal and interference information, the velocity information obtained by optical flow method is reduced to following form here:
voptical_x=vxx
Here voptical_xThe velocity amplitude read for light stream sensor, vxPhase for unmanned plane sensor relative to ground To speed true value;μxIt is steady state value for measurement noise.
, can be by the defeated of system using the structured flowchart of complementary filter algorithm as shown in figure 1, according to the structured flowchart of system Go out vx(k)It is written as form:
vx(k)=∫ a-K1(voptical_x-vx(k-1))
Wherein vx(k)For the horizontal velocity information of this state complementary filter output, vx(k-1)For the complementary filter of a upper time point The horizontal velocity information of ripple device output.
In actual control system, by adjusting K1Size, thus it is possible to vary unmanned plane complementary filter to acceleration pass The confidence coefficient of sensor and light stream sensor.Work as K1It is larger for the confidence coefficient of light stream sensor when larger, conversely, pair plus The confidence coefficient of velocity sensor is larger.
It is identical with the speed blending algorithm of x side for unmanned plane collective coordinate system y directions, it can be expressed as:
The velocity information in y directions that optical flow method obtains is:
voptical_y=vyy
Above-mentioned voptical_yThe velocity amplitude read for light stream sensor, vyIt is unmanned plane sensor relative to the relative of ground Speed true value;μyIt is steady state value for the noise figure of tachometric survey;
In blending algorithm, make the fusion velocity amplitude of first time equal with the uncorrected data value of light stream sensor, i.e.,:
vy(1)=voptical_y
After first time point, the output valve v of the complementary filter in time domainy(k), that is, the horizontal velocity after merging It is written as form:
vy(k)=∫ ay-K2(voptical_y-vy(k-1))
Wherein vy(k)For the horizontal velocity information in the y directions of this time point complementary filter output, vy(k-1)For the upper time The y directions horizontal velocity information of point complementary filter output, ayThe acceleration in the y directions obtained for acceleration transducer, K2For machine The proportionality coefficient of the velocity deviation in body coordinate system y directions, adjust K2The speed letter in an accurate y direction can be obtained Breath.It is similar with the x directions of body axis system, adjust K2And in regulation wave filter for light stream sensor and acceleration transducer Confidence coefficient.
With reference to example and accompanying drawing to the present invention based on nobody of light stream sensor and acceleration transducer data fusion Machine autonomic positioning method is described in detail.
The present invention considers the advantages of localization method based on optical flow method, makes full use of positioning accurate in the optical flow method short time Degree is higher can to work and the faster advantage of acceleration transducer frequency indoors, calculated two kinds of positioning by complementary filter method Method is merged, and has carried out corresponding experimental verification by hand-held four rotor wing unmanned aerial vehicles in environment indoors, realizes indoor base In four rotor wing unmanned aerial vehicle precise locating functions of Fusion.
Unmanned plane autonomic positioning method of the invention based on light stream sensor and acceleration transducer data fusion, including with Lower step:
1) velocity information and the velocity information processing of unmanned plane are obtained using light stream sensor:
What optical flow method directly obtained is the horizontal velocity information of unmanned plane, when carry light stream sensor four rotors nobody During machine flight, because extraneous image moves relative to light stream sensor, pixel motion is just formd on the sensitized lithography of sensor, Its speed is expressed as vflow, its value is proportional to the relative velocity v of unmanned planeq, unmanned plane and the relative distance h on ground are inversely proportional to, It can be expressed as:
Thus can be to obtain the horizontal velocity information of four rotor wing unmanned aerial vehicles.However, due to by illumination condition, ground line High-frequency Gauss white noise be present in the influence of the factors such as reason, the relative velocity of the unmanned plane generally directly obtained by light stream sensor Sound, if without fusion, control effect can be influenceed.
2) acceleration information of unmanned plane is obtained using acceleration transducer:
Accelerograph sensor compensates to the noise of light stream sensor, by spi bus, realize microcontroller with Communication between acceleration transducer, the reading of acceleration transducer is read using programmable Interrupt under 1kHz frequencies, and it is right It is integrated the velocity information for obtaining unmanned plane.
3) data fusion is carried out using complementary filter:
The unmanned plane horizontal velocity information that light stream sensor and acceleration transducer obtain is entered using complementary filter algorithm Row fusion, the purpose for the correct velocity information that unmanned plane is obtained under environment indoors is realized, meets independently to hover in unmanned plane room Fly the needs controlled.Mainly the data anastomosing algorithm based on complementary filter will be introduced below.
Because the purpose of using the location algorithm based on light stream sensor or acceleration transducer is direct or indirect acquisition nothing Man-machine horizontal velocity information, so only considering the velocity information fusion method of horizontal direction unmanned plane in this chapter.
In the ideal situation, the velocity information v in unmanned plane x directionsx, the velocity information v in y directionsy, corresponding acceleration Information ax, ayWith following relation:
But in actual measurement process, due to the presence of sensor itself precision and external interference, measurement result is often wrapped Containing substantial amounts of noise signal and interference information, the velocity information obtained using light stream sensor is reduced to following shape here Formula:
voptical_x=vxx
voptical_y=vyy
Here voptical_x、voptical_y、vx、vyThe measured value in the x and y directions respectively obtained using light stream sensor With the true value in unmanned plane horizontal velocity x and y directions, μyAnd μxRespectively measurement noise, it is steady state value.
, can be by the defeated of system using the structured flowchart of complementary filter algorithm as shown in figure 1, according to the structured flowchart of system Go out vx(k)It is written as form:
vx(k)=∫ a-K1(voptical_x-vx(k-1))
vy(k)=∫ ay-K2(voptical_y-vy(k-1))
Wherein vx(k)For the horizontal velocity information in the x directions of this time complementary filter output, vx(k-1)For upper a period of time Between complementary filter export x directions horizontal velocity information, axThe acceleration in the x directions obtained for acceleration transducer, K1For The proportionality coefficient of the velocity deviation in x directions;vy(k)The horizontal velocity information in the y directions exported for this time complementary filter, vy(k-1)For the y directions horizontal velocity information of upper time complementary filter output, ayThe y directions obtained for acceleration transducer Acceleration, K2For the proportionality coefficient of the velocity deviation in y directions;In actual control system, by K1Or K2When adjusting big, filtering Device is larger to the confidence coefficient of light stream sensor, conversely, then larger to the confidence coefficient of acceleration transducer.
Specific example is given below:
First, system hardware connection and configuration
Four rotor wing unmanned aerial vehicle autonomous flight control methods of the view-based access control model of the present invention are flown using based on embedded architecture Row control structure, the experiment porch built include four rotor wing unmanned aerial vehicle bodies, earth station, remote control etc..Wherein four rotors without It is man-machine be equipped with airborne PX4FLOW light streams sensor, (core chip is Freescale K60 chips and to integrate to flight controller Inertial navigation unit and barometer module etc.).Earth station includes a notebook that (SuSE) Linux OS is housed, Startup and remote monitoring for onboard program.The platform can carry out manual takeoff and landing by remote control, and anticipate Manual mode is promptly switched to when outer, to ensure experiment safety.
2nd, flight experiment result
The present embodiment has carried out multigroup flight control experiment to above-mentioned experiment porch, and flight experiment environment is indoor environment In.Control targe is to realize the indoor station keeping function of four rotor wing unmanned aerial vehicles.
The data and curves of unmanned plane are as shown in Figure 3 in the hand-held experimentation in interior.Wherein, hand-held unmanned plane remains stationary as, Its speed is as shown in Figure 3 with merging speed, it is believed that the actual speed of unmanned plane is almost 0, now x-axis light stream sensor The average value of uncorrected data is 19.64cm/s, and the data after merging are 6.185cm/s, and the data of y-axis are respectively then 27.07cm/ S and 8.34cm/s, it is seen that the velocity information after merging can eliminate error of light stream sensor etc., and blending algorithm is effective 's.In addition, as shown in Figure 4, unmanned plane target pitch angle pitch and absolute value of actual pitch angle deviation during hovering is equal It is worth for 2.077., roll angle roll this data are 0.986., controller has more good control effect.

Claims (6)

1. a kind of acceleration and light stream Data Fusion of Sensor unmanned plane horizontal velocity control method, it is characterized in that, utilize installation Light stream sensor in four rotor wing unmanned aerial vehicle bottoms obtains the horizontal velocity information of unmanned plane, and utilizes adding on flight controller Velocity sensor obtains the acceleration information of unmanned plane, is merged above-mentioned data using complementary filter, obtains more smart Accurate unmanned plane and the relative velocity on ground.
2. acceleration as claimed in claim 1 and light stream Data Fusion of Sensor unmanned plane horizontal velocity control method, it is special Sign is, when carrying four rotor wing unmanned aerial vehicle flight of light stream sensor, extraneous image moves relative to light stream sensor, light stream Pixel motion is just formd on the sensitized lithography of sensor, its speed is expressed as voptical, its value is proportional to the phase of unmanned plane To speed vq, unmanned plane and the relative distance h on ground are inversely proportional to, its relation is expressed as:
<mrow> <msub> <mi>v</mi> <mrow> <mi>o</mi> <mi>p</mi> <mi>t</mi> <mi>i</mi> <mi>c</mi> <mi>a</mi> <mi>l</mi> </mrow> </msub> <mo>&amp;Proportional;</mo> <mfrac> <msub> <mi>v</mi> <mi>q</mi> </msub> <mi>h</mi> </mfrac> </mrow>
By above-mentioned relation, relative velocity v of the unmanned plane relative to ground is extrapolatedq
3. acceleration as claimed in claim 1 and light stream Data Fusion of Sensor unmanned plane horizontal velocity control method, it is special Sign is, due to can only directly or indirectly obtain the horizontal speed of unmanned plane using the algorithm based on acceleration transducer and optical flow method Information is spent, so only considering the horizontal velocity information fusion method of horizontal direction unmanned plane, the x side of the body axis system of unmanned plane Upwards:
In the ideal situation, unmanned plane horizontal velocity information vxCorresponding acceleration information axWith following relation:
<mrow> <msub> <mover> <mi>v</mi> <mo>&amp;CenterDot;</mo> </mover> <mi>x</mi> </msub> <mo>=</mo> <msub> <mi>a</mi> <mi>x</mi> </msub> </mrow>
axFor the acceleration in acceleration information a x directions, the point above above formula symbol represents first derivative, in actual measurement process In, due to the presence of sensor itself precision and external interference, measurement result has usually contained substantial amounts of noise signal and interference Information, the velocity information obtained by optical flow method is reduced to following form here:
voptical_x=vxx
Here voptical_xFor the uncorrected data value of light stream sensor, μxIt is steady state value for measurement noise;
The state of flight information for the unmanned plane that flight controller obtains is discrete information, the level after the fusion at each time point Position and speed information can be expressed as vx(n), wherein n is the sampling number of unmanned machine information.
4. acceleration as claimed in claim 2 and light stream Data Fusion of Sensor unmanned plane horizontal velocity control method, it is special Sign is to be merged above-mentioned data using complementary filter:Make the naked number of the fusion velocity amplitude and light stream sensor of first time It is equal according to being worth:
vx(1)=voptical_x
After first time point, the output valve v of the complementary filter in time domainx(k), k is defined as the data of a fixation Processing time point, that is, the horizontal velocity after merging are written as form:
vx(k)=∫ ax-K1(voptical_x-vx(k-1))
Wherein vx(k)The horizontal velocity information in the x directions exported for complementary filter described in k time points, vx(k-1)For k upper a period of time Between put complementary filter output x directions horizontal velocity information, axFor the acceleration in acceleration information a x directions, K1For x side To velocity deviation proportionality coefficient;The complementary filter utilizes the uncorrected data v of light stream sensoroptical_xWith k-1 time points Output vx(k-1)Make the difference as and and Proportional coefficient K1Proportional Feedback is constructed, with the data a of acceleration transducerxIntegration conduct Forward path, the velocity information obtained using feedback channel compensation integration, by described in k point in time measurement value and k-1 time points Complementary filter output valve makes the difference, and eliminates the measurement noise μ of the velocity amplitude of light stream sensor measurementx, so as to obtain one more The velocity information in accurate x directions.
5. acceleration as claimed in claim 4 and light stream Data Fusion of Sensor unmanned plane horizontal velocity control method, it is special Sign is, in actual control system, by adjusting K1Size, change unmanned plane described in complementary filter to acceleration transducer With the confidence coefficient of light stream sensor;Work as K1It is larger for the confidence coefficient of light stream sensor when larger, conversely, to acceleration The confidence coefficient of sensor is larger.
6. acceleration as claimed in claim 4 and light stream Data Fusion of Sensor unmanned plane horizontal velocity control method, it is special Sign is, for the body axis system y directions of unmanned plane, using with x directions same speed blending algorithm, be expressed as follows:
The velocity information in the y directions that optical flow method obtains is expressed as:
νoptical_yyy
Here voptical_yFor the uncorrected data value of light stream sensor, vyIt is true relative to the relative velocity on ground for unmanned plane sensor Value;μyIt is steady state value for measurement noise;
Equally, in blending algorithm, make the fusion velocity amplitude of first time equal with the uncorrected data value of light stream sensor, i.e.,:
vy(1)=voptical_y
After first time point, the output valve of complementary filter described in the k time points in time domain is vy(k), that is, after merging Horizontal velocity is written as form:
vy(k)=∫ ay-K2(voptical_y-vy(k-1))
Wherein vy(k)The horizontal velocity information in the y directions exported for complementary filter described in k time points, vy(k-1)For k-1 time points The y directions horizontal velocity information of the complementary filter output, ayThe acceleration in the y directions obtained for acceleration transducer, K2For The proportionality coefficient of the velocity deviation in y directions, it is identical with complementary filter regulative mode described in the x directions of body axis system, pass through Change K2Size, adjust confidence coefficient of the complementary filter for light stream sensor and acceleration transducer.
CN201510789452.7A 2015-11-17 2015-11-17 Acceleration and light stream Data Fusion of Sensor unmanned plane horizontal velocity control method Expired - Fee Related CN105352495B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510789452.7A CN105352495B (en) 2015-11-17 2015-11-17 Acceleration and light stream Data Fusion of Sensor unmanned plane horizontal velocity control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510789452.7A CN105352495B (en) 2015-11-17 2015-11-17 Acceleration and light stream Data Fusion of Sensor unmanned plane horizontal velocity control method

Publications (2)

Publication Number Publication Date
CN105352495A CN105352495A (en) 2016-02-24
CN105352495B true CN105352495B (en) 2018-03-23

Family

ID=55328505

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510789452.7A Expired - Fee Related CN105352495B (en) 2015-11-17 2015-11-17 Acceleration and light stream Data Fusion of Sensor unmanned plane horizontal velocity control method

Country Status (1)

Country Link
CN (1) CN105352495B (en)

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105806342B (en) * 2016-03-02 2019-02-22 上海交通大学 Unmanned plane movement velocity prediction technique based on machine learning
CN105807083B (en) * 2016-03-15 2019-03-12 深圳市高巨创新科技开发有限公司 A kind of unmanned vehicle real time speed measuring method and system
CN107346142B (en) * 2016-09-30 2019-02-26 广州亿航智能技术有限公司 Flying vehicles control method, light stream module and aircraft
CN107389968B (en) * 2017-07-04 2020-01-24 武汉视览科技有限公司 Unmanned aerial vehicle fixed point implementation method and device based on optical flow sensor and acceleration sensor
CN108007474A (en) * 2017-08-31 2018-05-08 哈尔滨工业大学 A kind of unmanned vehicle independent positioning and pose alignment technique based on land marking
CN107727877A (en) * 2017-09-04 2018-02-23 中国航空工业集团公司洛阳电光设备研究所 A kind of ground velocity measuring method based on instrument-landing-system
CN108052005A (en) * 2017-12-07 2018-05-18 智灵飞(北京)科技有限公司 Control method, the unmanned plane of a kind of interior unmanned plane speed limit and limit for height
CN108196582A (en) * 2018-02-12 2018-06-22 深圳技术大学(筹) A kind of indoor Visual Navigation unmanned plane cluster flight control system and method
CN110503740B (en) * 2018-05-18 2021-11-26 杭州海康威视数字技术股份有限公司 Vehicle state determination method and device, computer equipment and system
CN109948424A (en) * 2019-01-22 2019-06-28 四川大学 A kind of group abnormality behavioral value method based on acceleration movement Feature Descriptor
CN110375747A (en) * 2019-08-26 2019-10-25 华东师范大学 A kind of inertial navigation system of interior unmanned plane
CN111089595B (en) * 2019-12-30 2021-12-03 珠海一微半导体股份有限公司 Detection data fusion method of robot, main control chip and robot
CN111398522B (en) * 2020-03-24 2022-02-22 山东智翼航空科技有限公司 Indoor air quality detection system and detection method based on micro unmanned aerial vehicle
CN112414365B (en) * 2020-12-14 2022-08-16 广州昂宝电子有限公司 Displacement compensation method and apparatus and velocity compensation method and apparatus
CN114018241B (en) * 2021-11-03 2023-12-26 广州昂宝电子有限公司 Positioning method and device for unmanned aerial vehicle
CN114545017A (en) * 2022-01-31 2022-05-27 深圳市云鼠科技开发有限公司 Velocity fusion method and device based on optical flow and accelerometer and computer equipment
CN117518837B (en) * 2024-01-04 2024-03-19 中国科学院长春光学精密机械与物理研究所 Decoupling method based on parameterized model

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101598557A (en) * 2009-07-15 2009-12-09 北京航空航天大学 A kind of integrated navigation system that is applied to unmanned spacecraft
CN101915852A (en) * 2010-08-06 2010-12-15 北京交通大学 Velocity measurement method based on stereoscopic vision
CN104062977A (en) * 2014-06-17 2014-09-24 天津大学 Full-autonomous flight control method for quadrotor unmanned aerial vehicle based on vision SLAM
KR20140133994A (en) * 2013-05-13 2014-11-21 현대오트론 주식회사 Apparatus and method for alarming impact
CN104808231A (en) * 2015-03-10 2015-07-29 天津大学 Unmanned aerial vehicle positioning method based on GPS and optical flow sensor data fusion

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101598557A (en) * 2009-07-15 2009-12-09 北京航空航天大学 A kind of integrated navigation system that is applied to unmanned spacecraft
CN101915852A (en) * 2010-08-06 2010-12-15 北京交通大学 Velocity measurement method based on stereoscopic vision
KR20140133994A (en) * 2013-05-13 2014-11-21 현대오트론 주식회사 Apparatus and method for alarming impact
CN104062977A (en) * 2014-06-17 2014-09-24 天津大学 Full-autonomous flight control method for quadrotor unmanned aerial vehicle based on vision SLAM
CN104808231A (en) * 2015-03-10 2015-07-29 天津大学 Unmanned aerial vehicle positioning method based on GPS and optical flow sensor data fusion

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Bias Compensation of Gyroscopes in Mobiles with Optical Flow;Laszlo Kundra等;《2014 AASRI Conference on Circuit and Singnal Processing(CSP2014)》;20141231;152-157 *
Three Nested Kalman Filters-Based Algorithm for Real-Time Estimation of Optical FLow,UAV Motion and Obstacles Detection;Farid Kendoul等;《2007 IEEE International Conference on Robotics and Automation》;20070414;4746-4751 *
一种自适应互补滤波姿态估计算法;王立等;《控制工程》;20150930;第22卷(第5期);881-886 *
动态倾角传感器及其传递特性的研究;付勇杰等;《仪表技术与传感器》;20121231(第9期);6-8 *
四旋翼微型飞行器位姿及控制策略的研究;张洪涛;《中国博士学位论文全文数据库工程科技II辑》;20150115(第1期);1-118 *

Also Published As

Publication number Publication date
CN105352495A (en) 2016-02-24

Similar Documents

Publication Publication Date Title
CN105352495B (en) Acceleration and light stream Data Fusion of Sensor unmanned plane horizontal velocity control method
Bacik et al. Autonomous flying with quadrocopter using fuzzy control and ArUco markers
CN104808231B (en) Unmanned plane localization method based on GPS Yu light stream Data Fusion of Sensor
CN103853156B (en) A kind of small-sized four-rotor aircraft control system based on machine set sensor and method
CN104062977B (en) Full-autonomous flight control method for quadrotor unmanned aerial vehicle based on vision SLAM
Beyeler et al. Vision-based control of near-obstacle flight
Wenzel et al. Automatic take off, tracking and landing of a miniature UAV on a moving carrier vehicle
CN105644785B (en) A kind of UAV Landing method detected based on optical flow method and horizon
Zahran et al. A new velocity meter based on Hall effect sensors for UAV indoor navigation
CN100557540C (en) A kind of unmanned plane course automatic correcting method based on magnetic course transmitter
CN105094138A (en) Low-altitude autonomous navigation system for rotary-wing unmanned plane
Yun et al. IMU/Vision/Lidar integrated navigation system in GNSS denied environments
CN102508493A (en) Flight control method for small unmanned aerial vehicle
CN106774374B (en) Automatic unmanned aerial vehicle inspection method and system
Jung et al. Robust marker tracking algorithm for precise UAV vision-based autonomous landing
Parfiryev et al. Algorithm for controlling the trajectory of an unmanned aerial vehicle with the possibility of flying around obstacles
Watanabe et al. Simultaneous visual target tracking and navigation in a GPS-denied environment
Wang et al. Monocular vision and IMU based navigation for a small unmanned helicopter
Haddadi et al. Visual-inertial fusion for indoor autonomous navigation of a quadrotor using ORB-SLAM
Mercado et al. Quadrotor's trajectory tracking control using monocular vision navigation
Kendoul et al. A visual navigation system for autonomous flight of micro air vehicles
Denuelle et al. Biologically-inspired visual stabilization of a rotorcraft UAV in unknown outdoor environments
Chen et al. System integration of a vision-guided UAV for autonomous tracking on moving platform in low illumination condition
Lee Helicopter autonomous ship landing system
Ramirez et al. Stability analysis of a vision-based UAV controller: An application to autonomous road following missions

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20180323

Termination date: 20211117