CN109916394A - Combined navigation algorithm fusing optical flow position and speed information - Google Patents
Combined navigation algorithm fusing optical flow position and speed information Download PDFInfo
- Publication number
- CN109916394A CN109916394A CN201910270669.5A CN201910270669A CN109916394A CN 109916394 A CN109916394 A CN 109916394A CN 201910270669 A CN201910270669 A CN 201910270669A CN 109916394 A CN109916394 A CN 109916394A
- Authority
- CN
- China
- Prior art keywords
- optical flow
- coordinate system
- carrier
- height
- flow sensor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003287 optical effect Effects 0.000 title claims abstract description 134
- 230000004927 fusion Effects 0.000 claims abstract description 55
- 238000006073 displacement reaction Methods 0.000 claims description 84
- 238000005259 measurement Methods 0.000 claims description 48
- 239000011159 matrix material Substances 0.000 claims description 31
- 238000004364 calculation method Methods 0.000 claims description 21
- 238000000034 method Methods 0.000 claims description 20
- 238000006243 chemical reaction Methods 0.000 claims description 15
- 230000036541 health Effects 0.000 claims description 12
- 230000001133 acceleration Effects 0.000 claims description 10
- 239000011541 reaction mixture Substances 0.000 claims description 9
- 238000001914 filtration Methods 0.000 claims description 8
- 238000010606 normalization Methods 0.000 claims description 8
- 230000007704 transition Effects 0.000 claims description 6
- 230000008569 process Effects 0.000 claims description 5
- 238000012545 processing Methods 0.000 claims description 5
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
Landscapes
- Navigation (AREA)
Abstract
The invention provides a combined navigation algorithm for fusing optical flow position and speed information, which utilizes position and speed information output by an optical flow sensor and data of an MEMS IMU, a magnetometer, an air pressure altimeter and a laser ranging sensor to complete data fusion by adopting an extended Kalman filter and solve the position, speed and attitude information of a carrier. The algorithm adopts a station center coordinate system as a navigation coordinate system, and the position coordinate of the carrier relative to the initial position point is calculated in real time. The algorithm can realize navigation and positioning of the carrier under the GNSS rejection condition, provide accurate course, attitude and speed information for the carrier, and effectively slow down the drift rate of strapdown pure inertial positioning errors.
Description
Technical Field
The invention relates to the technical field of integrated navigation algorithms, in particular to an integrated navigation algorithm fusing optical flow position and speed information.
Background
The inertial navigation based on the low-precision MEMS IMU can not provide available navigation data for a carrier, GNSS data are often adopted to restrain divergence of inertial navigation errors, but the GNSS can not provide positioning information in environments such as indoor environments, urban waterways, bridges and tunnels, and under the GNSS rejection environment, pseudo GNSS (global navigation satellite system) is required to be searched to assist the inertial navigation, so-called pseudo GNSS, namely under the GNSS rejection environment, the pseudo GNSS replaces the GNSS partially or completely in function to measure speed or position, is fused with the inertial navigation and other sensor data, and achieves navigation and positioning functions under the GNSS rejection environment.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: in order to solve the problem of navigation and positioning in a GNSS rejection environment, the invention provides a combined navigation algorithm fusing optical flow position and speed information, the algorithm fuses data information of an optical flow sensor, an MEMS IMU, a magnetometer, an air pressure altimeter and a laser ranging sensor, data fusion is completed by adopting extended Kalman filtering, and the position, speed and attitude information of a carrier is solved under a station center coordinate system.
The technical scheme adopted for solving the technical problems is as follows:
the invention relates to an integrated navigation system, which comprises an optical flow sensor, an MEMS IMU (micro electro mechanical system for short), a magnetometer, an air pressure altimeter and a laser ranging sensor, wherein the optical flow sensor, the MEMS IMU, the magnetometer, the air pressure altimeter and the laser ranging sensor are arranged on a carrier, a camera coordinate system of the optical flow sensor, an IMU coordinate system and a magnetometer coordinate system are superposed with a coordinate system of a right front upper carrier, the optical flow sensor and the laser ranging sensor are arranged at the bottom of the carrier, the measuring direction of the,
the IMU comprises a three-axis orthogonal gyroscope and a three-axis orthogonal accelerometer which are respectively used for measuring angular velocity and acceleration (specific force); the magnetometer adopts a three-axis orthogonal magnetometer and is used for geomagnetic measurement; the barometric altimeter is used for measuring barometric altitude; the optical flow sensor is used for measuring pixel displacement between two frames of effective images under a pixel coordinate system; the laser ranging sensor is used for measuring the one-dimensional distance between the sensor and the reflection point.
The invention relates to a coordinate system which comprises a carrier coordinate system, a station center coordinate system, a northeast geographic coordinate system, a navigation coordinate system, a geomagnetic northeast coordinate system, a camera coordinate system and a pixel coordinate system, wherein the carrier coordinate system refers to a right front upper coordinate system of a carrier for loading the navigation system, and a sensor (IMU, magnetometer and optical flow sensor) with coordinates, which is arranged on the carrier, is arranged to be coincident with the carrier coordinate system; the station center coordinate system is a northeast geographical coordinate system with the carrier navigation starting point as the origin, and the position estimation of the integrated navigation is expressed by the station center coordinate system; the origin of the northeast geographic coordinate system is the center of mass of the carrier, the speed estimation of the integrated navigation is expressed in the coordinate system, and the speed estimation is considered to be equivalently expressed in the station center coordinate system; in the algorithm, a navigation coordinate system adopts a station center coordinate system; the north-east earth coordinate system takes magnetic north as north direction, the north-east earth physical coordinate system rotates a geomagnetic deviation angle by a geomagnetic deviation angle to coincide with the north-east earth coordinate system, the camera coordinate system is the 'right front upper' coordinate system of the camera, the origin of the pixel coordinate system is the upper left corner of the image, the unit is pixel (pixel), the u axis is the same as the right axis direction of the camera coordinate system, and the v axis is the same as the front axis direction of the camera coordinate system.
A combined navigation algorithm that fuses optical flow position and velocity information, comprising the steps of:
s1: reading data information of an optical flow sensor, an IMU, a magnetometer, a barometric altimeter and a laser ranging sensor which are arranged on a carrier by a navigation computer, wherein pixel displacement between two frames of effective images in a u-axis direction and a v-axis direction under a pixel coordinate system is read from the optical flow sensor, angular velocity and acceleration (specific force) data are read from the IMU, geomagnetic intensity data are read from the magnetometer, barometric altitude data are read from the barometric altimeter, and laser ranging altitude data are read from the laser ranging sensor;
s2: fusing the air pressure height and the laser ranging height in the data information acquired in the step S1 by adopting a variable weight method, calculating to obtain a fusion height, and taking the fusion height as the height measurement of Extended Kalman Filter (EKF) data fusion;
s3: the navigation algorithm judges whether the optical flow sensor data is updated according to the optical flow sensor data updating mark, and if the optical flow sensor data is not updated, the step S4 is carried out; if the optical flow sensor data is updated, the flow proceeds to step S5;
s4: under the condition that the data of the optical flow sensor is not updated, carrying out strapdown pure inertial navigation recursion by a navigation algorithm, and solving the position, the speed and the attitude information of the carrier;
s5: under the condition that the data of the optical flow sensor is updated, angular motion compensation is carried out on two-dimensional pixel displacement output by the optical flow sensor to obtain pixel displacement corresponding to linear motion, physical scale conversion is carried out on the linear motion pixel displacement by utilizing the resolution of a camera and the physical distance between the camera and a photographed plane, and the linear motion pixel displacement is converted into right-direction and forward-direction displacement under a camera coordinate system with a meter as a unit; calculating the right-direction and forward-direction speeds of the camera coordinate system according to the interval time between the two frames of effective optical flow sensor data output; ensuring that the optical flow sensor is superposed with a carrier coordinate system through installation, so that the displacement and the speed of the camera coordinate system in the right direction and the forward direction are the displacement and the speed of the carrier in the right direction and the forward direction in the carrier coordinate system;
s6: and according to the result of the step S5, performing data fusion on the position (namely displacement) and the speed of the carrier system, IMU data, magnetometer data and fusion height obtained by calculating the data of the airflow sensor through the extended Kalman filtering, and solving the position, the speed and the attitude information of the carrier.
In step S2, a fusion height is calculated from the air pressure height and the laser ranging height, and the fusion height is calculated by fusing the air pressure height and the laser ranging height as measurement information of EKF data fusion of the extended kalman filter as follows:
(1) determining an initial height HBaro0
At the initial navigation time, the barometric altitude output by the barometric altimeter at the time is recorded as an initial barometric altitude HBaro_T0The laser ranging height output by the laser ranging sensor at this time is recorded as the initial laser ranging height HLaser_T0Calculating an initial height H for calculation of the variation in barometric pressure heightBaro0Neglecting the non-verticality of the laser ranging height caused by the horizontal attitude angle of the carrier at the initial navigation time, the calculation formula is as follows:
HBaro0=HBaro_T0-HLaser_T0(1)
(2) determining laser ranging sensor vertical height HLaser
Because the height in the navigation algorithm is vertical height, and because the measuring direction of the laser ranging sensor is parallel to the Z axis of the carrier system, when the horizontal attitude angle of the carrier is not 0, the laser ranging sensor measures an inclined height, the horizontal attitude angle of the carrier is required to be utilized to measure the laser ranging height H output by the laser ranging sensorLaserConversion to vertical height HLaser_verticalThe calculation formula is as follows:
HLaser_vertical=HLaser*cosθ*cosγ (2)
theta is the carrier pitch angle in radian units, and gamma is the carrier roll angle in radian units;
(3) calculating the fusion height H
According to the barometric altitude H output by the barometric altimeterBaroCombining the calculation values of the formula (1) and the formula (2), the calculation formula of the fusion H for the navigation data fusion measurement height is as follows:
H=HLaser_vertical*W+(1-W)*(HBaro-HBaro0) (3)
in the formula (3), W is a weight coefficient, the value range is 0-1, the laser ranging sensor generally outputs health parameters with the value range of 0-1, when the health parameters are low, the W value is set to be 0, the higher the health parameters are, the larger the W value is, and the health parameters of the laser ranging sensor are actually determined through the noise variance statistics of laser measurement distance values output by the sensor and the number statistics of over-range distance values.
Further, the process of performing angular motion compensation and physical scale conversion on the optical flow sensor data output by the optical flow sensor, i.e. the pixel displacement in the u-axis direction and the v-axis direction of the pixel coordinate system in step S5 is specifically as follows:
in the algorithm, an optical flow sensor is installed at the bottom, a camera coordinate system, an IMU coordinate system and a magnetometer coordinate system of the optical flow sensor are consistent with a carrier coordinate system of front right and upper, a u axis of a pixel coordinate system is parallel to a right axis of the carrier coordinate system, and a v axis of the pixel coordinate system is parallel to a front axis of the carrier coordinate system; the optical flow sensor directly outputs u-axis pixel displacement OpFlowX and v-axis pixel displacement OpFlowY under a pixel coordinate system, the u-axis pixel displacement OpFlowX and the v-axis pixel displacement OpFlowY comprise pixel displacement generated by linear motion and pixel displacement generated by angular motion, and in order to extract line motion information, the pixel displacement generated by angular motion of a carrier needs to be compensated:
in the formula (4), OpFlowX _ displacement and OpFlowY _ displacement are pixel displacements generated by line motion; gamma raylast、θlastThe roll angle and the pitch angle of the carrier are in radian unit when the last effective optical flow sensor data is output; k is an optical flow sensor angular motion compensation parameter, the value of K is a sensor factory parameter given in an optical flow sensor manual, and the optical flow sensor can be obtained by only performing angular motion test at a fixed height.
After obtaining the pixel displacement generated by the line motion in equation (4), it is converted into a camera coordinate system displacement in meters:
OpFlowP in formula (5)x、OpFlowPyThe displacement of the optical flow sensor between two frames of effective optical flow sensor data in the right direction and the forward direction under the camera coordinate system is measured in meters, because the camera coordinate system is coincident with the carrier coordinate system, the displacement is also the displacement of the optical flow sensor under the carrier coordinate system; resolution is the resolution of the optical flow sensor, which can be determined by looking into the optical flow sensorObtaining the parameter by a manual or by calibrating the GPS velocity under linear motion, the velocity after the acceleration output by the accelerometer is integrated or calibrating the reciprocating linear motion along a fixed distance and the like; pUThe resulting height is fused for the combined navigation data.
After the displacement of the optical flow sensor with the meter as the unit in the formula (5) in a carrier coordinate system is obtained, the speed of the optical flow sensor with the meter/second as the unit in the carrier system is calculated:
in the formula (6), OpFlowVx、OpFlowVyThe speed of the optical flow sensor in the right direction and the forward direction of the carrier coordinate system; t is tnowOutputting time with time seconds as a unit for the current frame of the optical flow sensor; t is tlastThe time is the time in seconds when the last frame data of the optical flow sensor is output.
Further, in step S6, when the navigation data is merged, the geomagnetic vector [ m ] in the local geographical coordinate system (northeast) is normalized by processing the magnetometer data, and the processing is performedEmNmU]TThe calculation method comprises the following steps:
normalizing value [ m ] of magnetometer raw output under carrier coordinate systemxmymz]TConversion to a geographic coordinate system:
wherein [ m ]E1mN1mU1]TThe earth magnetic vector is obtained by direct conversion of the attitude matrix in a geographic coordinate system;and the attitude matrix from the carrier coordinate system formed by quaternions to the navigation coordinate system.
Using the earth magnetic vector [ m ] in the geographic coordinate system in formula (7)E1mN1mU1]TReconstructing a geomagnetic vector [0 m ] under a geomagnetic northeast coordinate systemN2mU2]T:
The geomagnetic vector [0 m ] in the coordinate system of the geomagnetic northeast in the formula (8)N2mU2]TConverting to geomagnetic vector [ m ] in geographic coordinate system by compensation of geomagnetic declinationEmNmU]T:
And the Mag _ dec is a geomagnetic declination with radian as a unit, and is obtained by latitude and longitude inquiry.
Based on a station center coordinate system, an EKF fusion optical flow sensor position and speed combined navigation algorithm is adopted, and the state equation of the combined navigation algorithm is as follows:
wherein,in order to be in the state of the system,is the system noise;the method comprises the following steps: 3D position3 dimensional velocityQuaternion of 4-dimensional attitudeZero bias of 3D gyroscopeAnd 3-dimensional accelerometer zero offset16 dimensions in total; system noiseThe method comprises the following steps: 3-dimensional gyroscope white noise3-dimensional accelerometer white noise6 dimensions in total;is a quaternionThe multiplication matrix of (a);outputs acceleration for an accelerometer in the IMU,Outputting the angular velocity for a gyroscope in the IMU; the state one-step prediction is solved according to the state differential equation (10).
The combined navigation algorithm measurement equation is as follows:
wherein,representing the measurement;to measure the predicted value;representing the measurement noise;
in the formula (11), the noise is measuredThe method comprises 2-dimensional optical flow displacement measurement noise, 2-dimensional optical flow velocity measurement noise, magnetometer output measurement noise after 3-dimensional normalization, and 1-dimensional fusion height measurement noise.
In the formula (11), the reaction mixture is,represents an 8-dimensional measurement, and can be represented by the following formula (12):
in the formula (12), the reaction mixture is,OpFlowP representing right and forward displacement of 2-dimensional optical flow sensor output carrier systemxAnd OpFlowPy、OpFlowV representing right and forward velocities of 2-dimensional optical flow sensor output carrier systemxAnd OpFlowVy、Representing 3-dimensional normalization by magnetometer outputAnd measuring by the later magnetometer, and H represents the fusion height of the 1-dimensional air pressure height and the laser ranging height.
In the formula (11), the reaction mixture is,as a function of the non-linear relationship between measurement and state, it can be expressed as:
in the formula (13)Andcalculated from equations (14) and (15):
wherein,is the right and forward shift of the state from the northeast position to the vector system, i.e. the vector system velocity [ P ] in equation (14)xPyPz]TP in (1)xAnd Py;For the northeast speed in the state to be converted into the right and forward speed of the carrier, i.e. carrier speed [ V ] in equation (15)xVyVz]TV inxAnd Vy;PUNamely the stateHeight of (2);estimating the northeast position state of navigation data fusion when the last optical flow data is valid;is a normalized geomagnetic vector under a local geographic coordinate system (northeast sky); and (3) substituting the state one-step predicted value into a formula (13) to obtain a measured predicted value.
In the formula (11), the noise is measuredThe method comprises 2-dimensional optical flow displacement measurement noise, 2-dimensional optical flow velocity measurement noise, magnetometer output measurement noise after 3-dimensional normalization, and 1-dimensional fusion height measurement noise.
For equations (10) and (13), a state transition matrix Φ, a system noise driving matrix Γ, and a measurement matrix H are obtained by calculating a jacobian matrix.
State transition matrix Φ calculation:
calculating a system noise driving matrix gamma:
in the formulas (16) and (17), T is a navigation period, and I is an identity matrix;
and (3) calculating a measurement matrix H:
status of stateOne-step prediction valueObtaining by solving a differential equation:
in the formula (19)The state estimation value of the last navigation period is obtained;
and then, data fusion can be completed by using the extended Kalman filtering, and the position, the speed and the attitude information of the carrier are solved.
The invention has the beneficial effects that: the invention provides a combined navigation algorithm for fusing optical flow position and speed information, which utilizes position and speed information output by an optical flow sensor and data of an MEMS IMU, a magnetometer, an air pressure altimeter and a laser ranging sensor to complete data fusion by adopting an extended Kalman filter and solve the position, speed and attitude information of a carrier. The algorithm adopts a station center coordinate system as a navigation coordinate system, and the position coordinate of the carrier relative to the initial position point is calculated in real time. The algorithm can realize navigation and positioning of the carrier under the GNSS rejection condition, provide accurate course attitude and speed measurement data for the carrier, and effectively slow down the drift rate of strapdown pure inertial positioning errors.
Drawings
The invention is further illustrated by the following figures and examples.
FIG. 1 is a schematic block diagram of the algorithm of the present invention.
Fig. 2 is a flow chart of the algorithm of the present invention.
Detailed Description
The present invention will now be described in detail with reference to the accompanying drawings. This figure is a simplified schematic diagram, and merely illustrates the basic structure of the present invention in a schematic manner, and therefore it shows only the constitution related to the present invention.
As shown in fig. 1, the integrated navigation system according to the present invention includes an optical flow sensor, a memsmiu (IMU for short), a magnetometer, a barometric altimeter, and a laser ranging sensor mounted on a carrier, wherein a camera coordinate system of the optical flow sensor, an IMU coordinate system, and a magnetometer coordinate system are coincident with a right front upper carrier coordinate system, wherein,
the IMU comprises a three-axis orthogonal gyroscope and a three-axis orthogonal accelerometer which are respectively used for measuring angular velocity and acceleration (specific force); the magnetometer adopts a three-axis orthogonal magnetometer and is used for geomagnetic measurement; the barometric altimeter is used for measuring barometric altitude; the optical flow sensor is used for measuring pixel displacement between two frames of effective images under a pixel coordinate system; the laser ranging sensor is used for measuring the one-dimensional distance between the sensor and the reflection point.
As shown in FIG. 2, a combined navigation algorithm for fusing optical flow position and velocity information comprises the following steps:
s1: reading data information of an optical flow sensor, an IMU, a magnetometer, a barometric altimeter and a laser ranging sensor which are arranged on a carrier by a navigation computer, wherein pixel displacement between two frames of effective images in a u-axis direction and a v-axis direction under a pixel coordinate system is read from the optical flow sensor, angular velocity and acceleration (specific force) data are read from the IMU, geomagnetic intensity data are read from the magnetometer, barometric altitude data are read from the barometric altimeter, and laser ranging altitude data are read from the laser ranging sensor;
s2: fusing the air pressure height and the laser ranging height in the data information acquired in the step S1 by adopting a variable weight method, calculating to obtain a fusion height, and taking the fusion height as the height measurement of Extended Kalman Filter (EKF) data fusion;
s3: the navigation algorithm judges whether the optical flow sensor data is updated according to the optical flow sensor data updating mark, and if the optical flow sensor data is not updated, the step S4 is carried out; if the optical flow sensor data is updated, the flow proceeds to step S5;
s4: under the condition that the data of the optical flow sensor is not updated, carrying out strapdown pure inertial navigation recursion by a navigation algorithm, and solving the position, the speed and the attitude information of the carrier;
s5: under the condition that the data of the optical flow sensor is updated, angular motion compensation is carried out on two-dimensional pixel displacement output by the optical flow sensor to obtain pixel displacement corresponding to linear motion, physical scale conversion is carried out on the linear motion pixel displacement by utilizing the resolution of a camera and the physical distance between the camera and a photographed plane, and the linear motion pixel displacement is converted into right-direction and forward-direction displacement under a camera coordinate system with a meter as a unit; calculating the right-direction and forward-direction speeds of the camera coordinate system according to the interval time between the two frames of effective optical flow sensor data output; ensuring that the optical flow sensor is superposed with a carrier coordinate system through installation, so that the displacement and the speed of the camera coordinate system in the right direction and the forward direction are the displacement and the speed of the carrier in the right direction and the forward direction in the carrier coordinate system;
s6: and according to the result of the step S5, performing data fusion on the position (namely displacement) and the speed of the carrier system, IMU data, magnetometer data and fusion height obtained by calculating the data of the airflow sensor through the extended Kalman filtering, and solving the position, the speed and the attitude information of the carrier.
In step S2, a fusion height is calculated from the air pressure height and the laser ranging height, and is used as the measurement information for EKF data fusion of the extended kalman filter, wherein the process of fusing the air pressure height and the laser ranging height to calculate the fusion height is as follows:
(1) determining an initial height HBaro0
At the initial navigation time, the barometric altitude output by the barometric altimeter at the time is recorded as an initial barometric altitude HBaro_T0The laser ranging height output by the laser ranging sensor at this time is recorded as the initial laser ranging height HLaser_T0Calculating an initial height H for calculation of the variation in barometric pressure heightBaro0Neglecting the non-verticality of the laser ranging height caused by the horizontal attitude angle of the carrier at the initial navigation time, the calculation formula is as follows:
HBaro0=HBaro_T0-HLaser_T0(1)
(2) determining laser ranging sensor vertical height HLaser
Because the height in the navigation algorithm is vertical height, and because the measuring direction of the laser ranging sensor is parallel to the Z axis of the carrier system, when the horizontal attitude angle of the carrier is not 0, the laser ranging sensor measures an inclined height, the horizontal attitude angle of the carrier is required to be utilized to measure the laser ranging height H output by the laser ranging sensorLaserConversion to vertical height HLaser_verticalThe calculation formula is as follows:
HLaser_vertical=HLaser*cosθ*cosγ (2)
theta is the carrier pitch angle in radian units, and gamma is the carrier roll angle in radian units;
(3) calculating the fusion height H
According to the barometric altitude H output by the barometric altimeterBaroCombining the calculation values of the formula (1) and the formula (2), the calculation formula of the fusion H for the navigation data fusion measurement height is as follows:
H=HLaser_vertical*W+(1-W)*(HBaro-HBaro0) (3)
in the formula (3), W is a weight coefficient, the value range is 0-1, the laser ranging sensor generally outputs health parameters with the value range of 0-1, when the health parameters are low, the W value is set to be 0, the higher the health parameters are, the larger the W value is, and the health parameters of the laser ranging sensor are actually determined through the noise variance statistics of laser measurement distance values output by the sensor and the number statistics of over-range distance values.
Further, the process of performing angular motion compensation and physical scale conversion on the optical flow sensor data output by the optical flow sensor, i.e. the pixel displacement in the u-axis direction and the v-axis direction of the pixel coordinate system in step S5 is specifically as follows:
in the algorithm, an optical flow sensor is installed at the bottom, a camera coordinate system, an IMU coordinate system and a magnetometer coordinate system of the optical flow sensor are consistent with a carrier coordinate system of front right and upper, a u axis of a pixel coordinate system is parallel to a right axis of the carrier coordinate system, and a v axis of the pixel coordinate system is parallel to a front axis of the carrier coordinate system; the optical flow sensor directly outputs u-axis pixel displacement OpFlowX and v-axis pixel displacement OpFlowY under a pixel coordinate system, the u-axis pixel displacement OpFlowX and the v-axis pixel displacement OpFlowY comprise pixel displacement generated by linear motion and pixel displacement generated by angular motion, and in order to extract line motion information, the pixel displacement generated by angular motion of a carrier needs to be compensated:
in the formula (4), OpFlowX _ displacement and OpFlowY _ displacement are pixel displacements generated by line motion; gamma raylast、θlastThe roll angle and the pitch angle of the carrier are in radian unit when the last effective optical flow sensor data is output; k is an optical flow sensor angular motion compensation parameter, the value of K is a sensor factory parameter given in an optical flow sensor manual, and the optical flow sensor can be obtained by only performing angular motion test at a fixed height.
After obtaining the pixel displacement generated by the line motion in equation (4), it is converted into a camera coordinate system displacement in meters:
OpFlowP in formula (5)x、OpFlowPyThe displacement of the optical flow sensor between two frames of effective optical flow sensor data in the right direction and the forward direction under the camera coordinate system is measured in meters, because the camera coordinate system is coincident with the carrier coordinate system, the displacement is also the displacement of the optical flow sensor under the carrier coordinate system; resolution is the resolution of the optical flow sensor, and can be obtained by looking up a manual of the optical flow sensor, or can be obtained by calibrating the speed of a GPS under linear motion, calibrating the speed after the acceleration output by an accelerometer is integrated, or calibrating the parameter along the reciprocating linear motion between fixed distances; pUThe resulting height is fused for the combined navigation data.
After the displacement of the optical flow sensor with the meter as the unit in the formula (5) in a carrier coordinate system is obtained, the speed of the optical flow sensor with the meter/second as the unit in the carrier system is calculated:
in the formula (6), OpFlowVx、OpFlowVyThe speed of the optical flow sensor in the right direction and the forward direction of the carrier coordinate system; t is tnowOutputting time with time seconds as a unit for the current frame of the optical flow sensor; t is tlastThe time is the time in seconds when the last frame data of the optical flow sensor is output.
Further, in step S6, when the navigation data is merged, the geomagnetic vector [ m ] in the local geographical coordinate system (northeast) is normalized by processing the magnetometer data, and the processing is performedEmNmU]TThe calculation method comprises the following steps:
normalizing value [ m ] of magnetometer raw output under carrier coordinate systemxmymz]TConversion to a geographic coordinate system:
wherein [ m ]E1mN1mU1]TThe earth magnetic vector is obtained by direct conversion of the attitude matrix in a geographic coordinate system;and the attitude matrix from the carrier coordinate system formed by quaternions to the navigation coordinate system.
Using the earth magnetic vector [ m ] in the geographic coordinate system in formula (7)E1mN1mU1]TReconstructing a geomagnetic vector [0 m ] under a geomagnetic northeast coordinate systemN2mU2]T:
The geomagnetic vector [0 m ] in the coordinate system of the geomagnetic northeast in the formula (8)N2mU2]TConverting to geomagnetic vector [ m ] in geographic coordinate system by compensation of geomagnetic declinationEmNmU]T:
And the Mag _ dec is a geomagnetic declination with radian as a unit, and is obtained by latitude and longitude inquiry.
Based on a station center coordinate system, an EKF fusion optical flow sensor position and speed combined navigation algorithm is adopted, and the state equation of the combined navigation algorithm is as follows:
wherein,in order to be in the state of the system,is the system noise;the method comprises the following steps: 3D position3 dimensional velocityQuaternion of 4-dimensional attitudeZero bias of 3D gyroscopeAnd 3-dimensional accelerometer zero offset16 dimensions in total; system noiseThe method comprises the following steps: 3-dimensional gyroscope white noise3-dimensional accelerometer white noise6 dimensions in total;is a quaternionMultiplication ofA matrix;outputs acceleration for an accelerometer in the IMU,Outputting the angular velocity for a gyroscope in the IMU; the state one-step prediction is solved according to the state differential equation (10).
The combined navigation algorithm measurement equation is as follows:
wherein,representing the measurement;to measure the predicted value;representing the measurement noise;
in the formula (11), the noise is measuredThe method comprises the steps of measuring noise by 2-dimensional optical flow displacement, measuring noise by 2-dimensional optical flow velocity, measuring noise output by a magnetometer after 3-dimensional normalization and measuring noise by 1-dimensional fusion height;
in the formula (11), the reaction mixture is,represents an 8-dimensional measurement, and can be represented by the following formula (12):
in the formula (12), the reaction mixture is,OpFlowP representing right and forward displacement of 2-dimensional optical flow sensor output carrier systemxAnd OpFlowPy、OpFlowV representing right and forward velocities of 2-dimensional optical flow sensor output carrier systemxAnd OpFlowVy、And H represents the fusion height of the 1-dimensional air pressure height and the laser ranging height.
In the formula (11), the reaction mixture is,as a function of the non-linear relationship between measurement and state, it can be expressed as:
in the formula (13)Andcalculated from equations (14) and (15):
wherein,is the right and forward shift of the state from the northeast position to the vector system, i.e. the vector system velocity [ P ] in equation (14)xPyPz]TP in (1)xAnd Py;For the northeast speed in the state to be converted into the right and forward speed of the carrier, i.e. carrier speed [ V ] in equation (15)xVyVz]TV inxAnd Vy;PUNamely the stateHeight of (2);estimating the northeast position state of navigation data fusion when the last optical flow data is valid;is a normalized geomagnetic vector under a local geographic coordinate system (northeast sky); and (3) substituting the state one-step predicted value into a formula (13) to obtain a measured predicted value.
In the formula (11), the noise is measuredThe method comprises 2-dimensional optical flow displacement measurement noise, 2-dimensional optical flow velocity measurement noise, magnetometer output measurement noise after 3-dimensional normalization, and 1-dimensional fusion height measurement noise.
Calculating Jacobian matrixes according to the formula (10) and the formula (13) to obtain a state transition matrix phi, a system noise driving matrix gamma and a measurement matrix H;
state transition matrix Φ calculation:
calculating a system noise driving matrix gamma:
in the formulas (16) and (17), T is a navigation period, and I is an identity matrix;
and (3) calculating a measurement matrix H:
status of stateOne-step prediction valueObtaining by solving a differential equation:
in the formula (19)The state estimation value of the last navigation period is obtained;
and then, data fusion can be completed by using the extended Kalman filtering, and the position, the speed and the attitude information of the carrier are solved.
In light of the foregoing description of preferred embodiments in accordance with the invention, it is to be understood that numerous changes and modifications may be made by those skilled in the art without departing from the scope of the invention. The technical scope of the present invention is not limited to the contents of the specification, and must be determined according to the scope of the claims.
Claims (5)
1. A combined navigation algorithm for fusing optical flow position and speed information is characterized in that: the navigation system comprises an optical flow sensor, an IMU, a magnetometer, a barometric altimeter and a laser ranging sensor, and further comprises the following steps:
s1: reading data information of an optical flow sensor, an IMU, a magnetometer, a barometric altimeter and a laser ranging sensor which are arranged on a carrier by a navigation computer, wherein pixel displacement between two frames of effective images in a u-axis direction and a v-axis direction under a pixel coordinate system is read from the optical flow sensor, angular velocity and acceleration data are read from the IMU, geomagnetic intensity data are read from the magnetometer, barometric altitude data are read from the barometric altimeter, and laser ranging altitude data are read from the laser ranging sensor;
s2: fusing the air pressure height and the laser ranging height in the data information obtained in the step S1 by adopting a variable weight method, calculating to obtain a fusion height, and taking the fusion height as the height measurement of the extended Kalman filtering data fusion;
s3: the navigation algorithm judges whether the optical flow sensor data is updated according to the optical flow sensor data updating mark, and if the optical flow sensor data is not updated, the step S4 is carried out; if the optical flow sensor data is updated, the flow proceeds to step S5;
s4: under the condition that the data of the optical flow sensor is not updated, carrying out strapdown pure inertial navigation recursion by a navigation algorithm, and solving the position, the speed and the attitude information of the carrier;
s5: under the condition that the data of the optical flow sensor is updated, angular motion compensation is carried out on two-dimensional pixel displacement output by the optical flow sensor to obtain pixel displacement corresponding to linear motion, physical scale conversion is carried out on the linear motion pixel displacement by utilizing the resolution of a camera and the physical distance between the camera and a photographed plane, and the linear motion pixel displacement is converted into right-direction and forward-direction displacement under a camera coordinate system with a meter as a unit; calculating the right-direction and forward-direction speeds of the camera coordinate system according to the interval time between the two frames of effective optical flow sensor data output;
s6: and according to the result of the step S5, performing data fusion on the position and the speed of the carrier system, IMU data, magnetometer data and fusion height obtained by calculating the data of the airflow sensor through the extended Kalman filtering, and solving the position, the speed and the attitude information of the carrier.
2. The combined navigation algorithm with fused optical flow position and velocity information according to claim 1, wherein: the process of fusing the air pressure height and the laser ranging height to calculate the fusion height in step S2 is as follows:
(1) determining an initial height HBaro0
At the initial navigation time, the barometric altitude output by the barometric altimeter at the time is recorded as an initial barometric altitude HBaro_T0The laser ranging height output by the laser ranging sensor at this time is recorded as the initial laser ranging height HLaser_T0Calculating an initial height H for calculation of the variation in barometric pressure heightBaro0Neglecting the non-verticality of the laser ranging height caused by the horizontal attitude angle of the carrier at the initial navigation time, the calculation formula is as follows:
HBaro0=HBaro_T0-HLaser_T0(1)
(2) determining laser ranging sensor vertical height HLaser
Because the height in the navigation algorithm is vertical height, and because the measuring direction of the laser ranging sensor is parallel to the Z axis of the carrier system, when the horizontal attitude angle of the carrier is not 0, the laser ranging sensor measures an inclined height, the horizontal attitude angle of the carrier is required to be utilized to measure the laser ranging height H output by the laser ranging sensorLaserConversion to vertical height HLaser_verticalThe calculation formula is as follows:
HLaser_vertical=HLaser*cosθ*cosγ (2)
theta is the carrier pitch angle in radian units, and gamma is the carrier roll angle in radian units;
(3) calculating the fusion height H
According to the barometric altitude H output by the barometric altimeterBaroCombining the calculation values of the formula (1) and the formula (2), the calculation formula of the fusion H for the navigation data fusion measurement height is as follows:
H=HLaser_vertical*W+(1-W)*(HBaro-HBaro0) (3)
in the formula (3), W is a weight coefficient, the value range is 0-1, the laser ranging sensor generally outputs health parameters with the value range of 0-1, when the health parameters are low, the W value is set to be 0, the higher the health parameters are, the larger the W value is, and the health parameters of the laser ranging sensor are actually determined through the noise variance statistics of laser measurement distance values output by the sensor and the number statistics of over-range distance values.
3. The combined navigation algorithm with fusion of optical flow position and velocity information according to claim 2, wherein: the process of performing angular motion compensation and physical scale conversion on the optical flow sensor data output by the optical flow sensor in step S5, that is, the pixel displacements in the u-axis direction and the v-axis direction of the pixel coordinate system, is specifically as follows:
in the algorithm, an optical flow sensor is installed at the bottom, a camera coordinate system, an IMU coordinate system and a magnetometer coordinate system of the optical flow sensor are consistent with a carrier coordinate system of front right and upper, a u axis of a pixel coordinate system is parallel to a right axis of the carrier coordinate system, and a v axis of the pixel coordinate system is parallel to a front axis of the carrier coordinate system; the optical flow sensor directly outputs u-axis pixel displacement OpFlowX and v-axis pixel displacement OpFlowY under a pixel coordinate system, the u-axis pixel displacement OpFlowX and the v-axis pixel displacement OpFlowY comprise pixel displacement generated by linear motion and pixel displacement generated by angular motion, and in order to extract line motion information, the pixel displacement generated by angular motion of a carrier needs to be compensated:
in the formula (4), OpFlowX _ displacement and OpFlowY _ displacement are pixel displacements generated by line motion; gamma raylast、θlastThe roll angle and the pitch angle of the carrier are in radian unit when the last effective optical flow sensor data is output; k is an optical flow sensor angular motion compensation parameter;
after obtaining the pixel displacement generated by the line motion in equation (4), it is converted into a camera coordinate system displacement in meters:
OpFlowP in formula (5)x、OpFlowPyRight-hand sum in camera coordinate system for optical flow sensor between two frames of valid optical flow sensor dataForward displacement in meters; resolution is the resolution of the optical flow sensor; pUHeight obtained by combining navigation data fusion;
after the displacement of the optical flow sensor with the meter as the unit in the formula (5) in a carrier coordinate system is obtained, the speed of the optical flow sensor with the meter/second as the unit in the carrier system is calculated:
in the formula (6), OpFlowVx、OpFlowVyThe speed of the optical flow sensor in the right direction and the forward direction of the carrier coordinate system; t is tnowOutputting time with time seconds as a unit for the current frame of the optical flow sensor; t is tlastThe time is the time in seconds when the last frame data of the optical flow sensor is output.
4. The attitude autonomous redundant combinational navigation algorithm of claim 3, characterized in that: in step S6, when the navigation data is merged, the geomagnetic vector [ m ] in the local geographical coordinate system is normalized by processing the magnetometer dataEmNmU]TThe calculation method comprises the following steps:
normalizing value [ m ] of magnetometer raw output under carrier coordinate systemxmymz]TConversion to a geographic coordinate system:
wherein [ m ]E1mN1mU1]TThe earth magnetic vector is obtained by direct conversion of the attitude matrix in a geographic coordinate system;an attitude matrix from a carrier coordinate system formed by quaternions to a navigation coordinate system;
using the earth magnetic vector [ m ] in the geographic coordinate system in formula (7)E1mN1mU1]TReconstructing a geomagnetic vector [0 m ] under a geomagnetic northeast coordinate systemN2mU2]T:
The geomagnetic vector [0 m ] in the coordinate system of the geomagnetic northeast in the formula (8)N2mU2]TConverting to geomagnetic vector [ m ] in geographic coordinate system by compensation of geomagnetic declinationEmNmU]T:
And the Mag _ dec is a geomagnetic declination with radian as a unit, and is obtained by latitude and longitude inquiry.
5. The combined navigation algorithm with fused optical flow position and velocity information according to claim 4, wherein: based on a station center coordinate system, an EKF fusion optical flow sensor position and speed combined navigation algorithm is adopted, and the state equation of the combined navigation algorithm is as follows:
wherein,in order to be in the state of the system,is the system noise;the method comprises the following steps: 3D position3 dimensional velocityQuaternion of 4-dimensional attitudeZero bias of 3D gyroscopeAnd 3-dimensional accelerometer zero offset16 dimensions in total; system noiseThe method comprises the following steps: 3-dimensional gyroscope white noise3-dimensional accelerometer white noise6 dimensions in total;is a quaternionThe multiplication matrix of (a);outputs acceleration for an accelerometer in the IMU,Outputting the angular velocity for a gyroscope in the IMU; solving the state one-step prediction according to the state differential equation (10);
the combined navigation algorithm measurement equation is as follows:
wherein,representing the measurement;to measure the predicted value;representing the measurement noise;
in the formula (11), the noise is measuredThe method comprises the steps of measuring noise by 2-dimensional optical flow displacement, measuring noise by 2-dimensional optical flow velocity, measuring noise output by a magnetometer after 3-dimensional normalization and measuring noise by 1-dimensional fusion height;
in the formula (11), the reaction mixture is,represents an 8-dimensional measurement, and can be represented by the following formula (12):
in the formula (12), the reaction mixture is,OpFlowP representing right and forward displacement of 2-dimensional optical flow sensor output carrier systemxAnd OpFlowPy、OpFlowV representing right and forward velocities of 2-dimensional optical flow sensor output carrier systemxAnd OpFlowVy、The fusion height represents the magnetometer measurement after the output normalization of the 3-dimensional magnetometer, and the fusion height represents the 1-dimensional air pressure height and the laser ranging height;
in the formula (11), the reaction mixture is,as a function of the non-linear relationship between measurement and state, it can be expressed as:
in the formula (13)Andcalculated from equations (14) and (15):
wherein,is the right and forward shift of the state from the northeast position to the vector system, i.e. the vector system velocity [ P ] in equation (14)xPyPz]TP in (1)xAnd Py;For the northeast speed in the state to be converted into the right and forward speed of the carrier, i.e. carrier speed [ V ] in equation (15)xVyVz]TV inxAnd Vy;PUNamely the stateHeight of (2);estimating the northeast position state of navigation data fusion when the last optical flow data is valid;is a normalized geomagnetic vector under a local geographic coordinate system; substituting the state one-step predicted value into a formula (13) to obtain a measured predicted value;
in the formula (11), the noise is measuredThe method comprises the steps of measuring noise by 2-dimensional optical flow displacement, measuring noise by 2-dimensional optical flow velocity, measuring noise output by a magnetometer after 3-dimensional normalization and measuring noise by 1-dimensional fusion height;
calculating Jacobian matrixes according to the formula (10) and the formula (13) to obtain a state transition matrix phi, a system noise driving matrix gamma and a measurement matrix H;
state transition matrix Φ calculation:
calculating a system noise driving matrix gamma:
in the formulas (16) and (17), T is a navigation period, and I is an identity matrix;
and (3) calculating a measurement matrix H:
status of stateOne-step prediction valueObtaining by solving a differential equation:
in the formula (19)The state estimation value of the last navigation period is obtained;
and then, data fusion can be completed by using the extended Kalman filtering, and the position, the speed and the attitude information of the carrier are solved.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910270669.5A CN109916394A (en) | 2019-04-04 | 2019-04-04 | Combined navigation algorithm fusing optical flow position and speed information |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910270669.5A CN109916394A (en) | 2019-04-04 | 2019-04-04 | Combined navigation algorithm fusing optical flow position and speed information |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109916394A true CN109916394A (en) | 2019-06-21 |
Family
ID=66968658
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910270669.5A Pending CN109916394A (en) | 2019-04-04 | 2019-04-04 | Combined navigation algorithm fusing optical flow position and speed information |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109916394A (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110428452A (en) * | 2019-07-11 | 2019-11-08 | 北京达佳互联信息技术有限公司 | Detection method, device, electronic equipment and the storage medium of non-static scene point |
CN110873813A (en) * | 2019-12-02 | 2020-03-10 | 中国人民解放军战略支援部队信息工程大学 | Water flow velocity estimation method, integrated navigation method and device |
CN111445491A (en) * | 2020-03-24 | 2020-07-24 | 山东智翼航空科技有限公司 | Three-neighborhood maximum difference value edge detection narrow lane guidance algorithm for micro unmanned aerial vehicle |
CN112254721A (en) * | 2020-11-06 | 2021-01-22 | 南京大学 | Attitude positioning method based on optical flow camera |
CN112284380A (en) * | 2020-09-23 | 2021-01-29 | 深圳市富临通实业股份有限公司 | Nonlinear estimation method and system based on fusion of optical flow and IMU (inertial measurement Unit) |
CN112923924A (en) * | 2021-02-01 | 2021-06-08 | 杭州电子科技大学 | Method and system for monitoring attitude and position of anchored ship |
CN114018241A (en) * | 2021-11-03 | 2022-02-08 | 广州昂宝电子有限公司 | Positioning method and device for unmanned aerial vehicle |
CN114184194A (en) * | 2021-11-30 | 2022-03-15 | 中国电子科技集团公司第二十九研究所 | Unmanned aerial vehicle autonomous navigation positioning method in rejection environment |
CN114216454A (en) * | 2021-10-27 | 2022-03-22 | 湖北航天飞行器研究所 | Unmanned aerial vehicle autonomous navigation positioning method based on heterogeneous image matching in GPS rejection environment |
CN115435779A (en) * | 2022-08-17 | 2022-12-06 | 南京航空航天大学 | Intelligent body pose estimation method based on GNSS/IMU/optical flow information fusion |
CN118050814A (en) * | 2024-04-16 | 2024-05-17 | 山东省地质矿产勘查开发局第五地质大队(山东省第五地质矿产勘查院) | Low-altitude optical attitude determination three-component magnetic measurement system and method for measuring magnetic field vector data |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110206236A1 (en) * | 2010-02-19 | 2011-08-25 | Center Jr Julian L | Navigation method and aparatus |
CN103344218A (en) * | 2013-06-18 | 2013-10-09 | 桂林理工大学 | System and method for measuring altitude of low-altitude unmanned plane |
US20170212529A1 (en) * | 2013-11-27 | 2017-07-27 | The Trustees Of The University Of Pennsylvania | Multi-sensor fusion for robust autonomous flight in indoor and outdoor environments with a rotorcraft micro-aerial vehicle (mav) |
CN106989744A (en) * | 2017-02-24 | 2017-07-28 | 中山大学 | A kind of rotor wing unmanned aerial vehicle autonomic positioning method for merging onboard multi-sensor |
CN107074360A (en) * | 2016-11-22 | 2017-08-18 | 深圳市大疆创新科技有限公司 | Control method, flight controller and the unmanned vehicle of unmanned vehicle |
CN109540126A (en) * | 2018-12-03 | 2019-03-29 | 哈尔滨工业大学 | A kind of inertia visual combination air navigation aid based on optical flow method |
-
2019
- 2019-04-04 CN CN201910270669.5A patent/CN109916394A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110206236A1 (en) * | 2010-02-19 | 2011-08-25 | Center Jr Julian L | Navigation method and aparatus |
CN103344218A (en) * | 2013-06-18 | 2013-10-09 | 桂林理工大学 | System and method for measuring altitude of low-altitude unmanned plane |
US20170212529A1 (en) * | 2013-11-27 | 2017-07-27 | The Trustees Of The University Of Pennsylvania | Multi-sensor fusion for robust autonomous flight in indoor and outdoor environments with a rotorcraft micro-aerial vehicle (mav) |
CN107074360A (en) * | 2016-11-22 | 2017-08-18 | 深圳市大疆创新科技有限公司 | Control method, flight controller and the unmanned vehicle of unmanned vehicle |
CN106989744A (en) * | 2017-02-24 | 2017-07-28 | 中山大学 | A kind of rotor wing unmanned aerial vehicle autonomic positioning method for merging onboard multi-sensor |
CN109540126A (en) * | 2018-12-03 | 2019-03-29 | 哈尔滨工业大学 | A kind of inertia visual combination air navigation aid based on optical flow method |
Non-Patent Citations (1)
Title |
---|
杨天雨等: "惯性/光流/磁组合导航技术在四旋翼飞行器中的应用", 《传感器与微系统》 * |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110428452B (en) * | 2019-07-11 | 2022-03-25 | 北京达佳互联信息技术有限公司 | Method and device for detecting non-static scene points, electronic equipment and storage medium |
CN110428452A (en) * | 2019-07-11 | 2019-11-08 | 北京达佳互联信息技术有限公司 | Detection method, device, electronic equipment and the storage medium of non-static scene point |
CN110873813A (en) * | 2019-12-02 | 2020-03-10 | 中国人民解放军战略支援部队信息工程大学 | Water flow velocity estimation method, integrated navigation method and device |
CN111445491A (en) * | 2020-03-24 | 2020-07-24 | 山东智翼航空科技有限公司 | Three-neighborhood maximum difference value edge detection narrow lane guidance algorithm for micro unmanned aerial vehicle |
CN111445491B (en) * | 2020-03-24 | 2023-09-15 | 山东智翼航空科技有限公司 | Three-neighborhood maximum difference edge detection narrow channel guiding method for miniature unmanned aerial vehicle |
CN112284380A (en) * | 2020-09-23 | 2021-01-29 | 深圳市富临通实业股份有限公司 | Nonlinear estimation method and system based on fusion of optical flow and IMU (inertial measurement Unit) |
CN112254721A (en) * | 2020-11-06 | 2021-01-22 | 南京大学 | Attitude positioning method based on optical flow camera |
CN112923924A (en) * | 2021-02-01 | 2021-06-08 | 杭州电子科技大学 | Method and system for monitoring attitude and position of anchored ship |
CN114216454A (en) * | 2021-10-27 | 2022-03-22 | 湖北航天飞行器研究所 | Unmanned aerial vehicle autonomous navigation positioning method based on heterogeneous image matching in GPS rejection environment |
CN114216454B (en) * | 2021-10-27 | 2023-09-08 | 湖北航天飞行器研究所 | Unmanned aerial vehicle autonomous navigation positioning method based on heterogeneous image matching in GPS refusing environment |
CN114018241A (en) * | 2021-11-03 | 2022-02-08 | 广州昂宝电子有限公司 | Positioning method and device for unmanned aerial vehicle |
CN114018241B (en) * | 2021-11-03 | 2023-12-26 | 广州昂宝电子有限公司 | Positioning method and device for unmanned aerial vehicle |
CN114184194A (en) * | 2021-11-30 | 2022-03-15 | 中国电子科技集团公司第二十九研究所 | Unmanned aerial vehicle autonomous navigation positioning method in rejection environment |
CN115435779A (en) * | 2022-08-17 | 2022-12-06 | 南京航空航天大学 | Intelligent body pose estimation method based on GNSS/IMU/optical flow information fusion |
CN118050814A (en) * | 2024-04-16 | 2024-05-17 | 山东省地质矿产勘查开发局第五地质大队(山东省第五地质矿产勘查院) | Low-altitude optical attitude determination three-component magnetic measurement system and method for measuring magnetic field vector data |
CN118050814B (en) * | 2024-04-16 | 2024-07-12 | 山东省地质矿产勘查开发局第五地质大队(山东省第五地质矿产勘查院) | Low-altitude optical attitude determination three-component magnetic measurement system and method for measuring magnetic field vector data |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109916394A (en) | Combined navigation algorithm fusing optical flow position and speed information | |
CN109931926B (en) | Unmanned aerial vehicle seamless autonomous navigation method based on station-core coordinate system | |
CN107655476B (en) | Pedestrian high-precision foot navigation method based on multi-information fusion compensation | |
CN102575933B (en) | System that generates map image integration database and program that generates map image integration database | |
US10337884B2 (en) | Method and apparatus for fast magnetometer calibration | |
US7805244B2 (en) | Attitude correction apparatus and method for inertial navigation system using camera-type solar sensor | |
CN107588769B (en) | Vehicle-mounted strapdown inertial navigation, odometer and altimeter integrated navigation method | |
US20110153198A1 (en) | Method for the display of navigation instructions using an augmented-reality concept | |
US20120078510A1 (en) | Camera and inertial measurement unit integration with navigation data feedback for feature tracking | |
CN109186597B (en) | Positioning method of indoor wheeled robot based on double MEMS-IMU | |
KR100558367B1 (en) | System and method for making digital map using gps and ins | |
US11408735B2 (en) | Positioning system and positioning method | |
CN112432642B (en) | Gravity beacon and inertial navigation fusion positioning method and system | |
CN111025366B (en) | Grid SLAM navigation system and method based on INS and GNSS | |
CN106705966A (en) | Stable platform system capable of realizing high-precision absolute position and posture measurement | |
CN112197765B (en) | Method for realizing fine navigation of underwater robot | |
CN114396943B (en) | Fusion positioning method and terminal | |
CN109725339A (en) | A kind of tightly coupled automatic Pilot cognitive method and system | |
JP4986883B2 (en) | Orientation device, orientation method and orientation program | |
CN114719843A (en) | High-precision positioning method in complex environment | |
CN115790613B (en) | Visual information-assisted inertial/odometer combined navigation method and device | |
Park et al. | Implementation of vehicle navigation system using GNSS, INS, odometer and barometer | |
CN109084755B (en) | Accelerometer zero offset estimation method based on gravity apparent velocity and parameter identification | |
CN115356965B (en) | Loose coupling real-package data acquisition device and data processing method | |
CN115790585A (en) | Visual-aided gait feature constraint pedestrian navigation method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
AD01 | Patent right deemed abandoned |
Effective date of abandoning: 20230609 |
|
AD01 | Patent right deemed abandoned |