CN117075158A - Pose estimation method and system of unmanned deformation motion platform based on laser radar - Google Patents
Pose estimation method and system of unmanned deformation motion platform based on laser radar Download PDFInfo
- Publication number
- CN117075158A CN117075158A CN202311065030.6A CN202311065030A CN117075158A CN 117075158 A CN117075158 A CN 117075158A CN 202311065030 A CN202311065030 A CN 202311065030A CN 117075158 A CN117075158 A CN 117075158A
- Authority
- CN
- China
- Prior art keywords
- pose
- laser radar
- positioning information
- point cloud
- laser
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 53
- 230000004927 fusion Effects 0.000 claims abstract description 49
- 238000001914 filtration Methods 0.000 claims abstract description 23
- 238000010276 construction Methods 0.000 claims abstract description 11
- 238000009499 grossing Methods 0.000 claims abstract description 8
- 239000011159 matrix material Substances 0.000 claims description 50
- 238000005259 measurement Methods 0.000 claims description 42
- 238000012545 processing Methods 0.000 claims description 11
- 238000006243 chemical reaction Methods 0.000 claims description 9
- 230000006870 function Effects 0.000 claims description 9
- 230000008569 process Effects 0.000 claims description 7
- 238000004590 computer program Methods 0.000 claims description 5
- 230000003044 adaptive effect Effects 0.000 claims description 4
- 238000007781 pre-processing Methods 0.000 claims description 4
- 238000005516 engineering process Methods 0.000 abstract description 8
- 238000010586 diagram Methods 0.000 description 8
- 238000004422 calculation algorithm Methods 0.000 description 6
- 238000001514 detection method Methods 0.000 description 5
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 238000013507 mapping Methods 0.000 description 3
- 230000004075 alteration Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 239000011664 nicotinic acid Substances 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 235000001968 nicotinic acid Nutrition 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/40—Correcting position, velocity or attitude
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
- G01C21/1652—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/48—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/48—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
- G01S19/49—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an inertial position system, e.g. loosely-coupled
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/53—Determining attitude
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Automation & Control Theory (AREA)
- Navigation (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
A pose estimation method and system of unmanned deformation motion platform based on laser radar relates to the technical field of laser technology application, the method includes: collecting laser point cloud data of a scene; obtaining a laser radar estimated pose based on the laser point cloud data; acquiring wheel type odometer, IMU pose information, GPS positioning information and RTK positioning information; fusing the GPS positioning information and the RTK positioning information, combining the fusion result of the laser radar estimated pose and the wheel type odometer, smoothing a pose route by using the IMU pose information, and estimating the current system state from the last system state by using a Kalman filtering nonlinear stochastic difference equation to obtain a pose estimation result; according to the method, the data in five dimensions are subjected to fusion estimation, so that the accuracy of pose estimation and scene construction is remarkably improved.
Description
Technical Field
The application relates to the technical field of laser technology application.
Background
In recent years, autonomous unmanned systems typified by unmanned vehicles are one of the great directions of development in the current high-tech fields, and have been involved in many research directions and fields including automation, computers, photoelectric detection, hardware control, and the like. In order to complete tasks, the unmanned vehicle is generally required to have multiple functions of detecting surrounding environment, controlling and executing commands, analyzing and planning in real time, and the like, and is a comprehensive detection system. In the actual moving process, the unmanned vehicle needs to have at least four capabilities, namely, the capability of positioning the real-time position of the unmanned vehicle, constructing a surrounding map, planning a path and controlling motion, wherein the capability of constructing the surrounding map, namely, the capability of three-dimensional reconstruction of the surrounding map is the most important step.
Scene construction requires estimating the pose of environmental objects to form a three-dimensional scene map. There are two main methods established for common scenarios: one is scene construction by obtaining surrounding information by using a laser radar, and the second is reconstruction by correcting map information by using a vision camera. The laser reconstruction technology is mature, and the advantages include: the method has the advantages of high reliability, high precision, no accumulated error and intuitive graph construction, but has the defects of being limited by the detection range of the radar and being provided with structural limitations; the visual reconstruction technology has the advantages of various installation modes, simple structure, no sensor detection distance limitation and the like, but has the defect of accumulated errors.
Therefore, how to provide a pose estimation method of an unmanned deformation motion platform with smaller error so as to construct a more accurate three-dimensional scene becomes a technical problem to be solved in the field.
Disclosure of Invention
In order to solve the technical problems, the application provides a pose estimation method and a pose estimation system for an unmanned deformation motion platform based on a laser radar.
Based on the same inventive concept, the application has five independent technical schemes:
1. a pose estimation method of unmanned deformation motion platform based on laser radar comprises the following steps:
collecting laser point cloud data of a scene;
obtaining a laser radar estimated pose based on the laser point cloud data;
acquiring wheel type odometer, IMU pose information, GPS positioning information and RTK positioning information;
and fusing the GPS positioning information and the RTK positioning information, combining a fusion result of the laser radar estimated pose and the wheel type odometer, smoothing a pose route by using the IMU pose information, and estimating the current system state from the last system state by using a Kalman filtering nonlinear stochastic difference equation to obtain a pose estimation result.
Further, obtaining a laser radar estimated pose based on the laser point cloud data includes:
preprocessing the laser point cloud data;
extracting characteristic points in the laser point cloud data, and filtering discrete characteristic points according to a preset coordinate threshold;
registering characteristic points in the laser point cloud data with a world point cloud map obtained in advance, performing point cloud range measurement, and performing 10Hz conversion updating on a range measurement result;
and carrying out drawing construction based on the range measurement result, carrying out 1Hz conversion updating on the drawing construction result, and carrying out 10Hz conversion updating result fusion with the range measurement result to obtain the estimated pose of the laser radar.
Further, a Kalman filtering nonlinear stochastic differential equation is used to estimate a current system state from a previous system state, the current system state being represented by the following equation:
wherein,is the system state representing the three-dimensional position and attitude of the system, x k 、y k 、z k Coordinates of x-axis, y-axis and z-axis, θ k 、/>ψ k Respectively a course angle, a pitch angle and a roll angle in turn k To measure the noise covariance matrix.
Further, the GPS positioning information and the RTK positioning information are fused, and the method specifically comprises the following steps:
judging whether the RTK positioning information is in a fixed resolving state, if so, adopting a fusion mode with RTK priority, otherwise adopting a fusion mode with GPS priority;
and updating a measurement noise covariance matrix in a Kalman filtering nonlinear stochastic differential equation by adopting a selected fusion mode, wherein the self-adaptive state updating function is expressed as follows:
wherein K is k For the current state, k in the subscript represents the kth state, H, V is the Jacobian coefficient, v k For measuring the noise covariance matrix, R is the process noise covariance matrix, phi is the adaptive parameter matrix, derived from the covariance of the RTK and GPS, phi = lambda kt φ kt +λ RG φ RG ,λ kt To assist the sensor weighting parameters phi kt To assist the sensor in adapting the parameter matrix lambda RG Weighting parameters phi for primary sensor RG Is a self-adaptive parameter matrix of a main sensor,co-ordination of last state and current stateA variance matrix.
Further, the GPS positioning information and the RTK positioning information are fused, and then the fusion result of the laser radar estimated pose and the wheel type odometer is combined, and the method specifically comprises the following steps:
judging whether the error between the estimated pose of the laser radar and the wheel type odometer is within 0.2%, if so, adopting a fusion mode with the preferential pose of the laser radar, otherwise, adopting a fusion mode with the preferential wheel type odometer;
updating the current measured value in the Kalman filtering nonlinear stochastic differential equation by adopting the selected fusion mode, and updating the state based on the following formula:
wherein x is kt Represents the current state of the estimated pose of the laser radar,is the parameter of the laser radar for estimating the last state of the pose, K kt Is a Kalman gain matrix for estimating pose by a laser radar, z kt The current measurement value of the estimated pose of the laser radar is represented, h represents the measurement equation of the estimated pose of the laser radar, and represents the measurement noise covariance matrix of the estimated pose of the laser radar;
and then weighting the fusion result of the laser radar estimated pose and the wheel type odometer into the self-adaptive parameter matrix phi, wherein the fusion result is expressed by the following formula:
φ=λ kt φk t+ λ RG φ RG ;
wherein lambda is kt For auxiliary sensor weighting parameters lambda RG Weighting parameters phi for primary sensor RG Self-adaptive parameter matrix phi of main sensor kt For assisting sensor self-adaptive parameter matrix, the current measured value z of laser radar estimated pose is expressed kt The fusion is performed to obtain the final state estimate,covariance matrix representing last state and current state,/->Jacobian matrix coefficients representing the estimated pose of the lidar.
2. A laser radar-based pose estimation device of an unmanned deformation motion platform, comprising:
the point cloud acquisition module is used for acquiring laser point cloud data of a scene;
the radar pose estimation module is used for obtaining a laser radar estimated pose based on the laser point cloud data;
the positioning information acquisition module is used for acquiring wheel type odometer, IMU pose information, GPS positioning information and RTK positioning information;
and the fusion pose estimation module is used for fusing the GPS positioning information and the RTK positioning information, combining the fusion result of the laser radar estimated pose and the wheel type odometer, smoothing a pose route by using the IMU pose information, and estimating the current system state from the last system state by using a Kalman filtering nonlinear stochastic difference equation to obtain a pose estimation result.
3. An electronic device comprises a processor and a storage device, wherein a plurality of instructions are stored in the storage device, and the processor is used for reading the plurality of instructions in the storage device and executing the method.
4. A computer readable storage medium storing a computer program, characterized in that the computer program is executed by a processor to implement the above method.
5. The pose estimation system of the unmanned deformation motion platform based on the laser radar comprises a sensor, a controller and the unmanned deformation motion platform, wherein the sensor comprises a laser point cloud data acquisition module and a motion data acquisition module, and the controller is used for controlling the unmanned deformation motion platform to move and executing the pose estimation method.
Further, the laser point cloud data acquisition module is a linear laser radar, and the motion data acquisition module is an inertial measurement unit; the controller comprises an STM32 controller and an X86 architecture core processing board, wherein the STM32 controller is used for controlling the unmanned deformation motion platform to move, and the X86 architecture core processing board is used for executing the pose estimation method.
The pose estimation method and system of the unmanned deformation motion platform based on the laser radar provided by the application at least comprise the following beneficial effects:
(1) The method utilizes an extended Kalman filtering technology to fuse laser radar point cloud matching with five dimensional information such as estimated pose, acceleration information, GPS positioning information, RTK positioning information and an odometer, by the method, absolute track errors and relative errors of scene reconstruction can be remarkably reduced, the self-adaptive weight coordinate data fusion sequence and mode can remarkably reduce the absolute track errors and relative errors of scene reconstruction, and the optical effective calibration and tree structure multistage buffer filtering technology of the system provides a solution for high-precision data calibration;
(2) The pose estimation system adopting the method is used for realizing the scene reconstruction of the laser radar, can adapt to various types of motion chassis characteristics, has the advantages of high-precision positioning capability and comprehensive information fusion, and becomes an important tool in the scene reconstruction task.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments or the description of the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of an embodiment of a method for estimating the pose of an unmanned deformation motion platform based on a laser radar;
FIG. 2 is a schematic diagram of obtaining a laser radar estimated pose from a laser point cloud;
FIG. 3 is a flowchart of an algorithm for fusion of RTK positioning information, GPS positioning information, wheel type odometer and laser radar estimated pose;
FIG. 4 is a schematic diagram of a tree structure of a wheel type odometer and a laser radar estimated pose pre-fused IMU pose information;
FIG. 5 is a block diagram of one embodiment of an unmanned deformed motion platform in a pose estimation system of an unmanned deformed motion platform based on a lidar provided by the application;
fig. 6 is a schematic structural diagram of a four-wheel off-road chassis in the pose estimation system of the unmanned deformation motion platform based on the laser radar;
FIG. 7 is a schematic structural view of a tracked off-road chassis;
figure 8 is a schematic structural view of a quad-rotor chassis;
fig. 9 is a schematic structural diagram of an embodiment of a pose estimation system of an unmanned deformation motion platform based on a laser radar provided by the application;
FIG. 10 is a hardware block diagram of an embodiment of a pose estimation system of an unmanned deformation motion platform based on a lidar provided by the application;
FIG. 11 is an effect diagram of the pose estimation method of the unmanned deformation motion platform based on the laser radar fusing five-dimensional data;
fig. 12 is a three-dimensional reconstruction experimental result point cloud map obtained by the pose estimation method of the unmanned deformation motion platform based on the laser radar.
Detailed Description
In order to better understand the above technical solutions, the following detailed description will be given with reference to the accompanying drawings and specific embodiments.
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application, but the present application may be practiced in other ways other than those described herein, and persons skilled in the art will readily appreciate that the present application is not limited to the specific embodiments disclosed below.
Embodiment one:
referring to fig. 1, in some embodiments, a method for estimating pose of unmanned deformation motion platform based on laser radar is provided, including:
s1, collecting laser point cloud data of a scene;
s2, obtaining a laser radar estimated pose based on the laser point cloud data;
s3, acquiring wheel type odometer, IMU pose information, GPS positioning information and RTK positioning information;
s4, fusing the GPS positioning information and the RTK positioning information, combining the fusion result of the laser radar estimated pose and the wheel type odometer, smoothing a pose route by using the IMU pose information, and estimating the current system state from the last system state by using a Kalman filtering nonlinear stochastic difference equation to obtain a pose estimation result.
Specifically, in step S2, obtaining the estimated pose of the laser radar based on the laser point cloud data includes:
s21, preprocessing the laser point cloud data;
s22, extracting characteristic points in the laser point cloud data, and filtering discrete characteristic points according to a preset coordinate threshold;
s23, registering characteristic points in the laser point cloud data with a world point cloud map obtained in advance, performing point cloud range measurement, and performing 10Hz conversion updating on a range result;
and S24, carrying out drawing construction based on the range measurement result, carrying out 1Hz conversion updating on the drawing construction result, and carrying out 10Hz conversion updating result fusion with the range measurement result to obtain the estimated pose of the laser radar.
In order to obtain laser radar pose estimation information in information fusion, the system adopts an LOAM algorithm to complete reconstruction of a three-dimensional point cloud image, and positioning information is generated by using a hector algorithm. The specific implementation process is as follows: firstly, preprocessing point cloud data obtained by laser radar scanning. Second, the system extracts feature points and sets a threshold to filter and remove those feature points that are erroneous or of poor quality. Next, motion is estimated using a nonlinear optimization method. And finally, carrying out point cloud registration on the data of the current laser radar scanning frame and the previously constructed world point cloud image to obtain pose estimation information. The software schematic block diagram is shown in fig. 2, and based on the algorithm shown in fig. 2, the estimated pose information of the laser radar is obtained.
In step S3, the wheel type odometer and IMU pose information are generated by using kinematic inverse settlement, and the GPS positioning information and the RTK positioning information use original data with covariance.
In step S4, the current system state is estimated from the last system state using a kalman filter nonlinear stochastic differential equation, and the current system state is expressed by the following formula:
wherein,is the system state representing the three-dimensional position and attitude of the system, x k 、y k 、z k Coordinates of x-axis, y-axis and z-axis, θ k 、/>ψ k Respectively a course angle, a pitch angle and a roll angle in turn k To measure the noise covariance matrix.
It should be noted that, the breakthrough of this embodiment is to fuse five data including the wheel type odometer, the IMU pose information, the laser radar estimated pose, the GPS positioning information and the RTK positioning information, so as to reduce the absolute track error and the relative error of scene reconstruction. Estimating the current system state from the last system state using a Kalman filtering nonlinear stochastic differential equation, the extended Kalman filtering formula being as follows:
wherein K is k For the current state (small k in the subscript represents the kth state,a Kalman gain matrix for fusing the previous state estimate with the current measurement to obtain an updated state estimate), R is a process noise covariance matrix, v k To measure the noise covariance matrix, A, W, H, V is the jacobian coefficient, which varies over time. The current state measurement equation is:
z k =h(x k ,v k )
function h () is a nonlinear mapping function for mapping the system state to the measurement space, z k Representing the current measured value, x k Is the current system state, v k Is the measurement noise. And mapping the system state and noise to a measurement space through a measurement equation, and fusing the system state and the noise by utilizing a Kalman gain to obtain updated state estimation.
If the measurement result obtained by lidar point cloud matching is the position and attitude (e.g., x and y coordinates and heading angle) of the vehicle on a two-dimensional plane, the measurement equation can be defined as:
z k =[x k ;y k ;θ k ]+v k
wherein [ x ] k ;y k ;θ k ]Is the system state representing the position and attitude of the vehicle on a two-dimensional plane, v k Is the measurement noise.
If the measurements obtained by lidar point cloud matching are three-dimensional positions and attitudes (e.g., x, y, and z coordinates, as well as heading, pitch, and roll angles) of the vehicle, then the measurement equation may be defined as:
wherein,is the system state representing the three-dimensional position and attitude of the system.
The specific estimation steps are as follows:
(1) State prediction is performed using a system model (i.e., a state transfer function) that is expressed as follows:
z k =h(x k ,v k );
(2) Updating a covariance matrix, the covariance matrix being represented as follows:
wherein phi is an adaptive parameter matrix derived from the covariance of the RTK and GPS.
(3) The kalman gain matrix is calculated, and the kalman gain matrix is calculated by the following formula:
wherein v is k R is the covariance matrix of the laser radar measurement noise.
(4) Based on lidar measurements z k And the measurement equation h () is updated in state, and the above steps are repeated until the calculation converges.
The wheel type odometer and the pure IMU have larger accumulated errors, and the two methods are richer than pure laser radar pose estimation or pure GPS algorithm in terms of local detail and have higher refresh rate. Therefore, the extended Kalman filtering is adopted to carry out fusion processing on the data, so that pose estimation results with more detail parameters on macroscopic and microscopic aspects are obtained.
For the five information, the embodiment adopts a step fusion mode: RTK positioning information and GPS positioning information can be classified, surrounding building density can be calculated according to laser radar point cloud, and if the buildings are too dense, the GPS priority mode is switched. If the building is open, then switch to RTK priority mode, algorithm flow is as shown in FIG. 3.
Specifically, the method for fusing the GPS positioning information and the RTK positioning information specifically comprises the following steps:
s41, judging whether the RTK positioning information is in a fixed resolving state, if so, adopting a fusion mode with the priority of RTK, otherwise adopting a fusion mode with the priority of GPS;
the RTK has four states, namely, when the RTK is measured, the mobile station receives the differential signal of the base station and then has a single point-differential-floating-fixed process. If the solution is not fixed, the accuracy is poor, and at this time the GPS-preferred mode is switched. In the case of fixed wing mode, the flight speed is often greater than 100m/s, the base station switching speed is very fast, and the RTK technique requires a distance of no more than 20km from the base station, in which case the GPS priority mode is used and the weight of the RTK data becomes low.
S42, updating a measurement noise covariance matrix in a Kalman filtering nonlinear random difference equation by adopting a selected fusion mode, wherein the self-adaptive state updating function is expressed as follows:
wherein K is k For the current state, k in the subscript represents the kth state, H, V is the Jacobian coefficient, v k For measuring the noise covariance matrix, R is the process noise covariance matrix, phi is the adaptive parameter matrix, derived from the covariance of the RTK and GPS,a covariance matrix representing the last state and the current state.
S43, judging whether the error between the estimated pose of the laser radar and the wheel type odometer is within 0.2%, if so, adopting a fusion mode with the prioritized pose of the laser radar, otherwise, adopting a fusion mode with the prioritized wheel type odometer;
the path and odometer information obtained by laser radar point cloud matching are classified into one category, and the two are preferentially fused to form a tree structure, as shown in figure 4,
s44, updating the current measured value in the Kalman filtering nonlinear stochastic difference equation by adopting the selected fusion mode, and updating the state based on the following formula:
wherein x is kt Represents the current state of the estimated pose of the laser radar,is the parameter of the laser radar for estimating the last state of the pose, K kt Is a Kalman gain matrix for estimating pose by a laser radar, z kt The current measurement value of the estimated pose of the laser radar is represented, h represents the measurement equation of the estimated pose of the laser radar, and represents the measurement noise covariance matrix of the estimated pose of the laser radar;
s45, weighting the fusion result of the laser radar estimated pose and the wheel type odometer into the self-adaptive parameter matrix phi, wherein the fusion result is expressed by the following formula:
φ=λ kt φ kt +λ RG φ RG ;
wherein lambda is kt For auxiliary sensor weighting parameters lambda RG Weighting parameters phi for primary sensor RG Self-adaptive parameter matrix phi of main sensor kt For assisting sensor self-adaptive parameter matrix, the current measured value z of laser radar estimated pose is expressed kt The fusion is performed to obtain the final state estimate,covariance matrix representing last state and current state,/->Jacobian matrix coefficients representing the estimated pose of the lidar.
And finally, smoothing the route by using IMU inertial navigation information.
Embodiment two:
in some embodiments, a pose estimation device of an unmanned deformation motion platform based on laser radar is provided, comprising:
the point cloud acquisition module is used for acquiring laser point cloud data of a scene;
the radar pose estimation module is used for obtaining a laser radar estimated pose based on the laser point cloud data;
the positioning information acquisition module is used for acquiring wheel type odometer, IMU pose information, GPS positioning information and RTK positioning information;
and the fusion pose estimation module is used for fusing the GPS positioning information and the RTK positioning information, combining the fusion result of the laser radar estimated pose and the wheel type odometer, smoothing a pose route by using the IMU pose information, and estimating the current system state from the last system state by using a Kalman filtering nonlinear stochastic difference equation to obtain a pose estimation result.
Embodiment III:
in some embodiments, an electronic device is provided that includes a processor and a storage device having a plurality of instructions stored therein, the processor configured to read the plurality of instructions in the storage device and perform the method of embodiment one.
Embodiment four:
in some embodiments, a computer readable storage medium is provided, which stores a computer program, characterized in that the computer program when executed by a processor implements the method of embodiment one.
Fifth embodiment:
in some embodiments, a pose estimation system of an unmanned deformation motion platform based on a laser radar is provided, and the pose estimation system comprises a sensor, a controller and the unmanned deformation motion platform, wherein the sensor comprises a laser point cloud data acquisition module and a motion data acquisition module, and the controller is used for controlling the unmanned deformation motion platform to move and executing the pose estimation method in the first embodiment.
Preferably, the system is constructed by adopting elements such as an X86 architecture core processing board, a 16-line laser radar, an STM32 controller, an inertial navigation assembly and the like. The device is simple and convenient to install, small in size and capable of freely replacing various motion chassis, such as four-foot bionic, caterpillar tracks, fixed wings and the like. The method is suitable for reconstructing various complex scenes in real time and three dimensions. The laser point cloud data acquisition module is a linear laser radar, and the motion data acquisition module is an inertial measurement unit; the controller comprises an STM32 controller and an X86 architecture core processing board, wherein the STM32 controller is used for controlling and controlling the unmanned deformation motion platform to move, and the X86 architecture core processing board is used for executing the pose estimation method in the first embodiment.
The structure of the unmanned deformation motion platform is shown in fig. 5, and the example assembled self-adaptive chassis is shown in fig. 6-8, wherein fig. 6 is a four-wheel off-road chassis, fig. 7 is a crawler-type off-road chassis, and fig. 8 is a four-rotor motion chassis.
The system works in a short wave infrared band, can output high-resolution range images and intensity images of a scene in real time in an incremental mode, and is mainly used for reconstructing a real-time three-dimensional scene. The multi-dimensional information fusion technology can be applied to the fields of three-bit reconstruction scenes, target detection, identification and the like with larger accumulated errors.
Referring to fig. 9, as a preferred embodiment, the system includes a deformable motion chassis module: a variety of sports chassis options are provided, including quadruped bionics, tracks, stationary wings, and the like. The system consists of three parts, namely a sensor, a controller and a data exchanger, wherein the sensor can acquire point cloud data and motion data; the controller can complete the functions of motion control, work order construction, motion reverse calculation, program execution, data summarization and the like; the data exchanger is used for transmitting data and receiving data,
the controller module includes an STM32 controller and a PWM signal generator. The STM32 controller is responsible for motion control of the chassis and encoder data reading. The data processing is completed by an X86 architecture core processing board, which receives the original data from a scene data acquisition module, and the module matches and fuses the point cloud data acquired at different times or positions by using a point cloud registration technology so as to generate an accurate three-dimensional scene model. It is capable of efficiently processing large amounts of data and performing complex computing and analysis operations to extract and reconstruct useful information in a scene.
In the sensor module, the scene data acquisition module consists of a linear laser radar and an inertial measurement unit. The laser radar is used for acquiring scene data, and the inertial measurement unit is used for acquiring GPS, RTK, acceleration and angular velocity information.
The system architecture has higher flexibility and can adapt to various different types of motion chassis. Whether it is a four-foot bionic, track, or fixed wing, etc., the system can provide the desired functions and control capabilities. The design enables the unmanned chassis system to flexibly operate in different application scenes and meets the requirements of specific sport chassis. The hardware framework is shown in fig. 10. The RTK inertial navigation module in the sensor transmits inertial navigation coordinates and acceleration to a controller (a core processing board), the laser radar in the sensor transmits three-dimensional point cloud to the controller, and the controller controls the self-adaptive motion chassis in the unmanned deformation motion platform through four-wheel drive chassis control, photoelectric sensor data and a speed loop PID. The controller is connected with the PC end, distributes the point cloud to the PC end, and receives the historical map data sent by the PC end.
The beneficial effects of the embodiments provided by the application are further illustrated by the following experiments:
as shown in fig. 11, by counting 3 samples, it can be obtained that the accuracy of the x-direction pose estimation is improved by about 1.32%, and the accuracy in the y-direction is improved by about 3.98%. These results indicate that the accuracy and precision of pose estimation can be significantly improved by fusing the information of the wheel odometer, the pure IMU and other data sources by extended kalman filtering. The comprehensive data processing method can provide richer pose parameters at both macroscopic and microscopic levels.
The three-dimensional point cloud map drawn by the pose estimation method by using the laser imaging radar device is shown in fig. 12: the method can clearly distinguish the backgrounds of buildings, roads and jungles, can distinguish the front and back sequences of all objects and the distance between the objects and a system device according to the difference of distance images, and has the ranging precision of 0.3m. The image has clear outline and relatively high signal-to-noise ratio, complex target segmentation is easy to complete, and the advantages of the three-dimensional range profile in aspects of target recognition, tracking and the like are reflected.
While preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the application. It will be apparent to those skilled in the art that various modifications and variations can be made to the present application without departing from the spirit or scope of the application. Thus, it is intended that the present application also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.
Claims (10)
1. The pose estimation method of the unmanned deformation motion platform based on the laser radar is characterized by comprising the following steps of:
collecting laser point cloud data of a scene;
obtaining a laser radar estimated pose based on the laser point cloud data;
acquiring wheel type odometer, IMU pose information, GPS positioning information and RTK positioning information;
and fusing the GPS positioning information and the RTK positioning information, combining a fusion result of the laser radar estimated pose and the wheel type odometer, smoothing a pose route by using the IMU pose information, and estimating the current system state from the last system state by using a Kalman filtering nonlinear stochastic difference equation to obtain a pose estimation result.
2. The method of claim 1, wherein obtaining a lidar estimated pose based on the laser point cloud data comprises:
preprocessing the laser point cloud data;
extracting characteristic points in the laser point cloud data, and filtering discrete characteristic points according to a preset coordinate threshold;
registering characteristic points in the laser point cloud data with a world point cloud map obtained in advance, performing point cloud range measurement, and performing 10Hz conversion updating on a range measurement result;
and carrying out drawing construction based on the range measurement result, carrying out 1Hz conversion updating on the drawing construction result, and carrying out 10Hz conversion updating result fusion with the range measurement result to obtain the estimated pose of the laser radar.
3. The method of claim 1, wherein the current system state is estimated from a previous system state using a kalman filter nonlinear stochastic difference equation, the current system state being represented by the formula:
wherein,is the system state representing the three-dimensional position and attitude of the system, x k 、y k 、z k Coordinates of x-axis, y-axis and z-axis, θ k 、/>ψ k Respectively a course angle, a pitch angle and a roll angle in turn k To measure the noise covariance matrix.
4. The method of claim 1, wherein fusing the GPS positioning information and the RTK positioning information comprises:
judging whether the RTK positioning information is in a fixed resolving state, if so, adopting a fusion mode with RTK priority, otherwise adopting a fusion mode with GPS priority;
and updating a measurement noise covariance matrix in a Kalman filtering nonlinear stochastic differential equation by adopting a selected fusion mode, wherein the self-adaptive state updating function is expressed as follows:
wherein K is k For the current state, k in the subscript represents the kth state, H, V is the Jacobian coefficient, v k For measuring the noise covariance matrix, R is the process noise covariance matrix, phi is the adaptive parameter matrix, derived from the covariance of the RTK and GPS,a covariance matrix representing the last state and the current state.
5. The method of claim 4, wherein the step of fusing the GPS positioning information and the RTK positioning information, and combining the fusion result of the laser radar estimated pose and the wheel odometer, specifically comprises the steps of:
judging whether the error between the estimated pose of the laser radar and the wheel type odometer is within 0.2%, if so, adopting a fusion mode with the preferential pose of the laser radar, otherwise, adopting a fusion mode with the preferential wheel type odometer;
updating the current measured value in the Kalman filtering nonlinear stochastic differential equation by adopting the selected fusion mode, and updating the state based on the following formula:
wherein x is kt Represents the current state of the estimated pose of the laser radar,is the parameter of the laser radar for estimating the last state of the pose, K kt Is a laser mineKalman gain matrix for estimating pose, z kt Representing the current measurement value of the estimated pose of the laser radar, h representing the measurement equation of the estimated pose of the laser radar, v kt A measurement noise covariance matrix representing the estimated pose of the laser radar;
and then weighting the fusion result of the laser radar estimated pose and the wheel type odometer into the self-adaptive parameter matrix phi, wherein the fusion result is expressed by the following formula:
φ=λ kt φ kt +λ RG φ RG ;
wherein lambda is kt For auxiliary sensor weighting parameters lambda RG Weighting parameters phi for primary sensor RG Self-adaptive parameter matrix phi of main sensor kt For assisting sensor self-adaptive parameter matrix, the current measured value z of laser radar estimated pose is expressed kt The fusion is performed to obtain the final state estimate,covariance matrix representing last state and current state,/->Jacobian matrix coefficients representing the estimated pose of the lidar.
6. The utility model provides a pose estimation device of unmanned deformation motion platform based on laser radar which characterized in that includes:
the point cloud acquisition module is used for acquiring laser point cloud data of a scene;
the radar pose estimation module is used for obtaining a laser radar estimated pose based on the laser point cloud data;
the positioning information acquisition module is used for acquiring wheel type odometer, IMU pose information, GPS positioning information and RTK positioning information;
and the fusion pose estimation module is used for fusing the GPS positioning information and the RTK positioning information, combining the fusion result of the laser radar estimated pose and the wheel type odometer, smoothing a pose route by using the IMU pose information, and estimating the current system state from the last system state by using a Kalman filtering nonlinear stochastic difference equation to obtain a pose estimation result.
7. An electronic device comprising a processor and a memory means, wherein a plurality of instructions are stored in the memory means, the processor being arranged to read the plurality of instructions in the memory means and to perform the method of any of claims 1-5.
8. A computer readable storage medium storing a computer program, which when executed by a processor performs the method according to any one of claims 1-5.
9. A pose estimation system of an unmanned deformation motion platform based on a laser radar, which is characterized by comprising a sensor, a controller and the unmanned deformation motion platform, wherein the sensor comprises a laser point cloud data acquisition module and a motion data acquisition module, and the controller is used for controlling the unmanned deformation motion platform to move and executing the pose estimation method according to any one of claims 1-5.
10. The system of claim 9, wherein the laser point cloud data acquisition module is a line-type lidar and the motion data acquisition module is an inertial measurement unit; the controller comprises an STM32 controller and an X86 architecture core processing board, wherein the STM32 controller is used for controlling the unmanned deformation motion platform to move, and the X86 architecture core processing board is used for executing the pose estimation method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311065030.6A CN117075158A (en) | 2023-08-23 | 2023-08-23 | Pose estimation method and system of unmanned deformation motion platform based on laser radar |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311065030.6A CN117075158A (en) | 2023-08-23 | 2023-08-23 | Pose estimation method and system of unmanned deformation motion platform based on laser radar |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117075158A true CN117075158A (en) | 2023-11-17 |
Family
ID=88711210
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311065030.6A Pending CN117075158A (en) | 2023-08-23 | 2023-08-23 | Pose estimation method and system of unmanned deformation motion platform based on laser radar |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117075158A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117690194A (en) * | 2023-12-08 | 2024-03-12 | 北京虹湾威鹏信息技术有限公司 | Multi-source AI biodiversity observation method and acquisition system |
-
2023
- 2023-08-23 CN CN202311065030.6A patent/CN117075158A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117690194A (en) * | 2023-12-08 | 2024-03-12 | 北京虹湾威鹏信息技术有限公司 | Multi-source AI biodiversity observation method and acquisition system |
CN117690194B (en) * | 2023-12-08 | 2024-06-07 | 北京虹湾威鹏信息技术有限公司 | Multi-source AI biodiversity observation method and acquisition system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110243358B (en) | Multi-source fusion unmanned vehicle indoor and outdoor positioning method and system | |
CN112484725B (en) | Intelligent automobile high-precision positioning and space-time situation safety method based on multi-sensor fusion | |
CN107246876B (en) | Method and system for autonomous positioning and map construction of unmanned automobile | |
CN114526745B (en) | Drawing construction method and system for tightly coupled laser radar and inertial odometer | |
Cai et al. | Mobile robot localization using gps, imu and visual odometry | |
CN113920198B (en) | Coarse-to-fine multi-sensor fusion positioning method based on semantic edge alignment | |
CN114323033A (en) | Positioning method and device based on lane lines and feature points and automatic driving vehicle | |
CN113238554A (en) | Indoor navigation method and system based on SLAM technology integrating laser and vision | |
CN110487286B (en) | Robot pose judgment method based on point feature projection and laser point cloud fusion | |
CN112504261A (en) | Unmanned aerial vehicle landing pose filtering estimation method and system based on visual anchor point | |
CN115272596A (en) | Multi-sensor fusion SLAM method oriented to monotonous texture-free large scene | |
CN114067210A (en) | Mobile robot intelligent grabbing method based on monocular vision guidance | |
CN115585818A (en) | Map construction method and device, electronic equipment and storage medium | |
CN114047766B (en) | Mobile robot data acquisition system and method for long-term application of indoor and outdoor scenes | |
CN117075158A (en) | Pose estimation method and system of unmanned deformation motion platform based on laser radar | |
CN115307646A (en) | Multi-sensor fusion robot positioning method, system and device | |
CN115032984A (en) | Semi-autonomous navigation method and system for port logistics intelligent robot | |
CN113554705A (en) | Robust positioning method for laser radar in changing scene | |
CN114764830A (en) | Object pose estimation method based on quaternion EKF and uncalibrated hand-eye system | |
Housein et al. | Extended Kalman filter sensor fusion in practice for mobile robot localization | |
US20220152835A1 (en) | Pose determination method, robot using the same, and computer readable storage medium | |
CN116358525A (en) | Laser radar-based map building and positioning method, system and engineering vehicle | |
CN114721377A (en) | Improved Cartogrier based SLAM indoor blind guiding robot control method | |
CN112747752A (en) | Vehicle positioning method, device, equipment and storage medium based on laser odometer | |
CN118363008B (en) | Robot positioning scene degradation processing method, rapid positioning method and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |