CN117572879A - Unmanned aerial vehicle based on laser radar SLAM positioning navigation - Google Patents
Unmanned aerial vehicle based on laser radar SLAM positioning navigation Download PDFInfo
- Publication number
- CN117572879A CN117572879A CN202210930327.3A CN202210930327A CN117572879A CN 117572879 A CN117572879 A CN 117572879A CN 202210930327 A CN202210930327 A CN 202210930327A CN 117572879 A CN117572879 A CN 117572879A
- Authority
- CN
- China
- Prior art keywords
- lidar
- pose
- information
- motion information
- uav
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000033001 locomotion Effects 0.000 claims abstract description 114
- 238000012545 processing Methods 0.000 claims abstract description 39
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 claims abstract description 16
- 238000004364 calculation method Methods 0.000 claims abstract description 8
- 238000007781 pre-processing Methods 0.000 claims description 10
- 230000004927 fusion Effects 0.000 abstract description 18
- 238000004891 communication Methods 0.000 abstract description 2
- 238000000034 method Methods 0.000 description 11
- 238000001914 filtration Methods 0.000 description 9
- 230000010365 information processing Effects 0.000 description 3
- 230000002159 abnormal effect Effects 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000010006 flight Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000012827 research and development Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
- G01C21/1652—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/43—Determining position using carrier phase measurements, e.g. kinematic positioning; using long or short baseline interferometry
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/45—Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
- G01S19/47—Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Navigation (AREA)
Abstract
本发明公开了一种基于激光雷达SLAM定位导航的无人机,包括:无人机本体、安装于无人机本体正前方的多线三维激光雷达单元、IMU惯性传感器、高性能数据处理单元和飞行控制系统;所述高性能数据处理单元根据激光雷达探测的点云数据进行计算,生成无人机当前的相对位置。所述的高性能数据处理单元对IMU的位姿信息进行实时标定修正,并根据比较结果融合IMU与激光雷达的位姿和运动信息,并结合当前更新的全局地图模型,得到当前帧的融合位姿和运动信息。本发明提供的无人机可在室内或任何不具备卫星GPS信号接收的条件下进行飞行,安全可靠,不容易被无线信号干扰飞行,可在完全无通讯连接的情况下自主飞行。
The invention discloses a UAV based on lidar SLAM positioning and navigation, which includes: a UAV body, a multi-line three-dimensional lidar unit installed directly in front of the UAV body, an IMU inertial sensor, a high-performance data processing unit and Flight control system; the high-performance data processing unit performs calculations based on the point cloud data detected by lidar to generate the current relative position of the drone. The high-performance data processing unit performs real-time calibration and correction on the pose information of the IMU, and fuses the pose and motion information of the IMU and lidar based on the comparison results, and combines it with the currently updated global map model to obtain the fusion position of the current frame. Posture and motion information. The UAV provided by the present invention can fly indoors or under any conditions without satellite GPS signal reception. It is safe and reliable, is not easily interfered by wireless signals, and can fly autonomously without any communication connection.
Description
技术领域Technical field
本发明属于无人机技术领域,具体涉及一种基于激光雷达SLAM定位导航的无人机。The invention belongs to the technical field of drones, and specifically relates to a drone based on laser radar SLAM positioning and navigation.
背景技术Background technique
无人驾驶飞机简称无人机,最早出现于20世纪初,随后由于军事需要而逐渐发展。无人机采用无线电遥控设备和自备的程序控制装置操纵不载人飞机,或者由车载计算机完全地或间歇地自主操作。无人机因其灵活机动、受地面地形约束小、造价较低且不会出现人员伤亡的特点被广泛应用在灾后搜救、航空拍摄、农作物监测以及军事作战等多个领域,不同的领域对无人机有不同的要求,比如要求无人机实现定位、避障、跟踪和轨迹规划等复杂任务。Unmanned aerial vehicles, referred to as drones, first appeared in the early 20th century and then gradually developed due to military needs. UAVs use radio remote control equipment and self-contained program control devices to control unmanned aircraft, or are fully or intermittently operated autonomously by on-board computers. UAVs are widely used in many fields such as post-disaster search and rescue, aerial photography, crop monitoring, and military operations due to their flexible maneuverability, small constraints on ground terrain, low cost, and no casualties. Different fields have a great impact on UAVs. Humans and machines have different requirements, such as requiring UAVs to perform complex tasks such as positioning, obstacle avoidance, tracking and trajectory planning.
近年来,四旋翼无人机具有成本低、操作简便等优点,已经成为无人机行业一个重要的研究方向。四旋翼无人机作为一种飞行机器人,其应用越来越要求无人机具备高精度、智能化、自主化的飞行能力,而具备这些能力的基础就是无人机具备准确的状态估计以及环境感知的能力。目前,各类飞行器普遍使用GPS/RTK实现定位,然而在某些情况下,例如在室内环境或一些拥挤的地区,GPS/RTK精度变差甚至失效,不足以支持无人机的自主飞行。以卫星导航技术作为导航方式的无人机定点悬停具有不容忽视的误差,最终难以实现在一个固定区域内稳定的悬停。但若在无人机上搭载双目视觉系统,便可以让无人机不再需要通过外界信号传输实现自主控制。In recent years, quad-rotor drones have the advantages of low cost and easy operation, and have become an important research direction in the drone industry. As a kind of flying robot, the application of quadcopter drones increasingly requires drones to have high-precision, intelligent, and autonomous flight capabilities. The basis for these capabilities is that the drone has accurate state estimation and environmental The ability to perceive. At present, various types of aircraft generally use GPS/RTK to achieve positioning. However, in some cases, such as in indoor environments or some crowded areas, the accuracy of GPS/RTK deteriorates or even fails, which is not enough to support autonomous flight of drones. Fixed-point hovering of UAVs using satellite navigation technology as a navigation method has errors that cannot be ignored, and ultimately it is difficult to achieve stable hovering in a fixed area. However, if a binocular vision system is installed on a drone, the drone will no longer need to transmit external signals to achieve autonomous control.
因为目前无人机的飞行大多依靠GPS/RTK信号飞行,而在室内环境下,GPS/RTK的信号无法获取,所以研究无人机在GPS/RTK信号丢失的情况下,仍能进行避障飞行是主要的难点,也是对无人机飞行安全的一大挑战。因此对于多旋翼无人机飞行系统的自主飞行的研发,具有重要意义。Because most of the current drone flights rely on GPS/RTK signals, and in indoor environments, GPS/RTK signals cannot be obtained. Therefore, the research drone can still perform obstacle avoidance flight when the GPS/RTK signal is lost. This is the main difficulty and a major challenge to drone flight safety. Therefore, the research and development of autonomous flight of multi-rotor UAV flight systems is of great significance.
发明内容Contents of the invention
本发明实施例提供一种基于激光雷达SLAM定位导航的无人机,其能够达到无人机在不依靠GPS/RTK信号的情况下,仍能进行避障飞行的效果。Embodiments of the present invention provide a UAV based on lidar SLAM positioning and navigation, which can achieve the effect that the UAV can still perform obstacle avoidance flight without relying on GPS/RTK signals.
本发明实施例提供一种基于激光雷达SLAM系统定位导航的无人机,包括:无人机本体、安装于无人机本体正前方的多线三维激光雷达单元、IMU惯性传感器、高性能数据处理单元和飞行控制系统;所述高性能数据处理单元根据激光雷达探测的点云数据进行计算,生成无人机当前的相对位置,并将无人机当前的相对位置信息传给无人机飞行控制系统,飞行控制系统根据无人机的位置信息实时的生成飞行航路,对无人机的飞行进行定位导航;其中,Embodiments of the present invention provide a UAV based on the positioning and navigation of the lidar SLAM system, including: a UAV body, a multi-line three-dimensional lidar unit installed directly in front of the UAV body, an IMU inertial sensor, and high-performance data processing Unit and flight control system; the high-performance data processing unit performs calculations based on the point cloud data detected by lidar, generates the current relative position of the UAV, and transmits the current relative position information of the UAV to the UAV flight control System, the flight control system generates flight routes in real time based on the position information of the UAV, and performs positioning and navigation of the UAV's flight; among them,
所述高性能数据处理单元还处理如下功能:The high-performance data processing unit also handles the following functions:
对点云数据进行解算,获取当前帧的激光雷达位姿和运动信息和当前帧的周围环境的局部地图信息;根据局部地图信息对激光雷达SLAM系统中地图进行更新,融合全部的局部地图生成三维全局地图;Solve the point cloud data to obtain the lidar pose and motion information of the current frame and the local map information of the surrounding environment of the current frame; update the map in the lidar SLAM system based on the local map information, and fuse all local maps to generate 3D global map;
结合上一帧的激光雷达位姿和运动信息与当前帧的激光雷达位姿和运动信息生成第一运动轨迹;Generate a first motion trajectory by combining the lidar pose and motion information of the previous frame with the lidar pose and motion information of the current frame;
结合上一帧的IMU位姿和运动信息与当前帧的IMU位姿和运动信息生成第二运动轨迹;比较第一运动轨迹和第二运动轨迹,得到比较误差,将比较误差与系统预设的激光SLAM标定误差进行比较;Combine the IMU pose and motion information of the previous frame with the IMU pose and motion information of the current frame to generate a second motion trajectory; compare the first motion trajectory and the second motion trajectory to obtain a comparison error, and compare the comparison error with the system's preset Compare laser SLAM calibration errors;
根据比较误差与激光SLAM标定误差的比较结果融合IMU与激光雷达的位姿和运动信息,并结合当前更新的全局地图模型,得到当前帧的融合位姿和运动信息。Based on the comparison results of the comparison error and the laser SLAM calibration error, the pose and motion information of the IMU and lidar are fused, and combined with the currently updated global map model, the fused pose and motion information of the current frame is obtained.
优选的,所述的高性能数据处理单元还对采集到的点云数据进行滤波和预处理;根据预处理后的点云数据更新临时局部地图信息。Preferably, the high-performance data processing unit also performs filtering and preprocessing on the collected point cloud data; and updates the temporary local map information based on the preprocessed point cloud data.
优选的,所述的高性能数据处理单元将比较误差与激光SLAM标定误差进行比较还包括如下处理:Preferably, the high-performance data processing unit compares the comparison error with the laser SLAM calibration error and also includes the following processing:
当比较误差大于激光SLAM标定误差,则将当前帧的IMU位姿和运动信息修改为当前帧的激光雷达位姿和运动信息;When the comparison error is greater than the laser SLAM calibration error, the IMU pose and motion information of the current frame are modified to the lidar pose and motion information of the current frame;
当比较误差小于激光SLAM标定误差,则存储当前帧的IMU位姿和运动信息。When the comparison error is less than the laser SLAM calibration error, the IMU pose and motion information of the current frame are stored.
优选的,在本申请实施例中还包括GPS装置和/或RTK装置,当检测到存在GPS信号和/或RTK信号时,高性能数据处理单元将当前帧的融合位姿和运动信息结合GPS信号和/或RTK信号得到带有地球坐标系的融合位姿和运动信息。Preferably, the embodiment of the present application also includes a GPS device and/or an RTK device. When detecting the presence of a GPS signal and/or an RTK signal, the high-performance data processing unit combines the fused pose and motion information of the current frame with the GPS signal. and/or RTK signals to obtain fused pose and motion information with the earth coordinate system.
优选的,在本申请实施例中还包括航线生成单元,其获取用户指定的位置信息,并根据用户指定的位置信息、地图更新信息、融合位姿和运动信息,生成最佳航路,供飞行控制单元控制无人机的定位及导航。Preferably, the embodiment of the present application also includes a route generation unit, which obtains the location information specified by the user, and generates the optimal route for flight control based on the location information specified by the user, map update information, fused pose and motion information. The unit controls the positioning and navigation of the drone.
优选的,所述的三维激光雷达单元、IMU惯性传感器、高性能数据处理单元集合在一个框架中。Preferably, the three-dimensional lidar unit, IMU inertial sensor, and high-performance data processing unit are integrated into one frame.
本发明提供的无人机,融合了IMU的惯导数据,使用惯导测量出的方向信息作为SLAM系统的输入信息避免了无GPS信号时无法获取当前运动方向的问题,提高系统自主定位的可靠性。同时本发明基于多线三维激光扫描数据并通过SLAM数据系统实时构建更新三维地图,供无人机导航使用。本发明的有益效果如下:The UAV provided by the present invention integrates the inertial navigation data of the IMU and uses the direction information measured by the inertial navigation as the input information of the SLAM system to avoid the problem of being unable to obtain the current movement direction when there is no GPS signal and improve the reliability of the system's autonomous positioning. sex. At the same time, the invention is based on multi-line three-dimensional laser scanning data and uses the SLAM data system to construct and update a three-dimensional map in real time for use in UAV navigation. The beneficial effects of the present invention are as follows:
1、可在室内或任何不具备卫星GPS信号接收的条件下进行飞行,安全可靠,不容易被无线信号干扰飞行,可在完全无通讯连接的情况下自主飞行。1. It can be flown indoors or under any conditions without satellite GPS signal reception. It is safe and reliable and will not be easily interfered by wireless signals. It can fly autonomously without any communication connection.
2、实现无人机高可靠性的悬停和自主飞行,可自主的进行快速航路规划,自主飞达目的地。2. Achieve high-reliability hovering and autonomous flight of the UAV, enable rapid route planning, and autonomously fly to the destination.
3、系统的激光雷达属于自主发光的传感器,可以在完全黑暗的环境下飞行。3. The system's lidar is an autonomous light-emitting sensor that can fly in a completely dark environment.
附图说明Description of the drawings
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below. Obviously, the drawings in the following description are: For some embodiments of the present invention, those of ordinary skill in the art can also obtain other drawings based on these drawings without exerting creative efforts.
图1为本发明实施例中的基于激光雷达SLAM定位导航的无人机。Figure 1 shows a UAV based on lidar SLAM positioning and navigation in an embodiment of the present invention.
图2为本发明实施例中的基于激光雷达SLAM定位导航的无人机系统模块图。Figure 2 is a module diagram of an unmanned aerial vehicle system based on lidar SLAM positioning and navigation in an embodiment of the present invention.
图3为本发明实施例中的激光雷达SLAM定位导航的方法总流程图。Figure 3 is a general flow chart of the laser radar SLAM positioning and navigation method in the embodiment of the present invention.
具体实施方式Detailed ways
为使本申请实施例的目的、技术方案和优点更加清楚,下面将结合本申请实施例中附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。通常在此处附图中描述和示出的本申请实施例的组件可以以各种不同的配置来布置和设计。因此,以下对在附图中提供的本申请的实施例的详细描述并非旨在限制要求保护的本申请的范围,而是仅仅表示本申请的选定实施例。基于本申请的实施例,本领域技术人员在没有做出创造性劳动的前提下所获得的所有其他实施例,都属于本申请保护的范围。In order to make the purpose, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application. Obviously, the described embodiments are only These are part of the embodiments of this application, not all of them. The components of the embodiments of the present application generally described and illustrated in the figures herein may be arranged and designed in a variety of different configurations. Therefore, the following detailed description of the embodiments of the application provided in the appended drawings is not intended to limit the scope of the claimed application, but rather to represent selected embodiments of the application. Based on the embodiments of this application, all other embodiments obtained by those skilled in the art without any creative work shall fall within the scope of protection of this application.
请参阅图1所示,本发明实施例提供一种基于激光雷达SLAM系统定位导航的无人机,包括:无人机本体1、安装于无人机本体正前方的多线三维激光雷达单元2、IMU惯性传感器、高性能数据处理单元3和飞行控制系统;所述高性能数据处理单元3根据激光雷达探测的点云数据进行计算,生成无人机当前的相对位置,并将无人机当前的相对位置信息传给无人机飞行控制系统,飞行控制系统根据无人机的位置信息实时的生成飞行航路,对无人机的飞行进行定位导航;其中,Please refer to Figure 1. An embodiment of the present invention provides a UAV based on the positioning and navigation of a lidar SLAM system, including: a UAV body 1, and a multi-line three-dimensional lidar unit 2 installed directly in front of the UAV body. , IMU inertial sensor, high-performance data processing unit 3 and flight control system; the high-performance data processing unit 3 calculates based on the point cloud data detected by lidar, generates the current relative position of the drone, and sets the current relative position of the drone The relative position information is transmitted to the UAV flight control system, and the flight control system generates a flight route in real time based on the UAV's position information, and performs positioning and navigation on the UAV's flight; where,
所述高性能数据处理单元3还处理如下功能:The high-performance data processing unit 3 also handles the following functions:
对点云数据进行解算,获取当前帧的激光雷达位姿和运动信息和当前帧的周围环境的局部地图信息;根据局部地图信息对激光雷达SLAM系统中地图进行更新,融合全部的局部地图生成三维全局地图;Solve the point cloud data to obtain the lidar pose and motion information of the current frame and the local map information of the surrounding environment of the current frame; update the map in the lidar SLAM system based on the local map information, and fuse all local maps to generate 3D global map;
结合上一帧的激光雷达位姿和运动信息与当前帧的激光雷达位姿和运动信息生成第一运动轨迹;Generate a first motion trajectory by combining the lidar pose and motion information of the previous frame with the lidar pose and motion information of the current frame;
结合上一帧的IMU位姿和运动信息与当前帧的IMU位姿和运动信息生成第二运动轨迹;比较第一运动轨迹和第二运动轨迹,得到比较误差,将比较误差与系统预设的激光SLAM标定误差进行比较;Combine the IMU pose and motion information of the previous frame with the IMU pose and motion information of the current frame to generate a second motion trajectory; compare the first motion trajectory and the second motion trajectory to obtain a comparison error, and compare the comparison error with the system's preset Compare laser SLAM calibration errors;
根据比较误差与激光SLAM标定误差的比较结果融合IMU与激光雷达的位姿和运动信息,并结合当前更新的全局地图模型,得到当前帧的融合位姿和运动信息。Based on the comparison results of the comparison error and the laser SLAM calibration error, the pose and motion information of the IMU and lidar are fused, and combined with the currently updated global map model, the fused pose and motion information of the current frame is obtained.
进一步的,在本发明实施例中,所述的高性能数据处理单元3还对采集到的点云数据进行滤波和预处理;根据预处理后的点云数据更新临时局部地图信息。Further, in the embodiment of the present invention, the high-performance data processing unit 3 also performs filtering and preprocessing on the collected point cloud data; and updates the temporary local map information according to the preprocessed point cloud data.
进一步的,在本发明实施例中,所述的高性能数据处理单元3将比较误差与激光SLAM标定误差进行比较还包括如下处理:Further, in the embodiment of the present invention, the high-performance data processing unit 3 compares the comparison error with the laser SLAM calibration error and also includes the following processing:
当比较误差大于激光SLAM标定误差,则将当前帧的IMU位姿和运动信息修改为当前帧的激光雷达位姿和运动信息;When the comparison error is greater than the laser SLAM calibration error, the IMU pose and motion information of the current frame are modified to the lidar pose and motion information of the current frame;
当比较误差小于激光SLAM标定误差,则存储当前帧的IMU位姿和运动信息。When the comparison error is less than the laser SLAM calibration error, the IMU pose and motion information of the current frame are stored.
进一步的,在本发明实施例中还包括GPS装置和/或RTK装置,当检测到存在GPS信号和/或RTK信号时,高性能数据处理单元将当前帧的融合位姿和运动信息结合GPS信号和/或RTK信号得到带有地球坐标系的融合位姿和运动信息。Furthermore, the embodiment of the present invention also includes a GPS device and/or an RTK device. When detecting the presence of a GPS signal and/or an RTK signal, the high-performance data processing unit combines the fused pose and motion information of the current frame with the GPS signal. and/or RTK signals to obtain fused pose and motion information with the earth coordinate system.
进一步的,在本发明实施例中还包括航线生成单元,其获取用户指定的位置信息,并根据用户指定的位置信息、地图更新信息、融合位姿和运动信息,生成最佳航路,供飞行控制单元控制无人机的定位及导航。Further, the embodiment of the present invention also includes a route generation unit, which obtains the location information specified by the user, and generates the optimal route for flight control based on the location information specified by the user, map update information, fused pose and motion information. The unit controls the positioning and navigation of the drone.
进一步的,在本发明实施例中,所述的三维激光雷达单元、IMU惯性传感器、高性能数据处理单元集合在一个框架中。Further, in the embodiment of the present invention, the three-dimensional lidar unit, IMU inertial sensor, and high-performance data processing unit are integrated into one frame.
本实施例提供的无人机正前方安装有多线三维激光雷达系统,雷达在内部扫描系统的控制下对前方一定的视角范围内进行快速扫描,生成密集的点云数据,这些点云数据反映了飞机周围数十米范围内的一切障碍物的精确位置信息,之后生成的点云数据实时传输到高性能数据处理单元,处理单元利用这些数据进行计算生成飞行航线,实现无人机自主避障飞行;无人机内安装的高性能数据处理单元,内置有高速的CPU和GPU,可对多线激光雷达扫描出来的点云实时的进行预处理,之后送到SLAM计算单元进行位置解算和三维地图模型构建;The UAV provided in this embodiment is equipped with a multi-line three-dimensional lidar system directly in front. Under the control of the internal scanning system, the radar quickly scans the front within a certain viewing angle range and generates dense point cloud data. These point cloud data reflect The precise position information of all obstacles within a range of tens of meters around the aircraft is obtained. The generated point cloud data is then transmitted to a high-performance data processing unit in real time. The processing unit uses these data to calculate and generate flight routes to achieve autonomous obstacle avoidance for the drone. Flying; the high-performance data processing unit installed in the drone has built-in high-speed CPU and GPU, which can preprocess the point cloud scanned by multi-line lidar in real time, and then send it to the SLAM computing unit for position calculation and Three-dimensional map model construction;
本发明可基于三维激光雷达扫描出的地图模型,只要用户指出无人机需要抵达的目标位置,无人机即可自动快速生成无人机的飞行航路,航路是基于周围环境扫描出的精确地图构建,在此基础上生成的航路非常安全,算法又能保证飞行的路线最高效。This invention can be based on the map model scanned by three-dimensional lidar. As long as the user points out the target location that the drone needs to reach, the drone can automatically and quickly generate the flight route of the drone. The route is an accurate map scanned based on the surrounding environment. The route generated on this basis is very safe, and the algorithm can ensure the most efficient flight route.
与现有技术相比,本发明实施例通过不断实时优化的IMU位姿数据,不断的提高无人机的实时位姿信息,提高了无人机的定位信息和构建地图的信息,使得本发明提供的无人机具有高的精准定位、不依赖GPS可自主导航和避障、操作简单、安全系数高等优点。Compared with the existing technology, the embodiments of the present invention continuously improve the real-time pose information of the drone through continuously optimized IMU pose data in real time, and improve the positioning information of the drone and the information for constructing the map, so that the present invention The drones provided have the advantages of high precision positioning, independent navigation and obstacle avoidance without relying on GPS, simple operation, and high safety factor.
为进一步说明本发明实现的方案,本申请同时进一步提供具体细化的实施例,具体细化实施例说明如下。In order to further illustrate the solutions implemented by the present invention, this application further provides specific and detailed examples. The specific and detailed examples are described as follows.
另一方面,本发明实施例提供一种基于激光雷达SLAM定位导航的无人机系统,图2为本发明实施例中的基于激光雷达SLAM定位导航的无人机系统模块图包括三维激光SLAM系统100和飞行控制单元200,其中,所述三维激光SLAM系统100提供无人机定位、导航信息,所述飞行控制单元200根据三维激光SLAM系统的定位、导航信息控制无人机定位、导航的飞行;On the other hand, embodiments of the present invention provide an unmanned aerial vehicle system based on lidar SLAM positioning and navigation. Figure 2 is a module diagram of an unmanned aerial vehicle system based on lidar SLAM positioning and navigation in an embodiment of the invention, including a three-dimensional laser SLAM system. 100 and a flight control unit 200, wherein the three-dimensional laser SLAM system 100 provides UAV positioning and navigation information, and the flight control unit 200 controls the UAV positioning and navigation flight according to the positioning and navigation information of the three-dimensional laser SLAM system. ;
所述三维激光SLAM系统100还包括激光信息采集单元110和激光信息处理单元120,其中,The three-dimensional laser SLAM system 100 also includes a laser information collection unit 110 and a laser information processing unit 120, wherein,
所述激光信息采集单元110用于为三维激光SLAM系统100采集信息,包括IMU单元111和至少一个多线三维激光雷达模块112,其中,The laser information collection unit 110 is used to collect information for the three-dimensional laser SLAM system 100, including an IMU unit 111 and at least one multi-line three-dimensional lidar module 112, wherein,
所述IMU单元111用于为三维激光SLAM系统100提供预测位姿和运动信息;The IMU unit 111 is used to provide predicted pose and motion information for the three-dimensional laser SLAM system 100;
所述激光雷达112用于采集周围环境的点云数据;The lidar 112 is used to collect point cloud data of the surrounding environment;
所述激光信息处理单元120用于处理激光信息采集单元采集到的信息,包括:The laser information processing unit 120 is used to process the information collected by the laser information collection unit, including:
点云处理模块130,用于对点云数据进行滤波和预处理;Point cloud processing module 130, used to filter and preprocess point cloud data;
SLAM数据处理模块140,用于对滤波和预处理后的数据进行解算处理,输出运动位姿信息和三维地图模型;The SLAM data processing module 140 is used to solve the filtered and preprocessed data and output motion pose information and three-dimensional map model;
和多传感器融合模块180,用于根据SLAM数据处理模块获得的运动位姿信息结合三维地图模型信息融合生成当前的位姿和运动信息。and a multi-sensor fusion module 180, used to fuse the motion pose information obtained by the SLAM data processing module with the three-dimensional map model information to generate the current pose and motion information.
在本发明实施例中,所述SLAM数据处理模块140还包括雷达解算模块141、轨迹生成模块142、轨迹比较模块143、标定模块144、激光雷达融合模块145,其中,In the embodiment of the present invention, the SLAM data processing module 140 also includes a radar calculation module 141, a trajectory generation module 142, a trajectory comparison module 143, a calibration module 144, and a lidar fusion module 145, where,
所述雷达解算模块141用于对滤波之后的点云数据进行解算,得到对应的激光雷达位姿和运动信息和当前帧周围环境的局部地图信息;The radar calculation module 141 is used to calculate the filtered point cloud data to obtain the corresponding laser radar pose and motion information and the local map information of the surrounding environment of the current frame;
所述轨迹生成模块142用于结合连续两帧的激光雷达位姿和运动信息生成无人机第一运动轨迹,结合连续两帧的IMU位姿和运动信息生成无人机第二运动轨迹;The trajectory generation module 142 is used to generate the first movement trajectory of the drone by combining the LiDAR pose and motion information of two consecutive frames, and generate the second motion trajectory of the drone by combining the IMU pose and motion information of two consecutive frames;
所述轨迹比较模块143用于比较第一运动轨迹和第二运动轨迹,得到比较误差,并对获得的比较误差与预设的激光SLAM标定误差进行比较;The trajectory comparison module 143 is used to compare the first movement trajectory and the second movement trajectory, obtain a comparison error, and compare the obtained comparison error with a preset laser SLAM calibration error;
所述激光雷达融合模块145根据轨迹比较模块的比较结果融合IMU单元111与激光雷达输出的位姿和运动信息;输出当前无人机稳定的位姿信息和运动信息。The lidar fusion module 145 fuses the pose and motion information output by the IMU unit 111 and the lidar according to the comparison result of the trajectory comparison module; and outputs the stable pose information and motion information of the current UAV.
在本发明实施例中,所述SLAM数据处理模块还包括:In this embodiment of the present invention, the SLAM data processing module also includes:
地图更新模块160,用于根据当前帧周围环境的局部地图信息实时更新三维激光SLAM系统100的地图信息;和The map update module 160 is used to update the map information of the three-dimensional laser SLAM system 100 in real time according to the local map information of the surrounding environment of the current frame; and
全局地图融合模块170,用于融合全部的局部地图信息生成所述的全局三维地图模型信息,使无人机在所述全局地图信息中规划航行路径。The global map fusion module 170 is used to fuse all local map information to generate the global three-dimensional map model information, so that the UAV can plan a navigation path in the global map information.
在本发明实施例中,所述基于激光雷达SLAM定位导航的无人机系统还包括标定模块144,用于根据轨迹比较模块的比较结果实时修正IMU单元111误差。In the embodiment of the present invention, the UAV system based on lidar SLAM positioning and navigation also includes a calibration module 144, which is used to correct the error of the IMU unit 111 in real time according to the comparison result of the trajectory comparison module.
在本发明实施例中,当有GPS和/或RTK信号时,所述多传感器融合模块180还将激光雷达融合模块输出的位姿信息和运动信息、GPS和/或RTK位置信息以及当前更新地图信息进行融合,获得带有地球坐标的融合位姿和运动信息。In the embodiment of the present invention, when there are GPS and/or RTK signals, the multi-sensor fusion module 180 will also combine the pose information and motion information output by the lidar fusion module, the GPS and/or RTK position information, and the currently updated map. The information is fused to obtain fused pose and motion information with earth coordinates.
优选的,所述为三维激光SLAM系统100提供预测位姿和运动信息还包括IMU单元111先采集里程信息,通过无人机惯性里程计运动学的模型,将里程信息转化为无人机位姿变化信息,送入贝叶斯滤波器初步计算出预测位姿和运动信息,所述滤波处理包括对点云数据进行降噪、去除异常点和减少多余的点云数据处理。Preferably, the provision of predicted pose and motion information for the three-dimensional laser SLAM system 100 also includes the IMU unit 111 first collecting mileage information, and converting the mileage information into the UAV pose through the UAV inertial odometry kinematics model. The change information is sent to the Bayesian filter to initially calculate the predicted pose and motion information. The filtering process includes denoising the point cloud data, removing abnormal points, and reducing redundant point cloud data processing.
在本发明实施例中,所述点云处理模块130还包括滤波模块131和预处理模块132,其中,In this embodiment of the present invention, the point cloud processing module 130 also includes a filtering module 131 and a preprocessing module 132, where,
所述滤波模块131用于对激光雷达112采集到的点云数据滤波处理;The filtering module 131 is used to filter the point cloud data collected by the lidar 112;
所述预处理模块132用于对滤波之后的点云数据进行初步处理,得到临时局部地图信息。The preprocessing module 132 is used to perform preliminary processing on the filtered point cloud data to obtain temporary local map information.
在本发明实施例中,所述飞行控制单元200还包括全局规划模块210、局部规划模块220和底层控制模块230,其中,In the embodiment of the present invention, the flight control unit 200 also includes a global planning module 210, a local planning module 220 and a bottom-level control module 230, wherein,
所述全局规划模块210用于规划全局地图的航行最佳路径;The global planning module 210 is used to plan the optimal navigation path of the global map;
所述局部规划模块220用于对预处理后得到的实时的局部地图信息规划全局最佳路径;The local planning module 220 is used to plan the global optimal path for the real-time local map information obtained after preprocessing;
所述底层控制模块230用于对无人机进行控制分配。The underlying control module 230 is used to control and allocate drones.
另一方面,请参阅图3,本发明实施例提供一种基于激光雷达SLAM定位导航无人机的方法包括以下步骤:On the other hand, please refer to Figure 3. An embodiment of the present invention provides a method for positioning and navigating a drone based on lidar SLAM, which includes the following steps:
(1)、获取IMU单元采集当前帧的IMU位姿和运动信息,获取三维激光雷达采集当前帧的点云数据信息;(1) Obtain the IMU pose and motion information of the current frame collected by the IMU unit, and obtain the point cloud data information of the current frame collected by the three-dimensional lidar;
(2)、对采集到的点云数据信息进行滤波和预处理;(2) Filter and preprocess the collected point cloud data information;
(3)、对滤波处理后的点云数据进行解算,获取当前帧的激光雷达位姿和运动信息和当前帧在激光SLAM系统中的周围环境的局部地图信息;(3) Solve the filtered point cloud data to obtain the lidar pose and motion information of the current frame and the local map information of the surrounding environment of the current frame in the laser SLAM system;
(4)、根据局部地图信息对激光SLAM系统中的地图进行更新,融合全部的局部地图生成三维全局地图;(4) Update the map in the laser SLAM system based on local map information, and fuse all local maps to generate a three-dimensional global map;
(5)、结合上一帧的激光雷达位姿和运动信息与当前帧的激光雷达位姿和运动信息生成第一运动轨迹;(5) Combine the lidar pose and motion information of the previous frame with the lidar pose and motion information of the current frame to generate the first motion trajectory;
(6)、结合上一帧的IMU位姿和运动信息与当前帧的IMU位姿和运动信息生成第二运动轨迹;(6) Generate a second motion trajectory by combining the IMU pose and motion information of the previous frame with the IMU pose and motion information of the current frame;
(7)、比较第一运动轨迹和第二运动轨迹,得到比较误差,将比较误差与系统预设的激光SLAM标定误差进行比较;(7) Compare the first movement trajectory and the second movement trajectory to obtain a comparison error, and compare the comparison error with the system's preset laser SLAM calibration error;
(8)、根据比较误差与激光SLAM标定误差的比较结果融合IMU与激光雷达的位姿和运动信息,并结合当前更新的全局地图模型,得到当前帧的融合位姿和运动信息;(8) Based on the comparison results of the comparison error and the laser SLAM calibration error, the pose and motion information of the IMU and lidar are fused, and combined with the currently updated global map model, the fused pose and motion information of the current frame is obtained;
(9)、获取用户指定的位置信息,并根据用户指定的位置信息、地图更新信息、融合位姿和运动信息,生成最佳航路,显示及控制无人机的定位及导航。(9) Obtain the location information specified by the user, and generate the best route based on the location information specified by the user, map update information, fused pose and motion information, and display and control the positioning and navigation of the drone.
在本发明实施例中,所述步骤2中的对采集到的点云数据进行滤波和预处理还包括以下步骤:In the embodiment of the present invention, filtering and preprocessing the collected point cloud data in step 2 also includes the following steps:
(2-1)、对采集到的点云数据进行滤波处理;(2-1) Filter the collected point cloud data;
(2-2)、对滤波后的点云数据进行预处理;(2-2), preprocess the filtered point cloud data;
(2-3)、根据预处理后的点云数据更新临时局部地图。(2-3). Update the temporary local map based on the preprocessed point cloud data.
在本发明实施例中,所述步骤7中的将比较误差与激光SLAM标定误差进行比较还包括以下步骤:In the embodiment of the present invention, comparing the comparison error with the laser SLAM calibration error in step 7 also includes the following steps:
(7-1)、若比较误差大于激光SLAM标定误差,则将当前帧的IMU位姿和运动信息替换为当前帧的激光雷达位姿和运动信息;(7-1). If the comparison error is greater than the laser SLAM calibration error, replace the IMU pose and motion information of the current frame with the lidar pose and motion information of the current frame;
(7-2)、若比较误差小于激光SLAM标定误差,则存储当前帧的IMU位姿和运动信息。(7-2). If the comparison error is smaller than the laser SLAM calibration error, the IMU pose and motion information of the current frame are stored.
在本发明实施例中,所述步骤8还包括:当存在GPS信号和/或RTK信号时,将当前帧的融合位姿和运动信息结合GPS信号和/或RTK信号得到带有地球坐标系的融合位姿和运动信息。In the embodiment of the present invention, the step 8 also includes: when there is a GPS signal and/or RTK signal, combining the fused pose and motion information of the current frame with the GPS signal and/or RTK signal to obtain an image with the earth coordinate system. Fusion of pose and motion information.
在本发明实施例中的一种基于激光雷达SLAM定位导航的无人机系统总体运行情况如下:The overall operation of a UAV system based on lidar SLAM positioning and navigation in the embodiment of the present invention is as follows:
IMU单元111获取三维激光SLAM系统100的里程信息,通过无人机惯性里程计运动学的模型,将里程信息转化为无人机位姿变化信息,送入贝叶斯滤波器初步计算出当前帧的IMU位姿和运动信息并输出给轨迹生成模块142,多线三维激光扫描雷达采集当前帧的点云数据信息并输出给滤波模块131;The IMU unit 111 acquires the mileage information of the three-dimensional laser SLAM system 100, converts the mileage information into UAV pose change information through the UAV inertial odometry kinematics model, and sends it to the Bayesian filter to initially calculate the current frame. The IMU pose and motion information is output to the trajectory generation module 142. The multi-line three-dimensional laser scanning radar collects the point cloud data information of the current frame and outputs it to the filtering module 131;
滤波模块131对采集到的点云数据进行降噪、去除异常点和减少多余的点云数据量等滤波处理,并将滤波后的点云数据分别输出给雷达解算模块141和预处理模块132;The filtering module 131 performs filtering processes such as denoising, removing abnormal points, and reducing the amount of redundant point cloud data on the collected point cloud data, and outputs the filtered point cloud data to the radar calculation module 141 and the preprocessing module 132 respectively. ;
预处理模块132对滤波后的点云数据进行初步处理得到临时局部地图信息并输出给局部规划模块220;The preprocessing module 132 performs preliminary processing on the filtered point cloud data to obtain temporary local map information and outputs it to the local planning module 220;
雷达解算模块141对滤波之后的点云数据解算,得到当前帧的激光雷达位姿和运动信息和当前帧在激光系统中的周围环境的局部地图信息,并将当前帧的激光雷达位姿和运动信息输出到轨迹生成模块142,将当前帧在激光系统中的周围环境的局部地图信息输出到地图更新模块160;The radar calculation module 141 calculates the filtered point cloud data to obtain the lidar pose and motion information of the current frame and the local map information of the surrounding environment of the current frame in the laser system, and converts the lidar pose and motion information of the current frame into and the motion information is output to the trajectory generation module 142, and the local map information of the surrounding environment of the current frame in the laser system is output to the map update module 160;
地图更新模块160实时更新连续两帧的局部地图信息并输出给全局地图融合模块170;The map update module 160 updates the local map information of two consecutive frames in real time and outputs it to the global map fusion module 170;
全局地图融合模块170融合全部的局部地图信息,生成全局地图信息,当无人机下次在同一位置起飞时,可以直接使用该全局地图信息,将全局地图信息输出给激光雷达融合模块145和全局规划模块210;The global map fusion module 170 fuses all local map information to generate global map information. When the drone takes off at the same location next time, the global map information can be directly used to output the global map information to the lidar fusion module 145 and the global map information. planning module 210;
轨迹生成模块142结合上一帧的激光雷达位姿和运动信息与当前帧的激光雷达位姿和运动信息生成第一运动轨迹,结合上一帧的IMU位姿和运动信息与当前帧的IMU位姿和运动信息生成第二运动轨迹,并将第一运动轨迹和第二运动轨迹输出给轨迹比较模块143;The trajectory generation module 142 combines the lidar pose and motion information of the previous frame with the lidar pose and motion information of the current frame to generate a first motion trajectory, and combines the IMU pose and motion information of the previous frame with the IMU position of the current frame. Generate a second motion trajectory from the posture and motion information, and output the first motion trajectory and the second motion trajectory to the trajectory comparison module 143;
轨迹比较模块143比较第一运动轨迹和第二运动轨迹,得到第一比较误差,将第一比较误差与预设好的激光SLAM标定误差进行比较,并将比较结果输出给标定模块144;The trajectory comparison module 143 compares the first movement trajectory and the second movement trajectory to obtain a first comparison error, compares the first comparison error with the preset laser SLAM calibration error, and outputs the comparison result to the calibration module 144;
若第一比较误差大于激光SLAM标定误差,则标定模块144将当前帧的IMU位姿和运动信息替换为当前帧的激光雷达位姿和运动信息,若第一比较误差小于激光SLAM标定误差,则存储当前帧的IMU位姿和运动信息;If the first comparison error is greater than the laser SLAM calibration error, then the calibration module 144 replaces the IMU pose and motion information of the current frame with the lidar pose and motion information of the current frame. If the first comparison error is less than the laser SLAM calibration error, then Store the IMU pose and motion information of the current frame;
将IMU与激光雷达的位姿和运动信息输出给激光雷达融合模块145,激光雷达融合模块145根据融合算法融合IMU与激光雷达的位姿和运动信息,得到当前帧的融合位姿和运动信息,将当前帧的融合位姿和运动信息输出给多传感器融合模块180,多传感器融合模块180接收到的融合位姿和运动信息、更新的全局三维地图信息进行融合,当有外部GPS和/或RTK信号时,同时融合GPS和/或RTK信息,获得带有地球坐标系的融合位姿和运动信息,并输出给全局规划模块210;The pose and motion information of the IMU and lidar are output to the lidar fusion module 145. The lidar fusion module 145 fuses the pose and motion information of the IMU and lidar according to the fusion algorithm to obtain the fused pose and motion information of the current frame, The fused pose and motion information of the current frame are output to the multi-sensor fusion module 180. The fused pose and motion information received by the multi-sensor fusion module 180 is fused with the updated global three-dimensional map information. When there is an external GPS and/or RTK When receiving the signal, the GPS and/or RTK information are simultaneously fused to obtain the fused pose and motion information with the earth coordinate system, and output to the global planning module 210;
全局规划模块210根据用户指定的位置、地图更新信息和融合位姿和运动信息规划无人机到达目标点的最佳航行路径,即路线最短且行进过程中无障碍的航行路径,并将全部数据输出到局部规划模块220;The global planning module 210 plans the best navigation path for the UAV to reach the target point based on the user-specified location, map update information, and fused pose and motion information, that is, the navigation path with the shortest route and no obstacles during travel, and all data Output to local planning module 220;
局部规划模块220根据激光信息处理单元中的预处理模块发送的实时的局部地图信息与全局规划模块210输出的数据规划出局部最佳航行路径,并与最佳航行路径比较取得最优路径,将最优路径输出到底层控制模块230;The local planning module 220 plans the local best navigation path based on the real-time local map information sent by the preprocessing module in the laser information processing unit and the data output by the global planning module 210, and compares it with the best navigation path to obtain the optimal path. The optimal path is output to the bottom control module 230;
底层控制模块230根据最优路径对无人机进行控制分配,控制无人机的飞行速度、角度和方位等信息;The underlying control module 230 controls and allocates the drone according to the optimal path, and controls the flight speed, angle, orientation and other information of the drone;
无人机接收到指令并开始航行。The drone receives the command and starts sailing.
本申请提供的方法和系统,融合了IMU的惯导数据,使用惯导测量出的方向信息作为SLAM系统的输入信息避免了无GPS信号时无法获取当前运动方向的问题,提高系统自主定位的可靠性。同时本发明基于多线三维激光扫描数据并通过SLAM数据系统实时构建更新三维地图,供无人机导航使用。The method and system provided by this application integrate the inertial navigation data of the IMU, and use the direction information measured by the inertial navigation as the input information of the SLAM system to avoid the problem of being unable to obtain the current direction of movement when there is no GPS signal, and improve the reliability of the system's autonomous positioning. sex. At the same time, the invention is based on multi-line three-dimensional laser scanning data and uses the SLAM data system to construct and update a three-dimensional map in real time for use in UAV navigation.
同时本发明可基于三维激光雷达扫描出的地图模型,实现无人机自主避障飞行,使用三维激光雷达与一般的平面单线激光雷达相比,好处是可以获取环境的三维地形信息,三维信息比二维信息的信息量多几个数量级,这也就意味着基于三维数据的SLAM定位算法能获得更精确和稳定的位置信息,另外还能获取到三维空间任意一点的坐标,这就使无人机可以具备自主生成飞行航迹,进行自主飞行的能力,只要用户指出无人机需要抵达的目标位置,无人机即可自动快速生成无人机的飞行航路,航路是基于周围环境扫描出的精确地图构建,在此基础上生成的航路非常安全,算法又能保证飞行的路线最高效。At the same time, the present invention can realize the autonomous obstacle avoidance flight of the drone based on the map model scanned by the three-dimensional lidar. Compared with the general planar single-line lidar, the advantage of using the three-dimensional lidar is that it can obtain the three-dimensional terrain information of the environment. The three-dimensional information ratio is The amount of information in two-dimensional information is several orders of magnitude greater, which means that the SLAM positioning algorithm based on three-dimensional data can obtain more accurate and stable position information. In addition, it can also obtain the coordinates of any point in the three-dimensional space, which makes unmanned The drone can have the ability to autonomously generate flight tracks and fly autonomously. As long as the user points out the target location that the drone needs to reach, the drone can automatically and quickly generate the drone's flight route. The route is scanned based on the surrounding environment. Accurate map construction, the route generated on this basis is very safe, and the algorithm can ensure the most efficient flight route.
另外,本发明通过多传感器融合模块,在有GPS和/或RTK信号时,本发明提供的方法及系统生成的坐标融合了GPS或RTK的绝对位置坐标,因此当GPS或RTK信号有效的时候系统把相对坐标变换成全球标准地球坐标,生成的坐标数据就可以用直接显示在地球坐标系中,生成的航路也可被其他应用系统使用,达到分享数据的目标,系统的航路也不会像相对坐标系统一样,一旦丢失原点就意味着全部数据丢失,而是可以像GPS坐标一样永久有效。另外,本系统把GPS和/或RTK、三维激光雷达以及IMU数据同时修正一组状态,使得多传感器融合能够做到无缝切换,互相弥补各自缺陷,将传感器性能发挥到最大。In addition, through the multi-sensor fusion module of the present invention, when there are GPS and/or RTK signals, the coordinates generated by the method and system provided by the present invention fuse the absolute position coordinates of GPS or RTK. Therefore, when the GPS or RTK signals are valid, the system By converting relative coordinates into global standard earth coordinates, the generated coordinate data can be directly displayed in the earth coordinate system. The generated routes can also be used by other application systems to achieve the goal of sharing data. The system routes will not be as similar as relative coordinates. Like the coordinate system, once the origin is lost, all data is lost, but it can be permanently valid like GPS coordinates. In addition, this system simultaneously corrects a set of states from GPS and/or RTK, 3D lidar and IMU data, allowing multi-sensor fusion to achieve seamless switching, make up for each other's shortcomings, and maximize sensor performance.
以上对本发明所提供的一种基于激光雷达SLAM定位导航的无人机的实施例、激光雷达SLAM定位导航无人机的方法及其系统的实施例进行了详细介绍,对于本领域的一般技术人员,依据本发明实施例的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本发明的限制。The above has introduced in detail the embodiments of a UAV based on LiDAR SLAM positioning and navigation provided by the present invention, the LiDAR SLAM positioning and navigation UAV method and the system thereof. For those of ordinary skill in the art, According to the ideas of the embodiments of the present invention, there will be changes in the specific implementation methods and application scope. In summary, the content of this description should not be understood as a limitation of the present invention.
Claims (6)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210930327.3A CN117572879A (en) | 2022-08-03 | 2022-08-03 | Unmanned aerial vehicle based on laser radar SLAM positioning navigation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210930327.3A CN117572879A (en) | 2022-08-03 | 2022-08-03 | Unmanned aerial vehicle based on laser radar SLAM positioning navigation |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117572879A true CN117572879A (en) | 2024-02-20 |
Family
ID=89884966
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210930327.3A Pending CN117572879A (en) | 2022-08-03 | 2022-08-03 | Unmanned aerial vehicle based on laser radar SLAM positioning navigation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117572879A (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105759829A (en) * | 2016-04-12 | 2016-07-13 | 深圳市龙云创新航空科技有限公司 | Laser radar-based mini-sized unmanned plane control method and system |
CN107450577A (en) * | 2017-07-25 | 2017-12-08 | 天津大学 | UAV Intelligent sensory perceptual system and method based on multisensor |
CN108827306A (en) * | 2018-05-31 | 2018-11-16 | 北京林业大学 | A kind of unmanned plane SLAM navigation methods and systems based on Multi-sensor Fusion |
CN110187375A (en) * | 2019-06-27 | 2019-08-30 | 武汉中海庭数据技术有限公司 | A kind of method and device improving positioning accuracy based on SLAM positioning result |
CN111240331A (en) * | 2020-01-17 | 2020-06-05 | 仲恺农业工程学院 | Intelligent trolley positioning and navigation method and system based on laser radar and odometer SLAM |
-
2022
- 2022-08-03 CN CN202210930327.3A patent/CN117572879A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105759829A (en) * | 2016-04-12 | 2016-07-13 | 深圳市龙云创新航空科技有限公司 | Laser radar-based mini-sized unmanned plane control method and system |
CN107450577A (en) * | 2017-07-25 | 2017-12-08 | 天津大学 | UAV Intelligent sensory perceptual system and method based on multisensor |
CN108827306A (en) * | 2018-05-31 | 2018-11-16 | 北京林业大学 | A kind of unmanned plane SLAM navigation methods and systems based on Multi-sensor Fusion |
CN110187375A (en) * | 2019-06-27 | 2019-08-30 | 武汉中海庭数据技术有限公司 | A kind of method and device improving positioning accuracy based on SLAM positioning result |
CN111240331A (en) * | 2020-01-17 | 2020-06-05 | 仲恺农业工程学院 | Intelligent trolley positioning and navigation method and system based on laser radar and odometer SLAM |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11914369B2 (en) | Multi-sensor environmental mapping | |
CN109029417B (en) | Unmanned aerial vehicle SLAM method based on mixed visual odometer and multi-scale map | |
CN105022401B (en) | Vision-based collaborative SLAM method for multi-quadrotor UAVs | |
EP2895819B1 (en) | Sensor fusion | |
CN108827306A (en) | A kind of unmanned plane SLAM navigation methods and systems based on Multi-sensor Fusion | |
CN105759829A (en) | Laser radar-based mini-sized unmanned plane control method and system | |
CN103926933A (en) | Indoor simultaneous locating and environment modeling method for unmanned aerial vehicle | |
CN110333735B (en) | A system and method for realizing secondary positioning of unmanned aerial vehicle in water and land | |
CN105004336A (en) | Robot positioning method | |
CN113156998A (en) | Unmanned aerial vehicle flight control system and control method | |
CN106155075A (en) | A detachable UAV control system | |
Liu et al. | A survey of computer vision applied in aerial robotic vehicles | |
CN117572879A (en) | Unmanned aerial vehicle based on laser radar SLAM positioning navigation | |
CN117572459A (en) | Unmanned aerial vehicle capable of automatically switching navigation system | |
Cui et al. | Landmark extraction and state estimation for UAV operation in forest | |
CN118362124A (en) | A real-time obstacle avoidance and map updating method for UAV inspection of power distribution lines | |
CN117554989A (en) | Visual fusion laser radar SLAM positioning navigation method and unmanned aerial vehicle system thereof | |
CN117554990A (en) | Laser radar SLAM positioning navigation method and unmanned aerial vehicle system thereof | |
CN115464620A (en) | Equipment maintenance method and device, operation and maintenance system and operation and maintenance robot | |
Yigit et al. | Visual attitude stabilization of a unmanned helicopter in unknown environments with an embedded single-board computer | |
CN115686045A (en) | Indoor autonomous aircraft obstacle avoidance device based on 3DVFH algorithm | |
US20230150543A1 (en) | Systems and methods for estimating cuboid headings based on heading estimations generated using different cuboid defining techniques | |
Zhang et al. | An integrated UAV navigation system based on geo-registered 3D point cloud | |
CN117629187A (en) | Tightly coupled odometry method and system based on UWB and vision fusion | |
Nonami et al. | Guidance and navigation systems for small aerial robots |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |