CN115964446A - Radar data interaction processing method based on mobile terminal - Google Patents
Radar data interaction processing method based on mobile terminal Download PDFInfo
- Publication number
- CN115964446A CN115964446A CN202211628850.7A CN202211628850A CN115964446A CN 115964446 A CN115964446 A CN 115964446A CN 202211628850 A CN202211628850 A CN 202211628850A CN 115964446 A CN115964446 A CN 115964446A
- Authority
- CN
- China
- Prior art keywords
- data
- point cloud
- obstacle
- vehicle
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000003993 interaction Effects 0.000 title claims abstract description 16
- 238000003672 processing method Methods 0.000 title claims abstract description 13
- 238000004364 calculation method Methods 0.000 claims abstract description 16
- 238000000034 method Methods 0.000 claims abstract description 15
- 230000004888 barrier function Effects 0.000 claims abstract description 7
- 230000005540 biological transmission Effects 0.000 claims description 12
- 238000012545 processing Methods 0.000 claims description 11
- 238000001914 filtration Methods 0.000 claims description 10
- 230000008569 process Effects 0.000 claims description 6
- 238000006243 chemical reaction Methods 0.000 claims description 5
- 238000004891 communication Methods 0.000 claims description 2
- 238000000605 extraction Methods 0.000 claims description 2
- 238000007781 pre-processing Methods 0.000 claims description 2
- 230000011218 segmentation Effects 0.000 claims description 2
- 238000004148 unit process Methods 0.000 claims description 2
- 230000008901 benefit Effects 0.000 abstract description 3
- 238000010586 diagram Methods 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 230000003796 beauty Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Landscapes
- Traffic Control Systems (AREA)
Abstract
A radar data interaction processing method based on a mobile terminal belongs to the field of automatic driving. According to the invention, the original point cloud data is quickly and accurately processed at the calculation generation end, and the radar data is sent to the mobile end for display through the calculation generation end, so that the problems of space occupation and difficulty in movement of display equipment are solved, the method has the advantage of high flexibility, and more testers can participate in debugging work at the same time; in addition, the radar data are processed at the mobile terminal, barrier information is represented through a simple model, more passengers can understand the radar data, and each decision made by the unmanned vehicle can be fully trusted when the unmanned vehicle is taken. The invention solves the problems that a calculation generation end is difficult to move and an ordinary user does not understand radar data.
Description
Technical Field
The invention belongs to the field of automatic driving.
Background
The autonomous vehicle makes correct decisions and controls through sensed data, while being unable to sense sensors such as radar, cameras, etc. The radar transmits a detection signal (laser beam) to a target, and then compares a received signal (target echo) reflected from the target with the transmitted signal, and after appropriate processing, relevant information of the target, such as target distance, azimuth, height, speed, attitude, even shape and other parameters, can be obtained, so as to detect, track and identify the target. However, the original data of the radar is complex point cloud data, which is not difficult for workers who are engaged in the field of automatic driving all the time to understand, but when the automatic driving vehicle runs on the road, a simple and easy-to-understand radar data presentation mode is very important for many passengers. Therefore, the radar data interaction processing method based on the mobile terminal is provided, the original radar data is processed at the calculation generating terminal, the radar data is transmitted to the mobile terminal through a UDP protocol, and the barrier is displayed by using a corresponding algorithm, so that the method has the advantages of popular and easy understanding, flexibility and convenience in interaction display.
In a conventional unmanned vehicle, raw data is usually displayed on a central control screen, and a tester must be in the vehicle to know the data perceived by the vehicle and the correctness of a decision made in real time. In terms of display interfaces, a display screen panel with a laser radar data graphic user interface (publication number: CN 307549670S) designs a laser radar display interface, the interface is suitable for computers, mobile phones and tablet devices, but only can be used for replaying and displaying laser radar data, and in unmanned driving, the vehicle has high running speed, decision control needs to be carried out through real-time sensed data, and the method cannot meet the real-time radar data processing and displaying requirements; for radar data processing, in a ground point cloud filtering method, a ground point cloud filtering system, ground point cloud filtering equipment and a ground point cloud filtering storage medium (publication number: CN 115166700A) of a laser radar, environment points are reserved by carrying out structured coding on point clouds and utilizing conditional query on neighbor points, and ground points and noise points in the point clouds are filtered.
Aiming at the defects, the invention provides a radar data interaction processing method based on a mobile terminal, which is characterized in that the original point cloud data is quickly and accurately processed at a calculation generation terminal, and the radar data is sent to the mobile terminal for display through the calculation generation terminal, so that the problems of space occupation and difficulty in movement of display equipment are solved, the method has the advantage of high flexibility, and more testers can participate in debugging work at the same time; in addition, the radar data are processed at the mobile terminal, barrier information is represented through a simple model, more passengers can understand the radar data, and each decision made by the unmanned vehicle can be fully trusted when the unmanned vehicle is taken. The invention aims to provide a radar data interaction processing method based on a mobile terminal, which aims to solve the problems that a calculation generation terminal is difficult to move and an ordinary user does not understand radar data.
Disclosure of Invention
The whole system comprises a data acquisition unit, a processing unit and a display unit. The data acquisition unit and the processing unit are completed at the calculation generation end, and the display unit is completed at the mobile end. The data acquisition unit comprises two parts of data, wherein one part of the data acquires a group of disordered original point cloud data through the vehicle-mounted three-dimensional laser radar, and the other part of the data acquires vehicle information through the can bus; the processing unit divides the point cloud data into a plurality of independent subsets in a clustering way, and carries out target classification and identification on the basis; the display unit displays the vehicle information and the recognized obstacle on the mobile terminal.
1. Acquisition of point cloud data
A group of unordered original point cloud data is obtained through a vehicle-mounted three-dimensional laser radar, wherein the point cloud data at least comprises point cloud points of a road area scanned by a vehicle in the driving process, and each point cloud point data comprises coordinate information and is provided with a timestamp and a light beam direction.
2. Processing point cloud data
The processing unit processes the original point cloud data, wherein the preprocessing unit comprises ground point cloud data filtering and target clustering segmentation operations, and the identification unit calculates geometric data of the clustered point cloud so as to identify the obstacle.
1) The raw point cloud data is converted into a depth image, each pixel of which stores the measured distance from the sensor to the object.
2) Filtering ground point cloud data
(1) Traversing all point cloud points in a point cloud set in the point cloud picture, and acquiring the road surface height of a driving road corresponding to each point cloud point;
(2) Determining the height coordinate of each cloud point relative to the driving road based on the point cloud coordinate information of each cloud point in the point cloud image;
(3) And if the height coordinate of any point cloud point is less than or equal to the corresponding road surface height, removing the point cloud point from the point cloud set, and reducing the data volume of the point cloud.
3) Clustering point cloud data
(1) Point cloud data clustering
And calculating Euclidean distance from the neighborhood point to the target point through a neighbor query algorithm of the KD-Tree, and clustering according to the distance. The calculation process is repeated until all new points have been calculated.
(2) Obstacle identification
And for each cluster, obtaining a rectangle with the minimum area around the obstacle based on a minimum convex hull method, and obtaining a cube frame. And carrying out feature extraction and classification on the cube frame area, and identifying the target barrier. The obstacle information includes corner coordinates, type, and id.
3. Vehicle state information acquisition
Vehicle information is obtained through the can bus, and the vehicle state information comprises steering wheel rotation angle, electric quantity, speed, gear, accelerator opening and brake opening.
4. Data communication
1) UDP protocol transmission data
(1) And performing frame selection on the obstacle information identified by the radar point cloud data. In order to solve the problem of huge point cloud data volume, the corner point coordinates of the rectangular frame of the obstacle and the obstacle category are transmitted during data transmission.
(2) Transmission of byte data over UDP protocol
Every eight bytes are specified as one obstacle information or vehicle information. The data transmission occupies two ports, and the data transmitted by each port is 1498-bit byte data. One port sends vehicle state information, the other port sends radar data information, and the two ports send data in a circulating mode.
5. Display interaction of data
1) The mobile terminal receives the obstacle data and the vehicle data.
2) Coordinate system conversion for obstacle information
The radar obstacle data takes the central point of the tail of the vehicle as the origin of coordinates and the unit is meter, so that a vehicle coordinate system is formed; when the mobile terminal displays, an image coordinate system is formed, a control is drawn in the middle of an interface and used for displaying lane information, the upper left corner of a display lane is used as a coordinate origin, the unit is a pixel, and the vehicle coordinate system needs to be converted into the image coordinate system. In a vehicle coordinate system, the coordinate origin is taken as an X axis to the right, and the coordinate origin is taken as a Y axis to the upper; in the image coordinate system, the coordinate origin is taken as the x axis to the right, and the y axis is taken downwards, the coordinate conversion formula is
x’=w/2+X’*(w/m)
y’=h–Y’*(h/n)
Wherein w is the width of the pixels of the displayed lane in the screen, h is the height of the pixels of the displayed lane in the screen, m is the maximum lateral distance (unit: meter) of the displayed obstacle from the vehicle, n is the maximum longitudinal distance (unit: meter) of the displayed obstacle from the vehicle, X 'is the actual lateral distance (unit: meter) of the obstacle from the vehicle, Y' is the actual longitudinal distance (unit: meter) of the obstacle from the vehicle, X 'is the calculated lateral distance (unit: pixel) of the obstacle from the vehicle in the interface, and Y' is the calculated longitudinal distance (unit: pixel) of the obstacle from the vehicle in the interface;
3) And developing a mobile terminal interface, and displaying vehicle information (vehicle speed, electric quantity, steering wheel angle, accelerator opening, brake opening and gear) and obstacle information.
Drawings
FIG. 1 is a flowchart illustrating a radar data interaction processing method based on a mobile terminal according to the present invention;
FIG. 2 is a schematic coordinate transformation of the radar data interaction processing method based on the mobile terminal according to the present invention;
FIG. 3 is a mobile terminal interface of the radar data interaction processing method based on a mobile terminal according to the present invention;
Detailed Description
The following description of the embodiments of the present invention, with reference to the accompanying drawings, will make the embodiments of the present invention, such as the shapes and configurations of the components, the mutual positions and connection relationships between the components, the functions and operating principles of the components, the manufacturing process and the operation method, etc., more detailed description so as to help those skilled in the art to more completely, accurately and deeply understand the inventive concept and technical solution of the present invention.
In this embodiment, jetson AGX Orin is selected as the calculation generation end, and the computer is a smallest, most powerful, and most energy efficient AI supercomputer issued by NVIDIA, and can perform 200 Trillion Operations (TOPS) per second. The mobile terminal is an M6 flat plate, is small in size and can basically meet the display requirement.
Example 1 fig. 1 is a work flow diagram of a radar data interaction processing method based on a mobile terminal, as shown in the figure, original point cloud data is obtained through a vehicle-mounted three-dimensional laser radar, and vehicle information including steering wheel turning angle, electric quantity, speed, gear position, accelerator opening degree, brake opening degree and the like is obtained through a can bus; then filtering the ground point cloud data to reduce the point cloud data amount, clustering the same obstacle point cloud by using a clustering algorithm on the filtered point cloud data, and identifying obstacles according to the clustered data; and transmitting the vehicle information and the point cloud data after processing and identification to a mobile terminal in real time for display by using a UDP-based data transmission protocol.
Most lidar systems provide raw data in the form of a single range reading per laser beam, with a time stamp and beam direction, which can be directly converted to a depth image. Each pixel of such a depth image stores a measured distance from the sensor to the object. When ground point cloud data is filtered, traversing all point cloud points in a point cloud set in a point cloud picture, and inquiring the road surface height of a driving road corresponding to each point cloud point; determining the height coordinate of each cloud point relative to the driving road based on the point cloud coordinate information of each cloud point in the point cloud image; and if the height coordinate of any point cloud point is less than or equal to the corresponding road surface height, removing the point cloud point from the point cloud set, thereby finishing the work of filtering ground point cloud data and reducing the data volume of the point cloud. And then carrying out Euclidean clustering on the point cloud data, calculating the Euclidean distance from the neighborhood point to the target point through a neighbor query algorithm of the KD-Tree, and clustering according to the distance. The calculation process is repeated until all new points have been calculated. After the point cloud of each obstacle is obtained, points around the obstacle are obtained based on a minimum convex hull method, a rectangle surrounding the minimum area is obtained on the basis of the points, and when the 3D object is surrounded, a cubic frame is obtained. And then identifying the target obstacle based on the calculation result by calculating the geometric relation of the clusters in each cube frame.
Example 2UDP protocol data transfer Process
The data is transmitted by a UDP protocol, the protocol transmission data occupies two ports, and the data transmitted by each port is 1498-bit byte data. One port sends vehicle state information, the other port sends radar data information, and the two ports send data in a circulating mode. Bytes 1 to 6 in both ports describe the source device MAC address, bytes 7 to 14 describe the destination address MAC address, bytes 15 to 24 describe the source device IP address, bytes 25 to 34 describe the destination device IP address, bytes 35 to 38 describe the source device data transmission port, bytes 39 to 42 describe the destination device data reception port, the following bytes being data bytes. Each barrier data transmitted by the calculation generation end at the first port occupies 72 bytes, the first 16 bytes and the first 4 bytes represent corner point coordinate information, each 8 bytes represent x or y coordinates, and the last 8 bytes represent type information. In order to transmit more obstacle data, only four corner point coordinates are transmitted, such as left front upper, right front upper, left back lower and right back lower, and a rectangular frame of the obstacle can be restored according to the four corner point coordinates. The vehicle information transmitted by the calculation generation end at the second port represents one data every 8 bytes, for example, 43 th to 50 th bytes represent steering wheel angle information. Since the transmitted vehicle information data is limited, the remaining byte bits without duty can be used to transmit obstacle data or to add other vehicle information later.
Embodiment 3 fig. 2 is a schematic diagram of coordinate transformation of a radar data interaction processing method based on a mobile terminal according to the present invention. The image coordinate system pixel origin (0, 0) shows the upper left corner of the lane in the screen, with the x-axis to the right and the y-axis down. Assuming that the width of the lane pixel displayed at the moving end is w, the height is h, that is, the pixel coordinate at the lower right corner of the lane is (w, h), and hua is a flat plate M6 as an example, the lane pixel width w =1800 and the pixel height h =1600 are displayed on the interface. The origin of the actual vehicle coordinate is the central point of the tail of the vehicle, the right direction is the X axis, the direction to the head of the vehicle is the Y axis, for example, 3 lanes are provided, each lane is 4 meters wide and 50 meters long, the vehicle is located right below the middle lane, the displayed range X of the transverse distance from the obstacle to the vehicle belongs to-6, and the range Y of the longitudinal distance belongs to 0,50, therefore, the coordinate conversion formula is as follows:
x’=1800/2+X’*(1800/12)
y’=1600–Y’*(1600/50)
for example, if the vehicle receives obstacle coordinates of (2,3), i.e., the actual lateral distance X =2m and the longitudinal distance Y =3m from the host vehicle, the position displayed in the interface lane is (2, 3)
x’=1800/2+2*(1800/12)=1200px
y’=1600–3*(1600/50)=1504px
Embodiment 4 fig. 3 is a moving end interface of the radar data interaction processing method based on a moving end according to the present invention, three lanes are displayed in the interface, each lane is 4 meters wide and 50 meters long, the vehicle is located right below the middle lane, and the lane plane rotates 30 degrees into the interface for the sake of the displayed beauty. The upper part of the interface sequentially displays the electric quantity, the vehicle speed and the steering wheel angle from left to right, and the electric quantity, the vehicle speed and the steering wheel angle are displayed in an integer form. The left side shows throttle opening and brake opening, wherein the throttle and brake are shown by means of a progress bar, in units of percentage, e.g. 34% throttle, and 34% in the progress bar is blue. Right vehicle gear information. For the obstacles, obstacles 50 meters ahead of the vehicle and ± 6 meters to the left and right of the vehicle are displayed in the interface. And displaying a corresponding obstacle model according to the obstacle type data sent by the calculation generation end, and translating the obstacle model through the association of the obstacle id between the transmission data frame and the frame to ensure the moving fluency of the obstacle.
Claims (1)
1. A radar data interaction processing method based on a mobile terminal is characterized in that:
acquisition of point cloud data
Acquiring a group of disordered original point cloud data through a vehicle-mounted three-dimensional laser radar, wherein the point cloud data at least comprises point cloud points of a road area scanned by a vehicle in the driving process, and each point cloud point data comprises coordinate information and is provided with a timestamp and a light beam direction;
processing point cloud data
The processing unit processes the original point cloud data, wherein the preprocessing unit comprises ground point cloud data filtering and target clustering segmentation operations, and the identification unit calculates geometric data of the clustered point cloud so as to identify an obstacle;
1) Converting the raw point cloud data into a depth image, each pixel of such depth image storing a measured distance from the sensor to the object;
2) Filtering ground point cloud data
(1) Traversing all point cloud points in a point cloud set in the point cloud picture, and acquiring the road surface height of a driving road corresponding to each point cloud point;
(2) Determining the height coordinate of each cloud point relative to the driving road based on the point cloud coordinate information of each cloud point in the point cloud image;
(3) If the height coordinate of any point cloud point is less than or equal to the corresponding road surface height, removing the point cloud point from the point cloud set, and reducing the data volume of the point cloud;
3) Clustering point cloud data
(1) Point cloud data clustering
And calculating Euclidean distance from the neighborhood point to the target point through a neighbor query algorithm of the KD-Tree, and clustering according to the distance. Repeating the calculation process until all new points are calculated;
(2) Obstacle identification
For each cluster, obtaining a rectangle with the smallest area around the obstacle based on a minimum convex hull method to obtain a cubic frame; carrying out feature extraction and classification on the cube frame area, and identifying a target obstacle; the obstacle information comprises corner point coordinates, types and ids;
third, vehicle state information acquisition
Obtaining vehicle information through a can bus, wherein the vehicle state information comprises steering wheel turning angle, electric quantity, speed, gear, accelerator opening and brake opening;
data communication
1) UDP protocol transmission data
(1) Performing frame selection on obstacle information identified by the radar point cloud data; during data transmission, transmitting the coordinates of corner points of a rectangular frame of the obstacle and the category of the obstacle;
(2) Transmission of byte data over UDP protocol
Defining each eight bytes as barrier information or vehicle information; the transmission data occupies two ports, and the data transmitted by each port is 1498-bit byte data; one port sends vehicle state information, the other port sends radar data information, and the two ports send the radar data information in a circulating mode;
display interaction of data
1) The mobile terminal receives the barrier data and the vehicle data;
2) Coordinate system conversion for obstacle information
The radar obstacle data takes the central point of the tail of the vehicle as the origin of coordinates and the unit is meter, so that a vehicle coordinate system is formed; when the mobile terminal displays, an image coordinate system is formed, a control is drawn in the middle of an interface and used for displaying lane information, the upper left corner of a display lane is used as a coordinate origin, the unit is a pixel, and the vehicle coordinate system needs to be converted into the image coordinate system; in a vehicle coordinate system, the coordinate origin is taken as an X axis to the right, and the coordinate origin is taken as a Y axis to the upper; in the image coordinate system, the coordinate origin is taken as the x axis to the right, and the y axis is taken downwards, the coordinate conversion formula is
x’=w/2+X’*(w/m)
y’=h–Y’*(h/n)
Wherein w is the pixel width of the displayed lane in the screen, h is the pixel height of the displayed lane in the screen, m is the maximum lateral distance of the displayed obstacle from the host vehicle, n is the maximum longitudinal distance of the displayed obstacle from the host vehicle, X 'is the actual lateral distance of the obstacle from the host vehicle, Y' is the actual longitudinal distance of the obstacle from the host vehicle, X 'is the calculated pixel lateral distance of the obstacle from the host vehicle in the interface, and Y' is the calculated pixel longitudinal distance of the obstacle from the host vehicle in the interface;
3) And developing a mobile terminal interface, and displaying the vehicle information and the obstacle information.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211628850.7A CN115964446B (en) | 2022-12-18 | 2022-12-18 | Radar data interaction processing method based on mobile terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211628850.7A CN115964446B (en) | 2022-12-18 | 2022-12-18 | Radar data interaction processing method based on mobile terminal |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115964446A true CN115964446A (en) | 2023-04-14 |
CN115964446B CN115964446B (en) | 2024-07-02 |
Family
ID=87362907
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211628850.7A Active CN115964446B (en) | 2022-12-18 | 2022-12-18 | Radar data interaction processing method based on mobile terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115964446B (en) |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103226833A (en) * | 2013-05-08 | 2013-07-31 | 清华大学 | Point cloud data partitioning method based on three-dimensional laser radar |
CN110726993A (en) * | 2019-09-06 | 2020-01-24 | 武汉光庭科技有限公司 | Obstacle detection method using single line laser radar and millimeter wave radar |
CN110816527A (en) * | 2019-11-27 | 2020-02-21 | 奇瑞汽车股份有限公司 | Vehicle-mounted night vision safety method and system |
CN111469764A (en) * | 2020-04-15 | 2020-07-31 | 厦门华厦学院 | Prediction control method based on mathematical model |
CN111985322A (en) * | 2020-07-14 | 2020-11-24 | 西安理工大学 | Road environment element sensing method based on laser radar |
CN113887276A (en) * | 2021-08-20 | 2022-01-04 | 苏州易航远智智能科技有限公司 | Image-based forward main target detection method |
WO2022022694A1 (en) * | 2020-07-31 | 2022-02-03 | 北京智行者科技有限公司 | Method and system for sensing automated driving environment |
CN114488073A (en) * | 2022-02-14 | 2022-05-13 | 中国第一汽车股份有限公司 | Method for processing point cloud data acquired by laser radar |
CN114488194A (en) * | 2022-01-21 | 2022-05-13 | 常州大学 | Method for detecting and identifying targets under structured road of intelligent driving vehicle |
CN115166700A (en) * | 2022-06-30 | 2022-10-11 | 上海西井信息科技有限公司 | Ground point cloud filtering method, system, equipment and storage medium for laser radar |
-
2022
- 2022-12-18 CN CN202211628850.7A patent/CN115964446B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103226833A (en) * | 2013-05-08 | 2013-07-31 | 清华大学 | Point cloud data partitioning method based on three-dimensional laser radar |
CN110726993A (en) * | 2019-09-06 | 2020-01-24 | 武汉光庭科技有限公司 | Obstacle detection method using single line laser radar and millimeter wave radar |
CN110816527A (en) * | 2019-11-27 | 2020-02-21 | 奇瑞汽车股份有限公司 | Vehicle-mounted night vision safety method and system |
CN111469764A (en) * | 2020-04-15 | 2020-07-31 | 厦门华厦学院 | Prediction control method based on mathematical model |
CN111985322A (en) * | 2020-07-14 | 2020-11-24 | 西安理工大学 | Road environment element sensing method based on laser radar |
WO2022022694A1 (en) * | 2020-07-31 | 2022-02-03 | 北京智行者科技有限公司 | Method and system for sensing automated driving environment |
CN113887276A (en) * | 2021-08-20 | 2022-01-04 | 苏州易航远智智能科技有限公司 | Image-based forward main target detection method |
CN114488194A (en) * | 2022-01-21 | 2022-05-13 | 常州大学 | Method for detecting and identifying targets under structured road of intelligent driving vehicle |
CN114488073A (en) * | 2022-02-14 | 2022-05-13 | 中国第一汽车股份有限公司 | Method for processing point cloud data acquired by laser radar |
CN115166700A (en) * | 2022-06-30 | 2022-10-11 | 上海西井信息科技有限公司 | Ground point cloud filtering method, system, equipment and storage medium for laser radar |
Non-Patent Citations (3)
Title |
---|
YIMING MIAO; YUAN TANG; BANDER A. ALZAHRANI; AHMED BARNAWI; TARIK ALAFIF COMPUTER SCIENCE DEPARTMENT, JAMOUM UNIVERSITY COLLEGE, U: "Airborne LiDAR Assisted Obstacle Recognition and Intrusion Detection Towards Unmanned Aerial Vehicle: Architecture, Modeling and Evaluation", IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 1 October 2020 (2020-10-01), pages 4531, XP011865597, DOI: 10.1109/TITS.2020.3023189 * |
王柠: "基于激光雷达的障碍物检测方法研究", 中国优秀硕士学位论文全文数据库, 15 February 2022 (2022-02-15), pages 035 - 267 * |
王灿;孔斌;杨静;王智灵;祝辉;: "基于三维激光雷达的道路边界提取和障碍物检测算法", 模式识别与人工智能, no. 04, 15 April 2020 (2020-04-15), pages 70 - 79 * |
Also Published As
Publication number | Publication date |
---|---|
CN115964446B (en) | 2024-07-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112894832B (en) | Three-dimensional modeling method, three-dimensional modeling device, electronic equipment and storage medium | |
US11403860B1 (en) | Multi-sensor object detection fusion system and method using point cloud projection | |
CN110136273B (en) | Sample data labeling method and device used in machine learning | |
WO2022141912A1 (en) | Vehicle-road collaboration-oriented sensing information fusion representation and target detection method | |
CN110136182B (en) | Registration method, device, equipment and medium for laser point cloud and 2D image | |
CN113865580B (en) | Method and device for constructing map, electronic equipment and computer readable storage medium | |
US10846844B1 (en) | Collaborative disparity decomposition | |
CN109101861A (en) | Obstacle identity recognition methods, device, equipment and storage medium | |
CN112365549B (en) | Attitude correction method and device for vehicle-mounted camera, storage medium and electronic device | |
CN106774296A (en) | A kind of disorder detection method based on laser radar and ccd video camera information fusion | |
CN105448184A (en) | Map road drawing method and map road drawing device | |
CN113985405A (en) | Obstacle detection method and obstacle detection equipment applied to vehicle | |
WO2022199195A1 (en) | Map updating method and system, vehicle-mounted terminal, server, and storage medium | |
CN113989766A (en) | Road edge detection method and road edge detection equipment applied to vehicle | |
WO2023056789A1 (en) | Obstacle identification method and system for automatic driving of agricultural machine, device, and storage medium | |
Patra et al. | A joint 3d-2d based method for free space detection on roads | |
CN114923477A (en) | Multi-dimensional space-ground collaborative map building system and method based on vision and laser SLAM technology | |
Cunha et al. | Hardware-accelerated data decoding and reconstruction for automotive LiDAR sensors | |
CN116027339A (en) | Mine underground map building and positioning method based on laser radar | |
CN111833443A (en) | Landmark position reconstruction in autonomous machine applications | |
CN117522766A (en) | Obstacle presenting method, apparatus, device, readable storage medium, and program product | |
CN111638487B (en) | Automatic parking test equipment and method | |
CN115964446A (en) | Radar data interaction processing method based on mobile terminal | |
CN113435224B (en) | Method and device for acquiring 3D information of vehicle | |
WO2021189420A1 (en) | Data processing method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant |