CN111127582A - Method, device and system for identifying track overlapping section and storage medium - Google Patents
Method, device and system for identifying track overlapping section and storage medium Download PDFInfo
- Publication number
- CN111127582A CN111127582A CN201811283506.2A CN201811283506A CN111127582A CN 111127582 A CN111127582 A CN 111127582A CN 201811283506 A CN201811283506 A CN 201811283506A CN 111127582 A CN111127582 A CN 111127582A
- Authority
- CN
- China
- Prior art keywords
- track
- trajectory
- pixel
- points
- determining
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/20—Drawing from basic elements, e.g. lines or circles
- G06T11/203—Drawing of straight lines or curves
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
The invention provides a method, a device and a system for identifying a track overlapping section and a storage medium. The method comprises the following steps: acquiring a plurality of tracks; respectively drawing a track line by using the same drawing coordinate system and the same drawing proportion aiming at each track in the plurality of tracks so as to generate a plurality of track images with the same resolution and the same size; determining a track overlapping part according to the plurality of track images; and determining the track overlapping section according to the track overlapping part. Therefore, the identification of the track overlapped section is carried out based on machine vision, the overlapped road sections in any plurality of tracks can be accurately and efficiently identified, and the accuracy and the efficiency of road network updating can be greatly improved.
Description
Technical Field
The invention relates to the field of digital maps, in particular to a method, a device and a system for identifying a track coincident section and a storage medium.
Background
The road network in the digital map needs to be updated in time to ensure the accuracy of the map data. Road network updating is a process of comparing continuous real-time driving track data with the existing road network track data so as to merge repeated road sections and update missing road sections. Generally, the method is divided into two modes of manual updating and automatic updating, and although the quality of the manufactured road network can be ensured by manual updating, the method is time-consuming, labor-consuming and high in cost. Therefore, the existing research focuses on automatic updating of the road network, and has the advantages of extremely high manufacturing efficiency and considerable labor cost saving. The most important one of the road network automatic updating technologies is the identification technology of the overlapped section, and the overlapped section in the track needs to be accurately identified, so that the fusion of the overlapped section and the addition of the missing section can be carried out on the basis of the identification technology, namely, the road network updating is carried out.
The existing track overlapping section identification technologies are mainly divided into two categories: one is a matching calculation based on arithmetic geometry such as distance, direction, etc. The threshold involved in this type of technique is often difficult to determine, is prone to mismatch, and requires traversing all points and lines, which is computationally expensive. Another type is matching calculation based on topology such as connection relationships. Because the technology is based on the improvement of the arithmetic geometry method, the defects of the arithmetic geometry method still exist, error matching is easy to cause, and the track coincident section cannot be accurately identified.
Therefore, a new trajectory coincidence segment identification technology is needed to solve the above problems.
Disclosure of Invention
The present invention has been made in view of the above problems. The invention provides a method, a device and a system for identifying a track overlapping section and a storage medium.
According to an aspect of the present invention, a method for identifying a track overlap section includes:
acquiring a plurality of tracks;
respectively drawing a track line by using the same drawing coordinate system and the same drawing proportion aiming at each track in the plurality of tracks so as to generate a plurality of track images with the same resolution and the same size;
determining a track overlapping part according to the plurality of track images; and
and determining the track overlapping section according to the track overlapping part.
Illustratively, the determining a trajectory coincidence portion from the plurality of trajectory images comprises:
and determining the track superposition part according to the pixel point position shared by all the track lines in the plurality of track images.
Illustratively, the determining the trajectory coincidence portion from the pixel point positions common to all trajectory lines in the plurality of trajectory images comprises:
converting the plurality of track images into binary gray-scale images respectively; and
and determining the track coincidence part according to the pixel point position shared by all the track lines in the binary gray-scale map.
Exemplarily, the pixel value of the pixel point of the trace line in the binary grayscale image is 0, and the pixel value of the pixel point of the non-trace line is not 0;
the determining the trajectory coincidence part according to the pixel point position common to all trajectory lines in the binary gray scale map comprises:
and respectively superposing the pixel values of the same pixel position in the binary gray image, wherein the superposed pixel points with the pixel values of 0 form the track superposition part.
Illustratively, the superimposing the pixel values of the same pixel position in the binary gray scale map respectively includes:
directly superposing pixel values of the same pixel position in the binary gray scale image respectively, wherein the pixel value of a pixel point of which the superposed value exceeds the maximum gray scale value of the binary gray scale image is equal to the maximum gray scale value of the binary gray scale image; or
And weighting and superposing the pixel values of the same pixel position in the binary gray-scale image respectively, wherein the weights are all 1/N, and N is the number of the tracks.
Illustratively, the drawing the trajectory line at the same drawing coordinate system and the same drawing scale respectively for each trajectory in the plurality of trajectories to generate a plurality of trajectory images with the same resolution and the same size includes:
and drawing the trajectory line based on a preset line width, wherein the line width can contain the deviation of the vehicles running in the same lane.
Illustratively, the determining the trajectory coincidence segment from the trajectory coincidence portion includes:
determining the real distance in the X-axis direction and the real distance in the Y-axis direction of each two adjacent pixel points in the track overlapping part;
determining two adjacent pixel points as segmentation points when the real distance in the X-axis direction or the real distance in the Y-axis direction between the two adjacent pixel points is greater than a preset distance; and
and determining the track coincident section according to the segmentation points.
For example, the determining the real distance in the X-axis direction and the real distance in the Y-axis direction for every two adjacent pixel points in the trajectory overlap portion includes:
converting the pixel points of the track coinciding part into a track coordinate system to obtain the real coordinate values of the pixel points of the track coinciding part; and
and aiming at every two adjacent pixel points in the track overlapping part, calculating the real distance in the X-axis direction and the real distance in the Y-axis direction according to the real coordinate values of the two adjacent pixel points.
Exemplarily, the converting the pixel point of the trajectory coincidence part into the trajectory coordinate system to obtain the real coordinate value of the pixel point of the trajectory coincidence part includes:
calibrating conversion parameters from the drawing coordinate system to the trajectory coordinate system through the uppermost vertex, the lowermost vertex, the leftmost vertex and the rightmost vertex in the plurality of trajectories; and
and converting the pixel points of the track coinciding part into the track coordinate system according to the conversion parameters so as to obtain the real coordinate values of the pixel points of the track coinciding part.
Illustratively, the determining the trajectory coincidence segment according to the segmentation point includes:
decomposing the track superposition part into track superposition line segments according to the segmentation points;
for each trajectory coincident line segment:
for each pixel point, respectively searching track points with the shortest distance in the plurality of tracks according to the real coordinate value of the pixel point as coincident track points;
sequencing all coincident track points found in each track according to the time stamps to obtain a plurality of track segments;
and judging whether the advancing directions of the plurality of track segments are consistent according to the time stamp sequence of the head point and the tail point in the plurality of track segments, and determining the plurality of track segments as the track superposition segments under the condition that the advancing directions are consistent.
Illustratively, the method further comprises:
and for the case that the plurality of tracks are not based on the same track coordinate system, before the drawing of the track line, performing coordinate system conversion on the plurality of tracks to obtain a plurality of tracks based on the same track coordinate system.
Illustratively, the method further comprises:
in the case where any of the plurality of trajectories itself contains a coincident segment, the trajectory containing the coincident segment is decomposed into a plurality of trajectories containing no coincident segment before the drawing of the trajectory line.
According to another aspect of the present invention, there is also provided a trajectory overlap section recognition apparatus, including:
an acquisition module for acquiring a plurality of tracks;
the drawing module is used for drawing the track line according to the same drawing coordinate system and the same drawing proportion respectively aiming at each track in the plurality of tracks so as to generate a plurality of track images with the same resolution and the same size;
the overlapping module is used for determining a track overlapping part according to the plurality of track images with the same size;
and the segmentation module is used for determining the track superposition section according to the track superposition part.
According to yet another aspect of the present invention, there is also provided a system for trajectory coincidence segment identification, comprising a processor and a memory, wherein the memory has stored therein computer program instructions for executing the trajectory coincidence segment identification method described above when the computer program instructions are executed by the processor.
According to yet another aspect of the present invention, there is also provided a storage medium having stored thereon program instructions for executing the above-mentioned method of trajectory overlap segment recognition when executed.
According to the method, the device, the system and the storage medium for identifying the track overlapped section, the track overlapped section is identified based on machine vision, and the overlapped road sections in any multiple tracks can be accurately and efficiently identified, so that the accuracy and the efficiency of road network updating can be greatly improved.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
Drawings
The above and other objects, features and advantages of the present invention will become more apparent by describing in more detail embodiments of the present invention with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of the embodiments of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings, like reference numbers generally represent like parts or steps.
FIG. 1 shows a schematic flow diagram of a trajectory overlap segment identification method according to one embodiment of the invention;
FIG. 2 shows a schematic flow diagram of determining a trajectory overlap from pixel point locations common to all trajectory lines in a plurality of trajectory images, according to one embodiment of the present invention;
FIGS. 3A and 3B respectively illustrate a track image according to an embodiment of the present invention, and FIG. 3C illustrates an overlay of the track image shown in FIG. 3A and the track image shown in FIG. 3B;
FIG. 4 shows a schematic flow diagram of determining a trajectory overlap segment from trajectory overlaps according to one embodiment of the present invention;
FIG. 5 is a schematic flow chart diagram illustrating the determination of the true distance in the X-axis direction and the true distance in the Y-axis direction for every two adjacent pixels in the overlapped portion of the trace according to an embodiment of the present invention;
FIG. 6 is a schematic flow chart diagram illustrating the conversion of a pixel of a trace coincident portion to a trace coordinate system to obtain a true coordinate value of the pixel of the trace coincident portion according to one embodiment of the present invention;
FIG. 7 illustrates a diagram of transformation parameters of a drawing coordinate system to a trajectory coordinate system, according to one embodiment of the invention;
FIG. 8 shows a schematic flow diagram of determining a trajectory coincidence segment from trajectory coincidence line segments, according to one embodiment of the present invention; and
fig. 9 shows a schematic block diagram of a trajectory overlap segment recognition device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, exemplary embodiments according to the present invention will be described in detail below with reference to the accompanying drawings. It is to be understood that the described embodiments are merely a subset of embodiments of the invention and not all embodiments of the invention, with the understanding that the invention is not limited to the example embodiments described herein. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the invention described herein without inventive step, shall fall within the scope of protection of the invention.
In order to solve the problems, the invention provides a track coincident section identification method based on machine vision. Next, a track coincidence section identification method according to an embodiment of the present invention will be described with reference to fig. 1. FIG. 1 shows a schematic flow diagram of a trajectory overlap segment identification method 1000 according to one embodiment of the present invention.
As shown in fig. 1, the method 1000 includes step S1100, step S1200, step S1300, and step S1400.
In step S1100, a plurality of trajectories are acquired.
Roads in a digital map consist of time-stamped columns of points. The method can be divided into nodes and directed edges according to a topological network of a road. The directed edges represent drivable roads, the nodes represent intersections or turning points in the roads, and the directed edges are connected through the nodes.
A trace is a time-stamped sequence of points. A section of a travelable road can be represented by a track. One trajectory may be a trajectory acquired through the in-vehicle terminal. A trajectory may also be a sequence of points of a road segment extracted from an existing digital map. One trajectory may also be a trajectory acquired by an aerial device, such as a drone, flying in the direction of road travel.
In order to ensure the accuracy of map data, it is necessary to acquire trajectories for various situations such as a newly opened road and a new lane change, and update the road network based on the existing road network trajectory data.
Step S1200, for each of the plurality of trajectories acquired in step S1100, drawing the trajectory line in the same drawing coordinate system and at the same drawing scale, respectively, to generate a plurality of trajectory images with the same resolution and the same size.
For each of the plurality of trajectories acquired in step S1100, a point column in the trajectory is drawn as a trajectory line from the coordinates of each point using drawing software or a drawing device. Wherein, all the tracks are drawn based on the same drawing coordinate system and the same drawing proportion, and each track line is stored as a track image with the same resolution and the same size. Thereby obtaining a plurality of track images which respectively correspond to different tracks one by one, have mutual reference, and have the same resolution and the same size. The same pixel location in the multiple track images may refer to the same geographic location in the actual map.
In one example, the step S1200 includes: and drawing a track line based on a preset line width, wherein the line width can contain the deviation of the vehicles running in the same lane.
Since the lane has a certain width, the vehicle travels in the lane, and there may be a certain left-right offset. Therefore, there may be some left-right offset in the acquired trajectory. In order to be able to accommodate the deviation of the vehicle traveling in the same lane, the trajectory line may be drawn based on a preset line width. Specifically, the preset line width may be set according to a drawing ratio and a resolution, for example, if the left-right oscillation deviation in the same lane is 1.5 meters, the drawing ratio is 1:500, and the resolution is 100dpi, the preset line width may be 12 pixel widths (points). Therefore, the track line width in the track image can contain the deviation of the vehicle running in the same lane, and the multiple tracks passing through the same section of running road have track overlapping parts, so that the multiple tracks can be accurately identified.
In step S1300, a trajectory overlap portion is determined from the plurality of trajectory images generated in step S1200.
From the plurality of trajectory images generated in step S1200, the trajectory line overlapping portions in the plurality of trajectory images can be "seen" by image processing based on the principle of machine vision. Specifically, a portion where the trajectory lines coexist in the plurality of trajectory images is determined as a trajectory overlapping portion by image processing. This method is fast in computation and free from the confusion of possible mismatching based on arithmetic geometry methods, and is particularly suitable for determining the trajectory overlap of more than two trajectories.
And step S1400, determining a track overlapping section according to the track overlapping part determined in step S1300.
There may be a plurality of sub-portions that are not continuous according to the trajectory coincidence portion in the image determined in step S1300. These sub-portions correspond to one or more coincident segments of the track. The distance between the sub-parts of the track coinciding part can be obtained according to the corresponding relation between the track coinciding part and the track point, and whether the sub-parts of the track coinciding part belong to the same track coinciding section or different track coinciding sections is determined according to the distance. Thereby, the trajectory coincidence section is finally determined.
The method for identifying the track overlapped section is based on machine vision, and can accurately and efficiently identify the overlapped road sections in any plurality of tracks, so that the accuracy and efficiency of road network updating can be greatly improved.
For example, in the case where the plurality of trajectories acquired through step S1100 are not based on the same trajectory coordinate system, before the trajectory line is drawn in step S1200, the plurality of trajectories are subjected to coordinate system conversion to obtain a plurality of trajectories based on the same trajectory coordinate system. And then executing the subsequent steps to carry out track coincidence identification.
Illustratively, in the case where any one of the plurality of trajectories acquired through step S1100 itself contains a coincident segment, the trajectory containing the coincident segment is decomposed into a plurality of trajectories containing no coincident segment before the trajectory line is drawn in step S1200. And then executing the subsequent steps to carry out track coincidence identification. For example, one of the plurality of trajectories acquired in step S1100 includes a multi-turn travel trajectory of the loop road. The trajectory needs to be decomposed into a plurality of trajectories that do not contain overlapping segments; or removing the track of the cyclic repeated driving part in the track to generate the track only containing the single driving process of the circular road. And then executing the subsequent steps to carry out track coincidence identification.
In one embodiment, the step S1300 of determining the trajectory coincidence portion from the plurality of trajectory images includes: determining a trajectory overlap portion according to pixel point positions common to all trajectory lines in the plurality of trajectory images.
The same pixel location in multiple track images refers to the same geographic location in the actual map. Therefore, the pixel point position common to all the trajectory lines in the plurality of trajectory images is the position where the coincident trajectory exists, and corresponds to the road at the same position in the actual map. Various image processing algorithms may be utilized to determine the trajectory overlap. In some embodiments, the traversal may be performed based on pixel positions, and it is determined for each pixel position whether the pixel position is a pixel position common to all the trajectory lines in the plurality of trajectory images. In some embodiments, the determination may be made by summing pixel values, thereby screening out the overlapping portions. Therefore, the track overlapping parts of all the track lines can be determined through simple image processing, and judgment is accurate.
FIG. 2 shows a schematic flow diagram for determining a trajectory overlap based on pixel point locations common to all trajectory lines in a plurality of trajectory images, according to one embodiment of the present invention. As shown in fig. 2, the above-mentioned determination of the track overlapping portion according to the pixel point positions common to all the track lines in the plurality of track images includes steps S1310 and S1320.
In step S1310, the plurality of trajectory images rendered in step S1200 are converted into binary grayscale images, respectively.
Only the trajectory line and non-trajectory line information is included in each trajectory image drawn by step S1200. Accordingly, the plurality of track images drawn through step S1200 may be converted into a binary grayscale image. Wherein the track lines in each track image are all represented by the same gray scale value, and the non-track lines are all represented by another gray scale value. In some embodiments, the trajectory and non-trajectory lines may be represented by 0 and 1, and different values may be used, for example, the trajectory lines are both represented by a gray scale value of 10 and the non-trajectory lines are both represented by a gray scale value of 200.
In step S1320, a locus coincidence portion is determined according to the pixel point position common to all the locus lines in the binary grayscale image converted in step S1310.
The locus overlapping portion can be determined by determining the same pixel position where the gray value is 10 in the plurality of locus images according to the gray value representing the locus line in the binary gray map converted in step S1310, for example, the gray value of the locus line is 10.
According to the embodiment, the plurality of track images are respectively converted into the binary gray level images and then the track overlapping part is confirmed, so that the image algorithm for confirming the track overlapping part can be simplified, and the processing speed of track overlapping section identification is increased.
Illustratively, the pixel value of the pixel point of the trace line in the binary gray scale map is 0, and the pixel value of the pixel point of the non-trace line is not 0, for example, 255. The above determining the locus coincidence part according to the pixel point position common to all the locus lines in the binary gray scale map includes: and respectively superposing pixel values of the same pixel position in the binary gray image, wherein the superposed pixel points with the pixel values of 0 form a track superposition part.
Based on the pixel value of the trajectory line being 0, the pixel value of the pixel position where the trajectory overlapped part exists in the plurality of trajectory images after the pixel value is superimposed is also 0. Therefore, the image algorithm for confirming the track overlapping part is further simplified, and the processing speed of the track overlapping section identification is improved.
For example, the pixel values of the same pixel position in the binary gray scale map may be directly superimposed, wherein the pixel value of the pixel point whose superimposed value exceeds the maximum gray scale value of the binary gray scale map is equal to the maximum gray scale value of the binary gray scale map. For example, the binary grayscale map is composed of a trace line of 0 value and a non-trace line of 255 value, the trace line is displayed in black, and the non-trace line is displayed in white. The gray scale value range of the binary gray scale map is 0-255, namely the maximum gray scale value of the binary gray scale map is 255. The pixel value of the pixel point of the overlapped track part is still 0, and the pixel value of the pixel point of the non-overlapped track part is 255. The superposed image trajectory line and the non-trajectory line generated by the method are obviously different, and the calculation is simple, so that the method is beneficial to accurately and quickly determining the trajectory coincident section in the follow-up process.
Illustratively, the pixel values of the same pixel position in the binary gray scale map may be weighted and superimposed respectively, where the weights are all 1/N, and N is the number of the tracks. And (3) controlling the 1/N weight, wherein the pixel value of the pixel point of the superposed track superposition part is still 0, and the pixel value of the superposed pixel point of the non-track superposition part is not more than the original pixel value of the non-track line. By setting the pixel value to 0 or non-0, the track overlapping portion and the non-track overlapping portion can be distinguished.
Fig. 3A, 3B and 3C show schematic diagrams of determining a trajectory coincidence from multiple trajectory pictures according to one embodiment of the invention. Fig. 3A and 3B are track images with the same resolution and the same size generated by drawing a track line according to the first track and the second track with the same drawing coordinate system, the same drawing proportion and the same preset line width, respectively. Preferably, the binary grayscale image can be directly drawn. Wherein the pixel value of the trace line is 0, i.e., a black portion in fig. 3A and 3B; the non-trace line has a pixel value of 255, i.e., a white portion in fig. 3A and 3B. The pixel values of the same pixel position in fig. 3A and fig. 3B are weighted and superimposed respectively, wherein the weights are 1/2, and the superimposed graph shown in fig. 3C is obtained. In fig. 3C, a black portion having a pixel value of 0 is a track-overlapped portion of the first track and the second track, a gray portion having a pixel value of 122 is a non-track-overlapped portion of the first track and the second track, and a white portion having a pixel value of 255 is a non-track portion.
The above examples give two image processing methods for confirming the overlapping portion of the trajectory. The algorithm is simple and easy to realize, and the processing efficiency of the track overlapping section identification is improved.
FIG. 4 shows a schematic flow chart of step S1400 for determining a track overlap section according to a track overlap portion according to an embodiment of the present invention. As shown in fig. 4, step S1400 includes step S1410, step S1420, and step S1430.
Step S1410, determining the true distance in the X-axis direction and the true distance in the Y-axis direction for every two adjacent pixel points in the trajectory overlap portion determined in step S1300.
The trajectory overlapping portions of the plurality of trajectory images are determined by step S1300. When the distance between adjacent pixel points in the track overlapping part is smaller than a certain distance range, the two adjacent pixel points are represented as the adjacent pixel points and belong to the same track overlapping section. A larger distance may exist between adjacent pixel points in the track overlapping part, which indicates that the two adjacent pixel points may belong to different track overlapping sections. Optionally, the true distance in the X-axis direction and the true distance in the Y-axis direction of two adjacent pixel points may be calculated, and it is determined that the two adjacent pixel points belong to the same trajectory coincidence segment or different trajectory coincidence segments according to the true distance in the X-axis direction and the true distance in the Y-axis direction. The real distance between two adjacent pixels is the actual distance between the corresponding positions of the two adjacent pixels in the track coordinate system (e.g., world coordinate system). The real distance of adjacent pixel points in the track coincident part can be inversely calculated according to the drawing proportion and the resolution ratio, so that different track coincident sections can be decomposed according to parameters such as turning radius of a practical road bifurcation point. For example, according to the drawing scale, the distance between two adjacent pixels in the X-axis direction is 200points, the resolution is 100dpi, and the drawing scale is 1:500, so that the real distance between the two adjacent pixels in the X-axis direction is 200 ÷ 100 × 500 × 0.0254 ═ 25.4 m (note: 1 inch ═ 0.0254 m).
In step S1420, in the case that the real distance in the X-axis direction or the real distance in the Y-axis direction between two adjacent pixel points determined in step S1410 is greater than the preset distance, it is determined that the two adjacent pixel points are segment points.
In step S1410, the true distance in the X-axis direction or the true distance in the Y-axis direction between two adjacent pixels is determined, and it is determined whether the true distance in the X-axis direction or the true distance in the Y-axis direction is greater than the preset distance. And for the condition that the distance is greater than the preset distance, determining that the two adjacent pixel points belong to different track overlapping sections, and simultaneously determining that the two adjacent pixel points are segmentation points for decomposing the track overlapping part into different track overlapping sections. And determining that the two adjacent pixel points belong to the same track coincidence section under the condition that the distance is not more than the preset distance. The preset distance may be, for example, a turning radius of an actual road bifurcation point, and may be set to, for example, 3 meters. Whether the two adjacent pixel points are the segmentation points is determined according to the real distance in the X-axis direction or the real distance in the Y-axis direction of the two adjacent pixel points, the algorithm implementation of distance judgment between the two adjacent pixel points is simplified, and the processing efficiency of track coincidence identification is improved.
In step S1430, a trajectory coincidence segment is determined from the segmentation points determined in step S1420.
The trajectory coincidence portion is decomposed by the segmentation point determined in step S1420 to obtain a trajectory coincidence segment. As shown in fig. 3A and 3B, the first track and the second track both pass through the first road from the first intersection to the second intersection, and then pass through the second road from the third intersection to the fourth intersection. The locus overlapping portions of the first locus image (fig. 3A) and the second locus image (fig. 3B), such as the black portions shown in fig. 3C, are determined through step S1300. The pixel point of the second intersection and the pixel point of the third intersection are adjacent pixel points in the track overlapping portion of fig. 3C, the true distance in the X-axis direction or the true distance in the Y-axis direction between the two adjacent pixel points is determined through step S1410, and it is determined that the true distance in the X-axis direction between the two adjacent pixel points is greater than the preset distance through step S1420, and it is determined that the two adjacent pixel points are segment points. Similarly, the real distance of other adjacent pixel points in the X-axis direction or the real distance of other adjacent pixel points in the Y-axis direction is not greater than the preset distance, and is not a segmentation point. Therefore, the track superposition part can be decomposed into two track superposition sections according to the pixel points of the second intersection and the pixel points of the third intersection.
In the embodiment, the segmentation point is determined according to the real distance between the adjacent pixel points in the track overlapping part, so that the track overlapping part is decomposed. The method has the advantages of simple algorithm and easy realization, can obtain the same accurate recognition result for a plurality of tracks of a complex road type or a simple road type, and improves the accuracy of track coincident section recognition.
Fig. 5 shows a schematic flowchart of step S1410, which determines the true distance in the X-axis direction and the true distance in the Y-axis direction for each two adjacent pixel points in the trajectory overlap portion, according to an embodiment of the present invention. As shown in fig. 5, step S1410 includes step S1411 and step S1412.
Step S1411, converting the pixel point of the trajectory coincidence portion determined in step S1300 into a trajectory coordinate system to obtain a real coordinate value of the pixel point of the trajectory coincidence portion.
And converting the pixel points of the track overlapping part determined in the step S1300 into the track coordinate system according to the conversion relationship between the drawing coordinate system and the track coordinate system, so as to obtain the real coordinate values of the pixel points of the track overlapping part.
In step S1412, for every two adjacent pixels in the trajectory overlap portion, the real distance in the X-axis direction and the real distance in the Y-axis direction are calculated according to the real coordinate values of the two pixels obtained in step S1411.
After the real coordinate values of every two adjacent pixel points in the trajectory overlap portion are obtained in step S1411, the real distance in the X-axis direction and the real distance in the Y-axis direction can be calculated according to the real coordinate values.
Therefore, the accurate real distance in the X-axis direction and the accurate real distance in the Y-axis direction between every two adjacent pixel points in the track overlapping part are obtained, and the accuracy of track overlapping identification is improved.
Fig. 6 shows a schematic flow chart of step S1411 converting the pixel points of the trace overlapping portion to the trace coordinate system to obtain the real coordinate values of the pixel points of the trace overlapping portion according to an embodiment of the present invention. As shown in fig. 6, step S1411 includes step S1411a and step S1411 b.
In step S1411a, the conversion parameters from the drawing coordinate system to the trajectory coordinate system are calibrated by the uppermost vertex, the lowermost vertex, the leftmost vertex, and the rightmost vertex in the plurality of trajectories.
The plurality of trajectory images generated by step S1200 are based on the same drawing coordinate system. And calibrating the conversion parameters from the drawing coordinate system to the track coordinate system by taking the uppermost vertex, the lowermost vertex, the leftmost vertex and the rightmost vertex in the plurality of track images as 4 calibration reference points. It is to be understood that the uppermost vertex, the lowermost vertex, the leftmost vertex, and the rightmost vertex in the plurality of trajectory images are the uppermost vertex, the lowermost vertex, the leftmost vertex, and the rightmost vertex in a trajectory line common to the plurality of trajectory images. FIG. 7 illustrates a diagram of transformation parameters of a drawing coordinate system to a trajectory coordinate system, according to one embodiment of the invention. As shown in fig. 7, { x, y } is a drawing coordinate system, and { x ', y' } is a trajectory coordinate system. The origin offset of the two coordinate systems is { x }0,y0And the rotation angle is theta. The coordinate values of the 4 calibration reference points in the drawing coordinate system and the coordinate values in the track coordinate system can be substituted according to the formula 1, so that the conversion parameter { x ] from the drawing coordinate system to the track coordinate system is solved0,y0θ }. And substituting { x, y } into the coordinate value of the calibration reference point in the drawing coordinate system, and substituting { x ', y' } into the coordinate value of the track point corresponding to the calibration reference point.
Step S1411b, converting the pixel point of the trajectory coincidence part into a trajectory coordinate system according to the conversion parameters calibrated in step S1411a, so as to obtain a real coordinate value of the pixel point of the trajectory coincidence part.
According to the coordinate value of the pixel point of the track superposition part in the drawing coordinate system, based on the conversion parameter { x } calibrated in the step S1411a0,y0Theta, the coordinate value of the pixel point in the track coordinate system, that is, the real coordinate value of the pixel point, can be calculated by using formula 1.
The conversion parameters from the drawing coordinate system to the track coordinate system are calibrated by taking the uppermost vertex, the lowermost vertex, the leftmost vertex and the rightmost vertex in the track images as calibration reference points, and the conversion parameters from the drawing coordinate system to the track coordinate system are calibrated by utilizing the maximum range of the existing track, so that the conversion parameters with high accuracy are obtained, and the accuracy of identification of the track overlapped section is improved.
As described above, in step S1430 described above, a trajectory coincidence segment is determined from the segmentation points determined in step S1420. In one example, the trajectory coincidence portion is decomposed into trajectory coincidence line segments based on the determined segmentation points. It will be appreciated that the trajectory-coincident line segments represent only coincident portions of the trajectories in the plurality of trajectory images. The trajectory coincidence segment is not necessarily the desired trajectory coincidence segment because of the driving direction, etc. Therefore, the trajectory coincidence segment needs to be determined according to the trajectory coincidence line segment. FIG. 8 shows a schematic flow diagram for determining a trajectory coincidence segment from trajectory coincidence line segments, according to one embodiment of the present invention. As shown in fig. 8, for each trajectory-coincident line segment, the following steps are performed to determine a trajectory-coincident segment: step S1431, step S1432, and step S1433.
Step S1431, for each pixel point in the trajectory overlap line segment decomposed in step S1420, finding a closest trajectory point in the multiple trajectories according to the real coordinate value thereof, respectively, as an overlap trajectory point.
It should be understood that the pixel points in the overlapped line segments of the tracks do not necessarily have the trace points corresponding to the exact positions in each track. That is, the real coordinate values obtained after the pixel points in the track overlapping line segments are converted into the track coordinate system are not necessarily exactly the track points in each track. For example, the real coordinate value of a pixel point in a trajectory coincident line segment determined based on a trajectory obtained by expanding the line width to a preset line width is likely to have no accurately corresponding trajectory point. Therefore, the trace point with the shortest distance of the real coordinate value of the pixel point in the trace coincident line segment in each trace can be taken as the coincident trace point. It can be understood that there may be a plurality of pixel points corresponding to the same coincident trace point in the trace coincident line segment.
Step S1432, sort all the coincident track points found in each track by step S1431 by time stamp to obtain a plurality of track segments.
Each coincident track point has time stamp information, and all coincident track points found in each track are sorted according to the time stamp to obtain a track segment corresponding to the track coincident line segment in the track. Thus, a track segment of the plurality of tracks corresponding to the same track overlapping line segment is obtained.
Step S1433 is to determine whether the advance directions of the plurality of track segments are consistent according to the time stamp sequence of the head and the tail points in the plurality of track segments obtained in step S1432, and determine that the plurality of track segments are track overlapping segments when the advance directions are consistent.
The trajectory overlap segment identification also includes identification of the direction of travel. Only the track running through the same road section in the same driving direction in the plurality of tracks can be used as a track overlapping section. By using the time stamp sequence of the head and end points in the plurality of track segments obtained in step S1432, it can be determined whether the advancing directions of the plurality of track segments are consistent. And determining the plurality of track segments as track coincident segments when the advancing directions are consistent.
According to the above embodiment of the present invention, the trajectory coincidence part obtained based on the image operation is mapped back to the trajectory coordinate system, thereby accurately identifying the trajectory coincidence section. The method is simple in algorithm and easy to implement, and can accurately and efficiently identify the overlapped road sections in any plurality of tracks, so that the accuracy and efficiency of road network updating can be greatly improved.
According to another embodiment of the invention, a track coincident section recognition device is also provided. Fig. 9 shows a schematic block diagram of a trajectory overlap segment recognition device 900 according to an embodiment of the present invention. As shown in fig. 9, the trajectory overlap section recognition apparatus 900 includes an acquisition module 910, a drawing module 920, a reclosing module 930, and a segmentation module 940.
An obtaining module 910, configured to obtain a plurality of tracks.
And a drawing module 920, configured to draw the trajectory line in the same drawing coordinate system and the same drawing scale for each of the multiple trajectories, so as to generate multiple trajectory images with the same resolution and the same size.
An overlap module 930 configured to determine a track overlap portion according to the plurality of track images with the same size.
A segmenting module 940, configured to determine the trajectory coincidence segment according to the trajectory coincidence portion.
In summary, each module in the track overlapping section identifying apparatus 900 is configured to specifically execute the corresponding step in the track overlapping section identifying method. From reading the above description of the method, those skilled in the art can understand the specific implementation and technical effects of the above track overlap section recognition apparatus 900.
According to yet another aspect of the present invention, there is also provided a system for trajectory coincidence segment identification, comprising a processor and a memory, wherein the memory stores therein computer program instructions for implementing the steps in the trajectory coincidence segment identification method according to an embodiment of the present invention. The processor is configured to execute the computer program instructions stored in the memory to execute the corresponding steps of the track coincident segment identification method according to the embodiment of the present invention, and is configured to implement the obtaining module 910, the drawing module 920, the reclosing module 930 and the segmentation module 940 in the track coincident segment identification device according to the embodiment of the present invention.
Furthermore, according to still another aspect of the present invention, there is also provided a storage medium on which program instructions are stored, which when executed by a computer or a processor cause the computer or the processor to execute the respective steps of the trajectory coincidence segment identifying method according to the embodiment of the present invention and to implement the respective modules in the trajectory coincidence segment identifying device according to the embodiment of the present invention. The storage medium may include, for example, a storage component of a tablet computer, a hard disk of a personal computer, Read Only Memory (ROM), Erasable Programmable Read Only Memory (EPROM), portable compact disc read only memory (CD-ROM), USB memory, or any combination of the above storage media. The computer-readable storage medium may be any combination of one or more computer-readable storage media.
Although the illustrative embodiments have been described herein with reference to the accompanying drawings, it is to be understood that the foregoing illustrative embodiments are merely exemplary and are not intended to limit the scope of the invention thereto. Various changes and modifications may be effected therein by one of ordinary skill in the pertinent art without departing from the scope or spirit of the present invention. All such changes and modifications are intended to be included within the scope of the present invention as set forth in the appended claims.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the units is only one logical functional division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another device, or some features may be omitted, or not executed.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the invention and aiding in the understanding of one or more of the various inventive aspects. However, the method of the present invention should not be construed to reflect the intent: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
It will be understood by those skilled in the art that all of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where such features are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the claims, any of the claimed embodiments may be used in any combination.
The various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functions of some of the modules in an apparatus for lidar calibration according to embodiments of the present invention. The present invention may also be embodied as apparatus programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present invention may be stored on computer-readable media or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
The above description is only for the specific embodiment of the present invention or the description thereof, and the protection scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and the changes or substitutions should be covered within the protection scope of the present invention. The protection scope of the present invention shall be subject to the protection scope of the claims.
Claims (10)
1. A track coincident section identification method comprises the following steps:
acquiring a plurality of tracks;
respectively drawing a track line by using the same drawing coordinate system and the same drawing proportion aiming at each track in the plurality of tracks so as to generate a plurality of track images with the same resolution and the same size;
determining a track overlapping part according to the plurality of track images; and
and determining the track overlapping section according to the track overlapping part.
2. The method of claim 1, wherein said determining a trajectory coincidence from the plurality of trajectory images comprises:
and determining the track superposition part according to the pixel point position shared by all the track lines in the plurality of track images.
3. The method of claim 2, wherein said determining the trajectory overlap from pixel point locations common to all trajectory lines in the plurality of trajectory images comprises:
converting the plurality of track images into binary gray-scale images respectively; and
and determining the track coincidence part according to the pixel point position shared by all the track lines in the binary gray-scale map.
4. The method of claim 3, wherein the pixel values of the pixel points of the trajectory line in the binary gray scale map are 0, and the pixel values of the pixel points of the non-trajectory line are not 0;
the determining the trajectory coincidence part according to the pixel point position common to all trajectory lines in the binary gray scale map comprises:
and respectively superposing the pixel values of the same pixel position in the binary gray image, wherein the superposed pixel points with the pixel values of 0 form the track superposition part.
5. The method of claim 4, wherein said separately superimposing pixel values of the same pixel location in said binary gray scale map comprises:
directly superposing pixel values of the same pixel position in the binary gray scale image respectively, wherein the pixel value of a pixel point of which the superposed value exceeds the maximum gray scale value of the binary gray scale image is equal to the maximum gray scale value of the binary gray scale image; or
And weighting and superposing the pixel values of the same pixel position in the binary gray-scale image respectively, wherein the weights are all 1/N, and N is the number of the tracks.
6. The method of any one of claims 1 to 5, wherein the step of drawing the trajectory line at the same drawing coordinate system and the same drawing scale for each trajectory of the plurality of trajectories to generate a plurality of trajectory images of the same resolution and the same size comprises:
and drawing the trajectory line based on a preset line width, wherein the line width can contain the deviation of the vehicles running in the same lane.
7. The method of any of claims 1 to 5, wherein said determining the trajectory coincidence segment from the trajectory coincidence portion comprises:
determining the real distance in the X-axis direction and the real distance in the Y-axis direction of each two adjacent pixel points in the track overlapping part;
determining two adjacent pixel points as segmentation points when the real distance in the X-axis direction or the real distance in the Y-axis direction between the two adjacent pixel points is greater than a preset distance; and
and determining the track coincident section according to the segmentation points.
8. The method of claim 7, wherein the determining the true distance in the X-axis direction and the true distance in the Y-axis direction for each two adjacent pixel points in the trajectory overlap portion comprises:
converting the pixel points of the track coinciding part into a track coordinate system to obtain the real coordinate values of the pixel points of the track coinciding part; and
and aiming at every two adjacent pixel points in the track overlapping part, calculating the real distance in the X-axis direction and the real distance in the Y-axis direction according to the real coordinate values of the two adjacent pixel points.
9. The method of claim 8, wherein said converting the pixel points of the trajectory coincidence portion to a trajectory coordinate system to obtain real coordinate values of the pixel points of the trajectory coincidence portion comprises:
calibrating conversion parameters from the drawing coordinate system to the trajectory coordinate system through the uppermost vertex, the lowermost vertex, the leftmost vertex and the rightmost vertex in the plurality of trajectories; and
and converting the pixel points of the track coinciding part into the track coordinate system according to the conversion parameters so as to obtain the real coordinate values of the pixel points of the track coinciding part.
10. The method of claim 7, wherein said determining the trajectory coincidence segment from the segmentation points comprises:
decomposing the track superposition part into track superposition line segments according to the segmentation points;
for each trajectory coincident line segment:
for each pixel point, respectively searching track points with the shortest distance in the plurality of tracks according to the real coordinate value of the pixel point as coincident track points;
sequencing all coincident track points found in each track according to the time stamps to obtain a plurality of track segments;
and judging whether the advancing directions of the plurality of track segments are consistent according to the time stamp sequence of the head point and the tail point in the plurality of track segments, and determining the plurality of track segments as the track superposition segments under the condition that the advancing directions are consistent.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811283506.2A CN111127582B (en) | 2018-10-31 | 2018-10-31 | Track overlapping section identification method, device, system and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811283506.2A CN111127582B (en) | 2018-10-31 | 2018-10-31 | Track overlapping section identification method, device, system and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111127582A true CN111127582A (en) | 2020-05-08 |
CN111127582B CN111127582B (en) | 2023-06-23 |
Family
ID=70485120
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811283506.2A Active CN111127582B (en) | 2018-10-31 | 2018-10-31 | Track overlapping section identification method, device, system and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111127582B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111784730A (en) * | 2020-07-01 | 2020-10-16 | 杭州海康威视数字技术股份有限公司 | Object tracking method and device, electronic equipment and storage medium |
CN113436248A (en) * | 2021-06-18 | 2021-09-24 | 黑龙江惠达科技发展有限公司 | Method and device for calculating operating area of agricultural machine |
CN114372313A (en) * | 2022-01-07 | 2022-04-19 | 上海盎维信息技术有限公司 | Image processing method and system for actual measurement and laser scanner |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04190283A (en) * | 1990-11-24 | 1992-07-08 | Hitachi Ltd | Car running position display method and its device |
JP2003330361A (en) * | 2002-05-10 | 2003-11-19 | Mitsubishi Electric Corp | Map data updating system and map data editing system |
JP2004258981A (en) * | 2003-02-26 | 2004-09-16 | Hitachi Ltd | Vehicle monitoring method and device |
US20060217879A1 (en) * | 2004-07-16 | 2006-09-28 | Tomoya Ikeuchi | Map information processing device |
CN1967151A (en) * | 2005-11-16 | 2007-05-23 | 日产自动车株式会社 | Map data updating system and map data updating method |
JP2008039687A (en) * | 2006-08-09 | 2008-02-21 | Denso Corp | Road map updating system and vehicle-side device used for the same |
WO2015063422A2 (en) * | 2013-11-04 | 2015-05-07 | Renault S.A.S. | Device for detecting the lateral position of a pedestrian relative to the trajectory of the vehicle |
CN104732789A (en) * | 2015-04-08 | 2015-06-24 | 山东大学 | Method for generating road network map based on bus GPS data |
US20170236284A1 (en) * | 2016-02-13 | 2017-08-17 | University Of Rochester | Registration of aerial imagery to vector road maps with on-road vehicular detection and tracking |
CN108564657A (en) * | 2017-12-28 | 2018-09-21 | 达闼科技(北京)有限公司 | A kind of map constructing method, electronic equipment and readable storage medium storing program for executing based on high in the clouds |
-
2018
- 2018-10-31 CN CN201811283506.2A patent/CN111127582B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04190283A (en) * | 1990-11-24 | 1992-07-08 | Hitachi Ltd | Car running position display method and its device |
JP2003330361A (en) * | 2002-05-10 | 2003-11-19 | Mitsubishi Electric Corp | Map data updating system and map data editing system |
JP2004258981A (en) * | 2003-02-26 | 2004-09-16 | Hitachi Ltd | Vehicle monitoring method and device |
US20060217879A1 (en) * | 2004-07-16 | 2006-09-28 | Tomoya Ikeuchi | Map information processing device |
CN1967151A (en) * | 2005-11-16 | 2007-05-23 | 日产自动车株式会社 | Map data updating system and map data updating method |
JP2008039687A (en) * | 2006-08-09 | 2008-02-21 | Denso Corp | Road map updating system and vehicle-side device used for the same |
WO2015063422A2 (en) * | 2013-11-04 | 2015-05-07 | Renault S.A.S. | Device for detecting the lateral position of a pedestrian relative to the trajectory of the vehicle |
CN104732789A (en) * | 2015-04-08 | 2015-06-24 | 山东大学 | Method for generating road network map based on bus GPS data |
US20170236284A1 (en) * | 2016-02-13 | 2017-08-17 | University Of Rochester | Registration of aerial imagery to vector road maps with on-road vehicular detection and tracking |
CN108564657A (en) * | 2017-12-28 | 2018-09-21 | 达闼科技(北京)有限公司 | A kind of map constructing method, electronic equipment and readable storage medium storing program for executing based on high in the clouds |
Non-Patent Citations (1)
Title |
---|
杨伟;艾廷华;: "基于车辆轨迹大数据的道路网更新方法研究", no. 12 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111784730A (en) * | 2020-07-01 | 2020-10-16 | 杭州海康威视数字技术股份有限公司 | Object tracking method and device, electronic equipment and storage medium |
CN111784730B (en) * | 2020-07-01 | 2024-05-03 | 杭州海康威视数字技术股份有限公司 | Object tracking method and device, electronic equipment and storage medium |
CN113436248A (en) * | 2021-06-18 | 2021-09-24 | 黑龙江惠达科技发展有限公司 | Method and device for calculating operating area of agricultural machine |
CN114372313A (en) * | 2022-01-07 | 2022-04-19 | 上海盎维信息技术有限公司 | Image processing method and system for actual measurement and laser scanner |
Also Published As
Publication number | Publication date |
---|---|
CN111127582B (en) | 2023-06-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3792901B1 (en) | Ground mark extraction method, model training method, device and storage medium | |
EP3109842B1 (en) | Map-centric map matching method and apparatus | |
CN107341453B (en) | Lane line extraction method and device | |
Borkar et al. | A novel lane detection system with efficient ground truth generation | |
JP5057183B2 (en) | Reference data generation system and position positioning system for landscape matching | |
WO2018068653A1 (en) | Point cloud data processing method and apparatus, and storage medium | |
CN108279016B (en) | Smoothing processing method and device for HAD map, navigation system and automatic driving system | |
CN112560747B (en) | Lane boundary interactive extraction method based on vehicle-mounted point cloud data | |
CN111127582B (en) | Track overlapping section identification method, device, system and storage medium | |
CN110969592B (en) | Image fusion method, automatic driving control method, device and equipment | |
US20220219700A1 (en) | Apparatus, method, and computer program for generating map | |
CN113033497B (en) | Lane line identification method, device, equipment and computer readable storage medium | |
CN112418193B (en) | Lane line identification method and system | |
CN115131363A (en) | Positioning method and device based on semantic information and terminal equipment | |
Lu et al. | A lightweight real-time 3D LiDAR SLAM for autonomous vehicles in large-scale urban environment | |
CN115035251A (en) | Bridge deck vehicle real-time tracking method based on domain-enhanced synthetic data set | |
Yuan et al. | Estimation of vehicle pose and position with monocular camera at urban road intersections | |
US11835359B2 (en) | Apparatus, method and computer program for generating map | |
Lee et al. | Semi-automatic framework for traffic landmark annotation | |
CN112381726B (en) | Construction method and device for global map of underground garage | |
CN111238524B (en) | Visual positioning method and device | |
CN102938156B (en) | Planar note configuration method based on integral images | |
KR102425346B1 (en) | Apparatus for determining position of vehicle and method thereof | |
CN115249223A (en) | Dynamic target detection method and device, storage medium and terminal | |
Sai et al. | Detection of Lanes and Objects Using Deep Learning Techniques |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |