CN112712040A - Method, device and equipment for calibrating lane line information based on radar and storage medium - Google Patents
Method, device and equipment for calibrating lane line information based on radar and storage medium Download PDFInfo
- Publication number
- CN112712040A CN112712040A CN202011641279.3A CN202011641279A CN112712040A CN 112712040 A CN112712040 A CN 112712040A CN 202011641279 A CN202011641279 A CN 202011641279A CN 112712040 A CN112712040 A CN 112712040A
- Authority
- CN
- China
- Prior art keywords
- lane line
- line information
- road
- information
- curve
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/23—Clustering techniques
- G06F18/232—Non-hierarchical techniques
- G06F18/2321—Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/757—Matching configurations of points or features
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Multimedia (AREA)
- Medical Informatics (AREA)
- Databases & Information Systems (AREA)
- General Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- Software Systems (AREA)
- Probability & Statistics with Applications (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- General Engineering & Computer Science (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention discloses a method, a device, equipment and a storage medium for calibrating lane line information based on radar, wherein the method comprises the following steps: obtaining static target information on two sides of a road through radar equipment arranged on a vehicle; clustering the static target information, and performing curve fitting on the classified clusters; and matching the fitted curve with the characteristics of the lane line information identified by the camera, if the matching is successful, executing a normal vehicle control command, and if the matching is unsuccessful, calibrating the lane line information according to the fitted curve. According to the method for calibrating the lane line information, provided by the embodiment of the disclosure, the millimeter wave radar is adopted to extract the road information, and the lane line information can be mutually verified with the lane line information identified by the camera under the condition of not increasing a system sensor, so that the accuracy of lane line identification is greatly improved.
Description
Technical Field
The invention relates to the technical field of intelligent automobiles, in particular to a method, a device, equipment and a storage medium for calibrating lane line information based on radar.
Background
The method is an important technical problem in the field of intelligent driving of automobiles, and can accurately identify the lane line information of the road. At present, LKA (Lane Keeping Assistance system) is mainly based on a single-camera scheme, a Lane line is identified through a single camera arranged on a vehicle, a deviation between the vehicle and a Lane center is finally calculated, and the purpose of vehicle centered running is achieved by reducing the deviation through steering control. However, the function only senses the lane information by a front single camera, the fuzzy and damaged lane lines of the actual road cause great interference to the camera, the false recognition sometimes occurs, and great danger is caused under the high-speed scene. Still based on the scheme of two camera discernments, two cameras not only increase the cost, and what two camera discernments moreover all is lane line information, if the lane line damages, the condition that two camera detections are all inaccurate can appear, so both can not be fine carry out the redundancy design.
Disclosure of Invention
The embodiment of the disclosure provides a method, a device, equipment and a storage medium for calibrating lane line information based on radar. The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview and is intended to neither identify key/critical elements nor delineate the scope of such embodiments. Its sole purpose is to present some concepts in a simplified form as a prelude to the more detailed description that is presented later.
In a first aspect, an embodiment of the present disclosure provides a method for calibrating lane line information based on a radar, including:
obtaining static target information on two sides of a road through radar equipment arranged on a vehicle;
clustering static target information, and performing curve fitting on the classified clusters;
and matching the fitted curve with the lane line information identified by the camera in a characteristic manner, if the matching is successful, executing a normal vehicle control command, and if the matching is unsuccessful, calibrating the lane line information according to the fitted curve.
In one embodiment, before obtaining the stationary target information on both sides of the road by the radar device mounted on the vehicle, the method further comprises:
and acquiring the lane line information identified by the vehicle camera and the confidence information of the lane line.
In one embodiment, after acquiring the lane line information identified by the vehicle camera and the confidence information of the lane line, the method further includes:
judging the road type according to the lane line information;
if the road type is a straight road, directly executing a vehicle control instruction without calibrating lane line information;
and if the road type is a curved road, acquiring the information of the static targets at the two sides of the road through radar equipment installed on the vehicle.
In one embodiment, clustering stationary target information and curve fitting the classified clusters includes:
clustering static target information through a clustering algorithm based on density growth;
and performing curve fitting on each classified cluster to obtain a fitted cubic function curve.
In one embodiment, after curve fitting the classified clusters, the method further includes:
determining the confidence coefficient of the fitting curve according to the distance between the vehicle and the fitting curve and the number of the static targets in the cluster where the fitting curve is located;
when the confidence coefficient is greater than or equal to a preset first threshold value, performing feature matching on the fitted curve and the lane line information identified by the camera, if the matching is successful, executing a normal vehicle control instruction, and if the matching is unsuccessful, calibrating the lane line information according to the fitted curve;
and when the confidence coefficient is smaller than a preset first threshold value, extracting obstacle information output by the radar and the camera, determining a track to be traveled by the vehicle according to a control instruction to be executed, if no obstacle exists in the track within a preset time period, executing the control instruction, otherwise, not executing, and sending alarm information.
In one embodiment, the matching the fitted curve with the lane line information identified by the camera includes:
determining a first curvature and a first curvature change rate of the road according to the lane line information identified by the camera;
determining a second curvature and a second curvature change rate of the road according to the fitted curve;
and if the difference value between the first curvature and the second curvature is smaller than a preset second threshold value and the difference value between the first curvature change rate and the second curvature change rate is smaller than a preset third threshold value, determining that the matching is successful.
In one embodiment, calibrating lane line information from the fitted curve comprises:
obtaining the confidence coefficient of the fitted curve and the confidence coefficient of the lane line identified by the camera;
and correcting the characteristic information of the lane line by adopting a weighted average mode according to the confidence degrees of the two.
In a second aspect, an embodiment of the present disclosure provides an apparatus for calibrating lane line information based on a radar, including:
the acquisition module is used for acquiring static target information on two sides of a road through radar equipment installed on a vehicle;
the curve fitting module is used for clustering the static target information and performing curve fitting on the classified clusters;
and the calibration module is used for carrying out feature matching on the fitted curve and the lane line information identified by the camera, executing a normal vehicle control command if the matching is successful, and calibrating the lane line information according to the fitted curve if the matching is unsuccessful.
In a third aspect, the present disclosure provides an apparatus for calibrating lane line information based on radar, including a processor and a memory storing program instructions, where the processor is configured to execute the method for calibrating lane line information based on radar provided in the foregoing embodiments when executing the program instructions.
In a fourth aspect, the disclosed embodiments provide a computer-readable medium having computer-readable instructions stored thereon, where the computer-readable instructions are executable by a processor to implement a method for calibrating lane line information based on radar provided in the above embodiments.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
according to the method for calibrating lane line information provided by the embodiment of the disclosure, the millimeter wave radar is adopted to extract the static target information at two sides of a road, the road characteristics are extracted by utilizing the information of road guardrails, trees and the like, the road characteristics are mutually verified with the lane line information identified by the camera, and different road information is adopted for verification, so that the verification accuracy and the system safety are greatly improved, and the problem of inaccurate lane line identification caused by interference on the identification of the camera lane line due to the situations of fuzzy lane line, damage and the like of an actual road in the prior art is solved. In addition, the method in the embodiment of the disclosure does not increase system sensors, and reduces cost.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
FIG. 1 is a flow diagram illustrating a method for calibrating lane-line information based on radar in accordance with an exemplary embodiment;
FIG. 2 is a flow diagram illustrating a method for calibrating lane-line information based on radar in accordance with an exemplary embodiment;
FIG. 3 is a schematic diagram illustrating clustering of stationary objects in accordance with an exemplary embodiment;
FIG. 4 is a schematic diagram illustrating a curve fitting of clustered clusters in accordance with an exemplary embodiment;
FIG. 5 is a block diagram illustrating an apparatus for calibrating lane-line information based on radar in accordance with an exemplary embodiment;
FIG. 6 is a block diagram illustrating an apparatus for calibrating lane-line information based on radar in accordance with an exemplary embodiment;
FIG. 7 is a schematic diagram illustrating a computer storage medium in accordance with an exemplary embodiment.
Detailed Description
The following description and the drawings sufficiently illustrate specific embodiments of the invention to enable those skilled in the art to practice them.
It should be understood that the described embodiments are only some embodiments of the invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present invention. Rather, they are merely examples of systems and methods consistent with certain aspects of the invention, as detailed in the appended claims.
In the description of the present invention, it is to be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art. In addition, in the description of the present invention, "a plurality" means two or more unless otherwise specified. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
The embodiment of the disclosure provides a method for detecting a lane line by an auxiliary calibration camera, which comprises the steps of extracting static target information on two sides of a road through a vehicle forward millimeter wave radar, fitting the static targets into a cubic equation according to the extracted information, calibrating lane line information identified by the camera through the equation, and allowing a system to perform large-angle steering control on the vehicle if the static targets can be matched with the cubic equation, or only allowing the vehicle to perform small-range control if the static targets can be matched with the cubic equation. The millimeter wave radar is adopted to extract road information, mutual verification can be carried out on the millimeter wave radar and the camera under the condition that the sensor is not added, the safety of the system is greatly improved, and the functional safety bottleneck of the system in the aspect of the sensor is solved.
The method for calibrating lane line information based on radar according to the embodiment of the present application will be described in detail below with reference to fig. 1 to 4, and with reference to fig. 1, the method specifically includes the following steps:
s101, static target information on two sides of a road is obtained through radar equipment installed on a vehicle.
In a possible implementation manner, before executing step S101, the method further includes obtaining lane line information identified by a forward-facing camera installed on the vehicle, and in the prior art, the identified lane line information and confidence of the lane line may be output by the camera installed on the vehicle.
Further, the type of the road is determined according to the identified lane line, for example, the curvature of the road is calculated according to the identified lane line information, the larger the curvature is, the more curved the road is, and when the curvature of the road is larger than a preset curvature threshold, the road is determined to be a curved road.
If the judged road type is a straight road, the road condition is simple, the road route information does not need to be further checked, and the control instruction of the vehicle is directly executed. If the judged road type is a curved road, the road condition is relatively complex, and in order to improve the accuracy of the identification of the road line information, radar equipment on the vehicle is introduced to further verify the road line information.
Specifically, the information of the two sides of the road can be acquired through millimeter wave radar equipment installed on the vehicle, and then the static target information of the two sides of the road, including tree information planted on the two sides of the road, guardrail information of the two sides of the road, street lamp information of the two sides of the road and the like, is extracted from the data shot by the millimeter wave radar.
The millimeter wave radar on the vehicle is adopted to obtain the static target information on two sides of the road, so that the system cost is greatly reduced, the road information is extracted from different directions, and the multi-dimension of the information is expressed.
S102, clustering the static target information, and performing curve fitting on the classified clusters.
In order to determine road characteristics from the extracted stationary object information, the stationary object information is first clustered. In a possible implementation manner, static target information is clustered through a clustering algorithm based on density growth, and a commonly used clustering algorithm generally needs to specify the number of clusters to be clustered, but in the application scenario, because the environment is complex and the number of static targets is unknown, the clustering algorithm for determining the number of clusters is not suitable. In a possible implementation mode, according to the DBSCAN clustering algorithm, the OPTICS clustering algorithm, the DENCLUE clustering algorithm and the concept of region growing, starting from one seed point, processing the points in a certain space radius, bringing the points meeting the requirements into the same cluster, and discarding the points not meeting the requirements. And obtaining the clustered static target cluster.
Fig. 3 is a schematic diagram illustrating a clustering of stationary objects according to an exemplary embodiment, where as shown in fig. 3, the left half is to cluster stationary objects on both sides of a road, and two clusters can be obtained after clustering, and are located on both sides of the road respectively. And the right half part is used for clustering the static targets at two sides of the curved road, and two clusters at two sides of the road can be obtained through clustering.
Further, in order to conveniently represent the road information, curve fitting is performed on each classified cluster to obtain a fitted cubic function curve. FIG. 4 is a diagram illustrating curve fitting of clustered clusters according to an exemplary embodiment, where two thick vertical lines in the left half are fitted curves, a thick curved line in the right half is also fitted curves, and the fitted curves are cubic curves using the equation ax3+bx2The expression of + cx + d is 0, and information of the fitted curve can be obtained by solving the equation.
S103, carrying out feature matching on the fitted curve and the lane line information identified by the camera, if the matching is successful, executing a normal vehicle control command, and if the matching is unsuccessful, calibrating the lane line information according to the fitted curve.
In one embodiment, after curve fitting is performed on the classified clusters, the confidence of the fitted curve is further determined according to the distance between the vehicle and the fitted curve and the number of the stationary targets in the cluster where the fitted curve is located.
Further, a magnitude relation between the confidence of the fitting curve and a preset first threshold is determined, wherein the preset first threshold may be set by a person skilled in the art, and the embodiment of the present disclosure is not particularly limited. When the confidence coefficient is smaller than a preset first threshold value, the reliability of the fitting curve is low, and at the moment, the lane line information is not calibrated through the fitting curve. But extracting obstacle information output by the radar and the camera, pre-tracing a track to be traveled by the vehicle according to a turning control instruction to be executed, executing the turning control instruction if no obstacle exists in the track within a preset time period, or else, not executing the turning control instruction, and sending alarm information to inform a driver.
And when the confidence coefficient is greater than or equal to a preset first threshold value, performing characteristic matching on the fitted curve and the lane line information identified by the camera, and if the matching is successful, executing a normal vehicle control command. And if the matching is unsuccessful, calibrating lane line information according to the fitted curve.
Specifically, if the confidence of the fitted curve reaches the threshold, it indicates that the confidence of the fitted curve is high, and at this time, the lane line information may be further calibrated according to the fitted curve information. Selecting a fitting curve with the highest confidence coefficient at two sides of a road, calculating information such as a second curvature, a second curvature change rate and the like of the curve, then calculating information such as a first curvature, a first curvature change rate and the like of the road according to lane line information identified by a camera, performing feature matching on road features determined by the two methods, calculating a difference value between the first curvature and the second curvature, calculating a difference value between the first curvature change rate and the second curvature change rate, and if the difference value between the first curvature and the second curvature is smaller than a preset second threshold value and the difference value between the first curvature change rate and the second curvature change rate is smaller than a preset third threshold value, indicating that the difference values are within a preset range, the lane information identified by the two methods is similar, and the road features are matched. The second threshold and the third threshold are a difference range, and can be set by a person skilled in the art.
If the road characteristics are matched, a normal vehicle control instruction can be executed, and large-angle steering control can be performed on the vehicle. If the road characteristics are not matched, the lane line information identified by the camera can be corrected according to the fitted curve information. And taking the result of the fusion of the two to control the vehicle.
In a possible implementation mode, the confidence of the fitted curve and the confidence of the lane line identified by the camera are obtained, and the characteristic information of the lane line is corrected in a weighted average mode according to the confidence of the fitted curve and the confidence of the lane line identified by the camera. For example, if the road curvature calculated from the fitted curve is 0.8, the confidence of the fitted curve is 0.7, the road curvature calculated from the lane identified by the camera is 0.4, and the confidence of the lane identified by the camera is 0.3, the road curvature after fusion is 0.8 × 0.7+0.4 × 0.3 — 0.68, and the final road curvature is 0.68.
And taking the result of the fusion of the two, controlling the vehicle, for example, judging the distance between the vehicle and the lane lines on two sides according to the information such as the distance between the vehicle and the lane lines, the road curvature change rate and the like, finally calculating the deviation between the vehicle and the lane center, and reducing the deviation by controlling steering to achieve the purpose of driving the vehicle in the middle.
According to the method, mutual verification can be carried out, different sensors are used for detecting different characteristics of roads, lane line information is extracted, and the method is an independent system in terms of functional safety, and the mutual verification can improve the verification accuracy to the maximum extent.
In order to facilitate understanding of the lane line calibration method provided in the embodiment of the present application, the following description is made with reference to fig. 2. As shown in fig. 2, the method mainly includes the following steps:
firstly, acquiring lane line information identified by a vehicle through a camera, classifying road types according to the lane line information, directly executing a control instruction of the vehicle if the road type is a straight road, and acquiring data on two sides of the road through a millimeter wave radar on the vehicle if the road type is a curved road. And calibrating the lane line according to the data extracted by the millimeter wave radar.
Specifically, the method comprises the steps of obtaining static target information on two sides of a road by adopting a millimeter wave radar, clustering the static target information through a clustering algorithm based on density growth, and performing curve fitting on each classified cluster to obtain a fitted cubic function curve. Determining the confidence coefficient of the fitting curve according to the distance between the vehicle and the fitting curve and the number of the static targets in the cluster where the fitting curve is located, when the confidence coefficient is smaller than a preset first threshold value, analyzing the obstacle and the pre-tracing track, extracting obstacle information output by the radar and the camera, determining the track where the vehicle is going according to a control instruction to be executed, if no obstacle exists in the track within a preset time period, executing the control instruction, otherwise, not executing, and sending alarm information.
And when the confidence coefficient is greater than or equal to a preset first threshold value, performing feature matching on the fitted curve and the lane line information identified by the camera, if the matching is successful, executing a normal vehicle control instruction, if the matching is unsuccessful, acquiring the confidence coefficient of the fitted curve and the confidence coefficient of the lane line identified by the camera, correcting the feature information of the lane line by adopting a weighted average mode according to the confidence coefficients of the fitted curve and the lane line, and performing vehicle control according to the fused data.
According to the method for calibrating lane line information provided by the embodiment of the disclosure, the millimeter wave radar is adopted to extract the static target information at two sides of a road, the road characteristics are extracted by utilizing the information of road guardrails, trees and the like, the road characteristics are mutually verified with the lane line information identified by the camera, and different road information is adopted for verification, so that the verification accuracy and the system safety are greatly improved, and the problem of inaccurate lane line identification caused by interference on the identification of the camera lane line due to the situations of fuzzy lane line, damage and the like of an actual road in the prior art is solved. In addition, the method in the embodiment of the disclosure does not increase system sensors, and reduces cost.
The disclosed embodiment also provides a device for calibrating lane line information based on radar, which is used for executing the method for calibrating lane line information based on radar of the above embodiment, as shown in fig. 5, the device includes:
an obtaining module 501, configured to obtain stationary target information on two sides of a road through a radar device installed in a vehicle;
a curve fitting module 502, configured to cluster the static target information and perform curve fitting on the classified clusters;
and the calibration module 503 is configured to perform feature matching on the fitted curve and the lane line information identified by the camera, execute a normal vehicle control instruction if the matching is successful, and calibrate the lane line information according to the fitted curve if the matching is unsuccessful.
It should be noted that, when the apparatus for calibrating lane line information based on radar provided in the foregoing embodiment executes the method for calibrating lane line information based on radar, only the division of the functional modules is illustrated, and in practical applications, the function distribution may be completed by different functional modules according to needs, that is, the internal structure of the apparatus may be divided into different functional modules, so as to complete all or part of the functions described above. In addition, the device for calibrating lane line information based on the radar provided by the embodiment and the method embodiment for calibrating lane line information based on the radar belong to the same concept, and details of the implementation process are shown in the method embodiment and are not described herein again.
The embodiment of the present disclosure further provides an electronic device corresponding to the method for calibrating lane line information based on a radar provided in the foregoing embodiment, so as to execute the method for calibrating lane line information based on a radar.
Please refer to fig. 6, which illustrates a schematic diagram of an electronic device according to some embodiments of the present application. As shown in fig. 6, the electronic apparatus includes: the processor 600, the memory 601, the bus 602 and the communication interface 603, wherein the processor 600, the communication interface 603 and the memory 601 are connected through the bus 602; the memory 601 stores a computer program that can be executed on the processor 600, and the processor 600 executes the computer program to execute the method for calibrating lane line information based on radar provided in any of the foregoing embodiments of the present application.
The Memory 601 may include a high-speed Random Access Memory (RAM) and may further include a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. The communication connection between the network element of the system and at least one other network element is realized through at least one communication interface 603 (which may be wired or wireless), and the internet, a wide area network, a local network, a metropolitan area network, and the like can be used.
The electronic device provided by the embodiment of the application and the method for calibrating lane line information based on radar provided by the embodiment of the application have the same beneficial effects as the method adopted, operated or realized by the electronic device.
Referring to fig. 7, the computer-readable storage medium is an optical disc 700, on which a computer program (i.e., a program product) is stored, and when the computer program is executed by a processor, the computer program performs the method for calibrating lane line information based on radar provided in any of the foregoing embodiments.
It should be noted that examples of the computer-readable storage medium may also include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory, or other optical and magnetic storage media, which are not described in detail herein.
The computer-readable storage medium provided by the above-mentioned embodiment of the present application and the method for calibrating lane line information based on radar provided by the embodiment of the present application have the same beneficial effects as the method adopted, operated or implemented by the application program stored in the computer-readable storage medium.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above examples only show some embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.
Claims (10)
1. A method for calibrating lane line information based on radar, comprising:
obtaining static target information on two sides of a road through radar equipment arranged on a vehicle;
clustering the static target information, and performing curve fitting on the classified clusters;
and matching the fitted curve with the characteristics of the lane line information identified by the camera, if the matching is successful, executing a normal vehicle control command, and if the matching is unsuccessful, calibrating the lane line information according to the fitted curve.
2. The method according to claim 1, before acquiring the stationary target information on both sides of the road by the radar device mounted on the vehicle, further comprising:
and acquiring lane line information identified by a vehicle camera and confidence information of the lane line.
3. The method of claim 2, after obtaining lane line information identified by a vehicle camera and confidence information of the lane line, further comprising:
judging the road type according to the lane line information;
if the road type is a straight road, directly executing a vehicle control instruction without calibrating lane line information;
and if the road type is a curved road, acquiring static target information on two sides of the road through radar equipment installed on the vehicle.
4. The method of claim 1, wherein clustering the stationary target information and curve fitting the classified clusters comprises:
clustering the static target information through a clustering algorithm based on density growth;
and performing curve fitting on each classified cluster to obtain a fitted cubic function curve.
5. The method of claim 1, wherein after curve fitting the classified clusters, further comprising:
determining the confidence coefficient of the fitting curve according to the distance between the vehicle and the fitting curve and the number of the static targets in the cluster where the fitting curve is located;
when the confidence coefficient is greater than or equal to a preset first threshold value, performing feature matching on the fitted curve and the lane line information identified by the camera, if the matching is successful, executing a normal vehicle control instruction, and if the matching is unsuccessful, calibrating the lane line information according to the fitted curve;
and when the confidence coefficient is smaller than a preset first threshold value, extracting obstacle information output by the radar and the camera, determining a track to be traveled by the vehicle according to a control instruction to be executed, if no obstacle exists in the track within a preset time period, executing the control instruction, otherwise, not executing, and sending alarm information.
6. The method of claim 5, wherein the matching the fitted curve with the lane line information identified by the camera comprises:
determining a first curvature and a first curvature change rate of the road according to the lane line information identified by the camera;
determining a second curvature and a second curvature change rate of the road according to the fitted curve;
and if the difference value between the first curvature and the second curvature is smaller than a preset second threshold value and the difference value between the first curvature change rate and the second curvature change rate is smaller than a preset third threshold value, determining that the matching is successful.
7. The method of claim 5, wherein calibrating the lane line information from the fitted curve comprises:
obtaining the confidence coefficient of the fitting curve and the confidence coefficient of the lane line identified by the camera;
and correcting the characteristic information of the lane line by adopting a weighted average mode according to the confidence degrees of the two.
8. An apparatus for calibrating lane line information based on radar, comprising:
the acquisition module is used for acquiring static target information on two sides of a road through radar equipment installed on a vehicle;
the curve fitting module is used for clustering the static target information and performing curve fitting on the classified clusters;
and the calibration module is used for carrying out feature matching on the fitted curve and the lane line information identified by the camera, executing a normal vehicle control command if the matching is successful, and calibrating the lane line information according to the fitted curve if the matching is unsuccessful.
9. An apparatus for radar-based calibration of lane line information, comprising a processor and a memory having stored thereon program instructions, the processor being configured to, when executing the program instructions, perform the method for radar-based calibration of lane line information according to any one of claims 1 to 7.
10. A computer readable medium having computer readable instructions stored thereon which are executable by a processor to implement a method of radar-based calibration of lane line information according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011641279.3A CN112712040B (en) | 2020-12-31 | 2020-12-31 | Method, device, equipment and storage medium for calibrating lane line information based on radar |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011641279.3A CN112712040B (en) | 2020-12-31 | 2020-12-31 | Method, device, equipment and storage medium for calibrating lane line information based on radar |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112712040A true CN112712040A (en) | 2021-04-27 |
CN112712040B CN112712040B (en) | 2023-08-22 |
Family
ID=75547999
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011641279.3A Active CN112712040B (en) | 2020-12-31 | 2020-12-31 | Method, device, equipment and storage medium for calibrating lane line information based on radar |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112712040B (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113569800A (en) * | 2021-08-09 | 2021-10-29 | 北京地平线机器人技术研发有限公司 | Lane recognition and verification method and device, readable storage medium and electronic equipment |
CN113706731A (en) * | 2021-07-15 | 2021-11-26 | 北京无限感测科技有限公司 | ETC integrated antenna power self-adaptive adjusting method and device |
CN113791414A (en) * | 2021-08-25 | 2021-12-14 | 南京市德赛西威汽车电子有限公司 | Scene recognition method based on millimeter wave vehicle-mounted radar view |
CN113990052A (en) * | 2021-10-29 | 2022-01-28 | 南京慧尔视防务科技有限公司 | Incoming vehicle early warning method and device, storage medium and road protection vehicle |
CN114280582A (en) * | 2021-12-31 | 2022-04-05 | 中国第一汽车股份有限公司 | Calibration and calibration method and device for laser radar, storage medium and electronic equipment |
CN115116017A (en) * | 2022-06-27 | 2022-09-27 | 北京旋极信息技术股份有限公司 | Lane recognition method, device and storage medium |
CN115236627A (en) * | 2022-09-21 | 2022-10-25 | 深圳安智杰科技有限公司 | Millimeter wave radar data clustering method based on multi-frame Doppler velocity dimension expansion |
WO2023024516A1 (en) * | 2021-08-23 | 2023-03-02 | 上海商汤智能科技有限公司 | Collision early-warning method and apparatus, and electronic device and storage medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108519605A (en) * | 2018-04-09 | 2018-09-11 | 重庆邮电大学 | Curb detection method based on laser radar and video camera |
CN108960183A (en) * | 2018-07-19 | 2018-12-07 | 北京航空航天大学 | A kind of bend target identification system and method based on Multi-sensor Fusion |
CN109532826A (en) * | 2017-09-21 | 2019-03-29 | 天津所托瑞安汽车科技有限公司 | A kind of radar anticollision method for early warning based on the optimization of lane line Visual identification technology |
CN110174113A (en) * | 2019-04-28 | 2019-08-27 | 福瑞泰克智能系统有限公司 | A kind of localization method, device and the terminal in vehicle driving lane |
CN110962858A (en) * | 2019-12-23 | 2020-04-07 | 潍柴动力股份有限公司 | Target identification method and device |
CN111247525A (en) * | 2019-01-14 | 2020-06-05 | 深圳市大疆创新科技有限公司 | Lane detection method and device, lane detection equipment and mobile platform |
CN112132109A (en) * | 2020-10-10 | 2020-12-25 | 北京百度网讯科技有限公司 | Lane line processing and lane positioning method, device, equipment and storage medium |
-
2020
- 2020-12-31 CN CN202011641279.3A patent/CN112712040B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109532826A (en) * | 2017-09-21 | 2019-03-29 | 天津所托瑞安汽车科技有限公司 | A kind of radar anticollision method for early warning based on the optimization of lane line Visual identification technology |
CN108519605A (en) * | 2018-04-09 | 2018-09-11 | 重庆邮电大学 | Curb detection method based on laser radar and video camera |
CN108960183A (en) * | 2018-07-19 | 2018-12-07 | 北京航空航天大学 | A kind of bend target identification system and method based on Multi-sensor Fusion |
CN111247525A (en) * | 2019-01-14 | 2020-06-05 | 深圳市大疆创新科技有限公司 | Lane detection method and device, lane detection equipment and mobile platform |
CN110174113A (en) * | 2019-04-28 | 2019-08-27 | 福瑞泰克智能系统有限公司 | A kind of localization method, device and the terminal in vehicle driving lane |
CN110962858A (en) * | 2019-12-23 | 2020-04-07 | 潍柴动力股份有限公司 | Target identification method and device |
CN112132109A (en) * | 2020-10-10 | 2020-12-25 | 北京百度网讯科技有限公司 | Lane line processing and lane positioning method, device, equipment and storage medium |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113706731A (en) * | 2021-07-15 | 2021-11-26 | 北京无限感测科技有限公司 | ETC integrated antenna power self-adaptive adjusting method and device |
CN113569800A (en) * | 2021-08-09 | 2021-10-29 | 北京地平线机器人技术研发有限公司 | Lane recognition and verification method and device, readable storage medium and electronic equipment |
WO2023024516A1 (en) * | 2021-08-23 | 2023-03-02 | 上海商汤智能科技有限公司 | Collision early-warning method and apparatus, and electronic device and storage medium |
CN113791414A (en) * | 2021-08-25 | 2021-12-14 | 南京市德赛西威汽车电子有限公司 | Scene recognition method based on millimeter wave vehicle-mounted radar view |
CN113791414B (en) * | 2021-08-25 | 2023-12-29 | 南京市德赛西威汽车电子有限公司 | Scene recognition method based on millimeter wave vehicle-mounted radar view |
CN113990052A (en) * | 2021-10-29 | 2022-01-28 | 南京慧尔视防务科技有限公司 | Incoming vehicle early warning method and device, storage medium and road protection vehicle |
CN113990052B (en) * | 2021-10-29 | 2023-08-15 | 南京慧尔视防务科技有限公司 | Incoming vehicle early warning method and device, storage medium and road protection vehicle |
CN114280582A (en) * | 2021-12-31 | 2022-04-05 | 中国第一汽车股份有限公司 | Calibration and calibration method and device for laser radar, storage medium and electronic equipment |
CN115116017A (en) * | 2022-06-27 | 2022-09-27 | 北京旋极信息技术股份有限公司 | Lane recognition method, device and storage medium |
CN115236627A (en) * | 2022-09-21 | 2022-10-25 | 深圳安智杰科技有限公司 | Millimeter wave radar data clustering method based on multi-frame Doppler velocity dimension expansion |
CN115236627B (en) * | 2022-09-21 | 2022-12-16 | 深圳安智杰科技有限公司 | Millimeter wave radar data clustering method based on multi-frame Doppler velocity dimension expansion |
Also Published As
Publication number | Publication date |
---|---|
CN112712040B (en) | 2023-08-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112712040B (en) | Method, device, equipment and storage medium for calibrating lane line information based on radar | |
US11699292B2 (en) | Apparatus for managing driving pattern based on object recognition, vehicle driving controlling apparatus using the same and method thereof | |
US20220180750A1 (en) | Method for determining collision distance, storage medium and electronic equipment | |
CN113804214B (en) | Vehicle positioning method and device, electronic equipment and computer readable storage medium | |
CN110940979A (en) | Obstacle detection method, apparatus, device, and storage medium | |
WO2021057324A1 (en) | Data processing method and apparatus, chip system, and medium | |
CN112912894A (en) | Road boundary identification method and device | |
CN112699711A (en) | Lane line detection method, lane line detection device, storage medium, and electronic apparatus | |
CN114966632A (en) | Laser radar calibration method and device, electronic equipment and storage medium | |
CN110543818B (en) | Traffic light tracking method, device, medium and equipment based on weight graph matching | |
CN115856979B (en) | Positioning method and device for automatic driving vehicle, electronic equipment and storage medium | |
CN111753901A (en) | Data fusion method, device and system and computer equipment | |
KR102473392B1 (en) | Method and apparatus for determining the lane's centerline network using section node and section link | |
CN116883973A (en) | Point cloud target detection method and device and electronic equipment | |
CN115950441A (en) | Fusion positioning method and device for automatic driving vehicle and electronic equipment | |
CN112739599B (en) | Vehicle lane change behavior identification method and device | |
CN112733778A (en) | Vehicle front guide determination method and device and computer equipment | |
CN116381698B (en) | Road remains detection method and device and electronic equipment | |
CN116625384B (en) | Data association method and device and electronic equipment | |
CN115675472B (en) | Ramp port determining method and device, electronic equipment and storage medium | |
CN118015047B (en) | Multi-target tracking method, device, equipment and storage medium | |
CN118427205B (en) | Data processing method, electronic device, vehicle and program product | |
CN117572335B (en) | Updating method and device for laser positioning confidence coefficient and electronic equipment | |
US20240142590A1 (en) | Online sensor alignment using feature registration | |
CN114694112B (en) | Traffic signal lamp identification method and device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |