[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN114332225A - Lane line matching positioning method, electronic device and storage medium - Google Patents

Lane line matching positioning method, electronic device and storage medium Download PDF

Info

Publication number
CN114332225A
CN114332225A CN202111646702.3A CN202111646702A CN114332225A CN 114332225 A CN114332225 A CN 114332225A CN 202111646702 A CN202111646702 A CN 202111646702A CN 114332225 A CN114332225 A CN 114332225A
Authority
CN
China
Prior art keywords
lane line
positioning
map
error
matching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111646702.3A
Other languages
Chinese (zh)
Inventor
覃梓雨
张昭
余隆山
刘杰
张南杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dongfeng Nissan Passenger Vehicle Co
Original Assignee
Dongfeng Nissan Passenger Vehicle Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dongfeng Nissan Passenger Vehicle Co filed Critical Dongfeng Nissan Passenger Vehicle Co
Priority to CN202111646702.3A priority Critical patent/CN114332225A/en
Publication of CN114332225A publication Critical patent/CN114332225A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a lane line matching positioning method, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring a lane line identified by a vehicle camera device, and taking the point of the lane line identified by the vehicle camera device as a shot lane line point; obtaining a map lane line of a vehicle positioning position in a map, and taking points on the map lane line as map lane line points; matching the shot lane line points with the map lane line points; constructing an error function and a target function; carrying out iterative solution on the objective function; and correcting the vehicle positioning position based on the transformation matrix obtained by solving when the iteration is finished, and obtaining the confidence coefficient of the positioning position. The invention obtains the optimal matching result by optimizing the objective function. In addition, the method can calculate the confidence coefficient of lane line recognition in real time by using the optimized matching error, adjust the weight of the camera lane line recognition result in fusion positioning according to the real-time confidence coefficient, and reduce the influence of the randomness of the recognition error on the fusion positioning precision.

Description

Lane line matching positioning method, electronic device and storage medium
Technical Field
The invention relates to the technical field of automobiles, in particular to a lane line matching and positioning method, electronic equipment and a storage medium.
Background
High-level automatic driving depends on a high-precision map, and in order to ensure that a vehicle can be accurately positioned on the high-precision map, a multi-sensor fusion positioning method based on matching of a lane line and the map can be utilized.
When the vehicle is positioned by using the camera to identify the lane line, the positioning effect is affected by the lane line identification error, so that the identification error needs to be corrected during positioning. As is well known, the error of the camera for identifying the lane line has randomness, and particularly, the error change is more obvious in the occasions such as weather, illumination, accumulated water and dust on the ground, and the abrasion and the blurring of the lane line.
The existing lane line matching and positioning method matches a map discrete point set with a lane line point set. However, in the existing lane line matching and positioning method, since the lane line on the straight road has no longitudinal positioning feature, the confidence calculation is affected, and the sampling distance of the discrete point set affects the matching precision. Therefore, the current evaluation method for positioning cannot effectively evaluate the positioning effect in real time.
Disclosure of Invention
Therefore, it is necessary to provide a lane line matching and positioning method, an electronic device, and a storage medium for solving the technical problems of the prior art that the lane line matching and positioning accuracy is not high.
The invention provides a lane line matching and positioning method, which comprises the following steps:
acquiring a lane line identified by a vehicle camera device, and taking the point of the lane line identified by the vehicle camera device as a shot lane line point;
obtaining a map lane line of a vehicle positioning position in a map, and taking points on the map lane line as map lane line points;
matching the shot lane line points with the map lane line points;
constructing an error function about the transverse position error of the shooting lane line point and the matched map lane line point by using the transformation matrix as a variable, and constructing an objective function about the error function;
carrying out iterative solution on the objective function through a gradient descent algorithm;
and correcting the vehicle positioning position based on the transformation matrix obtained by solving when the iteration is finished to obtain a corrected positioning position for correcting the vehicle positioning position based on the lane line identified by the camera device, and obtaining the confidence coefficient of the corrected positioning position.
Further, the matching of the shot lane line points and the map lane line points specifically includes:
for each shot lane line point, a perpendicular line from the shot lane line point to the map lane line is made, and a point of a foot on the map lane line is used as the map lane line point corresponding to the shot lane line point.
Further, the constructing of the error function regarding the lateral position error between the shooting lane line point and the matched map lane line point by using the transformation matrix as a variable specifically includes:
constructing an error function
Figure BDA0003445404540000021
Where n is the number of captured lane line points, R is the rotation matrix, t is the transverse translation variable, piFor the ith shot lane line point coordinates, qiCoordinates of map lane line points corresponding to the ith photographed lane line point, with the photographed lane line points and the corresponding map lane line points as point pairs, wiIs the weight of the ith point pair, wi=(1-pix/MaxX)/SumW,pixIs the longitudinal distance of the ith lane line point, and MaxX is allThe maximum longitudinal distance in the point is,
Figure BDA0003445404540000022
further, the constructing an objective function related to the error function specifically includes:
constructing an objective function
Figure BDA0003445404540000023
Where E is an error function.
Further, the correcting the vehicle positioning position based on the transformation matrix obtained by solving at the end of the iteration to obtain a corrected positioning position for correcting the vehicle positioning position based on the lane line identified by the camera device specifically includes:
and constructing a transformation matrix based on the rotation matrix R and the transverse translation variable t obtained by solving at the end of iteration, and correcting the vehicle positioning position according to the transformation matrix to obtain a corrected positioning position for correcting the vehicle positioning position based on the lane line identified by the camera device.
Further, the obtaining of the confidence level of the corrected positioning position specifically includes:
and calculating the confidence coefficient of the transformation matrix as the confidence coefficient of the corrected positioning position.
Further, the calculating the confidence of the transformation matrix specifically includes:
obtaining an error value of an error function as a data error when the target function is subjected to the last iteration solution;
according to the data error, constructing a positioning error solving equation based on the Hessian matrix representation of the objective function relative to the transformation matrix;
and solving an equation according to the positioning error, determining the positioning error, and determining the confidence coefficient of the transformation matrix according to the positioning error.
Still further, still include:
and performing position fusion on the corrected positioning position and other positioning sources, wherein the fusion weight of the corrected positioning position is calculated according to the confidence coefficient.
The present invention provides an electronic device, including:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to at least one of the processors; wherein,
the memory stores instructions executable by at least one of the processors to enable the at least one of the processors to perform the lane line matching location method as previously described.
The present invention provides a storage medium storing computer instructions for performing all the steps of the lane line matching positioning method as described above when the computer executes the computer instructions.
According to the method, an error function is constructed by utilizing the transverse matching position error between the map point set and the camera lane line recognition result, and the optimal matching result is obtained by reducing the error function. And combining the optimal matching result with satellite positioning, inertial navigation positioning and the like to obtain the estimation of the vehicle position and attitude as a vehicle fusion positioning result. Furthermore, the method can calculate the confidence coefficient of lane line identification in real time by using the optimized residual matching error, adjust the weight of the camera lane line identification result in fusion positioning according to the real-time confidence coefficient, and reduce the influence of the randomness of the identification error on the fusion positioning precision.
Drawings
FIG. 1 is a flowchart illustrating a method for matching and positioning lane lines according to the present invention;
fig. 2 is a flowchart illustrating a method for matching and positioning lane lines according to an embodiment of the present invention;
FIG. 3 is a flowchart illustrating the operation of the matching location module according to the preferred embodiment of the present invention;
FIG. 4 is a flowchart of the operation of the confidence evaluation module in accordance with the preferred embodiment of the present invention;
FIG. 5 is a flowchart illustrating the operation of lane line matching and positioning in accordance with the preferred embodiment of the present invention;
FIG. 6 is a lane line correction simulation test result using the lane line matching positioning of the present invention;
FIG. 7 is a GPS correction simulation test result using lane line matching positioning of the present invention;
fig. 8 is a schematic diagram of a hardware structure of an electronic device according to the present invention.
Detailed Description
The following further describes embodiments of the present invention with reference to the accompanying drawings. In which like parts are designated by like reference numerals. It should be noted that the terms "front," "back," "left," "right," "upper" and "lower" used in the following description refer to directions in the drawings, and the terms "inner" and "outer" refer to directions toward and away from, respectively, the geometric center of a particular component.
Example one
Fig. 1 is a flowchart illustrating a method for matching and positioning lane lines according to the present invention, which includes:
step S101, acquiring a lane line identified by a vehicle camera device, and taking the points of the lane line identified by the vehicle camera device as the points of the shot lane line;
step S102, obtaining a map lane line of a vehicle positioning position in a map, and taking points on the map lane line as map lane line points;
step S103, matching the shot lane line points with the map lane line points;
step S104, constructing an error function about the transverse position error of the shooting lane line point and the matched map lane line point by taking the transformation matrix as a variable, and constructing an objective function about the error function;
step S105, carrying out iterative solution on the objective function through a gradient descent algorithm;
and step S106, correcting the vehicle positioning position based on the transformation matrix obtained by solving when the iteration is finished, obtaining a corrected positioning position for correcting the vehicle positioning position based on the lane line identified by the camera device, and obtaining the confidence coefficient of the corrected positioning position.
Specifically, the present invention can be applied to an Electronic Control Unit (ECU) of a vehicle.
In step S101, a camera, for example, photographs a lane line outside the vehicle, and points of the photographed lane line are taken as photographed lane line points. All the shot lane line points serve as a point set. Then, step S102 simultaneously acquires a vehicle Positioning position, for example, Global Positioning System (GPS) Positioning information, and acquires a map lane line of the Positioning position from a preset high-precision map, and sets a point on the map lane line as a map lane line point. All map lane line points serve as another set of points. Then, step S103 determines map lane line points corresponding to each photographed lane line point, and an error function is constructed by step S104. The error function calculates the lateral position error between the shot lane line point and the matched map lane line point by taking the transformation matrix as a variable. And simultaneously constructing an objective function about the error function, wherein the objective function is a transformation matrix for solving the minimum value of the error function.
In step S105, a gradient descent algorithm is used to iteratively solve the objective function, and when the number of iterations is greater than a threshold or the error is smaller than the threshold, the iteration is ended, and the transformation matrix obtained when the iteration is ended is the transformation matrix that minimizes the value of the error function.
Finally, step S106 corrects the vehicle positioning position based on the transformation matrix, for example, multiplies the transformation matrix by the vehicle positioning position to obtain a corrected positioning position, and calculates a confidence of the corrected positioning position.
The module utilizes the lane line identified by the camera, GPS positioning information and a preset high-precision map to match the lane line with the high-precision map in the transverse position so as to obtain the position of the vehicle on the map.
According to the method, an error function is constructed by utilizing the transverse position errors among the point sets, the confidence coefficient errors are estimated, the confidence coefficient calculation errors caused by the fact that the lane lines have no longitudinal positioning features can be eliminated, and the positioning effect can be accurately measured.
Example two
Fig. 2 is a flowchart illustrating a method for matching and positioning lane lines according to an embodiment of the present invention, which includes:
in step S201, the lane line recognized by the vehicle camera is acquired, and the points of the lane line recognized by the vehicle camera are taken as the captured lane line points.
Step S202, a map lane line of the vehicle positioning position in the map is obtained, and points on the map lane line are used as map lane line points.
In step S203, for each captured lane line point, a perpendicular line from the captured lane line point to the map lane line is drawn, and a point of a foot on the map lane line is taken as a map lane line point corresponding to the captured lane line point.
Step S204, the transformation matrix comprises a rotation matrix and a transverse translation variable, and an error function is constructed
Figure BDA0003445404540000061
Where n is the number of captured lane line points, R is the rotation matrix, t is the transverse translation variable, piFor the ith shot lane line point coordinates, qiCoordinates of map lane line points corresponding to the ith photographed lane line point, with the photographed lane line points and the corresponding map lane line points as point pairs, wiIs the weight of the ith point pair, wi=(1-pix/MaxX)/SumW,pixIs the longitudinal distance of the ith lane line point, MaxX is the maximum longitudinal distance of all the pairs of points,
Figure BDA0003445404540000062
step S205, constructing an objective function
Figure BDA0003445404540000063
Where E is an error function.
And step S206, carrying out iterative solution on the objective function through a gradient descent algorithm.
Step S207, constructing a transformation matrix based on the rotation matrix R and the transverse translation variable t obtained by solving at the end of iteration, and correcting the vehicle positioning position according to the transformation matrix to obtain a corrected positioning position for correcting the vehicle positioning position based on the lane line identified by the camera device;
step S208, calculating the confidence coefficient of the transformation matrix as the confidence coefficient of the corrected positioning position.
In one embodiment, the calculating the confidence of the transformation matrix specifically includes:
obtaining an error value of an error function as a data error when the target function is subjected to the last iteration solution;
according to the data error, constructing a positioning error solving equation based on the Hessian matrix representation of the objective function relative to the transformation matrix;
and solving an equation according to the positioning error, determining the positioning error, and determining the confidence coefficient of the transformation matrix according to the positioning error.
And step S209, carrying out position fusion on the corrected positioning position and other positioning sources, wherein the fusion weight of the corrected positioning position is calculated according to the confidence coefficient.
Specifically, the present embodiment is composed of a matching location module and a confidence evaluation module, wherein:
a matching positioning module: the module utilizes the lane line identified by the camera, GPS positioning information and a preset high-precision map to match the lane line with the high-precision map in the transverse position so as to obtain the position of the vehicle on the map.
1) The lane line points are matched with the map vectors, and the step S201 to the step S203 match the shot lane line points with the map lane line points through the map vectors, specifically: matching the shot lane line points with vector features of a preset high-precision map, wherein the lane line features in the map are represented by line segments, the line segments are represented as end points and equations, the lane line points shot by a camera are output in a vehicle coordinate system, then the vehicle coordinate system and a GPS are transferred to a coordinate system of the map, the shot lane line points identified by the camera are perpendicular to the line segments in the map coordinate system, and the points corresponding to the vertical feet on the line segments are matching points corresponding to the shot lane line points on the map, namely the map lane line points corresponding to the shot lane line points. Generally, the positioning error of the GPS does not exceed one lane, so that after the GPS is positioned on a map, a matching point can be searched in the lane. When the camera identifies the lane line, the left lane line and the right lane line are distinguished, and then the corresponding left lane line or the right lane line is selected from the map to be used as a vertical line.
2) Constructing an error function, namely constructing the error function and the target function from the step S204 to the step S206, and performing iterative computation to obtain a transformation matrix, wherein the method specifically comprises the following steps: constructing an error function E by using all the shot lane line points p and the corresponding map lane line points q, wherein the error function is a transverse position error between two point sets
Figure BDA0003445404540000081
Where n is the number of captured lane line points, R is the rotation matrix, t is the transverse translation variable, piFor the ith shot lane line point coordinates, qiCoordinates of map lane line points corresponding to the ith photographed lane line point, with the photographed lane line points and the corresponding map lane line points as point pairs, wiIs the weight of the ith point pair, wi=(1-pix/MaxX)/SumW,pixIs the longitudinal distance of the ith lane line point, MaxX is the maximum longitudinal distance of all the pairs of points,
Figure BDA0003445404540000082
objective function
Figure BDA0003445404540000083
And (3) carrying out iterative solution on the objective function by using a gradient descent algorithm, and when the error value of the error function is smaller than a threshold value or reaches the iteration times, considering that the solution is finished to obtain an optimal transformation matrix, namely the required transformation matrix T. And transforming the initial position of the GPS by using the transformation matrix T to obtain the positioning position of the vehicle on the map.
Because the lane line matching is a straight line characteristic and the longitudinal displacement is not considerable, in the solving process, the point set matching mode is changed into the transverse distance matching of the shooting lane line point and the map lane line vector, and the longitudinal unobservable direction is removed in the iterative solving and covariance solving. Wherein, the transverse direction refers to the direction perpendicular to the current driving direction of the vehicle, and the longitudinal direction is the current driving direction of the vehicle.
Step S207 is a confidence evaluation module: the module obtains a positioning error cov (x) of the positioning by using the obtained positioning result and a data error cov (z) of the positioning, and determines a confidence coefficient of the positioning based on the positioning error cov (x). The positioning error can be converted into a corresponding confidence degree by adopting the existing conversion mode of the confidence degree and the error.
Specifically, the method comprises the following steps:
1) acquisition data error cov (z): and the data errors comprise lane line identification errors and lane line and map matching errors, and the iteration is quitted when the iteration times of the matching positioning module on the target function meet the conditions or the error value of the error function meets the threshold value during the iteration. The error value of the error function at the last iteration may be considered to include identification errors that cannot be eliminated and mathematical errors of the algorithm, and therefore the data error cov (z) is set to be the error value normalized by the error function at the last iteration.
2) Since the objective function is expressed as least squares, the relationship between the data error cov (z) and the positioning error cov (x) can be expressed by using hessian matrix of the objective function J relative to the transformation matrix T, and then the following positioning error solving equation can be obtained:
Figure BDA0003445404540000091
the specific solving method is as follows:
setting the total target function J ═ Σ | | | G | | | | | ^2, and setting F | | | | G |;
translation vector and rotation matrix:
T=[x;y;];
R=[cos(a) -sin(a);
sin(a) cos(a)];
transformation matrix
Figure BDA0003445404540000092
Wherein,
x is the representation of the vehicle longitudinal translation amount under the map table coordinate, Y is the representation of the vehicle transverse translation amount under the map coordinate system, a: the variable quantity of the vehicle course angle is obtained through two times of iteration.
Corresponding lane line points and map points: pi=[pix;piy];Qi=[qix;qiy];piAnd QiRepresenting the coordinates of the point in the map.
Error between two points: g ═ wi·(Rpi+t-qi) (ii) a Where R is the same as R in the matrix, wi is the weight of each point pair, wi=(1-pix/MaxX)/SumW,pixIs the longitudinal distance of the ith lane line point, MaxX is the maximum longitudinal distance of all the pairs of points,
Figure BDA0003445404540000101
let X be [ y, a ]]J for X ═ y, a]The hessian matrix of (a) is expressed as:
Figure BDA0003445404540000102
where X represents a state quantity, which is a vector;
(1) since longitudinal displacement is not taken into account, only the solution can be solved
Figure BDA0003445404540000103
Because there are many pairs of points, so
Figure BDA0003445404540000104
The other same principles are adopted.
Then to JiSolving partial derivatives
Figure BDA0003445404540000105
Is dot multiplied by sign.
Figure BDA0003445404540000106
Also, since the longitudinal displacement is not considered, x is 0, and is expressed as:
Figure BDA0003445404540000107
unfolding the derivatives and bringing in, since we don't consider the vertical x, we can solve the Jacobian matrix of y and a in the same way, for example:
Figure BDA0003445404540000108
similarly, the hessian matrix can be solved according to the jacobian matrix:
Figure BDA0003445404540000111
Figure BDA0003445404540000112
then
Figure BDA0003445404540000113
In the formula, X represents a state quantity
Solving for
Figure BDA0003445404540000114
In the formula, X represents a state quantity, and Z represents a state quantity of a data error.
Similarly, solving the Jacobian matrix J of G for pi, qiG(qx),JG(qy),JG(px),JG(py) and then solving for its second derivative
Figure BDA0003445404540000115
Figure BDA0003445404540000116
Finally, cov (Z) was constructed
Figure BDA0003445404540000117
And is calculated to obtain
Figure BDA0003445404540000118
Finally, step S208 uses the corrected localization position for multi-sensor fusion localization, and the confidence level is used for the weight of the fusion localization. For example, the confidence level is directly used as the weight for correcting the positioning position. The multi-sensor fusion positioning can adopt the existing fusion positioning mode, and the corrected positioning position is used as one positioning source.
The specific fusion mode can be realized by adopting the existing position fusion mode.
The method provided by the invention utilizes the matching of the lane line points and the map features to obtain accurate matching points; and constructing an error function by utilizing the transverse matching position error between the map point set and the camera lane line recognition result, and obtaining an optimal matching result by reducing the error function. And the optimized residual matching error can be used for calculating the confidence coefficient of lane line identification in real time, and the weight of the camera lane line identification result in subsequent fusion positioning is adjusted according to the real-time confidence coefficient, so that the influence of the randomness of the identification error on the fusion positioning precision is reduced.
Fig. 3 is a flowchart illustrating the operation of the matching location module according to the preferred embodiment of the present invention, which includes:
step S301, receiving a lane line identified by a camera;
step S302, positioning initialization is carried out by utilizing GPS positioning information;
step S303, obtaining map vector data of corresponding positions;
step S304, matching the lane line points with the map vectors one by one to obtain matched map points;
s305, constructing an error function, wherein the error function is the transverse position error of the lane line point set and the map point set;
step S306, iteratively solving a transformation matrix from the lane line to the map by using a gradient descent algorithm;
step S307, if the error is smaller than the threshold value, executing step S308, otherwise, re-executing step S306;
step S308, obtaining the transformation matrix, and obtaining the position of the vehicle on the map by using the transformation matrix and the initial position of the GPS;
and step S309, solving the positioning confidence.
FIG. 4 is a flowchart of the operation of the confidence evaluation module according to the preferred embodiment of the present invention, which includes:
step S401, representing the data error existing in lane line identification and matching as COV (Z);
step S402, the obtained positioning error is represented as cov (X), and since the objective function is least squares, the relationship between the data error cov (z) and the positioning error cov (X) can be represented by a hessian matrix of the objective function J with respect to the positioning result X:
Figure BDA0003445404540000131
in step S403, both the result of the error function and the data error are reflected in the positioning error cov (x), and the confidence level is determined according to the positioning error cov (x). When the positioning error is large, the confidence coefficient is relatively small, and the fusion positioning result is more accurate due to smaller weight during fusion.
As shown in fig. 5, a flowchart of the work flow of lane line matching and positioning in the preferred embodiment of the present invention includes:
step S501, acquiring a lane line identified by a camera;
step S502, matching lane line points with map vectors to obtain matching points, and constructing a transverse position error function and a target function of two point sets;
step S503, iteratively solving an objective function to obtain a positioning result;
step S504, a positioning confidence coefficient is obtained by utilizing the Hessian matrix of the target function.
Fig. 6 shows the lane line correction simulation test result using the lane line matching positioning of the present invention, which includes a map lane line 61, a recognition lane line 62 after GPS disturbance, and a recognition lane line 63 after correction.
Wherein, the map lane line 61 is the projection display of the lane line on the image; identifying lane lines 62 after GPS disturbance is that the camera identifies lane lines, positions the lane lines on a map by using the GPS with errors, and projects the lane lines back to display on an image; the corrected lane markings 63 are identified by the camera and the lane markings are positioned on the map using the corrected GPS and projected back to the display on the image. It can be seen that after the GPS has an error, the corrected identified lane line 63 corrected by the lane line matching positioning method of the present invention is consistent with the map lane line 61.
As shown in fig. 7, the result of the GPS correction simulation test using lane line matching positioning of the present invention includes: true GPS71, corrected GPS 72, post-disturbance GPS 73.
Where a true GPS71 is a true position fix. The post-disturbance GPS 72 is a GPS plus error fix. The corrected GPS 72 identifies the lane line for the camera and then corrects the position. It can be seen that after the GPS has an error, the corrected GPS 72 corrected by the lane line matching positioning method of the present invention is overlapped with the true GPS 71.
In high-level automatic driving, multiple sensors are fused for positioning, fusion weights of different sensors are required to be determined, and positioning information from different sources is required to provide confidence.
The method provided by the invention utilizes the matching of the lane line points and the map features to obtain accurate matching points; and constructing an error function by utilizing the transverse matching position error between the map point set and the camera lane line recognition result, and obtaining an optimal matching result by reducing the error function. And the optimized residual matching error can be used for calculating the confidence coefficient of lane line identification in real time, and the weight of the camera lane line identification result in subsequent fusion positioning is adjusted according to the real-time confidence coefficient, so that the influence of the randomness of the identification error on the fusion positioning precision is reduced.
EXAMPLE six
Fig. 8 is a schematic diagram of a hardware structure of an electronic device according to the present invention, which includes:
at least one processor 801; and the number of the first and second groups,
a memory 802 communicatively coupled to at least one of the processors 801; wherein,
the memory 802 stores instructions executable by at least one of the processors to enable the at least one of the processors to perform lane line matching location methods as previously described.
Fig. 8 illustrates an example of a processor 801.
The electronic device may further include: an input device 803 and a display device 804.
The processor 801, the memory 802, the input device 803, and the display device 804 may be connected by a bus or other means, and are illustrated as being connected by a bus.
The memory 802, as a non-volatile computer-readable storage medium, may be used to store non-volatile software programs, non-volatile computer-executable programs, and modules, such as program instructions/modules corresponding to the lane line matching positioning method in the embodiment of the present application, for example, the method flow shown in fig. 1. The processor 801 executes various functional applications and data processing by running nonvolatile software programs, instructions, and modules stored in the memory 802, that is, implements the lane line matching positioning method in the above-described embodiment.
The memory 802 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the lane line matching positioning method, and the like. Further, the memory 802 may include high speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some embodiments, the memory 802 optionally includes memory located remotely from the processor 801, which may be connected via a network to a device that performs the lane line matching location method. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 803 may receive an input of a user click and generate signal inputs related to user settings and function control of the lane line matching positioning method. The display device 804 may include a display screen or the like.
When the one or more modules are stored in the memory 802, the lane line matching location method in any of the above-described method embodiments is performed when executed by the one or more processors 801.
The method provided by the invention utilizes the matching of the lane line points and the map features to obtain accurate matching points; and constructing an error function by utilizing the transverse matching position error between the map point set and the camera lane line recognition result, and obtaining an optimal matching result by reducing the error function. And the optimized residual matching error can be used for calculating the confidence coefficient of lane line identification in real time, and the weight of the camera lane line identification result in subsequent fusion positioning is adjusted according to the real-time confidence coefficient, so that the influence of the randomness of the identification error on the fusion positioning precision is reduced.
An embodiment of the present invention provides a storage medium, which stores computer instructions for executing all the steps of the lane line matching positioning method described above when a computer executes the computer instructions.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A lane line matching and positioning method is characterized by comprising the following steps:
acquiring a lane line identified by a vehicle camera device, and taking the point of the lane line identified by the vehicle camera device as a shot lane line point;
obtaining a map lane line of a vehicle positioning position in a map, and taking points on the map lane line as map lane line points;
matching the shot lane line points with the map lane line points;
constructing an error function about the transverse position error of the shooting lane line point and the matched map lane line point by using the transformation matrix as a variable, and constructing an objective function about the error function;
carrying out iterative solution on the objective function through a gradient descent algorithm;
and correcting the vehicle positioning position based on the transformation matrix obtained by solving when the iteration is finished to obtain a corrected positioning position for correcting the vehicle positioning position based on the lane line identified by the camera device, and obtaining the confidence coefficient of the corrected positioning position.
2. The lane line matching and positioning method according to claim 1, wherein the matching of the shot lane line points with the map lane line points specifically comprises:
for each shot lane line point, a perpendicular line from the shot lane line point to the map lane line is made, and a point of a foot on the map lane line is used as the map lane line point corresponding to the shot lane line point.
3. The lane line matching positioning method according to claim 1, wherein the transformation matrix includes a rotation matrix and a lateral translation variable, and the constructing of the error function regarding the lateral position error between the captured lane line point and the matched map lane line point by using the transformation matrix as a variable specifically includes:
constructing an error function
Figure FDA0003445404530000011
Where n is the number of captured lane line points, R is the rotation matrix, t is the transverse translation variable, piFor the ith shot lane line point coordinates, qiFor the ith beatTaking the coordinates of the map lane line points corresponding to the lane line points, taking the taken lane line points and the corresponding map lane line points as point pairs, wiIs the weight of the ith point pair, wi=(1-pix/MaxX)/SumW,pixIs the longitudinal distance of the ith lane line point, MaxX is the maximum longitudinal distance of all the pairs of points,
Figure FDA0003445404530000021
4. the lane line matching and positioning method according to claim 3, wherein the constructing of the objective function with respect to the error function specifically includes:
constructing an objective function
Figure FDA0003445404530000022
Where E is an error function.
5. The lane line matching and positioning method according to claim 3, wherein the step of correcting the vehicle positioning position based on the transformation matrix obtained by solving at the end of the iteration to obtain a corrected positioning position for correcting the vehicle positioning position based on the lane line identified by the camera device specifically comprises:
and constructing a transformation matrix based on the rotation matrix R and the transverse translation variable t obtained by solving at the end of iteration, and correcting the vehicle positioning position according to the transformation matrix to obtain a corrected positioning position for correcting the vehicle positioning position based on the lane line identified by the camera device.
6. The lane line matching positioning method according to claim 1, wherein the obtaining of the confidence level of the corrected positioning position specifically includes:
and calculating the confidence coefficient of the transformation matrix as the confidence coefficient of the corrected positioning position.
7. The lane line matching positioning method according to claim 6, wherein the calculating the confidence of the transformation matrix specifically includes:
obtaining an error value of an error function as a data error when the target function is subjected to the last iteration solution;
according to the data error, constructing a positioning error solving equation based on the Hessian matrix representation of the objective function relative to the transformation matrix;
and solving an equation according to the positioning error, determining the positioning error, and determining the confidence coefficient of the transformation matrix according to the positioning error.
8. The lane line matching positioning method according to claim 6, further comprising:
and performing position fusion on the corrected positioning position and other positioning sources, wherein the fusion weight of the corrected positioning position is calculated according to the confidence coefficient.
9. An electronic device, comprising:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to at least one of the processors; wherein,
the memory stores instructions executable by at least one of the processors to enable the at least one of the processors to perform the lane line match locating method of any one of claims 1 to 8.
10. A storage medium storing computer instructions for performing all the steps of the lane line matching positioning method according to any one of claims 1 to 8 when the computer instructions are executed by a computer.
CN202111646702.3A 2021-12-30 2021-12-30 Lane line matching positioning method, electronic device and storage medium Pending CN114332225A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111646702.3A CN114332225A (en) 2021-12-30 2021-12-30 Lane line matching positioning method, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111646702.3A CN114332225A (en) 2021-12-30 2021-12-30 Lane line matching positioning method, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN114332225A true CN114332225A (en) 2022-04-12

Family

ID=81016967

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111646702.3A Pending CN114332225A (en) 2021-12-30 2021-12-30 Lane line matching positioning method, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN114332225A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114889633A (en) * 2022-06-13 2022-08-12 东风汽车集团股份有限公司 Display method for displaying lane line and front vehicle in front of intelligent driving automobile
CN114972829A (en) * 2022-05-17 2022-08-30 国汽智控(北京)科技有限公司 Vehicle positioning method, device, equipment and storage medium
CN115203352A (en) * 2022-09-13 2022-10-18 腾讯科技(深圳)有限公司 Lane level positioning method and device, computer equipment and storage medium
CN115267868A (en) * 2022-09-27 2022-11-01 腾讯科技(深圳)有限公司 Positioning point processing method and device and computer readable storage medium
CN116202538A (en) * 2023-05-05 2023-06-02 广州小鹏自动驾驶科技有限公司 Map matching fusion method, device, equipment and storage medium
CN116793369A (en) * 2023-02-10 2023-09-22 北京斯年智驾科技有限公司 Path planning method, device, equipment and computer readable storage medium
CN117330097A (en) * 2023-12-01 2024-01-02 深圳元戎启行科技有限公司 Vehicle positioning optimization method, device, equipment and storage medium

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114972829A (en) * 2022-05-17 2022-08-30 国汽智控(北京)科技有限公司 Vehicle positioning method, device, equipment and storage medium
CN114889633A (en) * 2022-06-13 2022-08-12 东风汽车集团股份有限公司 Display method for displaying lane line and front vehicle in front of intelligent driving automobile
CN115203352A (en) * 2022-09-13 2022-10-18 腾讯科技(深圳)有限公司 Lane level positioning method and device, computer equipment and storage medium
CN115267868A (en) * 2022-09-27 2022-11-01 腾讯科技(深圳)有限公司 Positioning point processing method and device and computer readable storage medium
CN115267868B (en) * 2022-09-27 2023-09-19 腾讯科技(深圳)有限公司 Positioning point processing method and device and computer readable storage medium
CN116793369A (en) * 2023-02-10 2023-09-22 北京斯年智驾科技有限公司 Path planning method, device, equipment and computer readable storage medium
CN116793369B (en) * 2023-02-10 2024-03-08 北京斯年智驾科技有限公司 Path planning method, device, equipment and computer readable storage medium
CN116202538A (en) * 2023-05-05 2023-06-02 广州小鹏自动驾驶科技有限公司 Map matching fusion method, device, equipment and storage medium
CN116202538B (en) * 2023-05-05 2023-08-29 广州小鹏自动驾驶科技有限公司 Map matching fusion method, device, equipment and storage medium
CN117330097A (en) * 2023-12-01 2024-01-02 深圳元戎启行科技有限公司 Vehicle positioning optimization method, device, equipment and storage medium
CN117330097B (en) * 2023-12-01 2024-05-10 深圳元戎启行科技有限公司 Vehicle positioning optimization method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN114332225A (en) Lane line matching positioning method, electronic device and storage medium
CN110686677B (en) Global positioning method based on geometric information
CN109708649B (en) Attitude determination method and system for remote sensing satellite
CN111208492B (en) Vehicle-mounted laser radar external parameter calibration method and device, computer equipment and storage medium
JP6760114B2 (en) Information processing equipment, data management equipment, data management systems, methods, and programs
EP2434256B1 (en) Camera and inertial measurement unit integration with navigation data feedback for feature tracking
US20090110267A1 (en) Automated texture mapping system for 3D models
US20060256200A1 (en) Method and system for improving video metadata through the use of frame-to-frame correspondences
CN111044037B (en) Geometric positioning method and device for optical satellite image
CN108510551A (en) Method and system for calibrating camera parameters under long-distance large-field-of-view condition
CN114252082B (en) Vehicle positioning method and device and electronic equipment
US9513130B1 (en) Variable environment high integrity registration transformation system and related method
CN112113564A (en) Positioning method and system based on image sensor and inertial sensor
CN111538029A (en) Vision and radar fusion measuring method and terminal
CN115164900A (en) Omnidirectional camera based visual aided navigation method and system in urban environment
RU2571300C2 (en) Method for remote determination of absolute azimuth of target point
CN112629558A (en) Vehicle inertial navigation matching correction method and device, equipment and storage medium
CN110989619A (en) Method, apparatus, device and storage medium for locating object
US20230350418A1 (en) Position determination by means of neural networks
CN110411449B (en) Aviation reconnaissance load target positioning method and system and terminal equipment
El-Ashmawy A comparison study between collinearity condition, coplanarity condition, and direct linear transformation (DLT) method for camera exterior orientation parameters determination
CN113012084A (en) Unmanned aerial vehicle image real-time splicing method and device and terminal equipment
CN117606506A (en) Vehicle positioning method, device, electronic equipment and medium
US8903163B2 (en) Using gravity measurements within a photogrammetric adjustment
CN112325770B (en) Method and system for evaluating confidence of relative precision of monocular vision measurement at vehicle end

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination