CN114407883A - Method and system for fusing direction angles of obstacles - Google Patents
Method and system for fusing direction angles of obstacles Download PDFInfo
- Publication number
- CN114407883A CN114407883A CN202210025996.6A CN202210025996A CN114407883A CN 114407883 A CN114407883 A CN 114407883A CN 202210025996 A CN202210025996 A CN 202210025996A CN 114407883 A CN114407883 A CN 114407883A
- Authority
- CN
- China
- Prior art keywords
- obstacle
- confidence
- track
- fusion
- parameter
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 58
- 230000033001 locomotion Effects 0.000 claims abstract description 93
- 230000004927 fusion Effects 0.000 claims abstract description 86
- 230000004888 barrier function Effects 0.000 claims abstract description 38
- 238000007499 fusion processing Methods 0.000 claims abstract description 37
- 230000002829 reductive effect Effects 0.000 claims description 5
- 238000012545 processing Methods 0.000 claims description 4
- 230000007423 decrease Effects 0.000 claims description 3
- 150000001875 compounds Chemical class 0.000 claims 1
- 238000004364 calculation method Methods 0.000 abstract description 13
- 238000010586 diagram Methods 0.000 description 14
- 238000004590 computer program Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 230000009471 action Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000010801 machine learning Methods 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 230000002238 attenuated effect Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000009432 framing Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000007500 overflow downdraw method Methods 0.000 description 1
- 230000036961 partial effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
- B60W30/0953—Predicting travel path or likelihood of collision the prediction being responsive to vehicle dynamic parameters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0011—Planning or execution of driving tasks involving control alternatives for a single driving scenario, e.g. planning several paths to avoid obstacles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
- B60W60/0027—Planning or execution of driving tasks using trajectory prediction for other traffic participants
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/50—Barriers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
- B60W2554/404—Characteristics
- B60W2554/4045—Intention, e.g. lane change or imminent movement
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Image Analysis (AREA)
Abstract
The embodiment of the invention provides a method and a system for fusing direction angles of obstacles. Determining at least two of an obstacle profile, an obstacle speed and an obstacle moving track as obstacle parameters, and calculating an estimated moving track of the obstacle based on each obstacle parameter; determining a first direction angle of the obstacle corresponding to each parameter included in the obstacle parameters based on the estimated movement track of the obstacle; determining a confidence corresponding to the first direction angle of each parameter included in the parameters of the obstacle based on the estimated movement track of the obstacle; according to the determined confidence degree corresponding to each parameter included by the barrier parameters, performing confidence degree fusion processing on the confidence degree of each parameter according to a priority gradient fusion mode to obtain a total confidence degree; and performing confidence fusion processing on the total confidence to obtain fusion confidence. The method and the device can improve the accuracy of the calculation of the moving direction of the obstacle and reduce the probability of vehicle collision.
Description
Technical Field
The invention relates to the technical field of intelligent driving, in particular to a method and a system for fusing direction angles of obstacles.
Background
In the field of intelligent driving, in order to avoid collision of a vehicle during driving, it is necessary to determine a moving direction of an obstacle (e.g., a surrounding vehicle), and adjust a moving trajectory of the vehicle using the moving direction of the obstacle. Currently, the moving direction of an obstacle can be determined by vehicle speed measurement or calculation of an Oriented Bounding Box (OBB). However, when the vehicle speed is low or the obstacle drawn into the bounding box is inaccurate, the accuracy of the calculated obstacle moving direction is not high, and there is still a risk of vehicle collision.
Disclosure of Invention
The embodiment of the invention aims to provide a method and a system for fusing direction angles of obstacles, which can improve the accuracy of calculation of the moving direction of the obstacles and reduce the probability of vehicle collision. The specific technical scheme is as follows:
the invention provides a method for fusing direction angles of obstacles, which comprises the following steps:
determining at least two of the obstacle profile, the obstacle speed and the obstacle moving track as obstacle parameters, and respectively calculating the estimated moving track of the obstacle based on each obstacle parameter; determining a first direction angle of the obstacle corresponding to each parameter included in the obstacle parameters based on the estimated movement trajectory of the obstacle;
determining the confidence degree of the first direction angle corresponding to each parameter included by the obstacle parameters based on the estimated movement track of the obstacle, wherein the confidence degree of the first direction angle of each parameter included by the obstacle parameters is determined in different manners;
according to the determined confidence degree corresponding to each parameter included by the barrier parameters, carrying out confidence degree fusion processing on the confidence degree of each parameter according to a priority gradient fusion mode to obtain a total confidence degree; and
and performing confidence fusion processing on the total confidence to obtain fusion confidence.
Optionally, calculating an estimated movement track of the obstacle based on the movement track of the obstacle includes:
acquiring a plurality of historical position coordinates of the obstacle based on the movement track of the obstacle;
fitting a moving track of the plurality of historical position coordinates based on an Nth-order polynomial to obtain a first moving track, wherein N is an integer greater than 1;
when the curvature of a point on the first moving track is smaller than a preset curvature, performing moving track fitting on the plurality of historical position coordinates based on a first-order polynomial to obtain an estimated moving track of the obstacle;
when the curvature of a point on the first moving track is not smaller than the preset curvature, performing moving track fitting on the plurality of historical position coordinates based on an M-th-order polynomial to obtain a second moving track, wherein M is an integer which is larger than 1 and not equal to N;
and selecting a movement track with high fitting degree from the first movement track and the second movement track as an estimated movement track of the obstacle based on the historical position coordinates.
Optionally, the determining a first direction angle of the obstacle corresponding to each parameter included in the obstacle parameters based on the estimated movement trajectory of the obstacle includes:
when the obstacle parameter is an obstacle outline, clustering is carried out based on point cloud information of the estimated movement track of the obstacle to obtain a clustering result, a directional bounding box of the obstacle is taken out based on the clustering result frame, the direction of the obstacle is determined based on the directional bounding box of the obstacle, and a first direction angle is obtained in a global coordinate system based on the direction of the obstacle;
when the barrier parameter is the barrier speed, obtaining the speeds of the barrier in the X direction and the Y direction respectively under a global coordinate system based on the estimated movement track of the barrier, and obtaining a first direction angle by adopting an inverse trigonometric function;
and when the barrier parameter is a barrier moving track, fitting the estimated moving track of the barrier to obtain an analytic expression representing the barrier track, and calculating a first direction angle corresponding to the current position according to the analytic expression slope of the barrier track.
Optionally, the determining the confidence of the first direction angle corresponding to each parameter included in the obstacle parameters based on the estimated movement track of the obstacle includes:
when the obstacle parameter is an obstacle outline, searching a first confidence coefficient corresponding to the length of the directional bounding box of the obstacle from a first data table, wherein the first data table comprises the corresponding relation between the length of the directional bounding box of the obstacle and the first confidence coefficient, and the first confidence coefficient is increased along with the increase of the length of the directional bounding box of the obstacle;
when the obstacle parameter is an obstacle speed, obtaining the obstacle speed based on the speeds of the obstacle in the X direction and the Y direction respectively under a global coordinate system, and searching a second confidence coefficient corresponding to the obstacle speed from a second data table, wherein the second data table comprises the corresponding relation between the obstacle speed and the second confidence coefficient, and the second confidence coefficient is increased along with the increase of the obstacle speed;
when the obstacle parameter is an obstacle moving track, obtaining a track parameter of the obstacle moving track based on the analytic expression representing the obstacle moving track, and searching a third confidence corresponding to the track parameter of the obstacle moving track from a third data table, wherein the third data table comprises a corresponding relation between any track parameter of the obstacle moving track and the third confidence, the track parameter comprises at least one of track length, track fitting degree and curvature of a point on the track, the third confidence increases with the increase of the track length or the track fitting degree, and the third confidence decreases with the increase of the curvature of the point on the track.
Optionally, the performing, according to the determined confidence corresponding to each parameter included in the obstacle parameters, a confidence fusion process on the confidence of each parameter according to a priority gradient fusion manner includes:
and calculating to obtain a first pre-fusion confidence coefficient according to the following formula:
ConfFitFinal=(1-ConfVel)*ConfFit
in the formula, ConfFitFinalRepresenting the first pre-fusion confidence, ConfVelRepresenting the second confidence, ConfFitRepresenting the third confidence level.
Optionally, the performing, according to the determined confidence corresponding to each parameter included in the obstacle parameters, a confidence fusion process on the confidence of each parameter according to a priority gradient fusion manner further includes:
and calculating to obtain a second pre-fusion confidence coefficient according to the following formula:
ConfBoundingBoxFinal=ConfBoundingBox*[min((1-ConfVel),(1-ConfFitFinal))]
in the formula, ConfBoundingBox FinalRepresenting the second pre-fusion confidence, ConfBoundingBoxRepresenting the first confidence level.
Optionally, the performing, according to the determined confidence corresponding to each parameter included in the obstacle parameters, a confidence fusion process on the confidence of each parameter according to a priority gradient fusion manner further includes:
the total confidence is calculated according to the following formula:
EstimateConfidence=ConfVel+ConfFitFinal+ConfBoundingBox Final
where EstimateConfidence represents the overall confidence.
Optionally, performing confidence fusion processing on the total confidence to obtain a fusion confidence specifically includes:
performing confidence fusion processing on the total confidence according to the following formula to obtain an estimated fusion confidence:
where the Heading estimate represents the estimated fusion confidence, HeadingFitIndicating movement corresponding to said obstacleFirst angle of orientation of obstacle of moving track, HeadingVelRepresenting a first direction angle, Heading, of an obstacle corresponding to a velocity of said obstacleBoundingboxA first direction angle of an obstacle corresponding to the obstacle profile is indicated.
Optionally, performing confidence fusion processing on the total confidence to obtain a fusion confidence specifically includes:
performing confidence regression fusion processing on the total confidence according to the following formula to obtain fusion confidence;
in the formula, the HeadingEstimateFinal represents the fusion confidence, HeadingEstimateiThe estimated fusion confidence of the ith time is shown, when i is 0, the current time is shown, when i is n, the nth historical time is shown, and EstimateConfidenceiAs a total confidence of the ith time, DecayCoeffiIs the decay coefficient of the ith time, wherein the decay coefficient is reduced along with the increase of the difference between the historical time and the current time.
The present invention also provides a system for fusing direction angles of obstacles, comprising:
the estimated movement track generation module is used for determining at least two of the obstacle outline, the obstacle speed and the obstacle movement track as obstacle parameters and calculating the estimated movement track of the obstacle based on each obstacle parameter; determining a first direction angle of the obstacle corresponding to each parameter included in the obstacle parameters based on the estimated movement trajectory of the obstacle;
a confidence coefficient determining module, configured to determine, based on the estimated movement trajectory of the obstacle, a confidence coefficient of the first direction angle corresponding to each parameter included in the obstacle parameters, where the confidence coefficient of the first direction angle of each parameter included in the obstacle parameters is determined differently;
the first fusion module is used for carrying out fusion processing on the confidence coefficient of each parameter according to the determined confidence coefficient corresponding to each parameter included by the barrier parameters and a priority gradient fusion mode to obtain a total confidence coefficient;
and the second fusion module is used for performing confidence fusion processing on the total confidence to obtain fusion confidence.
According to the method and the system for fusing the direction angles of the obstacles, provided by the embodiment of the invention, at least two of the outline of the obstacles, the speed of the obstacles and the movement track of the obstacles are determined as the parameters of the obstacles, and the estimated movement track of the obstacles is calculated based on each obstacle parameter; determining a first direction angle of the obstacle corresponding to each parameter included in the obstacle parameters based on the estimated movement track of the obstacle; determining a confidence corresponding to the first direction angle of each parameter included in the parameters of the obstacle based on the estimated movement track of the obstacle; according to the determined confidence degree corresponding to each parameter included by the barrier parameters, performing confidence degree fusion processing on the confidence degree of each parameter according to a priority gradient fusion mode to obtain a total confidence degree; and performing confidence fusion processing on the total confidence to obtain fusion confidence. The method and the device can improve the accuracy of the calculation of the moving direction of the obstacle and reduce the occurrence probability of the collision condition of the obstacle.
Of course, it is not necessary for any product or method of practicing the invention to achieve all of the above-described advantages at the same time.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic view of an orientation angle provided by an embodiment of the present invention;
FIG. 2 is a schematic diagram of a directional angle of an enclosure frame for removing an obstacle according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a moving track according to an embodiment of the present invention;
FIG. 4 is a flowchart of a method for fusing direction angles of obstacles according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of determining a direction angle from a fitted curve according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of confidence degradation provided by an embodiment of the present invention;
FIG. 7 is a schematic diagram of a fusion process of direction angles of obstacles according to an embodiment of the present invention;
FIG. 8 is a block diagram of a system for fusing directional angles of obstacles according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the field of intelligent driving, in order to avoid collision of vehicles during driving, the moving direction of an obstacle needs to be determined. The obstacle may be a surrounding vehicle whose distance from the vehicle is smaller than a preset distance, and may be in a moving state or a stationary state. Alternatively, the obstacle moving direction may be calculated by an obstacle direction angle. As shown in fig. 1, the direction angle may be an angle between the direction in which the obstacle travels and the positive direction of the x-axis of the global coordinate system in which the obstacle is located. The global coordinate system can be a preset coordinate system, the position relation between the vehicle and the obstacle can be obtained under the global coordinate system, whether collision risk exists or not can be judged by utilizing the direction angle of the obstacle, and if the collision risk exists, the moving track of the vehicle can be adjusted.
The obstacle direction angle is calculated by the following three methods:
the first method is to calculate the obstacle direction angle using a directional bounding box.
The method utilizes information such as laser point cloud to carry out clustering, and an Oriented Bounding Box (Oriented Bounding Box) of the obstacle is taken out based on a clustering result frame. Alternatively, the directional bounding box may be a rectangular parallelepiped, as shown in fig. 2, which is a schematic diagram of taking out a direction angle of the obstacle from the directional bounding box frame, where the direction angle of the obstacle is α. The directional bounding box is obtained based on the clustering result, and in the process of clustering by using the laser point cloud information, the clustering algorithm tends to divide a large target into small targets so as to obtain the clustering result, so that the framing effect of the obstacle can be influenced when the clustering result is inaccurate, and the accuracy of calculating the direction angle of the obstacle can be influenced when the outline of the framed obstacle is too large or too small.
The second method is to calculate the obstacle direction angle using the obstacle velocity.
The method utilizes the speed of the obstacle in the X direction and the Y direction of the global coordinate system and adopts an inverse trigonometric function to calculate and obtain the direction angle of the obstacle. The method depends on the accuracy of speed measurement, and because the amplitude of the noise sensed by the speed sensor is fixed, when the speed is lower, the signal-to-noise ratio of speed information is very low, the problem of serious distortion under a low-speed working condition exists, and the accuracy of the calculation of the direction angle of the obstacle is influenced.
The third method is to calculate the obstacle direction angle using a fitted curve.
The method comprises the steps of fitting a moving curve of an obstacle by using historical position coordinates of the obstacle to obtain an analytic expression capable of representing a moving track of the obstacle, and then calculating a direction angle corresponding to the current position of the obstacle according to the slope of the analytic expression. As shown in fig. 3, fig. 3(a) is a movement trajectory based on a cubic polynomial fitting, fig. 3(b) is a movement trajectory based on a quadratic polynomial fitting, and fig. 3(c) is a movement trajectory based on a first-order polynomial fitting. The higher the order of the polynomial, the greater the change in slope of the fitted trajectory at the end of the curve. When the method is used for calculating the direction angle of the obstacle at the current position, the robustness is poor, the obstacle is easily interfered by noise points to generate large variation, and the real direction angle of the obstacle cannot be reflected.
Based on this, the present invention provides a method for fusing direction angles of obstacles, as shown in fig. 4, the method comprising:
step 401: determining at least two of the obstacle profile, the obstacle speed and the obstacle moving track as obstacle parameters, and respectively calculating the estimated moving track of the obstacle based on each obstacle parameter; a first direction angle of the obstacle corresponding to each of the parameters included in the obstacle parameter is determined based on the estimated movement trajectory of the obstacle.
In this embodiment, the obstacle contour may be obtained based on a directed bounding box, and the obstacle contour may be a contour of the directed bounding box; the barrier speed can be obtained based on a speed sensor, and the barrier speed is the speed in the X direction and the Y direction under the global coordinate system; the obstacle movement trajectory may be obtained by curve fitting based on the historical position coordinates of the obstacle.
Optionally, if at least two of the obstacle profile, the obstacle speed, and the obstacle moving trajectory are determined as the obstacle parameters, the obstacle parameters may include the obstacle profile and the obstacle speed, the obstacle parameters may also include the obstacle profile and the obstacle moving trajectory, the obstacle parameters may also include the obstacle speed and the obstacle moving trajectory, and of course, the obstacle parameters may also be the obstacle profile, the obstacle speed, and the obstacle moving trajectory.
As an optional implementation, calculating the estimated moving track of the obstacle based on the moving track of the obstacle includes: acquiring a plurality of historical position coordinates of the obstacle based on the movement track of the obstacle; fitting the moving tracks of the plurality of historical position coordinates based on an Nth-order polynomial to obtain a first moving track, wherein N is an integer greater than 1; when the curvature of a point on the first moving track is smaller than the preset curvature, performing moving track fitting on a plurality of historical position coordinates based on a first-order polynomial to obtain an estimated moving track of the barrier; when the curvature of a point on the first moving track is not smaller than the preset curvature, performing moving track fitting on a plurality of historical position coordinates based on an M-th-order polynomial to obtain a second moving track, wherein M is an integer which is larger than 1 and not equal to N; and selecting a moving track with high fitting degree from the first moving track and the second moving track as an estimated moving track of the obstacle based on the historical position coordinates.
In this embodiment, a first moving trajectory is obtained by first adopting a N-th order polynomial fitting manner, if the curvature of a point on a fitted curve is smaller than a preset curvature, optionally, the preset curvature range is between 0 and 0.1, wherein the preset curvature may be 0.02, the moving trajectory of the obstacle may be considered as a linear motion trajectory, and at this time, compared with the moving trajectory fitted by the N-th order polynomial (N >1), the accuracy of obtaining the moving trajectory of the obstacle based on the first order polynomial is higher, and the problems that the robustness of the curve fitted by the high order polynomial is poor, the curve is easily interfered by noise points to generate large variation, and the calculation accuracy of the direction angle of the obstacle is not high can be solved.
In this embodiment, a first moving trajectory is obtained by fitting a curve with a polynomial of degree N, and if the curvature of a point on the fitted curve is not smaller than a preset curvature, optionally, the preset curvature range is 0 to 0.1, where the preset curvature may be 0.02, the moving trajectory of the obstacle may be considered as a curve moving trajectory, at this time, at least two high-order polynomials (such as a polynomial of degree N and a polynomial of degree M) may be used to perform moving trajectory fitting on the multiple historical position coordinates, and the moving trajectory with a high degree of fitting is selected as the estimated moving trajectory of the obstacle. Through the comparison of the fitting degrees, the moving track with high fitting degree can be selected to reduce the influence of low calculation accuracy of the direction angle of the obstacle caused by the fact that the curve fitted by the high-order polynomial has poor robustness and is easily interfered by noise points to generate large variation.
Optionally, the method for selecting a movement trajectory with a high degree of fitting from the first movement trajectory and the second movement trajectory as the movement trajectory of the obstacle specifically includes: calculating the variance of the deviation value between the coordinate of the point on the first moving track and the corresponding historical position coordinate to obtain a first variance; calculating the variance of the coordinate of the point on the second moving track and the deviation value of the corresponding historical position coordinate to obtain a second variance; when the first variance is smaller than the second variance, taking the first moving track as the estimated moving track of the obstacle; and when the second variance is smaller than the first variance, taking the second movement track as the estimated movement track of the obstacle.
When determining the first direction angle of the obstacle corresponding to each parameter included in the obstacle parameters based on the estimated movement track of the obstacle, optionally, if two of the obstacle profile, the obstacle speed, and the obstacle movement track are determined as the obstacle parameters, then each obstacle parameter obtains one first direction angle, and a total of the two first direction angles can be obtained. Of course, if the obstacle profile, the obstacle speed, and the obstacle movement trajectory are all determined as the obstacle parameters, a total of three first direction angles can be obtained.
In this embodiment, the method for determining the first direction angle of the obstacle corresponding to the obstacle profile based on the estimated movement track of the obstacle may be to perform clustering on point cloud information based on the estimated movement track of the obstacle to obtain a clustering result, extract a directional bounding box of the obstacle based on a clustering result frame, determine the direction of the obstacle based on the directional bounding box of the obstacle, and obtain the first direction angle in the global coordinate system based on the direction of the obstacle.
The method for determining the first direction angle of the obstacle corresponding to the speed of the obstacle based on the estimated movement track of the obstacle may be to obtain the speeds of the obstacle in the X direction and the Y direction respectively in the global coordinate system based on the estimated movement track of the obstacle, and obtain the first direction angle by using an inverse trigonometric function.
The method for determining the first direction angle of the obstacle corresponding to the obstacle movement track based on the estimated obstacle movement track may be to fit the estimated obstacle movement track to obtain an analytic expression representing the obstacle movement track, and calculate the first direction angle corresponding to the current position according to the analytic expression slope of the obstacle movement track, where the current position may be an end position of the estimated movement track. As shown in fig. 5, the end position of the predicted movement track and the slope of the straight line of the two points closest to the end position of the predicted movement track may be selected to obtain the first direction angle. The solid dots in fig. 5 indicate the position of the obstacle.
Step 402: and determining the confidence degree of the first direction angle corresponding to each parameter included by the obstacle parameters based on the estimated movement track of the obstacle, wherein the confidence degree of the first direction angle of each parameter included by the obstacle parameters is determined in different ways.
As an optional implementation manner, step 402 specifically includes:
when the obstacle parameter is an obstacle outline, searching a first confidence coefficient corresponding to the length of the directional bounding box of the obstacle from a first data table, wherein the first data table comprises the corresponding relation between the length of the directional bounding box of the obstacle and the first confidence coefficient, and the first confidence coefficient is increased along with the increase of the length of the directional bounding box of the obstacle.
According to the principle that the directional bounding box of the obstacle is obtained by laser point cloud clustering, due to the fact that the clustering tends to be over-divided, namely a large target is divided into a plurality of small targets in a tendency manner to obtain a clustering result, the larger the obstacle framed by the directional bounding box is, the longer the longitudinal length of the framed obstacle is, the closer the framed result is to an actual obstacle, and the higher the confidence coefficient of the directional angle of the obstacle obtained by using the framed result is.
Optionally, the first data table may be obtained by methods such as manual calibration and machine learning. As an alternative embodiment, the first data table is shown in table 1.
TABLE 1 first data sheet
Vehicle Bounding-Box longitudinal Length | |
4m | |
1 | |
3m | 0.75 |
2m | 0.25 |
1m | 0 |
When the obstacle parameter is the obstacle speed, the obstacle speed is obtained based on the speeds of the obstacle in the X direction and the Y direction under the global coordinate system, and a second confidence coefficient corresponding to the obstacle speed is searched from a second data table, wherein the second data table comprises the corresponding relation between the obstacle speed and the second confidence coefficient, and the second confidence coefficient is increased along with the increase of the obstacle speed.
As can be seen from the principle of calculating the direction angle from the velocity, the accuracy of the direction angle calculation depends on the accuracy of the velocity measurement. Because the amplitude of the noise sensed by the speed sensor is fixed, when the speed is lower, the signal-to-noise ratio of the speed information is very low, and the accuracy of the calculated direction angle of the obstacle is also lower. Optionally, the second data table may be obtained by a method such as noise amplitude calibration or machine learning manually according to the speed information. The second confidence coefficient can be obtained by looking up a table on line through a speed interval and confidence value data table generated in an off-line process.
When the obstacle parameter is the obstacle moving track, obtaining the track parameter of the obstacle moving track based on an analytic expression representing the obstacle moving track, and searching a third confidence coefficient corresponding to the track parameter of the obstacle moving track from a third data table, wherein the third data table comprises a corresponding relation between any track parameter of the obstacle moving track and the third confidence coefficient, the track parameter comprises at least one of track length, track fitting degree and curvature of a point on the track, the third confidence coefficient is increased along with the increase of the track length or the track fitting degree, and the third confidence coefficient is decreased along with the increase of the curvature of the point on the track.
When the length of the movement path of the obstacle is small, the speed of the obstacle is reflected to be small, and when the speed of the obstacle is small, the possibility of collision is lower than that when the speed of the obstacle is large, and the influence of the obstacle on the vehicle is reduced. Therefore, the smaller the track length of the obstacle movement track, the lower the confidence. Optionally, the third confidence level is determined by using the variance of the position point of each historical time on the estimated movement track in the X dimension and the Y dimension. When the variance is small, the distribution of points can be considered to be concentrated, the influence of noise on curve fitting is large, the generated estimated movement track is difficult to reflect the real direction angle, and when the variance of the point track in the X dimension and/or the variance of the point track in the Y dimension are large, the distribution is considered to be dispersed, the track is long, and the direction angle obtained according to the obstacle movement track at the moment has high confidence coefficient. The third data table can be obtained by methods such as manual calibration, machine learning and the like.
Of course, the confidence coefficient can also be determined according to the track fitting degree, and the higher the track fitting degree is, the higher the confidence coefficient is. The track fitting degree can reflect the degree of correlation between the point on the estimated movement track and the actual coordinate point, and the degree of correlation can be a determination coefficient.
In addition, the confidence level can be determined through the dynamic indexes of the vehicle, for example, when the curvature of the predicted moving track violates the limitation of the minimum turning radius of the vehicle, the fitted curve cannot represent the moving track of the obstacle. Therefore, the confidence when the curvature of the point on the trajectory exceeds the preset curvature is smaller than the confidence when the curvature of the point on the trajectory does not exceed the preset curvature. Wherein the preset curvature may be a curvature corresponding to a point at a turn on a moving trajectory generated by the vehicle with a minimum turning radius. Optionally, the confidence level when the preset curvature is exceeded is 0, and the confidence level when the preset curvature is not exceeded is 1.
Step 403: and according to the determined confidence degree corresponding to each parameter included by the barrier parameters, performing confidence degree fusion processing on the confidence degree of each parameter according to a priority gradient fusion mode to obtain a total confidence degree.
As an optional implementation manner, step 403 includes:
and calculating to obtain a first pre-fusion confidence coefficient according to the following formula:
ConfFitFinal=(1-ConfVel)*ConfFit
in the formula, ConfFitFinalRepresenting a first pre-fusion confidence, ConfVelIndicates a second degree of confidence, ConfFitA third confidence level is indicated.
As another optional implementation manner, step 403 further includes:
and calculating to obtain a second pre-fusion confidence coefficient according to the following formula:
ConfBoundingBoxFinal=ConfBoudingBox*[min((1-ConfVel),(1-ConfFitFinal))]
in the formula, ConfBoundingBox FinalRepresenting a second pre-fusion confidence, ConfBoundingBoxA first confidence is indicated.
As another optional implementation manner, step 403 further includes:
the total confidence is calculated according to the following formula:
EstimateConfidence=ConfVel+ConfFitFinal+ConfBondingBox Final
in the formula, estimeteeconfidence represents the total confidence.
In this embodiment, the confidence corresponding to the velocity of the obstacle may be used as a first confidence, the confidence corresponding to the movement trajectory of the obstacle may be used as a second confidence, and the confidence corresponding to the contour of the obstacle may be used as a third confidence. When the confidence of the first direction angle corresponding to the obstacle speed is high, even if the first confidence and the third confidence are high, the first confidence and the third confidence still participate in fusion with low confidence due to the high second confidence.
Step 404: and performing confidence fusion processing on the total confidence to obtain fusion confidence.
As an optional implementation manner, step 404 specifically includes:
performing confidence fusion processing on the total confidence according to the following formula to obtain an estimated fusion confidence:
where the Heading estimate represents the estimated fusion confidence, HeadingFitRepresenting a first direction angle, Heading, of an obstacle corresponding to a trajectory of movement of the obstacleVelRepresenting a first direction angle, Heading, of the obstacle corresponding to the speed of the obstacleBoundingboxA first direction angle of the obstacle corresponding to the obstacle profile is indicated.
After the estimation fusion confidence is obtained, since the estimation fusion confidence is obtained by a few methods with lower confidence, the confidence is actually lower although the optimal estimation fusion confidence is obtained after weighting. Therefore, as another embodiment of the present invention, it is necessary to perform confidence regression fusion processing on the total confidence to obtain a fusion confidence.
Based on this, in the method provided by the present invention, the confidence regression fusion processing is performed on the total confidence to obtain the fusion confidence, which specifically includes:
performing confidence regression fusion processing on the total confidence according to the following formula to obtain fusion confidence:
in the formula, the HeadingEstimateFinal represents the fusion confidence, HeadingEstimateiThe estimated fusion confidence of the ith time is shown, when i is 0, the current time is shown, when i is n, the nth historical time is shown, and EstimateConfidenceiAs a total confidence of the ith time, DecayCoeffiIs the decay coefficient of the ith time, wherein the decay coefficient is reduced along with the increase of the difference between the historical time and the current time.
The above method multiplies the confidence by a decay coefficient, which is a coefficient that becomes smaller with time, i.e., the longer the time, the more serious the confidence decay. Specifically, as shown in fig. 6, optionally, the original value of the confidence coefficient is maintained for the first 100ms, and then the confidence coefficient is attenuated to 0 by a fixed slope, that is, the confidence coefficient is 0 after 300 ms. Of course, the value of the decay coefficient may be changed in practical application with the change of the vehicle speed and the specific application scene, for example, the vehicle may be attenuated with a larger slope when moving at a high speed, because the confidence of the historical information is actually lower at a high speed.
Fig. 7 further illustrates the method for fusing the direction angles of the obstacle provided by the present invention, and as shown in fig. 7, the obstacle state information is the estimated movement trajectory of the obstacle.
The Bounding Box frame angle is a first direction angle of an obstacle corresponding to the obstacle outline of the obstacle, which is determined based on the estimated movement track of the obstacle; the speed direction estimation angle is a first direction angle of an obstacle corresponding to the obstacle profile of the obstacle determined based on the obstacle speed; the historical track fitting estimation angle is a first direction angle of the obstacle corresponding to the moving track of the obstacle, which is determined based on the speed of the obstacle.
The confidence degree calculation based on the Bounding Box frame angle is to search a first confidence degree corresponding to the length of the directed Bounding Box of the obstacle from a first data table; the confidence degree calculation based on the speed direction estimation angle is to obtain the speed of the obstacle based on the speeds of the obstacle in the X direction and the Y direction respectively under the global coordinate system, and search a second confidence degree corresponding to the speed of the obstacle from a second data table; the confidence calculation based on the historical track fitting estimation angle is to obtain the track parameter of the moving track of the obstacle based on the analytic expression representing the moving track of the obstacle, and look up a third confidence corresponding to the track parameter of the moving track of the obstacle from a third data table.
The result fusion of various methods based on the current time information is to perform confidence fusion processing on the total confidence to obtain the estimated fusion confidence HeadingEstimate.
The historical optimal estimation is the estimation fusion confidence HeadingEstimate at the ith momentiThe optimal historical estimation is used as input, and the confidence coefficient regression fusion based on the optimal historical estimation is to perform confidence coefficient regression fusion processing on the total confidence coefficientAnd obtaining the fusion confidence HeadingEstimateFinal.
And finally outputting the optimal obstacle direction angle estimation HeadingEstimateFinal and the confidence corresponding to the optimal obstacle direction angle estimation.
The present invention also provides a system for fusing direction angles of obstacles, as shown in fig. 8, the system comprising:
an estimated movement track generation module 801, configured to determine at least two of an obstacle profile, an obstacle speed, and an obstacle movement track as obstacle parameters, and calculate an estimated movement track of the obstacle based on each obstacle parameter; a first direction angle of the obstacle corresponding to each of the parameters included in the obstacle parameter is determined based on the estimated movement trajectory of the obstacle.
The estimated movement trajectory generating module 801 specifically includes:
the estimated movement track generation unit is used for acquiring a plurality of historical position coordinates of the obstacle based on the movement track of the obstacle; fitting the moving tracks of the plurality of historical position coordinates based on an Nth-order polynomial to obtain a first moving track, wherein N is an integer greater than 1; when the curvature of a point on the first moving track is smaller than the preset curvature, performing moving track fitting on a plurality of historical position coordinates based on a first-order polynomial to obtain an estimated moving track of the barrier; when the curvature of a point on the first moving track is not smaller than the preset curvature, performing moving track fitting on a plurality of historical position coordinates based on an M-th-order polynomial to obtain a second moving track, wherein M is an integer which is larger than 1 and not equal to N; and selecting a moving track with high fitting degree from the first moving track and the second moving track as an estimated moving track of the obstacle based on the historical position coordinates.
The system comprises a first direction angle generating unit, a second direction angle generating unit and a control unit, wherein the first direction angle generating unit is used for clustering point cloud information of estimated movement tracks of obstacles to obtain clustering results when the parameters of the obstacles are obstacle outlines, extracting directional bounding boxes of the obstacles based on a clustering result frame, determining the direction of the obstacles based on the directional bounding boxes of the obstacles, and obtaining a first direction angle in a global coordinate system based on the direction of the obstacles; when the barrier parameter is the barrier speed, obtaining the speeds of the barrier in the X direction and the Y direction respectively under the global coordinate system based on the estimated moving track of the barrier, and obtaining a first direction angle by adopting an inverse trigonometric function; and when the barrier parameter is the barrier moving track, fitting the estimated moving track of the barrier to obtain an analytic expression representing the barrier track, and calculating a first direction angle corresponding to the current position according to the analytic expression slope of the barrier track.
A confidence determining module 802, configured to determine, based on the estimated movement trajectory of the obstacle, a confidence corresponding to the first direction angle of each parameter included in the obstacle parameters, where the confidence of the first direction angle of each parameter included in the obstacle parameters is determined differently.
The confidence determining module 802 specifically includes:
the first confidence determining unit is used for searching a first confidence corresponding to the length of the directional bounding box of the obstacle from a first data table when the obstacle parameter is the obstacle outline, wherein the first data table comprises the corresponding relation between the length of the directional bounding box of the obstacle and the first confidence, and the first confidence increases along with the increase of the length of the directional bounding box of the obstacle.
And the second confidence determining unit is used for obtaining the speed of the obstacle based on the speeds of the obstacle in the X direction and the Y direction respectively under the global coordinate system when the obstacle parameter is the speed of the obstacle, and searching a second confidence corresponding to the speed of the obstacle from a second data table, wherein the second data table comprises the corresponding relation between the speed of the obstacle and the second confidence, and the second confidence increases along with the increase of the speed of the obstacle.
And the third confidence determining unit is used for obtaining the track parameter of the obstacle moving track based on the analytic expression representing the obstacle moving track when the obstacle parameter is the obstacle moving track, and searching a third confidence corresponding to the track parameter of the obstacle moving track from a third data table, wherein the third data table comprises a corresponding relation between any track parameter of the obstacle moving track and the third confidence, the track parameter comprises at least one of track length, track fitting degree and curvature of a point on the track, the third confidence increases along with the increase of the track length or the track fitting degree, and the third confidence decreases along with the increase of the curvature of the point on the track.
The first fusion module 803 is configured to perform confidence fusion processing on the confidence of each parameter according to the determined confidence of each parameter included in the corresponding obstacle parameter and in a priority gradient fusion manner, so as to obtain a total confidence.
The first fusion module 803 specifically includes:
the first pre-fusion confidence calculation unit is used for calculating and obtaining a first pre-fusion confidence according to the following formula:
ConfFitFinal=(1-ConfVel)*ConfFit
in the formula, ConfFitFinalRepresenting a first pre-fusion confidence, ConfVelIndicates a second degree of confidence, ConfFitA third confidence level is indicated.
The second pre-fusion confidence calculation unit is used for calculating and obtaining a second pre-fusion confidence according to the following formula:
ConfBoundingBoxFinal=ConfBoundingBox*[min((1-ConfVel),(1-ConfFitFinal))]
in the formula, ConfBoundingBox FinalRepresenting a second pre-fusion confidence, ConfBoundingBoxA first confidence is indicated.
The total confidence computing unit is used for computing and obtaining the total confidence according to the following formula:
EstimateConfidence=ConfVel+ConfFitFinal+ConfBoundingBox Final
in the formula, estimeteeconfidence represents the total confidence.
And the second fusion module 804 is configured to perform confidence fusion processing on the total confidence to obtain a fusion confidence.
The second fusion module 804 specifically includes:
and the estimated fusion confidence generating unit is used for performing confidence fusion processing on the total confidence according to the following formula to obtain an estimated fusion confidence:
where the Heading estimate represents the estimated fusion confidence, HeadingFitRepresenting a first direction angle, Heading, of an obstacle corresponding to a trajectory of movement of the obstacleVelRepresenting a first direction angle, Heading, of the obstacle corresponding to the speed of the obstacleBoundingboxA first direction angle of the obstacle corresponding to the obstacle profile is indicated.
The fusion confidence generating unit is used for carrying out confidence regression fusion processing on the total confidence according to the following formula to obtain fusion confidence;
in the formula, the HeadingEstimateFinal represents the fusion confidence, HeadingEstimateiThe estimated fusion confidence of the ith time is shown, when i is 0, the current time is shown, when i is n, the nth historical time is shown, and EstimateConfidenceiAs a total confidence of the ith time, DecayCoeffiIs the decay coefficient of the ith time, wherein the decay coefficient is reduced along with the increase of the difference between the historical time and the current time.
An embodiment of the present invention provides a computer-readable storage medium, on which a program is stored, which, when executed by a processor, implements the above-described method for fusing obstacle direction angles.
An embodiment of the present invention provides an electronic device, as shown in fig. 9, an electronic device 90 includes at least one processor 901, at least one memory 902 connected to the processor 901, and a bus 903; the processor 901 and the memory 902 complete communication with each other through the bus 903; the processor 901 is configured to call program instructions in the memory 902 to execute the above-mentioned method for fusing the direction angles of the obstacle. The electronic device herein may be a server, a PC, a PAD, a mobile phone, etc.
The present application also provides a computer program product adapted to execute a program of initializing the steps comprised by the fusion method with the above mentioned obstacle direction angle when executed on a data processing device.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, systems and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a device includes one or more processors (CPUs), memory, and a bus. The device may also include input/output interfaces, network interfaces, and the like.
The memory may include volatile memory in a computer readable medium, Random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip. The memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.
Claims (10)
1. A method for fusing direction angles of obstacles, comprising:
determining at least two of the obstacle profile, the obstacle speed and the obstacle moving track as obstacle parameters, and respectively calculating the estimated moving track of the obstacle based on each obstacle parameter; determining a first direction angle of the obstacle corresponding to each parameter included in the obstacle parameters based on the estimated movement trajectory of the obstacle;
determining the confidence degree of the first direction angle corresponding to each parameter included by the obstacle parameters based on the estimated movement track of the obstacle, wherein the confidence degree of the first direction angle of each parameter included by the obstacle parameters is determined in different manners;
according to the determined confidence degree corresponding to each parameter included by the barrier parameters, carrying out confidence degree fusion processing on the confidence degree of each parameter according to a priority gradient fusion mode to obtain a total confidence degree; and
and performing confidence fusion processing on the total confidence to obtain fusion confidence.
2. The method for fusing obstacle direction angles according to claim 1, wherein calculating the estimated movement trajectory of the obstacle based on the obstacle movement trajectory comprises:
acquiring a plurality of historical position coordinates of the obstacle based on the movement track of the obstacle;
fitting a moving track of the plurality of historical position coordinates based on an Nth-order polynomial to obtain a first moving track, wherein N is an integer greater than 1;
when the curvature of a point on the first moving track is smaller than a preset curvature, performing moving track fitting on the plurality of historical position coordinates based on a first-order polynomial to obtain an estimated moving track of the obstacle;
when the curvature of a point on the first moving track is not smaller than the preset curvature, performing moving track fitting on the plurality of historical position coordinates based on an M-th-order polynomial to obtain a second moving track, wherein M is an integer which is larger than 1 and not equal to N;
and selecting a movement track with high fitting degree from the first movement track and the second movement track as an estimated movement track of the obstacle based on the historical position coordinates.
3. The method of fusing obstacle direction angles according to claim 1, wherein the determining a first direction angle of an obstacle corresponding to each of the parameters included in the obstacle based on the estimated movement trajectory of the obstacle comprises:
when the obstacle parameter is an obstacle outline, clustering is carried out based on point cloud information of the estimated movement track of the obstacle to obtain a clustering result, a directional bounding box of the obstacle is taken out based on the clustering result frame, the direction of the obstacle is determined based on the directional bounding box of the obstacle, and a first direction angle is obtained in a global coordinate system based on the direction of the obstacle;
when the barrier parameter is the barrier speed, obtaining the speeds of the barrier in the X direction and the Y direction respectively under a global coordinate system based on the estimated movement track of the barrier, and obtaining a first direction angle by adopting an inverse trigonometric function;
and when the barrier parameter is a barrier moving track, fitting the estimated moving track of the barrier to obtain an analytic expression representing the barrier track, and calculating a first direction angle corresponding to the current position according to the analytic expression slope of the barrier track.
4. The method according to claim 3, wherein the determining the confidence of the first direction angle corresponding to each of the parameters included in the obstacle parameters based on the estimated movement track of the obstacle comprises:
when the obstacle parameter is an obstacle outline, searching a first confidence coefficient corresponding to the length of the directional bounding box of the obstacle from a first data table, wherein the first data table comprises the corresponding relation between the length of the directional bounding box of the obstacle and the first confidence coefficient, and the first confidence coefficient is increased along with the increase of the length of the directional bounding box of the obstacle;
when the obstacle parameter is an obstacle speed, obtaining the obstacle speed based on the speeds of the obstacle in the X direction and the Y direction respectively under a global coordinate system, and searching a second confidence coefficient corresponding to the obstacle speed from a second data table, wherein the second data table comprises the corresponding relation between the obstacle speed and the second confidence coefficient, and the second confidence coefficient is increased along with the increase of the obstacle speed;
when the obstacle parameter is an obstacle moving track, obtaining a track parameter of the obstacle moving track based on the analytic expression representing the obstacle moving track, and searching a third confidence corresponding to the track parameter of the obstacle moving track from a third data table, wherein the third data table comprises a corresponding relation between any track parameter of the obstacle moving track and the third confidence, the track parameter comprises at least one of track length, track fitting degree and curvature of a point on the track, the third confidence increases with the increase of the track length or the track fitting degree, and the third confidence decreases with the increase of the curvature of the point on the track.
5. The method for fusing obstacle direction angles according to claim 4, wherein the fusing, according to the determined confidence corresponding to each parameter included in the obstacle parameters, of the confidence of each parameter in a priority gradient fusion manner includes:
and calculating to obtain a first pre-fusion confidence coefficient according to the following formula:
ConfFitFinal=(1-ConfVel)*ConfFit
in the formula, ConfFitFinalRepresenting the first pre-fusion confidence, ConfVelRepresenting the second confidence, ConfFitRepresenting the third confidence level.
6. The method according to claim 5, wherein the confidence fusion processing is performed on the confidence of each parameter in a priority gradient fusion manner according to the determined confidence of each parameter included in the corresponding obstacle parameter, and further comprising:
and calculating to obtain a second pre-fusion confidence coefficient according to the following formula:
ConfBoundingBoxFinal=ConfBoundingBox*[min((1-ConfVel),(1-ConfFitFinal))]
in the formula, ConfBoundingBox FinalRepresenting the second pre-fusion confidence, ConfBoundingBoxRepresenting the first confidence level.
7. The method for fusing obstacle direction angles according to any one of claim 6, wherein the fusing processing of the confidence degrees of each parameter in a priority gradient fusion manner according to the determined confidence degree corresponding to each parameter included in the obstacle parameters further includes:
the total confidence is calculated according to the following formula:
EstimateConfidence=ConfVel+ConfFitFinal+ConfBoundingBox Final
where EstimateConfidence represents the overall confidence.
8. The method according to any one of claims 7, wherein the performing confidence fusion processing on the total confidence to obtain a fusion confidence specifically includes:
performing confidence fusion processing on the total confidence according to the following formula to obtain an estimated fusion confidence:
in the formula (I), the compound is shown in the specification,HeadingEstimaterepresenting the estimated fusion confidence, HeadingFitRepresenting a first direction angle, Heading, of an obstacle corresponding to a movement trajectory of said obstacleVelRepresenting a first direction angle, Heading, of an obstacle corresponding to a velocity of said obstacleBoundingboxA first direction angle of an obstacle corresponding to the obstacle profile is indicated.
9. The method according to claim 8, wherein the performing confidence fusion processing on the total confidence to obtain a fusion confidence specifically includes:
performing confidence regression fusion processing on the total confidence according to the following formula to obtain fusion confidence;
in the formula, the HeadingEstimateFinal represents the fusion confidence, HeadingEstimateiThe estimated fusion confidence of the ith time is shown, when i is 0, the current time is shown, when i is n, the nth historical time is shown, and EstimateConfidenceiAs a total confidence of the ith time, DecayCoeffiIs the decay coefficient of the ith time, wherein the decay coefficient is reduced along with the increase of the difference between the historical time and the current time.
10. An obstacle directional angle fusion system, comprising:
the estimated movement track generation module is used for determining at least two of the obstacle outline, the obstacle speed and the obstacle movement track as obstacle parameters and calculating the estimated movement track of the obstacle based on each obstacle parameter; determining a first direction angle of the obstacle corresponding to each parameter included in the obstacle parameters based on the estimated movement trajectory of the obstacle;
a confidence coefficient determining module, configured to determine, based on the estimated movement trajectory of the obstacle, a confidence coefficient of the first direction angle corresponding to each parameter included in the obstacle parameters, where the confidence coefficient of the first direction angle of each parameter included in the obstacle parameters is determined differently;
the first fusion module is used for carrying out fusion processing on the confidence coefficient of each parameter according to the determined confidence coefficient corresponding to each parameter included by the barrier parameters and a priority gradient fusion mode to obtain a total confidence coefficient;
and the second fusion module is used for performing confidence fusion processing on the total confidence to obtain fusion confidence.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210025996.6A CN114407883B (en) | 2022-01-11 | 2022-01-11 | Fusion method and system of obstacle direction angle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210025996.6A CN114407883B (en) | 2022-01-11 | 2022-01-11 | Fusion method and system of obstacle direction angle |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114407883A true CN114407883A (en) | 2022-04-29 |
CN114407883B CN114407883B (en) | 2024-03-08 |
Family
ID=81271540
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210025996.6A Active CN114407883B (en) | 2022-01-11 | 2022-01-11 | Fusion method and system of obstacle direction angle |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114407883B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018024338A (en) * | 2016-08-10 | 2018-02-15 | 日産自動車株式会社 | Travel track estimation method and travel track estimation apparatus |
CN109910878A (en) * | 2019-03-21 | 2019-06-21 | 山东交通学院 | Automatic driving vehicle avoidance obstacle method and system based on trajectory planning |
CN110764509A (en) * | 2019-11-11 | 2020-02-07 | 北京百度网讯科技有限公司 | Task scheduling method, device, equipment and computer readable storage medium |
CN111190427A (en) * | 2020-04-10 | 2020-05-22 | 北京三快在线科技有限公司 | Method and device for planning track |
CN111186437A (en) * | 2019-12-25 | 2020-05-22 | 北京三快在线科技有限公司 | Vehicle track risk determination method and device |
US20210188263A1 (en) * | 2019-12-23 | 2021-06-24 | Baidu International Technology (Shenzhen) Co., Ltd. | Collision detection method, and device, as well as electronic device and storage medium |
CN113895459A (en) * | 2021-11-11 | 2022-01-07 | 北京经纬恒润科技股份有限公司 | Method and system for screening obstacles |
-
2022
- 2022-01-11 CN CN202210025996.6A patent/CN114407883B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018024338A (en) * | 2016-08-10 | 2018-02-15 | 日産自動車株式会社 | Travel track estimation method and travel track estimation apparatus |
CN109910878A (en) * | 2019-03-21 | 2019-06-21 | 山东交通学院 | Automatic driving vehicle avoidance obstacle method and system based on trajectory planning |
CN110764509A (en) * | 2019-11-11 | 2020-02-07 | 北京百度网讯科技有限公司 | Task scheduling method, device, equipment and computer readable storage medium |
US20210188263A1 (en) * | 2019-12-23 | 2021-06-24 | Baidu International Technology (Shenzhen) Co., Ltd. | Collision detection method, and device, as well as electronic device and storage medium |
CN111186437A (en) * | 2019-12-25 | 2020-05-22 | 北京三快在线科技有限公司 | Vehicle track risk determination method and device |
CN111190427A (en) * | 2020-04-10 | 2020-05-22 | 北京三快在线科技有限公司 | Method and device for planning track |
CN113895459A (en) * | 2021-11-11 | 2022-01-07 | 北京经纬恒润科技股份有限公司 | Method and system for screening obstacles |
Also Published As
Publication number | Publication date |
---|---|
CN114407883B (en) | 2024-03-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102463720B1 (en) | System and Method for creating driving route of vehicle | |
CN109521757B (en) | Static obstacle identification method and device | |
CN106651901B (en) | Object tracking method and device | |
KR101504252B1 (en) | System for controlling speed of vehicle on curved road and method thereof | |
CN111830979A (en) | Trajectory optimization method and device | |
WO2020233436A1 (en) | Vehicle speed determination method, and vehicle | |
US11567501B2 (en) | Method and system for fusing occupancy maps | |
US20220314980A1 (en) | Obstacle tracking method, storage medium and unmanned driving device | |
WO2022141116A1 (en) | Three-dimensional point cloud segmentation method and apparatus, and movable platform | |
CN110426714B (en) | Obstacle identification method | |
CN114407883A (en) | Method and system for fusing direction angles of obstacles | |
WO2022099620A1 (en) | Three-dimensional point cloud segmentation method and apparatus, and mobile platform | |
WO2023236476A1 (en) | Lane line-free method and apparatus for determining tracking trajectory | |
WO2023201951A1 (en) | Method and apparatus for generating lane centerline | |
CN113203424B (en) | Multi-sensor data fusion method and device and related equipment | |
CN117490710A (en) | Parameter adjusting method and device for target fusion and target fusion system | |
CN112562391B (en) | Parking space updating method and device | |
CN114147707A (en) | Robot docking method and device based on visual identification information | |
CN114987478A (en) | Vehicle lane changing method, device, storage medium and processor | |
CN115031755A (en) | Automatic driving vehicle positioning method and device, electronic equipment and storage medium | |
CN116382308B (en) | Intelligent mobile machinery autonomous path finding and obstacle avoiding method, device, equipment and medium | |
CN112183358B (en) | Training method and device for target detection model | |
CN117392172A (en) | Target tracking method and device based on laser point cloud | |
CN117315616A (en) | Environment sensing method, device, computer equipment and medium | |
CN115511975A (en) | Distance measurement method of monocular camera and computer program product |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |