CN113189583B - Time-space synchronization millimeter wave radar and visual information fusion method - Google Patents
Time-space synchronization millimeter wave radar and visual information fusion method Download PDFInfo
- Publication number
- CN113189583B CN113189583B CN202110455091.8A CN202110455091A CN113189583B CN 113189583 B CN113189583 B CN 113189583B CN 202110455091 A CN202110455091 A CN 202110455091A CN 113189583 B CN113189583 B CN 113189583B
- Authority
- CN
- China
- Prior art keywords
- millimeter wave
- wave radar
- point
- coordinate
- track
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/06—Systems determining position data of a target
- G01S13/08—Systems for measuring distance only
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Computer Networks & Wireless Communication (AREA)
- Theoretical Computer Science (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Computational Biology (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Train Traffic Observation, Control, And Security (AREA)
Abstract
The invention discloses a time-space synchronization millimeter wave radar and visual information fusion method which mainly comprises three steps, wherein the first step is to complete the primary positioning of a front same-rail train by utilizing the message data analysis of a millimeter wave radar sensor. And secondly, completing the detection of the advancing track of the running train and the image position of the front same-track train by using a vision sensor based on an image processing technology. And step three, fusing millimeter wave radar information and visual information by a time-space synchronization-based combined calibration method to finish accurate identification and positioning of the front same-rail train. The method overcomes the problems of insufficient precision, low adaptability and the like existing in the detection of a single sensor, realizes the real-time monitoring of the distance between the running train and the front same-rail train, and improves the running safety of the train.
Description
Technical Field
The invention relates to the field of millimeter wave radar and vision measurement, in particular to a track target ranging method based on millimeter wave radar and vision information fusion.
Background
In order to improve the safety of train operation in a rail transit operation system, the distance between a train and a front target needs to be monitored in real time. The single sensor detection has a series of defects such as insufficient precision and poor adaptability. For example, the millimeter wave radar sensor can accurately output distance information of all targets in front, but because the distance information is not displayed visually, accurate identification of the targets in front cannot be performed, and because the millimeter wave radar has high sensitivity to metal, the millimeter wave radar is easily interfered by noise to generate target detection position deviation and undetected conditions of the targets, so that the real-time performance and stability of target tracking are seriously influenced; real-time image information of a front target can be acquired by using a camera, but capture positioning of the target and actual relative position information between the target are difficult to obtain, so that real-time position detection of the front detected target is difficult to meet.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a time-space synchronization millimeter wave radar and visual sensor information fusion method which increases the target detection accuracy and increases the real-time performance of a ranging result.
The invention relates to a time-space synchronous millimeter wave radar and visual sensor information fusion method, which comprises the following steps:
the method comprises the following steps of firstly, completing primary positioning of a target to be detected by analyzing message data of a millimeter wave radar sensor, and specifically comprising the following steps:
firstly, a millimeter wave radar sensor is arranged at the head position of a running train, the geometric center of the maximum plane of the millimeter wave radar is taken as the origin of coordinates, and the advancing direction of the running train is taken as YrwAxis, vertically upwards direction ZrwThe right side direction of the axle and the running train is XrwThe system comprises a shaft, a millimeter wave radar three-dimensional rectangular coordinate system is established, the pitch angle, the yaw angle and the roll angle of a millimeter wave radar sensor in the millimeter wave radar three-dimensional rectangular coordinate system are all zero, the millimeter wave radar sensor is connected with a computer through a CAN bus and used for acquiring message data information obtained by detecting all target trains in front, and then radar message data analysis is completed by utilizing the MFC function of the computer and the millimeter wave radar communication protocol, all target trains in front comprise the same-rail train in front and adjacent-rail trains in front, and the message data information comprises the transverse distances d between all targets in front and the origin of the millimeter wave radar coordinatexAnd a longitudinal directionDistance dy;
Secondly, the detection of the advancing track of the running train and the detection of the position of the front same-track train are completed by utilizing a camera based on an image processing technology, and the method specifically comprises the following steps:
firstly, a camera is arranged on a head of a running train right below a millimeter wave radar sensor, and a three-dimensional rectangular coordinate system X of the camera is established by taking an optical center of the camera as a coordinate origincw-Ycw-ZcwEach coordinate axis of the three-dimensional rectangular coordinate system of the camera is parallel to each coordinate axis of the three-dimensional rectangular coordinate system of the millimeter wave radar and Z isrwAxis and ZcwThe axes are coincident, the pitch angle, the yaw angle and the roll angle of the camera are all zero under a three-dimensional rectangular coordinate system of the camera, the camera is connected with the millimeter wave radar and the camera is connected with the computer through USB data lines respectively, the camera is used for collecting real-time images of a scene in front of a running train, the collected scene images in front comprise all target trains in front and running tracks of the running train, and an image coordinate system X is establishedp-YpThe origin of coordinates of the image coordinate system is located at the intersection of the optical axis of the camera and the image plane, Xp,YpRespectively along the length direction and the width direction of the front scene image;
secondly, completing the straight line detection of the advancing track of the running train based on the accumulated probability Hough transformation and completing the primary screening of the advancing track of the running train based on the straight line slope on the front scene image collected by the camera in the first step;
thirdly, performing track linear screening based on DBSCAN probability density clustering and track linear correction based on queues on a plurality of linear information including the left and right tracks obtained in the second step to obtain corrected linear position information of the tracks on the two sides, wherein the linear positions of the tracks on the left and right sides are respectively represented by lleft,lrightShowing that the slopes are respectively kleft,krightTaking the straight line l of the left and right side railsleft,lrightThe intersection point is represented by p0Represents;
fourthly, selecting the corrected linear position information of the two sides of the track obtained in the third step, and realizing the linear traversing mode of the track along the track direction by using a logarithm-based linear traversing mode of the trackHigh-density traversal of a point near the front same-rail train and low-density traversal far away from the same-rail train position are carried out to obtain a point coordinate p of traversal points of track straight lines on the left side and the right side in the front scene imageleft(xleft,yleft),pright(xright,yright),
Fifthly, selecting the point coordinate p of the left and right track traversal points obtained in the fourth step in the front scene imageleft(xleft,yleft),pright(xright,yright) The method comprises the steps that identification of a front same-rail train is completed based on gray value gradient changes of traversal points of track straight lines on the left side and the right side, and positions with gray value mutation exist in the traversal points of the left side and the right side of the track, and the positions are determined as the positions of the front same-rail train;
step three, fusing millimeter wave radar information and visual information by a time-space synchronization-based combined calibration method to finish accurate identification and distance measurement of the front same-rail train, and specifically comprising the following steps:
firstly, multithreading synchronization of a millimeter wave radar and a camera;
when data acquisition is carried out, a three-thread fusion mode of a millimeter wave radar data receiving thread, a camera receiving thread and a computer data processing thread is selected to realize multithreading time synchronization based on millimeter wave radar and visual information;
secondly, obtaining a position of any radar point in the millimeter wave radar coordinate system converted into the image coordinate system by utilizing the translation and rotation relation among the millimeter wave radar coordinate system, the camera coordinate system and the image coordinate system, and then converting a middle position point p (x) at the bottom of the front same-rail train obtained in the sixth step in the second stepbottom,ybottom) Converting the image position information into coordinates under a millimeter wave radar coordinate system, and finally calculating to obtain a middle position point p (x) at the bottom of the same-rail trainbottom,ybottom) Relative distance d in millimeter wave radar coordinate systemw;
Thirdly, firstly, the transverse distances d of all the targets obtained by the radar in the first stepxAnd a longitudinal distance dyConversion to image coordinate system X of camerap-YpThen, using radar point coordinate pi(xp,yp) Is displayed in the front scene image, and then the transverse distances d between all the front targets and the millimeter wave radar coordinate origin in the step onexAnd a longitudinal distance dyConverting into relative distance under millimeter wave radar coordinate systemFinally, the millimeter wave radar sensor and the camera spatial distance information are fused to complete the radar point coordinate pi(xp,yp) And (4) screening to obtain the position information of the radar point of the front same-rail train.
The invention has the following beneficial effects:
1. the invention realizes the track target ranging based on the millimeter wave radar and visual information fusion, overcomes the defect of single sensor ranging, and increases the target detection accuracy;
2. the algorithm used in the invention has higher operation rate, thus meeting the requirement on the operation rate of the algorithm in the ranging process and increasing the real-time property of the ranging result.
Drawings
FIG. 1 is a flow chart of track target ranging based on millimeter wave radar and visual information fusion;
FIG. 2 is a schematic diagram of clustering results based on probability density;
FIG. 3 is a schematic view of a straight line point traversal of the left and right tracks;
FIG. 4 is a schematic diagram of millimeter wave radar and camera joint calibration;
fig. 5 is a schematic diagram of radar point screening by fusion of millimeter wave radar and visual space distance information.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples.
As shown in the attached drawings, the time-space synchronous millimeter wave radar and visual information fusion method comprises the following steps:
the method comprises the following steps of firstly, completing primary positioning of a target to be detected by analyzing message data of a millimeter wave radar sensor, and specifically comprising the following steps:
firstly, a millimeter wave radar sensor is arranged at the head position of a running train, the geometric center of the maximum plane of the millimeter wave radar is taken as the origin of coordinates, and the advancing direction of the running train is taken as YrwAxis, vertically upwards direction ZrwThe right side direction of the axle and the running train is XrwAnd establishing a millimeter wave radar three-dimensional rectangular coordinate system. The pitch angle, yaw angle and roll angle of the millimeter wave radar sensor in the millimeter wave radar three-dimensional rectangular coordinate system are all zero, as shown in fig. 4 at 8. The millimeter wave radar sensor is connected with the computer through a CAN bus and used for acquiring message data information obtained by detecting all the target trains in front. And then, the data analysis of the radar message is completed by utilizing the MFC function of the computer and the millimeter wave radar communication protocol (see the communication protocol [ M ] M408 _ ARS404_ SRR308 of Wangshangli]Technical _ Documentation, 2019.10.01) that includes a front co-rail train (stationary co-rail train or co-rail train traveling toward a traveling train) and a front adjacent rail train (stationary train or adjacent rail train traveling toward a traveling train). The message data information comprises the transverse distances d between all the targets in front and the millimeter wave radar coordinate originxAnd a longitudinal distance dy;
Secondly, the detection of the advancing track of the running train and the detection of the position of the front same-track train are completed by utilizing a camera based on an image processing technology, and the method specifically comprises the following steps:
firstly, a camera is arranged on the head of a running train right below a millimeter wave radar sensor, the distance is usually 5cm, and a three-dimensional rectangular coordinate system X of the camera is established by taking the optical center of the camera as the origin of coordinatescw-Ycw-ZcwEach coordinate axis of the three-dimensional rectangular coordinate system of the camera is parallel to each coordinate axis of the three-dimensional rectangular coordinate system of the millimeter wave radar and Z isrwAxis and ZcwThe axes coincide. The pitch, yaw and roll angles of the camera are all zero in the three dimensional rectangular coordinate system of the camera, as shown at 8 in fig. 4. The camera and the millimeter wave radar and the camera and the computer are connected through USB data lines respectively. Benefit toAnd (3) carrying out real-time image acquisition on a scene in front of the running train by using a camera, wherein the acquired scene image in front comprises all the target trains in front and the running track of the running train. Establishing an image coordinate system Xp-YpThe origin of coordinates is located at the intersection of the camera optical axis and the image plane, X, as shown at 9 in FIG. 4p,YpRespectively along the length direction and the width direction of the front scene image.
And secondly, performing running train running track straight line detection on the front scene image acquired by the camera in the first step based on accumulated probability Hough transformation (see the Shandong, Weng Mongolia and Yang hongtao. the rapid lane line detection method based on improved probability Hough transformation, computer technology and development [ J ].2020,30(05)) and performing running train running track straight line primary screening based on a straight line slope.
The method for preliminarily screening the traveling track straight line of the running train comprises the following specific processes: according to the position information (including the slope and the starting point position) of the travelling track of the travelling train in the scene image in front and the installation position of the camera, the threshold value of the travelling track slope of the travelling train is selected, the selection principle of the threshold value comprises the steps of including the slopes of the tracks on the left side and the right side of the travelling track, removing unnecessary straight lines such as transverse straight lines and the like as far as possible, finally obtaining a plurality of pieces of straight line information which are represented in a point inclined mode and include the tracks on the left side and the right side, and finishing the preliminary screening of the travelling track straight lines of the travelling train based on the accumulated probability Hough transform.
Thirdly, performing track linear screening based on DBSCAN probability density clustering and track linear correction based on queues on a plurality of linear information including the left and right tracks obtained in the second step to obtain corrected linear position information of the tracks on the two sides, wherein the linear positions of the tracks on the left and right sides are respectively represented by lleft,lrightShowing that the slopes are respectively kleft,krightTaking the straight line l of the left and right side railsleft,lrightIntersection point, in p0The specific implementation is as follows:
step 101, selecting a plurality of straight lines obtained by preliminary screening of straight lines passing through a traveling track of a running train, and based on a DBSCAN probability density clustering algorithm(see in particular Abdellah IDRISSI, Altaf ALAOUI. A Multi-criterion determination Method in the DBSCAN Algorithm for Better Cluster [ J ]]The International Journal of Advanced Computer Science and Applications,2016 (DBSCAN clustering algorithm based on multi-criteria decision method)) accurately identifies the left and right tracks of the advancing track to obtain the initial linear position information of the tracks on both sides expressed by a point-slope formula, wherein the slope is kjWhere j represents the number of lines initially obtained based on the BSCAN clustering algorithm, j ═ 1, 2.. n (n)<5) (ii) a Meanwhile, as the detection process is real-time, the deviation phenomenon of the linear position information of the two side tracks in a partial period occurs in the detection process, and therefore step 102 is executed;
102, respectively setting a queue with an empirical Length of 5 for the preliminary straight line position information of each side track to ensure that all straight lines obtained by the DBSCAN clustering algorithm enter the queue each time;
step 103, calculating the linear slope k of the preliminary linear position information of the two-side track obtained by utilizing the DBSCAN probability density clustering algorithm in sequencejEntering a queue, and selecting the average value k of the queuemeanAs a comparison valueThen, according to the pixel size of the width between the two side tracks in the front scene image, an empirical Distance threshold Distance is set to be 5 (unit, pixel), the following criterion is carried out on the initial straight line slope of each side track,the average value k of the finally updated queue obtained by the above judgmentmeanI.e. the slope of the track straight line, finally completing the correction of the slopes of the track straight lines on both sides, obtaining the position information of the track straight lines on both sides expressed in a point-slope manner after the correction, wherein the track straight lines on the left side and the right side are respectively expressed in a linear mannerleft,lrightShowing that the slopes are respectively kleft,krightTaking the straight line l of the left and right side railsleft,lrightIntersection point, in p0Is shown, as indicated by the 5 black dots in FIG. 3。
Fourthly, selecting the corrected linear position information of the two sides of the track obtained in the third step, and using a track linear traversal mode based on logarithm to realize high-density traversal of points near the same-track train in the track direction and low-density traversal far away from the same-track train position to obtain a point coordinate p of traversal points of the left and right linear tracks in the scene image in front of the traversal pointsleft(xleft,yleft),pright(xright,yright) The specific implementation mode is as follows:
step 101, taking a straight line l of two side tracksleft,lrightPoint of intersection p0Is an initial traversal point;
step 102, following Y in FIG. 4pTraversing the straight lines of the two side tracks in the front scene image in the axial direction respectively, wherein the two side tracks have the same traversal distanceWhereinIs an intersection point p0Along YpThe coordinate of the axis direction, Width is the image Width of the front scene;
step 103, acquiring straight traversing points Y of the tracks on the left side and the right sidepAxial coordinates, comprising the following steps:
in formula (1): y isleft,yrightRespectively, the straight line traverse points of the left and right tracks are at YpThe coordinate of the axial direction is determined,is a straight line l of the left and right tracksleft,lrightInitial traversal point p0Along with at YpAxial coordinate, kleft,krightThe track slopes are respectively the track slopes at the left side and the right side, the straight line traversal intervals of the tracks at the two sides are the same, and the value is delta y ═ logaL, the number n of traversal points isL is the traversal distance;
step 104, obtaining X of the traversal points of the tracks on the left and right sides according to the step 101-pThe coordinate in the axial direction is finally obtained, and the point coordinate p of all the traversal points of the left and right tracks based on the logarithm in the front scene image is finally obtainedleft(xleft,yleft),pright(xright,yright) The implementation result is shown in fig. 3, and 6 and 7 in fig. 3 represent traversal point information of the traversed left and right track lines, respectively.
In formula (2): x is the number ofleft,xrightX of the track traversal points on the left and right sides respectivelypAxial coordinate, yleft,yrightRespectively as the traversal points Y of the left and right trackspThe coordinate of the axial direction is determined,for the initial traversal of the left and right orbits by the point p0Along YpThe coordinate of the axial direction is measured by the coordinate measuring device,for the initial traversal of the left and right orbits by the point p0Along XpAxial coordinate, kleft,krightRespectively a left side track straight line and a right side track straight lineleft,lrightThe slope of the straight line of (a);
fifthly, selecting the point coordinate p of the left and right track traversal points obtained in the fourth step in the front scene imageleft(xleft,yleft),pright(xright,yright) And finishing the identification of the front same-rail train based on the gray value gradient change of the traversal points of the straight lines of the tracks on the left side and the right side. The gray value of the rail is higher, and the gray value of the bottom of the front same-rail train is lowerTherefore, the position where the gray value changes suddenly exists in the left and right track traversal points is determined as the front same-track train position. The specific implementation mode is as follows:
step 101, homogenizing the gray value of the traversal point of the track straight line. In order to eliminate the jitter problem of the gray value of the traversal points, the mean value of the gray values of 4 continuous traversal points is calculated as a new mean value traversal point, and the coordinate value is pmean_left(xmean_left,ymean_left),pmean_right(xmean_right,ymean_right);
And 102, determining the mutation position of the gray value. Will coordinate pmean_left(xmean_left,ymean_left),pmean_right(xmean_right,ymean_right) As the coordinates of the left and right position points at the bottom of the front same-rail train, then selecting the coordinates p of the left and right position points at the bottom of the front same-rail trainmean_left(xmean_left,ymean_left),pmean_right(xmean_right,ymean_right) The calculated arithmetic mean value is used as the coordinate of the middle position point at the bottom of the front same-rail train:
in formula (3): x is the number ofbottom,ybottomIs a point at the middle position of the bottom of the front same-rail train along XpAxis, YpAxial coordinate, xmean-left,ymean-leftIs a position point at the left side of the bottom of the front same-rail train along XpAxis, YpAxial coordinate, xmean-right,ymean-rightIs a position point at the right side of the bottom of the front same-rail train along XpAxis, YpThe coordinate in the axial direction is finally obtained, and the coordinate p (x) of the bottom position point of the front same-rail train is finally obtainedbottom,ybottom) As indicated by the black dots 12 in fig. 5;
the sixth step, choose the coordinate to be p (x)bottom,ybottom) The middle position point of the bottom of the front same-rail train is subjected to position correction based on Kalman filteringAnd obtaining the information of the middle position point at the bottom of the front same-rail train in smooth transition in each period, wherein the specific implementation mode is as follows:
for the position point deviation phenomenon of the front co-rail train and the phenomenon that the front co-rail train is not detected in a partial period, firstly, a Kalman filtering distance empirical threshold value d is setthreshold50 (see in particular, alidio Gagliardi, francisco de Gioia, Sergio saponara. a real-time video detection algorithm based on Kalman filter and CNN J]Journal of Real-Time Image Processing,2021 a Real-Time video detection algorithm based on kalman filtering and CNN), at the bottom middle point p (x) of the same-track train ahead of the ith periodbottom,ybottom)iAnd the middle position point p (x) at the bottom of the same-rail train in front of the i-1 th periodbottom,ybottom)i-1The Euclidean distance between the two is greater than a distance threshold dthresholdAnd then abandoning the middle target position point at the bottom of the same-rail train in front of the period, and replacing the middle target position point at the bottom of the same-rail train in front of the previous period with the middle target position point. Simultaneously, performing Kalman filtering treatment on all middle target position points at the bottom of the front same-rail train smaller than the distance threshold value to realize smooth transition of the point positions;
step three, fusing millimeter wave radar information and visual information by a time-space synchronization-based combined calibration method to finish accurate identification and distance measurement of the front same-rail train, and specifically comprising the following steps:
firstly, multithreading synchronization of a millimeter wave radar and a camera;
during data acquisition, in order to enable a millimeter wave radar and a camera to acquire target data at the same time, a three-thread fusion mode of a millimeter wave radar data receiving thread, a camera receiving thread and a computer data processing thread is selected to realize multi-thread time synchronization based on the millimeter wave radar and visual information (see particularly lubin, fislin. research and application of multi-thread technology [ J ] computer research and development, 2000, (04));
secondly, obtaining the position of any radar point in the millimeter wave radar coordinate system converted into the image coordinate system by utilizing the translation and rotation relationship among the millimeter wave radar coordinate system, the camera coordinate system and the image coordinate system, wherein the conversion relationship is as follows (see Roxiao, Yaoyuan, Zhang jin Shi. a millimeter wave radar and camera combined calibration method [ N ]. Qinghua university journal of 54 rd Vol. 3 in 2014),
in the formula (4), xp,ypX of points in millimeter wave radar in image coordinate systempAxial direction and YpAxial coordinate, xrw,yrwFor X of point in millimeter-wave radar under coordinate system of millimeter-wave radarrwAxial direction and YrwAxial coordinate, CxFor the optical axis of the camera along XrwAxial offset, CyIs the optical axis of the camera along YpOffset in axial direction, fxIs along XpFocal length of axial camera, fyIs along YpFocal length of axial camera, LxFor the spacing, L, between the radar projection coordinate system and the X-axis of the camera projection coordinate systemyAnd H is the distance between the radar projection coordinate system and the Y axis of the camera projection coordinate system, and the camera mounting height.
Finally, the formula (4) is utilized to obtain the middle position point p (x) at the bottom of the front common rail train obtained in the sixth step in the second stepbottom,ybottom) The image position information is converted into coordinates under a millimeter wave radar coordinate system, and then the middle position point p (x) at the bottom of the same-rail train is obtained through calculationbottom,ybottom) Relative distance d in millimeter wave radar coordinate systemw。
The specific conversion steps are as follows: firstly, obtaining the coordinate p (x) of the middle position point at the bottom of the same-rail trainbottom,ybottom) At XpAxial position xbottomAnd YpAxial position ybottomAfter that x isbottom,ybottomCoordinate x converted into millimeter wave radar coordinate systemw,ywBy the formulaCalculating to obtain the middle position point p (x) at the bottom of the train on the same trackbottom,ybottom) Relative distance d in millimeter wave radar coordinate systemw。
Thirdly, firstly, the transverse distances d of all the targets obtained by the radar in the first stepxAnd a longitudinal distance dyConversion to the image coordinate system X of the camera using equation (4)p-YpThen, using radar point coordinate pi(xp,yp) Is shown in the front scene image as black squares 10, 11, 13 in fig. 5. Then the transverse distances d between all the front targets in the step one and the coordinate origin of the millimeter wave radar are calculatedxAnd a longitudinal distance dyConverting into relative distance under millimeter wave radar coordinate systemFinally, the millimeter wave radar sensor and the camera spatial distance information are fused to complete the radar point coordinate pi(xp,yp) The method for screening the radar point position information of the front same-rail train comprises the following steps:
step 101, regarding the radar point data phenomenon that multiple groups of distances are multiple in the same target caused by secondary reflection in the radar detection process, as shown in fig. 5 10 and 11, where the actual relative distance of 10 is much greater than 11, firstly, the transverse distances d between all the targets in front and the millimeter wave radar coordinate origin are determinedxAnd a longitudinal distance dyConverting into relative distance under millimeter wave radar coordinate systemComparing and then comparing drWith the bottom middle position point p (x) of the same-rail trainbottom,ybottom) Relative distance d in millimeter wave radar coordinate systemwComparison, if dwAnd drAbsolute value of the difference | dw-dr|<ΔdthresholdThen the relative distance d is preservedrCorresponding radar points are deleted, otherwise, the coordinate p of the radar point is realizedi(xp,yp) Coarse screening ofObtaining the coordinate p of the radar point after coarse screeningj(xp,yp). Since the actual relative distance of the radar points is at least doubled due to the secondary reflection, the distance threshold is set to be 1.5 times the visual actual relative distance value deltadthreshold=1.5dw。
102, selecting a radar point coordinate p obtained by rough screening for the phenomenon that the information of the radar point of the adjacent vehicle is detected in the radar detection process as shown in 13 in FIG. 5j(xp,yp) In the front scene image, according to the radar point coordinates pj(xp,yp) Position information, and the middle position point information p (x) of the bottom of the same-rail train in the front scene image obtained in the step twobottom,ybottom) As a reference center, selecting a bottom middle position point p (x) of the train on the same track with the front according to the Euclidean distance minimum constraint principlebottom,ybottom) Nearest radar point coordinate pj(xp,yp) As a final radar point screening result, relative distance information d corresponding to the finally screened radar pointrAs a final ranging result.
In formula (5): x is the number ofpi,ypiIs the horizontal and vertical coordinate, x, of the radar point in the image coordinate systembottom,ybottomAnd the horizontal and vertical coordinates of the middle position point at the bottom of the front same-rail train are shown. The middle position point at the bottom of the front same-rail train is shown as 12 in fig. 5, the radar point closest to the middle position point is selected as a final radar point detection result, screening of the radar points 13 of the adjacent vehicles is achieved, the radar point 11 based on visual matching is finally obtained, the distance information of the radar point at the moment is selected as a final radar point screening result, and the corresponding relative distance information d of the radar point is selectedrAs a final ranging result.
Claims (5)
1. A time-space synchronization millimeter wave radar and visual information fusion method is characterized by comprising the following steps:
the method comprises the following steps of firstly, completing primary positioning of a target to be detected by analyzing message data of a millimeter wave radar sensor, and specifically comprising the following steps:
firstly, a millimeter wave radar sensor is arranged at the head position of a running train, the geometric center of the maximum plane of the millimeter wave radar is taken as the origin of coordinates, and the advancing direction of the running train is taken as YrwAxis, vertically upwards direction ZrwThe right side direction of the axle and the running train is XrwThe system comprises a shaft, a millimeter wave radar three-dimensional rectangular coordinate system is established, the pitch angle, the yaw angle and the roll angle of a millimeter wave radar sensor in the millimeter wave radar three-dimensional rectangular coordinate system are all zero, the millimeter wave radar sensor is connected with a computer through a CAN bus and used for acquiring message data information obtained by detecting all target trains in front, and then radar message data analysis is completed by utilizing the MFC function of the computer and a millimeter wave radar communication protocol, all target trains in front comprise a front same-rail train and a front adjacent rail train, and the message data information comprises the transverse distances d between all targets in front and the origin of the millimeter wave radar coordinatexAnd a longitudinal distance dy;
Secondly, the detection of the advancing track of the running train and the detection of the position of the front same-track train are completed by utilizing a camera based on an image processing technology, and the method specifically comprises the following steps:
firstly, a camera is arranged on a head of a running train right below a millimeter wave radar sensor, and a three-dimensional rectangular coordinate system X of the camera is established by taking an optical center of the camera as a coordinate origincw-Ycw-ZcwEach coordinate axis of the three-dimensional rectangular coordinate system of the camera is parallel to each coordinate axis of the three-dimensional rectangular coordinate system of the millimeter wave radar and Z isrwAxis and ZcwThe axes are coincident, the pitch angle, the yaw angle and the roll angle of the camera are all zero under a three-dimensional rectangular coordinate system of the camera, the camera is connected with the millimeter wave radar and the camera is connected with the computer through USB data lines respectively, the camera is used for collecting real-time images of a scene in front of a running train, the collected scene images in front comprise all target trains in front and running tracks of the running train, and an image coordinate system X is establishedp-YpThe origin of coordinates of the image coordinate system is located at the intersection of the optical axis of the camera and the image plane, Xp,YpRespectively along the length direction and the width direction of the front scene image;
secondly, completing the straight line detection of the advancing track of the running train based on the accumulated probability Hough transformation and completing the primary screening of the advancing track of the running train based on the straight line slope on the front scene image collected by the camera in the first step;
thirdly, performing track linear screening based on DBSCAN probability density clustering and track linear correction based on queues on a plurality of linear information including the left and right tracks obtained in the second step to obtain corrected linear position information of the tracks on the two sides, wherein the linear positions of the tracks on the left and right sides are respectively represented by lleft,lrightShowing that the slopes are respectively kleft,krightTaking the straight line l of the left and right side railsleft,lrightIntersection point is defined by p0Represents;
fourthly, selecting the corrected linear position information of the two sides of the track obtained in the third step, and using a track linear traversal mode based on logarithm to realize high-density traversal of points near the same-track train in the track direction and low-density traversal far away from the same-track train position to obtain a point coordinate p of traversal points of the left and right linear tracks in the scene image in front of the traversal pointsleft(xleft,yleft),pright(xright,yright),
Fifthly, selecting the point coordinate p of the left and right track traversal points obtained in the fourth step in the front scene imageleft(xleft,yleft),pright(xright,yright) The method comprises the steps that identification of a front same-rail train is completed based on gray value gradient changes of traversal points of track straight lines on the left side and the right side, and positions with gray value mutation exist in the traversal points of the left side and the right side of the track, and the positions are determined as the positions of the front same-rail train;
the sixth step, choose the coordinate to be p (x)bottom,ybottom) The middle position point at the bottom of the front same-rail train is corrected based on Kalman filtering to obtain the front same-rail train in smooth transition in each periodVehicle bottom middle position point information;
step three, fusing millimeter wave radar information and visual information by a time-space synchronization-based combined calibration method to complete accurate identification and distance measurement of the front same-rail train, and specifically comprising the following steps of:
firstly, multithreading synchronization of a millimeter wave radar and a camera;
when data acquisition is carried out, a three-thread fusion mode of a millimeter wave radar data receiving thread, a camera receiving thread and a computer data processing thread is selected to realize multithreading time synchronization based on the millimeter wave radar and visual information;
secondly, obtaining a position of any radar point in the millimeter wave radar coordinate system converted into the image coordinate system by utilizing the translation and rotation relation among the millimeter wave radar coordinate system, the camera coordinate system and the image coordinate system, and then converting a middle position point p (x) at the bottom of the front same-rail train obtained in the sixth step in the second stepbottom,ybottom) Converting the image position information into coordinates under a millimeter wave radar coordinate system, and finally calculating to obtain a middle position point p (x) at the bottom of the same-rail trainbottom,ybottom) Relative distance d in millimeter wave radar coordinate systemw;
Thirdly, firstly, the transverse distances d of all the targets obtained by the radar in the first stepxAnd a longitudinal distance dyConversion to image coordinate system X of camerap-YpThen, using radar point coordinate pi(xp,yp) Is displayed in the front scene image, and then the transverse distances d between all the front targets and the millimeter wave radar coordinate origin in the step onexAnd a longitudinal distance dyConverting into relative distance under millimeter wave radar coordinate systemFinally, the millimeter wave radar sensor and the camera spatial distance information are fused to complete the radar point coordinate pi(xp,yp) And (4) screening to obtain the position information of the radar point of the front same-rail train.
2. The time-space synchronized millimeter wave radar and visual information fusion method of claim 1, wherein:
the specific processes of the second step and the third step are as follows:
step 101, selecting a plurality of straight lines obtained by preliminarily screening straight lines of a travelling track of a running train, accurately identifying the tracks on the left side and the right side of the travelling track based on a DBSCAN probability density clustering algorithm, and obtaining preliminary straight line position information of the tracks on the two sides expressed in a point skew manner, wherein the slope is kjWhere j represents the number of lines initially obtained based on the BSCAN clustering algorithm, j is 1,2<5;
102, respectively setting a queue with an empirical Length of 5 for the preliminary straight line position information of each side track to ensure that all straight lines obtained by the DBSCAN clustering algorithm enter the queue each time;
step 103, calculating the linear slope k of the preliminary linear position information of the two-side track obtained by utilizing the DBSCAN probability density clustering algorithm in sequencejEntering a queue, and selecting the average value k of the queuemeanAs a comparison valueThen, setting an empirical Distance threshold Distance to 5 according to the pixel size of the width between the tracks on the two sides in the front scene image, performing the following criterion on the initial straight line slope of each track on the side,the average value k of the finally updated queue obtained by the above judgmentmeanNamely the slope of the track straight line, and finally finishing the correction of the slopes of the track straight lines on both sides to obtain the position information of the track straight lines on both sides expressed in a point-slope manner after the correction.
3. The time-space synchronized millimeter wave radar and visual information fusion method according to claim 1 or 2, characterized in that: the fourth step of the second step is realized by the following specific method:
step 101, taking a straight line l of two side tracksleft,lrightPoint of intersection p0Is an initial traversal point;
step 102, along YpThe axial direction respectively traverses the straight lines of the two side tracks in the front scene image, and the two side tracks take the same traverse distance asWhereinIs an intersection point p0Along YpThe coordinate of the axis direction, Width is the image Width of the front scene;
step 103, acquiring straight traversing points Y of the tracks on the left side and the right sidepAxial coordinates, comprising the following steps:
in formula (1): y isleft,yrightRespectively, the straight line traverse points of the left and right tracks are at YpThe coordinate of the axial direction is determined,is a straight line l of the left and right tracksleft,lrightInitial traversal point p0At YpAxial coordinate, kleft,krightThe track slopes are respectively the track slopes at the left side and the right side, the straight line traversal intervals of the tracks at the two sides are the same, and the value is delta y ═ logaL, the number n of traversal points isL is the traversal distance;
step 104, obtaining X of the traversal points of the tracks on the left and right sides according to the step 101-pThe coordinate in the axial direction is finally obtained, and the point coordinate p of all the traversal points of the left and right tracks based on the logarithm in the front scene image is finally obtainedleft(xleft,yleft),pright(xright,yright):
In formula (2): x is the number ofleft,xrightX of the track traversal points on the left and right sides respectivelypAxial coordinate, yleft,yrightRespectively as the traversal points Y of the left and right trackspThe coordinate of the axial direction is determined,for initial traversal of the left and right side orbits by points p0Along YpThe coordinate of the axial direction is determined,for the initial traversal of the left and right orbits by the point p0Along XpAxial coordinate, kleft,krightRespectively a track straight line l on the left and right sidesleft,lrightThe slope of the line of (a).
4. The time-space synchronized millimeter wave radar and visual information fusion method of claim 3, wherein: the concrete implementation method of the fifth step and the sixth step is as follows:
step 101, carrying out homogenization operation on the gray values of the traversal points of the track straight line, namely calculating the mean value of the gray values of 4 continuous traversal points as a new mean value traversal point, wherein the coordinate value is pmean_left(xmean_left,ymean_left),pmean_right(xmean_right,ymean_right);
Step 102, determining the position of the gray value mutation: will coordinate pmean_left(xmean_left,ymean_left),pmean_right(xmean_right,ymean_right) As the coordinates of the left and right position points at the bottom of the front same-rail train, then selecting the coordinates p of the left and right position points at the bottom of the front same-rail trainmean_left(xmean_left,ymean_left),pmean_right(xmean_right,ymean_right) The calculated arithmetic mean value is used as the coordinate of the middle position point at the bottom of the front same-rail train:
in formula (3): x is the number ofbottom,ybottomIs a point at the middle position of the bottom of the front same-rail train along XpAxis, YpAxial coordinate, xmean-left,ymean-leftIs a position point at the left side of the bottom of the front same-rail train along XpAxis, YpAxial coordinate, xmean-right,ymean-rightIs a position point at the right side of the bottom of the front same-rail train along XpAxis, YpThe coordinate in the axial direction is finally obtained, and the coordinate p (x) of the bottom position point of the front same-rail train is finally obtainedbottom,ybottom);
The sixth step, choose the coordinate to be p (x)bottom,ybottom) The middle position point of the bottom of the front co-rail train is corrected based on Kalman filtering to obtain the middle position point information d of the bottom of the front co-rail train in smooth transition in each periodthresholdWhen p (x) is 50bottom,ybottom)iAnd p (x)bottom,ybottom)i-1The inter-Euclidean distance is greater than a distance threshold dthresholdThen abandoning the middle target position point at the bottom of the same-rail train in front of the period, and replacing the middle target position point at the bottom of the same-rail train in front of the previous period with the middle target position point; meanwhile, Kalman filtering processing is carried out on all middle target position points at the bottom of the front same-rail train smaller than the distance threshold value, and smooth transition of the point positions is realized.
5. The time-space synchronized millimeter wave radar and visual information fusion method of claim 3, wherein: the third step of the step comprises the following steps:
step 101, firstly, all the targets in front are matched with the millimeter wave radar coordinatesTransverse distance d of originxAnd a longitudinal distance dyConverting into relative distance under millimeter wave radar coordinate systemThen d isrWith the middle position point p (x) at the bottom of the same-rail trainbottom,ybottom) Relative distance d in millimeter wave radar coordinate systemwComparison if dwAnd drAbsolute value of the difference | dw-dr|<ΔdthresholdThen the relative distance d is preservedrCorresponding radar points are deleted, otherwise, the coordinate p of the radar point is realizedi(xp,yp) Obtaining the coordinate p of the radar point after coarse screeningj(xp,yp);
102, selecting coordinates p of radar points obtained by coarse screeningj(xp,yp) In the front scene image, according to the radar point coordinates pj(xp,yp) Position information, and the middle position point information p (x) of the bottom of the same-rail train in the front scene image obtained in the step twobottom,ybottom) As a reference center, selecting a bottom middle position point p (x) of the train on the same track with the front according to the Euclidean distance minimum constraint principlebottom,ybottom) Nearest radar point coordinate pj(xp,yp) As a final radar point screening result, relative distance information d corresponding to the finally screened radar pointrAs a final ranging result.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110455091.8A CN113189583B (en) | 2021-04-26 | 2021-04-26 | Time-space synchronization millimeter wave radar and visual information fusion method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110455091.8A CN113189583B (en) | 2021-04-26 | 2021-04-26 | Time-space synchronization millimeter wave radar and visual information fusion method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113189583A CN113189583A (en) | 2021-07-30 |
CN113189583B true CN113189583B (en) | 2022-07-01 |
Family
ID=76978999
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110455091.8A Active CN113189583B (en) | 2021-04-26 | 2021-04-26 | Time-space synchronization millimeter wave radar and visual information fusion method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113189583B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114708585B (en) * | 2022-04-15 | 2023-10-10 | 电子科技大学 | Attention mechanism-based millimeter wave radar and vision fusion three-dimensional target detection method |
CN115169452B (en) * | 2022-06-30 | 2023-04-28 | 北京中盛国芯科技有限公司 | Target information system and method based on space-time synchronous queue characteristic radar fusion |
CN115877328B (en) * | 2023-03-06 | 2023-05-12 | 成都鹰谷米特科技有限公司 | Signal receiving and transmitting method of array radar and array radar |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107818557A (en) * | 2016-09-12 | 2018-03-20 | 德尔福技术有限公司 | Enhanced camera object for automotive vehicle detects |
CN108960183A (en) * | 2018-07-19 | 2018-12-07 | 北京航空航天大学 | A kind of bend target identification system and method based on Multi-sensor Fusion |
WO2020134512A1 (en) * | 2018-12-29 | 2020-07-02 | 南京慧尔视智能科技有限公司 | Traffic detection system based on millimeter wave radar and video |
CN111368706A (en) * | 2020-03-02 | 2020-07-03 | 南京航空航天大学 | Data fusion dynamic vehicle detection method based on millimeter wave radar and machine vision |
CN111461088A (en) * | 2020-06-17 | 2020-07-28 | 长沙超创电子科技有限公司 | Rail transit obstacle avoidance system based on image processing and target recognition |
CN111546328A (en) * | 2020-04-02 | 2020-08-18 | 天津大学 | Hand-eye calibration method based on three-dimensional vision measurement |
CN111856441A (en) * | 2020-06-09 | 2020-10-30 | 北京航空航天大学 | Train positioning method based on fusion of vision and millimeter wave radar |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9052393B2 (en) * | 2013-01-18 | 2015-06-09 | Caterpillar Inc. | Object recognition system having radar and camera input |
CN107609522B (en) * | 2017-09-19 | 2021-04-13 | 东华大学 | Information fusion vehicle detection system based on laser radar and machine vision |
US11287523B2 (en) * | 2018-12-03 | 2022-03-29 | CMMB Vision USA Inc. | Method and apparatus for enhanced camera and radar sensor fusion |
CN110208793B (en) * | 2019-04-26 | 2022-03-11 | 纵目科技(上海)股份有限公司 | Auxiliary driving system, method, terminal and medium based on millimeter wave radar |
CN111832410B (en) * | 2020-06-09 | 2022-09-20 | 北京航空航天大学 | Forward train detection method based on fusion of vision and laser radar |
-
2021
- 2021-04-26 CN CN202110455091.8A patent/CN113189583B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107818557A (en) * | 2016-09-12 | 2018-03-20 | 德尔福技术有限公司 | Enhanced camera object for automotive vehicle detects |
CN108960183A (en) * | 2018-07-19 | 2018-12-07 | 北京航空航天大学 | A kind of bend target identification system and method based on Multi-sensor Fusion |
WO2020134512A1 (en) * | 2018-12-29 | 2020-07-02 | 南京慧尔视智能科技有限公司 | Traffic detection system based on millimeter wave radar and video |
CN111368706A (en) * | 2020-03-02 | 2020-07-03 | 南京航空航天大学 | Data fusion dynamic vehicle detection method based on millimeter wave radar and machine vision |
CN111546328A (en) * | 2020-04-02 | 2020-08-18 | 天津大学 | Hand-eye calibration method based on three-dimensional vision measurement |
CN111856441A (en) * | 2020-06-09 | 2020-10-30 | 北京航空航天大学 | Train positioning method based on fusion of vision and millimeter wave radar |
CN111461088A (en) * | 2020-06-17 | 2020-07-28 | 长沙超创电子科技有限公司 | Rail transit obstacle avoidance system based on image processing and target recognition |
Non-Patent Citations (4)
Title |
---|
《A Train Positioning Method Based-On Vision and Millimeter-Wave Radar Data Fusion》;Z. Wang, G. Yu, B. Zhou, P. Wang and X. Wu;《 IEEE Transactions on Intelligent Transportation Systems》;20210203;1 - 11 * |
丁雅斌 ; 彭翔 ; 刘则毅 ; 牛憨笨.《基于广义等值面提取的多视场深度像融合》.《工程图学学报》.2004, * |
姚文韬 ; 沈春锋 ; 董文生.《一种自适应摄像机与激光雷达联合标定算法》.《控制工程》.2017, * |
郑云水 ; 郭双全 ; 董昱.《基于雷达测量数据的列车运行前方障碍物检测方法研究》.《铁道学报》.2021, * |
Also Published As
Publication number | Publication date |
---|---|
CN113189583A (en) | 2021-07-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113189583B (en) | Time-space synchronization millimeter wave radar and visual information fusion method | |
CN109684921B (en) | Road boundary detection and tracking method based on three-dimensional laser radar | |
CN112433203B (en) | Lane linearity detection method based on millimeter wave radar data | |
Zhangyu et al. | A camera and LiDAR data fusion method for railway object detection | |
EP2126843B1 (en) | Method and system for video-based road lane curvature measurement | |
CN109359409A (en) | A kind of vehicle passability detection system of view-based access control model and laser radar sensor | |
Perrollaz et al. | Long range obstacle detection using laser scanner and stereovision | |
CN108596058A (en) | Running disorder object distance measuring method based on computer vision | |
CN116071387A (en) | Sleeper rail production quality detection method based on machine vision | |
CN117492026B (en) | Railway wagon loading state detection method and system combined with laser radar scanning | |
Wu et al. | An algorithm for automatic vehicle speed detection using video camera | |
CN112698302A (en) | Sensor fusion target detection method under bumpy road condition | |
CN106096525A (en) | A kind of compound lane recognition system and method | |
CN110736999B (en) | Railway turnout detection method based on laser radar | |
CN113850102B (en) | Vehicle-mounted vision detection method and system based on millimeter wave radar assistance | |
CN115482195B (en) | Train part deformation detection method based on three-dimensional point cloud | |
CN112991369A (en) | Method for detecting overall dimension of running vehicle based on binocular vision | |
CN107796373B (en) | Distance measurement method based on monocular vision of front vehicle driven by lane plane geometric model | |
CN108645375B (en) | Rapid vehicle distance measurement optimization method for vehicle-mounted binocular system | |
CN114758504B (en) | Online vehicle overspeed early warning method and system based on filtering correction | |
CN115113206B (en) | Pedestrian and obstacle detection method for assisting driving of underground rail car | |
CN110705358A (en) | Tunnel scene control decision method of train AEB system | |
CN112862858A (en) | Multi-target tracking method based on scene motion information | |
CN114820474A (en) | Train wheel defect detection method based on three-dimensional information | |
CN106056926B (en) | Video vehicle speed detection method based on dynamic virtual coil |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |