[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN109459750A - A kind of more wireless vehicle trackings in front that millimetre-wave radar is merged with deep learning vision - Google Patents

A kind of more wireless vehicle trackings in front that millimetre-wave radar is merged with deep learning vision Download PDF

Info

Publication number
CN109459750A
CN109459750A CN201811219589.9A CN201811219589A CN109459750A CN 109459750 A CN109459750 A CN 109459750A CN 201811219589 A CN201811219589 A CN 201811219589A CN 109459750 A CN109459750 A CN 109459750A
Authority
CN
China
Prior art keywords
millimetre
wave radar
vehicle
coordinate system
track
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811219589.9A
Other languages
Chinese (zh)
Other versions
CN109459750B (en
Inventor
金立生
闫福刚
司法
石健
夏海鹏
朱菲婷
冯成浩
孙栋先
王禹涵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jilin University
Original Assignee
Jilin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jilin University filed Critical Jilin University
Priority to CN201811219589.9A priority Critical patent/CN109459750B/en
Publication of CN109459750A publication Critical patent/CN109459750A/en
Application granted granted Critical
Publication of CN109459750B publication Critical patent/CN109459750B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • G01S13/726Multiple target tracking
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

The present invention relates to the more wireless vehicle trackings in front that a kind of millimetre-wave radar is merged with deep learning vision, preceding data information is obtained using millimetre-wave radar, according to its echo reflection intensity and width information, invalid information is weeded out, only retains inferoanterior information of vehicles.According to the method that millimetre-wave radar and video camera blend, by the filtering and online trace model generation motion profile to radar information and Track association is carried out.The front vehicles that Track association has been carried out are recorded and numbered.To the front vehicles for having generated track and having numbered, it is only necessary to the data in next period be carried out with the reprocessing of above-mentioned steps, and carry out consistency check, be added in numbered track.For emerging vehicle, track generation, Track association and number are carried out according to the step of most starting.Present invention incorporates the advantages that millimetre-wave radar and space or depth perception learn, and can effectively improve the accuracy and robustness of vehicle target tracking more for front.

Description

A kind of more wireless vehicle trackings in front that millimetre-wave radar is merged with deep learning vision
Technical field
The invention belongs to Multitarget Tracking fields, are related to a kind of intelligent driving automobile assistant driving method, specifically relate to And a kind of method of the more vehicle target tracking in the front of information fusion, in particular to a kind of millimetre-wave radar and deep learning vision The more wireless vehicle trackings in the front of fusion.
Background technique
Pilotless automobile has become the hot fields studied now, and wherein environment sensing is to realize intelligent driving One important link.The important ring as environment sensing is tracked, the attention of researcher is more obtained.Utilize single sensing When device carries out perception tracking, always occur that precision is not high, stability is poor and the higher problem of false alarm rate.Therefore, it is passed to more Sensor is merged, to realize that tracking becomes research hotspot.Millimetre-wave radar job stability is higher, can under circumstances may be used The work leaned on, and detection range is longer, but its target identification ability is poor.Multiple target tracking based on deep learning is in recent years A kind of multi-object tracking method risen, by a large amount of sample training, object identification ability is preferable, can accurately identify The classification of objects in front and generate motion profile out.When the neural network number of plies is more, though recognition effect is good, operation is complicated, speed Spend it is very slow, and distance farther out when, effect is poor.As being blended using millimetre-wave radar and small layers neural network deep learning Method, to track to the more vehicles in front, this is a kind of new trial, is with a wide range of applications.
Summary of the invention
The object of the invention is that it is directed to deficiency and defect existing for the sensor of above-mentioned single type in the prior art, Provide a kind of more wireless vehicle trackings in front that millimetre-wave radar is merged with deep learning vision.
The purpose of the present invention is what is be achieved through the following technical solutions:
A kind of more wireless vehicle trackings in front that millimetre-wave radar is merged with deep learning vision, comprising the following steps:
A, the coordinate transformation relation between millimetre-wave radar coordinate system and visual sensor (video camera) coordinate system is established, it will The coordinate system of the two carries out unification, is sampled using the Minimum sample rate of both millimetre-wave radar and video camera, to keep Temporal consistency;
B, the data of millimetre-wave radar are received, and is resolved according to certain rules, and located accordingly Reason, to filter out front vehicles, weeds out invalid targets;
C, the millimetre-wave radar data received in step B are filtered, using Kalman filtering to front vehicles into Line trace, and generate track and it is numbered;When there is new vehicle to occur, determined by the collected data calculation of radar Afterwards, tracking is carried out to generate new track and add number;
D, video camera acquired image is pre-processed;
E, off-line training deep learning neural network, identifies front vehicles;
F, the deep learning neural network for pretreated image will have been carried out in step D being sent to step D pre-training In, the vehicle target in front is detected and positioned, obtains the more vehicles in front position in the picture and detection confidence (signals of vehicles with a low credibility in M% is deleted), and vehicle is numbered;
G, the vehicle detected in step F is tracked using online trace model and generates the running track of vehicle simultaneously Number;
H, the data processing centre of millimetre-wave radar and video camera sends data fusion center for local track respectively, and By fusion center by millimetre-wave radar at a distance from video camera output data and coordinate relationship carry out data fusion, based on milli Metre wave radar is related to the track of video camera execution track;
I, repeat above step, and carry out track update, obtain tracking result.
Further, step A, the coordinate system carry out unified step are as follows:
A1, transformational relation between millimetre-wave radar and world three dimensional coordinate system is established, wherein millimetre-wave radar coordinate system For the two-dimensional coordinate system of horizontal plane;
A2, transformational relation between camera coordinate system and three-dimensional world coordinate system is established, wherein camera coordinate system is The two-dimensional coordinate system of vertical plane;
A3, in conjunction with the coordinate relationship and video camera of step A1 and step A2, the millimetre-wave radar derived and three-dimensional world With the coordinate relationship of three-dimensional world, the coordinate relationship between millimetre-wave radar and camera review is released, as follows:
Further, step A1 specifically includes the following steps:
A11、X0O0Z is the coordinate system of millimetre-wave radar, the XOZ plane of coordinate plane and three-dimensional world coordinate system O-XYZ In parallel, and X0O0Z plane is located at below XOZ plane at Y1, and Y1 is the mounting height of millimetre-wave radar, by three-dimensional world coordinate system The XOZ plane projection of O-XYZ is to millimetre-wave radar coordinate system X0O0On Z, OX axis and O0X0Between at a distance of Z0, O is world coordinate system Origin, O are millimetre-wave radar coordinate origin, the i.e. installation site of millimetre-wave radar;
A12, assume to find front vehicles M, the phase between millimetre-wave radar in the scanning range of millimetre-wave radar It adjusts the distance as R, relative angle ɑ, i.e. MO0=R unit is mm, ∠ MO0Z=ɑ, unit are degree;
A13, the vehicle target in millimetre-wave radar coordinate system is transferred in three-dimensional world coordinate system, available X=R X sin ɑ, Z=Z0+R x cosɑ。
Further, step A2 specifically includes the following steps:
A21, camera coordinate system are that the two-dimensional coordinate system xoy, o in perpendicular are camera coordinate system coordinate origin, Its coordinate plane is parallel with the face XOY of three-dimensional world coordinate system O-XYZ.Wherein O is the coordinate origin of three-dimensional world coordinate system, together When be also video camera optical center, i.e. Oo=f, f be video camera effective focal length, unit mm.
A22, video camera installation process require its optical axis parallel to the ground, i.e., the Y value in three-dimensional world coordinate system is kept not Become, i.e. Y=Y0, Y0For the mounting height of video camera, unit mm.
A23, by vehicle target M (X, the Y in three-dimensional world coordinate system0, Z), the image being transformed into camera coordinate system In plane, transformational relation is as follows:
Wherein f is the effective focal length of video camera, unit mm.
Further, the step B specifically includes following steps, as shown in Figure 4:
The data in the front that B1, millimetre-wave radar receive include the distance range of objects ahead, angle angle, opposite Speed rangerate, reflected intensity power and width width;
B2, the data received are resolved using resolution protocol as defined in millimetre-wave radar, weeds out static target And invalid targets;
B3, object filtering is carried out according to the reflected intensity and width of objects ahead, by the way that reflected intensity threshold value u is arranged0With Width threshold value v0, as reflected intensity rangerate >=u0, and width >=v0When, it is confirmed as vehicle target.
Further, the step C specifically includes the following steps::
C1, using Kalman filtering, the vehicle-state in next period is predicted, the millimeter wave thunder in next period is read Up to measured data, predicted state is matched with actual measurement status predication.(with four frames for a period, step-length is two frames);
C2, for emerging vehicle, repeat step C1, renumberd, and generate new pursuit path.
Further, step C1 specifically includes the following steps:
C12, it is carried out in advance using state of the Kalman filtering algorithm to next period of the more vehicle targets in the front detected It surveys;
C13, the actual measured value in next period of the more vehicle targets in front was compared with the predicted value in a upper period, And carry out consistency check;
C14, the target for meeting coherence request update its data information, and carry out the prediction in next period.When even Continuous two periods are all satisfied coherence request, then generate motion profile.For being unsatisfactory for the target of coherence request, it is regarded as Emerging vehicle, retains them temporarily, in undetected vehicle of next period, being then considered as target disappearance.
Further, the step G specifically includes the following steps:
G1, the Euclidean distance for calculating the vehicle detected between the two field pictures of front and back, adjust the distance closer several according to distance A target carries out weight distribution, is followed successively by 1,0.9,0.8 according to distance ... ... 0, is denoted as w1
G2, the friendship for calculating each bounding box in two frames of front and back and ratio, according to handing over and the value of ratio carries out weight point Match, be followed successively by 1,0.9,0.8 according to coating ratio ... ... 0, is denoted as w2
G3, by w1With w2It is added, being worth maximum is most probable target, is recorded;
G4, when thering are three frame images all to detect same vehicle in continuous three frame or four frames, then give birth to vehicle motion profile, And it numbers, Record ID;When two continuous frames or three frames can not find detection vehicle, then record is retained, if it exceeds five frames can not find Detect vehicle, it is determined that vehicle disappears from the visual field, deletes the motion track information of vehicle.
Further, the step H specifically includes the following steps:
H1, be compared at a distance from the acquisition of millimetre-wave radar and video camera with coordinate relationship, when the two result unanimously or The difference of both persons is not significant, and (range difference is no more than threshold value Q1And pixel difference is no more than 24x24) when, then it is merged, is remembered again Record number.(difference on the face XOZ measured with range difference, and the difference on the face XOY is measured with pixel difference);
H2, when the two significant difference, be classified according to fore-and-aft distance.
Further, step H2 specifically includes the following steps:
What H21, millimeter wave measured is less than d with the fore-and-aft distance of vehicle1When, pursuit path mainly obtains track with video camera Based on, using the pursuit path that millimetre-wave radar obtains as the detection of video camera track, compare the pursuit path point mark of the two Figure, the two shape is similar, then it is assumed that and it is identical, if having inconsistent, keep independently tracked, if it exceeds the data of m frame are shown not It can merge, then regard as two targets, be handled according to newly there is target;
What H22, millimeter wave measured is d with the fore-and-aft distance of vehicle1--d2, the fore-and-aft distance difference of the two is no more than threshold value Q2, It is no more than 48x48 pixel in the distance of the upper coordinate difference of image, then takes intermediate value to be merged, fore-and-aft distance is more than Q2It is then straight It connects and deletes the track;
What H23, millimeter wave measured is d with the fore-and-aft distance of vehicle2--d3, rail that pursuit path is formed with millimetre-wave radar Subject to mark, using video camera obtain track be used as millimetre-wave radar track detection, compare the two pursuit path point mark figure, two Person's shape is similar, then it is assumed that if track accurately has inconsistent, keeps independently tracked, if it exceeds show cannot for the data of m frame Fusion, then regard as two targets, is handled according to newly there is target.
Compared with prior art, the beneficial effects of the present invention are:
1, the more wireless vehicle trackings in front that millimetre-wave radar of the present invention is merged with deep learning vision are different from The method for forming area-of-interest on the image by millimetre-wave radar before, but use the fusion method of decision level, root The characteristics of according to different sensors, devises different convergence strategies, makes full use of the advantage of each sensor, improves more to front The accuracy of vehicle tracking;
2, the feature learning ability powerful using deep learning avoids conventional machines and learns artificial selected characteristic, and The feature extracted more horn of plenty, ability to express is stronger, and obtained result is more accurate.
3, the deep learning model number of plies in the present invention, used is less, can preferably realize the real-time of tracking.It can To be preferably applied to unmanned field.
Detailed description of the invention
Fig. 1 is the flow chart for the more wireless vehicle trackings in front that millimetre-wave radar of the present invention is merged with deep learning vision;
Fig. 2 is the transformational relation figure of millimetre-wave radar and vehicle axis system;
Fig. 3 is the transformational relation figure of video camera and vehicle axis system;
Fig. 4 is the flow chart that pursuit path is generated using millimetre-wave radar data;
Fig. 5 is the flow chart for merging millimetre-wave radar track and video camera track.
Specific embodiment
The present invention will be further described with reference to the accompanying drawing:
The more vehicle target trackings in front of the fusion millimetre-wave radar and deep learning in automatic Pilot field of the present invention, Realize the tracking of the more vehicles in front.The present invention obtains preceding data information, including distance, angle, speed using millimetre-wave radar Degree, the reflected intensity of echo and width information etc..According to the echo reflection intensity and width of the preceding data information of acquisition Information carries out information rejecting, weeds out invalid information, only retain inferoanterior information of vehicles.Further according to millimetre-wave radar and depth Degree study (video camera) method for blending generates motion profile simultaneously by filtering to radar information and online trace model Track association is carried out, the accuracy and robustness of multiple target tracking are improved, reduces false alarm rate.Then, to track has been carried out Associated front vehicles are recorded and are numbered.To the front vehicles for having generated track and having numbered, it is only necessary to next period Data carry out the reprocessings of above-mentioned steps, and carry out consistency check, be added in numbered track.For Emerging vehicle carries out track generation, Track association and number according to the step of most starting.
As shown in Figure 1, Figure 2 and Figure 3, the more vehicle trackings in front that millimetre-wave radar of the present invention is merged with deep learning vision Method includes the following steps:
A, the coordinate transformation relation between millimetre-wave radar coordinate system and camera coordinate system is established:
Millimetre-wave radar coordinate system is the two-dimensional coordinate system of horizontal plane, and camera coordinate system is that the two dimension of vertical plane is sat Mark system, by setting up transformational relation between millimetre-wave radar and world three dimensional coordinate system and video camera and world three dimensional coordinate Transformational relation between system is turned to find out the transformational relation between millimetre-wave radar coordinate system and camera coordinate system It changes.
A1, transformational relation between millimetre-wave radar coordinate system and world three dimensional coordinate system is established, detailed process step is such as Under:
A11, the two-dimensional coordinate system that millimetre-wave radar coordinate system is horizontal plane, as shown in the figure, X0O0Z is millimeter wave thunder The coordinate system reached, coordinate plane is parallel with the XOZ plane of three-dimensional world coordinate system O-XYZ, and X0O0Z plane is located at XOZ plane At the Y1 of lower section, Y1 is the mounting height of millimetre-wave radar.By the XOZ plane projection of three-dimensional world coordinate system O-XYZ to millimeter wave Radar fix system X0O0On Z, OX axis and O0X0Between at a distance of Z0, O is world coordinate system origin, and O is that millimetre-wave radar coordinate system is former Point, the i.e. installation site of millimetre-wave radar.
A12, assume to find front vehicles M, the phase between millimetre-wave radar in the scanning range of millimetre-wave radar It adjusts the distance as R, relative angle ɑ, i.e. MO0=R unit is mm, ∠ MO0Z=ɑ, unit are degree.
A13, the vehicle target in millimetre-wave radar coordinate system is transferred in three-dimensional world coordinate system, available X=R X sin ɑ, Z=Z0+R x cosɑ。
A2, transformational relation between camera coordinate system and three-dimensional world coordinate system is established, steps are as follows for detailed process:
A21, camera coordinate system are that the two-dimensional coordinate system xoy, o in perpendicular are camera coordinate system coordinate origin, Its coordinate plane is parallel with the face XOY of three-dimensional world coordinate system O-XYZ.Wherein O is the coordinate origin of three-dimensional world coordinate system, together When be also video camera optical center, i.e. Oo=f, f be video camera effective focal length, unit mm.
A22, video camera installation process require its optical axis parallel to the ground, i.e., the Y value in three-dimensional world coordinate system is kept not Become, i.e. Y=Y0, Y0For the mounting height of video camera, unit mm.
A23, by vehicle target M (X, the Y in three-dimensional world coordinate system0, Z), the image being transformed into camera coordinate system In plane, transformational relation is as follows:
Wherein f is the effective focal length of video camera, unit mm.
A3, in conjunction with the coordinate relationship and video camera of step A1 and step A2, the millimetre-wave radar derived and three-dimensional world With the coordinate relationship of three-dimensional world, the coordinate relationship released between millimetre-wave radar and camera review is as follows:
B, the data of millimetre-wave radar are received, and is resolved according to certain rules, and located accordingly Reason, to filter out front vehicles, weeds out invalid targets, includes the following steps:
The data in the front that B1, millimetre-wave radar receive include the distance range of objects ahead, angle angle, opposite Speed rangerate, reflected intensity power and width width.
B2, the data received are resolved using resolution protocol as defined in millimetre-wave radar, weeds out static target And invalid targets.
B3, object filtering is carried out according to the reflected intensity and width of objects ahead, by the way that reflected intensity threshold value u is arranged0With Width threshold value v0, as reflected intensity rangerate >=u0, and width >=v0When, it is confirmed as vehicle target.
C, the millimetre-wave radar data received in step B are filtered, using Kalman filtering to front vehicle It is tracked, and generates track and it is numbered;When there is new vehicle to occur, pass through the collected data calculation of radar After determination, carries out tracking and generate new track and add number.Specific step is as follows:
C1, using Kalman filtering, the vehicle-state in next period is predicted, the millimeter wave thunder in next period is read Up to measured data, predicted state is matched with actual measurement status predication.(with four frames for a period, step-length is two frames)
C12, it is carried out in advance using state of the Kalman filtering algorithm to next period of the more vehicle targets in the front detected It surveys;
C13, the actual measured value in next period of the more vehicle targets in front was compared with the predicted value in a upper period, And carry out consistency check;
C14, the target for meeting coherence request update its data information, and carry out the prediction in next period.When even Continuous two periods are all satisfied coherence request, then generate motion profile.For being unsatisfactory for the target of coherence request, it is regarded as Emerging vehicle, retains them temporarily, in undetected vehicle of next period, being then considered as target disappearance;
C2, for emerging vehicle, repeat step C1, renumberd, and generate new pursuit path.
D, video camera acquired image is pre-processed.In order to guarantee millimetre-wave radar and deep learning (video camera) Temporal consistency when fusion, is sampled using the Minimum sample rate of both millimetre-wave radar and video camera, and is carried out Gray processing is handled, therefore the acquisition and prescreening of image are carried out during video camera acquires image.
E, off-line training deep learning neural network.In the training of neural network, carried out using ImageNet database Training, and tested.
F, the deep learning neural network for pretreated image will have been carried out in step D being sent to step D pre-training In, the vehicle target in front is detected and positioned, finally show that the more vehicles in front position in the picture and detection can Reliability (deletes the signals of vehicles with a low credibility in M%), and vehicle is numbered.
G, using tracking and calculating method to step F, in more vehicles for detecting track, generate the operation rail of vehicle Mark is completed Track association and is numbered, the specific steps are as follows:
G1, the Euclidean distance for calculating the vehicle detected between the two field pictures of front and back, adjust the distance closer several according to distance A target carries out weight distribution, is followed successively by 1,0.9,0.8 according to distance ... ... 0, is denoted as w1
G2, the friendship for calculating each bounding box in two frames of front and back and ratio, according to handing over and the value of ratio carries out weight point Match, be followed successively by 1,0.9,0.8 according to coating ratio ... ... 0, is denoted as w2
G.3, by w1With w2It is added, being worth maximum is most probable target, is recorded.
G4, when thering are three frame images all to detect same vehicle in continuous three frame or four frames, then give birth to vehicle motion profile, And it numbers, Record ID.When two continuous frames or three frames can not find detection vehicle, then record is retained, if it exceeds five frames can not find Detect vehicle, it is determined that vehicle disappears from the visual field, deletes the motion track information of vehicle.
H, the data processing centre of millimetre-wave radar and video camera sends data fusion center for local track respectively, and By fusion center by millimetre-wave radar at a distance from video camera output data and coordinate relationship carry out decision making level data fusion, It is related to the track of video camera execution track based on millimetre-wave radar, as shown in Figure 5.
H1, be compared at a distance from the acquisition of millimetre-wave radar and video camera with coordinate relationship, when the two result unanimously or The difference of both persons is not significant, and (range difference is no more than threshold value Q1And pixel difference is no more than 24x24) when, then it is merged, is remembered again Record number.(difference on the face XOZ measured with range difference, and the difference on the face XOY is measured with pixel difference);
H2, when the two significant difference, be classified according to fore-and-aft distance.
What H21, millimeter wave measured is less than d with the fore-and-aft distance of vehicle1When, pursuit path mainly obtains track with video camera Based on, using the pursuit path that millimetre-wave radar obtains as the detection of video camera track, compare the pursuit path point mark of the two Figure, the two shape is similar, then it is assumed that and it is identical, if having inconsistent, keep independently tracked, if it exceeds the data of m frame are shown not It can merge, then regard as two targets, be handled according to newly there is target;
What H22, millimeter wave measured is d with the fore-and-aft distance of vehicle1--d2, the fore-and-aft distance difference of the two is no more than threshold value Q2, It is no more than 48x48 pixel in the distance of the upper coordinate difference of image, then takes intermediate value to be merged, fore-and-aft distance is more than Q2It is then straight It connects and deletes the track;
What H23, millimeter wave measured is d with the fore-and-aft distance of vehicle2--d3, rail that pursuit path is formed with millimetre-wave radar Subject to mark, using video camera obtain track be used as millimetre-wave radar track detection, compare the two pursuit path point mark figure, two Person's shape is similar, then it is assumed that if track accurately has inconsistent, keeps independently tracked, if it exceeds show cannot for the data of m frame Fusion, then regard as two targets, is handled according to newly there is target.
I, repeat above step, and carry out track update, obtain tracking result.
To sum up, the present invention provides the more vehicle tracking sides in front that a kind of millimetre-wave radar is merged with deep learning vision Method, combine millimetre-wave radar distance measurement precision height with influenced small advantage by environmental change and space or depth perception study is being examined The accuracy with tracking aspect is surveyed, the accuracy and robustness of vehicle target tracking more for front are improved.

Claims (10)

1. a kind of more wireless vehicle trackings in the front that millimetre-wave radar is merged with deep learning vision, which is characterized in that including with Lower step:
A, the coordinate transformation relation between millimetre-wave radar coordinate system and camera coordinate system is established, the coordinate system of the two is carried out It is unified, it is sampled using the Minimum sample rate of both millimetre-wave radar and video camera, with the consistency on the retention time;
B, the data of millimetre-wave radar are received, is resolved by rule, and performed corresponding processing, thus before filtering out Square vehicle, weeds out invalid targets;
C, the millimetre-wave radar data received in step B are filtered, using Kalman filtering to front vehicles carry out with Track, and generate track and it is numbered;When there is new vehicle to occur, after being determined by the collected data calculation of radar, Tracking is carried out to generate new track and add number;
D, video camera acquired image is pre-processed;
E, off-line training deep learning neural network, identifies front vehicles;
F, pretreated image will have been carried out in step D to be sent in the deep learning neural network of step D pre-training, The vehicle target in front is detected and positioned, obtains the more vehicles in front position in the picture and detection confidence (to can Signals of vehicles of the reliability lower than M% is deleted), and vehicle is numbered;
G, the running track and volume of vehicle are tracked and generated to the vehicle detected in step F using online trace model Number;
H, the data processing centre of millimetre-wave radar and video camera sends data fusion center for local track respectively, and by melting Conjunction center by millimetre-wave radar at a distance from video camera output data and coordinate relationship carry out data fusion, be based on millimeter wave Radar is related to the track of video camera execution track;
I, repeat above step, and carry out track update, obtain tracking result.
2. the more vehicle tracking sides in front that a kind of millimetre-wave radar according to claim 1 is merged with deep learning vision Method, which is characterized in that step A, the coordinate system carry out unified step are as follows:
A1, transformational relation between millimetre-wave radar and world three dimensional coordinate system is established, wherein millimetre-wave radar coordinate system is water The two-dimensional coordinate system in average face;
A2, transformational relation between camera coordinate system and three-dimensional world coordinate system is established, wherein camera coordinate system is vertical The two-dimensional coordinate system of plane;
A3, in conjunction with the coordinate relationship and video camera and three of step A1 and step A2, the millimetre-wave radar derived and three-dimensional world The coordinate relationship in the world is tieed up, the coordinate relationship between millimetre-wave radar and camera review is released, as follows:
3. the more vehicle tracking sides in front that a kind of millimetre-wave radar according to claim 2 is merged with deep learning vision Method, which is characterized in that step A1 specifically includes the following steps:
A11、X0O0Z is the coordinate system of millimetre-wave radar, and the XOZ plane of coordinate plane and three-dimensional world coordinate system O-XYZ is flat Row, and X0O0Z plane is located at below XOZ plane at Y1, and Y1 is the mounting height of millimetre-wave radar, by three-dimensional world coordinate system O- The XOZ plane projection of XYZ is to millimetre-wave radar coordinate system X0O0On Z, OX axis and O0X0Between at a distance of Z0, O is that world coordinate system is former Point, O are millimetre-wave radar coordinate origin, the i.e. installation site of millimetre-wave radar;
A12, assume to find front vehicles M in the scanning range of millimetre-wave radar, between millimetre-wave radar it is opposite away from From for R, relative angle ɑ, i.e. MO0=R unit is mm, ∠ MO0Z=ɑ, unit are degree;
A13, the vehicle target in millimetre-wave radar coordinate system is transferred in three-dimensional world coordinate system, available X=R x Sin ɑ, Z=Z0+R x cosɑ。
4. the more vehicle tracking sides in front that a kind of millimetre-wave radar according to claim 2 is merged with deep learning vision Method, which is characterized in that step A2 specifically includes the following steps:
A21, camera coordinate system are that the two-dimensional coordinate system xoy, o in perpendicular are camera coordinate system coordinate origin, are sat It is parallel with the face XOY of three-dimensional world coordinate system O-XYZ to mark plane.Wherein O is the coordinate origin of three-dimensional world coordinate system, while It is the optical center of video camera, i.e. Oo=f, f is the effective focal length of video camera, unit mm.
A22, video camera installation process require its optical axis parallel to the ground, i.e., the Y value in three-dimensional world coordinate system remains unchanged, i.e., Y=Y0, Y0For the mounting height of video camera, unit mm.
A23, by vehicle target M (X, the Y in three-dimensional world coordinate system0, Z), the plane of delineation being transformed into camera coordinate system On, transformational relation is as follows:
Wherein f is the effective focal length of video camera, unit mm.
5. the more vehicle tracking sides in front that a kind of millimetre-wave radar according to claim 1 is merged with deep learning vision Method, which is characterized in that the step B specifically includes the following steps:
The data in the front that B1, millimetre-wave radar receive include the distance range of objects ahead, angle angle, relative velocity Rangerate, reflected intensity power and width width;
B2, the data received are resolved using resolution protocol as defined in millimetre-wave radar, weeds out static target and nothing Imitate target;
B3, object filtering is carried out according to the reflected intensity and width of objects ahead, by the way that reflected intensity threshold value u is arranged0With width threshold Value v0, as reflected intensity rangerate >=u0, and width >=v0When, it is confirmed as vehicle target.
6. the more vehicle tracking sides in front that a kind of millimetre-wave radar according to claim 1 is merged with deep learning vision Method, which is characterized in that the step C specifically includes the following steps::
C1, using Kalman filtering, the vehicle-state in next period is predicted, the millimetre-wave radar for reading next period is real Measured data matches predicted state with actual measurement status predication, and with four frames for a period, step-length is two frames;
C2, for emerging vehicle, repeat step C1, renumberd, and generate new pursuit path.
7. the more vehicle tracking sides in front that a kind of millimetre-wave radar according to claim 6 is merged with deep learning vision Method, which is characterized in that step C1 specifically includes the following steps:
C12, it is predicted using state of the Kalman filtering algorithm to next period of the more vehicle targets in the front detected;
C13, the actual measured value in next period of the more vehicle targets in front was compared with the predicted value in a upper period, is gone forward side by side Row consistency check;
C14, the target for meeting coherence request update its data information, and carry out the prediction in next period;When continuous two A period is all satisfied coherence request, then generates motion profile;For being unsatisfactory for the target of coherence request, it is regarded as newly going out Existing vehicle, retains them temporarily, in undetected vehicle of next period, being then considered as target disappearance.
8. the more vehicle tracking sides in front that a kind of millimetre-wave radar according to claim 1 is merged with deep learning vision Method, which is characterized in that the step G specifically includes the following steps:
G1, the Euclidean distance of vehicle detected between the two field pictures of front and back is calculated, adjusted the distance closer several mesh according to distance Mark carries out weight distribution, is followed successively by 1,0.9,0.8 according to distance ... ... 0, is denoted as w1
G2, calculate front and back two frames in each bounding box friendship and ratio, according to hand over and ratio value carry out weight distribution, according to 1,0.9,0.8 is followed successively by according to coating ratio ... ... 0, be denoted as w2
G3, by w1With w2It is added, being worth maximum is most probable target, is recorded;
G4, when thering are three frame images all to detect same vehicle in continuous three frame or four frames, then give birth to the motion profile of vehicle, and compile Number, Record ID;When two continuous frames or three frames can not find detection vehicle, then record is retained, if it exceeds five frames can not find detection Vehicle, it is determined that vehicle disappears from the visual field, deletes the motion track information of vehicle.
9. the more vehicle tracking sides in front that a kind of millimetre-wave radar according to claim 1 is merged with deep learning vision Method, which is characterized in that the step H specifically includes the following steps:
H1, be compared at a distance from the acquisition of millimetre-wave radar and video camera with coordinate relationship, when the two result unanimously or two The difference of person is not significant, i.e., range difference is no more than threshold value Q1And pixel difference be no more than 24x24 when, then merged, recorded again Number, wherein the difference on the face XOZ is measured with range difference, and the difference on the face XOY is measured with pixel difference;
H2, when the two significant difference, be classified according to fore-and-aft distance.
10. the more vehicle tracking sides in front that a kind of millimetre-wave radar according to claim 9 is merged with deep learning vision Method, which is characterized in that step H2 specifically includes the following steps:
What H21, millimeter wave measured is less than d with the fore-and-aft distance of vehicle1When, pursuit path mainly obtains track as base using video camera Plinth, using millimetre-wave radar obtain pursuit path be used as video camera track detection, compare the two pursuit path point mark figure, two Person's shape is similar, then it is assumed that and it is identical, if having inconsistent, keep independently tracked, if it exceeds the data of m frame, which are shown, to melt It closes, then regards as two targets, handled according to newly there is target;
What H22, millimeter wave measured is d with the fore-and-aft distance of vehicle1--d2, the fore-and-aft distance difference of the two is no more than threshold value Q2, scheming The distance of the upper coordinate difference of picture is no more than 48x48 pixel, then takes intermediate value to be merged, fore-and-aft distance is more than Q2Then directly will It deletes the track;
What H23, millimeter wave measured is d with the fore-and-aft distance of vehicle2--d3, pursuit path is with the track that millimetre-wave radar is formed Standard compares the pursuit path point mark figure of the two, the two shape using the track that video camera obtains as the track detection of millimetre-wave radar Shape is similar, then it is assumed that if track accurately has inconsistent, keep independently tracked, if it exceeds the data of m frame, which are shown, to be merged, Two targets are then regarded as, are handled according to newly there is target.
CN201811219589.9A 2018-10-19 2018-10-19 Front multi-vehicle tracking method integrating millimeter wave radar and deep learning vision Expired - Fee Related CN109459750B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811219589.9A CN109459750B (en) 2018-10-19 2018-10-19 Front multi-vehicle tracking method integrating millimeter wave radar and deep learning vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811219589.9A CN109459750B (en) 2018-10-19 2018-10-19 Front multi-vehicle tracking method integrating millimeter wave radar and deep learning vision

Publications (2)

Publication Number Publication Date
CN109459750A true CN109459750A (en) 2019-03-12
CN109459750B CN109459750B (en) 2023-05-23

Family

ID=65607929

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811219589.9A Expired - Fee Related CN109459750B (en) 2018-10-19 2018-10-19 Front multi-vehicle tracking method integrating millimeter wave radar and deep learning vision

Country Status (1)

Country Link
CN (1) CN109459750B (en)

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109960264A (en) * 2019-03-28 2019-07-02 潍柴动力股份有限公司 A kind of target identification method and system
CN110068818A (en) * 2019-05-05 2019-07-30 中国汽车工程研究院股份有限公司 The working method of traffic intersection vehicle and pedestrian detection is carried out by radar and image capture device
CN110264586A (en) * 2019-05-28 2019-09-20 浙江零跑科技有限公司 L3 grades of automated driving system driving path data acquisitions, analysis and method for uploading
CN110288832A (en) * 2019-07-10 2019-09-27 南京慧尔视智能科技有限公司 It is merged based on microwave with the multiple-object information of video and visual presentation method
CN110398720A (en) * 2019-08-21 2019-11-01 深圳耐杰电子技术有限公司 A kind of anti-unmanned plane detection tracking interference system and photoelectric follow-up working method
CN110422173A (en) * 2019-07-11 2019-11-08 惠州市德赛西威智能交通技术研究院有限公司 A kind of environment recognition methods
CN110632589A (en) * 2019-10-17 2019-12-31 安徽大学 Radar photoelectric information fusion technology
CN110736982A (en) * 2019-10-28 2020-01-31 江苏集萃智能传感技术研究所有限公司 Underground parking lot vehicle tracking method and device based on radar monitoring
CN110794392A (en) * 2019-10-15 2020-02-14 上海创昂智能技术有限公司 Vehicle positioning method and device, vehicle and storage medium
CN110794397A (en) * 2019-10-18 2020-02-14 北京全路通信信号研究设计院集团有限公司 Target detection method and system based on camera and radar
CN111090095A (en) * 2019-12-24 2020-05-01 联创汽车电子有限公司 Information fusion environment perception system and perception method thereof
CN111104960A (en) * 2019-10-30 2020-05-05 武汉大学 Sign language identification method based on millimeter wave radar and machine vision
CN111398923A (en) * 2020-04-28 2020-07-10 东风汽车集团有限公司 Multi-millimeter wave radar combined self-calibration method and system
CN111731272A (en) * 2020-06-17 2020-10-02 重庆长安汽车股份有限公司 Obstacle collision avoidance method based on automatic parking system
CN111814769A (en) * 2020-09-02 2020-10-23 深圳市城市交通规划设计研究中心股份有限公司 Information acquisition method and device, terminal equipment and storage medium
CN111862157A (en) * 2020-07-20 2020-10-30 重庆大学 Multi-vehicle target tracking method integrating machine vision and millimeter wave radar
CN111880196A (en) * 2020-06-29 2020-11-03 安徽海博智能科技有限责任公司 Unmanned mine car anti-interference method, system and computer equipment
CN111967498A (en) * 2020-07-20 2020-11-20 重庆大学 Night target detection and tracking method based on millimeter wave radar and vision fusion
CN112034445A (en) * 2020-08-17 2020-12-04 东南大学 Vehicle motion trail tracking method and system based on millimeter wave radar
CN112033429A (en) * 2020-09-14 2020-12-04 吉林大学 Target-level multi-sensor fusion method for intelligent automobile
CN112201040A (en) * 2020-09-29 2021-01-08 同济大学 Traffic data cleaning method and system based on millimeter wave radar data
CN112346046A (en) * 2020-10-30 2021-02-09 合肥中科智驰科技有限公司 Single-target tracking method and system based on vehicle-mounted millimeter wave radar
CN112380927A (en) * 2020-10-29 2021-02-19 中车株洲电力机车研究所有限公司 Track identification method and device
CN112415517A (en) * 2020-11-03 2021-02-26 上海泽高电子工程技术股份有限公司 Rail identification method based on millimeter wave radar
CN113030944A (en) * 2021-04-16 2021-06-25 深圳市众云信息科技有限公司 Radar target tracking method
CN113095345A (en) * 2020-01-08 2021-07-09 富士通株式会社 Data matching method and device and data processing equipment
CN113343849A (en) * 2021-06-07 2021-09-03 西安恒盛安信智能技术有限公司 Fusion sensing equipment based on radar and video
CN114137512A (en) * 2021-11-29 2022-03-04 湖南大学 Front multi-vehicle tracking method based on fusion of millimeter wave radar and deep learning vision
CN114299112A (en) * 2021-12-24 2022-04-08 萱闱(北京)生物科技有限公司 Multi-target-based track identification method, device, medium and computing equipment
CN114518573A (en) * 2022-04-21 2022-05-20 山东科技大学 Vehicle tracking method, equipment and medium for multiple radars
CN114814823A (en) * 2022-01-06 2022-07-29 上海道麒实业发展有限公司 Rail vehicle detection system and method based on integration of millimeter wave radar and camera
CN115004273A (en) * 2019-04-15 2022-09-02 华为技术有限公司 Digital reconstruction method, device and system for traffic road
CN115184917A (en) * 2022-09-13 2022-10-14 湖南华诺星空电子技术有限公司 Regional target tracking method integrating millimeter wave radar and camera
CN116453205A (en) * 2022-11-22 2023-07-18 深圳市旗扬特种装备技术工程有限公司 Method, device and system for identifying stay behavior of commercial vehicle
CN117687029A (en) * 2024-02-01 2024-03-12 深圳市佰誉达科技有限公司 Millimeter wave radar-based vehicle motion trail tracking method and system

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006292621A (en) * 2005-04-13 2006-10-26 Toyota Motor Corp Object detection apparatus
CN1940591A (en) * 2005-09-26 2007-04-04 通用汽车环球科技运作公司 System and method of target tracking using sensor fusion
US20070080850A1 (en) * 2003-09-11 2007-04-12 Kyoichi Abe Object detection system and object detection method
CN102508246A (en) * 2011-10-13 2012-06-20 吉林大学 Method for detecting and tracking obstacles in front of vehicle
CN102609953A (en) * 2010-12-02 2012-07-25 通用汽车环球科技运作有限责任公司 Multi-object appearance-enhanced fusion of camera and range sensor data
US20140071249A1 (en) * 2012-09-13 2014-03-13 California Institute Of Technology Coherent camera
CN106461774A (en) * 2014-02-20 2017-02-22 御眼视觉技术有限公司 Advanced driver assistance system based on radar-cued visual imaging
CN107076842A (en) * 2014-08-25 2017-08-18 兰普洛克斯公司 Positioned using the indoor location of delayed sweep beam reflector
CN107103276A (en) * 2016-02-19 2017-08-29 德尔福技术有限公司 The vision algorithm merged using low layer sensor is performed
CN107238834A (en) * 2016-01-19 2017-10-10 德尔福技术有限公司 Target Tracking System for use radar/vision fusion of automotive vehicle
CN107862287A (en) * 2017-11-08 2018-03-30 吉林大学 A kind of front zonule object identification and vehicle early warning method
CN108613679A (en) * 2018-06-14 2018-10-02 河北工业大学 A kind of mobile robot Extended Kalman filter synchronous superposition method

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070080850A1 (en) * 2003-09-11 2007-04-12 Kyoichi Abe Object detection system and object detection method
JP2006292621A (en) * 2005-04-13 2006-10-26 Toyota Motor Corp Object detection apparatus
CN1940591A (en) * 2005-09-26 2007-04-04 通用汽车环球科技运作公司 System and method of target tracking using sensor fusion
CN102609953A (en) * 2010-12-02 2012-07-25 通用汽车环球科技运作有限责任公司 Multi-object appearance-enhanced fusion of camera and range sensor data
CN102508246A (en) * 2011-10-13 2012-06-20 吉林大学 Method for detecting and tracking obstacles in front of vehicle
US20140071249A1 (en) * 2012-09-13 2014-03-13 California Institute Of Technology Coherent camera
CN106461774A (en) * 2014-02-20 2017-02-22 御眼视觉技术有限公司 Advanced driver assistance system based on radar-cued visual imaging
CN107076842A (en) * 2014-08-25 2017-08-18 兰普洛克斯公司 Positioned using the indoor location of delayed sweep beam reflector
CN107238834A (en) * 2016-01-19 2017-10-10 德尔福技术有限公司 Target Tracking System for use radar/vision fusion of automotive vehicle
CN107103276A (en) * 2016-02-19 2017-08-29 德尔福技术有限公司 The vision algorithm merged using low layer sensor is performed
CN107862287A (en) * 2017-11-08 2018-03-30 吉林大学 A kind of front zonule object identification and vehicle early warning method
CN108613679A (en) * 2018-06-14 2018-10-02 河北工业大学 A kind of mobile robot Extended Kalman filter synchronous superposition method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
朱愿: "基于视觉和雷达的智能车辆自主换道决策机制与控制研究", 《中国博士学位论文全文数据库 (工程科技Ⅱ辑)》 *
谢宪毅 等: "基于变权重系数的LQR车辆后轮主动转向控制研究", 《浙江大学学报(工学版)》 *

Cited By (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109960264A (en) * 2019-03-28 2019-07-02 潍柴动力股份有限公司 A kind of target identification method and system
CN115004273A (en) * 2019-04-15 2022-09-02 华为技术有限公司 Digital reconstruction method, device and system for traffic road
CN110068818A (en) * 2019-05-05 2019-07-30 中国汽车工程研究院股份有限公司 The working method of traffic intersection vehicle and pedestrian detection is carried out by radar and image capture device
CN110264586A (en) * 2019-05-28 2019-09-20 浙江零跑科技有限公司 L3 grades of automated driving system driving path data acquisitions, analysis and method for uploading
WO2021003896A1 (en) * 2019-07-10 2021-01-14 南京慧尔视智能科技有限公司 Multi-target information fusion and visual presentation method based on microwaves and videos
CN110288832A (en) * 2019-07-10 2019-09-27 南京慧尔视智能科技有限公司 It is merged based on microwave with the multiple-object information of video and visual presentation method
CN110422173A (en) * 2019-07-11 2019-11-08 惠州市德赛西威智能交通技术研究院有限公司 A kind of environment recognition methods
CN110398720A (en) * 2019-08-21 2019-11-01 深圳耐杰电子技术有限公司 A kind of anti-unmanned plane detection tracking interference system and photoelectric follow-up working method
CN110398720B (en) * 2019-08-21 2024-05-03 深圳耐杰电子技术有限公司 Anti-unmanned aerial vehicle detection tracking interference system and working method of photoelectric tracking system
CN110794392B (en) * 2019-10-15 2024-03-19 上海创昂智能技术有限公司 Vehicle positioning method and device, vehicle and storage medium
CN110794392A (en) * 2019-10-15 2020-02-14 上海创昂智能技术有限公司 Vehicle positioning method and device, vehicle and storage medium
CN110632589A (en) * 2019-10-17 2019-12-31 安徽大学 Radar photoelectric information fusion technology
CN110632589B (en) * 2019-10-17 2022-12-06 安徽大学 Radar photoelectric information fusion technology
CN110794397A (en) * 2019-10-18 2020-02-14 北京全路通信信号研究设计院集团有限公司 Target detection method and system based on camera and radar
CN110736982B (en) * 2019-10-28 2022-04-05 江苏集萃智能传感技术研究所有限公司 Underground parking lot vehicle tracking method and device based on radar monitoring
CN110736982A (en) * 2019-10-28 2020-01-31 江苏集萃智能传感技术研究所有限公司 Underground parking lot vehicle tracking method and device based on radar monitoring
CN111104960A (en) * 2019-10-30 2020-05-05 武汉大学 Sign language identification method based on millimeter wave radar and machine vision
CN111104960B (en) * 2019-10-30 2022-06-14 武汉大学 Sign language identification method based on millimeter wave radar and machine vision
CN111090095B (en) * 2019-12-24 2023-03-14 上海汽车工业(集团)总公司 Information fusion environment perception system and perception method thereof
CN111090095A (en) * 2019-12-24 2020-05-01 联创汽车电子有限公司 Information fusion environment perception system and perception method thereof
CN113095345A (en) * 2020-01-08 2021-07-09 富士通株式会社 Data matching method and device and data processing equipment
CN111398923A (en) * 2020-04-28 2020-07-10 东风汽车集团有限公司 Multi-millimeter wave radar combined self-calibration method and system
CN111731272A (en) * 2020-06-17 2020-10-02 重庆长安汽车股份有限公司 Obstacle collision avoidance method based on automatic parking system
CN111880196A (en) * 2020-06-29 2020-11-03 安徽海博智能科技有限责任公司 Unmanned mine car anti-interference method, system and computer equipment
CN111967498A (en) * 2020-07-20 2020-11-20 重庆大学 Night target detection and tracking method based on millimeter wave radar and vision fusion
CN111862157B (en) * 2020-07-20 2023-10-10 重庆大学 Multi-vehicle target tracking method integrating machine vision and millimeter wave radar
CN111862157A (en) * 2020-07-20 2020-10-30 重庆大学 Multi-vehicle target tracking method integrating machine vision and millimeter wave radar
CN112034445A (en) * 2020-08-17 2020-12-04 东南大学 Vehicle motion trail tracking method and system based on millimeter wave radar
CN111814769A (en) * 2020-09-02 2020-10-23 深圳市城市交通规划设计研究中心股份有限公司 Information acquisition method and device, terminal equipment and storage medium
CN112033429A (en) * 2020-09-14 2020-12-04 吉林大学 Target-level multi-sensor fusion method for intelligent automobile
CN112033429B (en) * 2020-09-14 2022-07-19 吉林大学 Target-level multi-sensor fusion method for intelligent automobile
CN112201040A (en) * 2020-09-29 2021-01-08 同济大学 Traffic data cleaning method and system based on millimeter wave radar data
CN112380927A (en) * 2020-10-29 2021-02-19 中车株洲电力机车研究所有限公司 Track identification method and device
CN112380927B (en) * 2020-10-29 2023-06-30 中车株洲电力机车研究所有限公司 Rail identification method and device
CN112346046B (en) * 2020-10-30 2022-09-06 合肥中科智驰科技有限公司 Single-target tracking method and system based on vehicle-mounted millimeter wave radar
CN112346046A (en) * 2020-10-30 2021-02-09 合肥中科智驰科技有限公司 Single-target tracking method and system based on vehicle-mounted millimeter wave radar
CN112415517A (en) * 2020-11-03 2021-02-26 上海泽高电子工程技术股份有限公司 Rail identification method based on millimeter wave radar
CN113030944A (en) * 2021-04-16 2021-06-25 深圳市众云信息科技有限公司 Radar target tracking method
CN113030944B (en) * 2021-04-16 2024-02-02 深圳市众云信息科技有限公司 Radar target tracking method
CN113343849A (en) * 2021-06-07 2021-09-03 西安恒盛安信智能技术有限公司 Fusion sensing equipment based on radar and video
CN114137512A (en) * 2021-11-29 2022-03-04 湖南大学 Front multi-vehicle tracking method based on fusion of millimeter wave radar and deep learning vision
CN114137512B (en) * 2021-11-29 2024-04-26 湖南大学 Front multi-vehicle tracking method integrating millimeter wave radar and deep learning vision
CN114299112A (en) * 2021-12-24 2022-04-08 萱闱(北京)生物科技有限公司 Multi-target-based track identification method, device, medium and computing equipment
CN114814823A (en) * 2022-01-06 2022-07-29 上海道麒实业发展有限公司 Rail vehicle detection system and method based on integration of millimeter wave radar and camera
CN114518573A (en) * 2022-04-21 2022-05-20 山东科技大学 Vehicle tracking method, equipment and medium for multiple radars
CN115184917B (en) * 2022-09-13 2023-03-10 湖南华诺星空电子技术有限公司 Regional target tracking method integrating millimeter wave radar and camera
CN115184917A (en) * 2022-09-13 2022-10-14 湖南华诺星空电子技术有限公司 Regional target tracking method integrating millimeter wave radar and camera
CN116453205A (en) * 2022-11-22 2023-07-18 深圳市旗扬特种装备技术工程有限公司 Method, device and system for identifying stay behavior of commercial vehicle
CN117687029A (en) * 2024-02-01 2024-03-12 深圳市佰誉达科技有限公司 Millimeter wave radar-based vehicle motion trail tracking method and system
CN117687029B (en) * 2024-02-01 2024-05-03 深圳市佰誉达科技有限公司 Millimeter wave radar-based vehicle motion trail tracking method and system

Also Published As

Publication number Publication date
CN109459750B (en) 2023-05-23

Similar Documents

Publication Publication Date Title
CN109459750A (en) A kind of more wireless vehicle trackings in front that millimetre-wave radar is merged with deep learning vision
CN109444911B (en) Unmanned ship water surface target detection, identification and positioning method based on monocular camera and laser radar information fusion
CN110414396B (en) Unmanned ship perception fusion algorithm based on deep learning
CN109670411B (en) Ship point cloud depth image processing method and system based on generation countermeasure network
CN105866790B (en) A kind of laser radar obstacle recognition method and system considering lasing intensity
Bertozzi et al. Obstacle detection and classification fusing radar and vision
CN110850403A (en) Multi-sensor decision-level fused intelligent ship water surface target feeling knowledge identification method
CN109472831A (en) Obstacle recognition range-measurement system and method towards road roller work progress
CN109490890A (en) A kind of millimetre-wave radar towards intelligent vehicle and monocular camera information fusion method
CN110488811B (en) Method for predicting pedestrian track by robot based on social network model
CN109085838A (en) A kind of dynamic barrier rejecting algorithm based on laser positioning
CN109564285A (en) Method and system for detecting ground marks in a traffic environment of a mobile unit
CN103149939A (en) Dynamic target tracking and positioning method of unmanned plane based on vision
CN111045000A (en) Monitoring system and method
CN111781608A (en) Moving target detection method and system based on FMCW laser radar
CN107796373B (en) Distance measurement method based on monocular vision of front vehicle driven by lane plane geometric model
Drews et al. Deepfusion: A robust and modular 3d object detector for lidars, cameras and radars
CN115184917B (en) Regional target tracking method integrating millimeter wave radar and camera
CN112862858A (en) Multi-target tracking method based on scene motion information
CN108596117B (en) Scene monitoring method based on two-dimensional laser range finder array
CN109213204A (en) AUV sub-sea floor targets based on data-driven search navigation system and method
CN114280611A (en) Road side sensing method integrating millimeter wave radar and camera
CN106447698B (en) A kind of more pedestrian tracting methods and system based on range sensor
CN110298271A (en) Seawater method for detecting area based on critical point detection network and space constraint mixed model
Meier et al. Object detection and tracking in range image sequences by separation of image features

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20230523