[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN111427036A - Short-baseline multi-radar signal level fusion detection method - Google Patents

Short-baseline multi-radar signal level fusion detection method Download PDF

Info

Publication number
CN111427036A
CN111427036A CN202010290262.1A CN202010290262A CN111427036A CN 111427036 A CN111427036 A CN 111427036A CN 202010290262 A CN202010290262 A CN 202010290262A CN 111427036 A CN111427036 A CN 111427036A
Authority
CN
China
Prior art keywords
radar
frame
value
video data
sweep
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010290262.1A
Other languages
Chinese (zh)
Inventor
赵玉丽
商凯
翟海涛
陈硕
刘�文
徐勇
吴贝贝
陈凌
龙超
童建文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Laisi Electronic Equipment Co ltd
Original Assignee
Nanjing Laisi Electronic Equipment Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Laisi Electronic Equipment Co ltd filed Critical Nanjing Laisi Electronic Equipment Co ltd
Priority to CN202010290262.1A priority Critical patent/CN111427036A/en
Publication of CN111427036A publication Critical patent/CN111427036A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/418Theoretical aspects

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The embodiment of the application discloses a short-baseline multi-radar signal level fusion detection method, which comprises the following steps: selecting a registration radar from a plurality of radars, and converting other radars into a coordinate system of the registration radar for spatial registration; carrying out noise floor normalization on echo detection values of other radars; normalizing the azimuth quantized value and the distance quantized value of all radars; establishing a time relation between frames of each resolution unit between each radar, and solving and storing a value function by designing a corresponding state transfer window; performing threshold detection on the final state value function to obtain the target state of the target to be detected; and backtracking all target states to obtain the detection tracks of all targets to be detected. By adopting the method, the constant false alarm detection can be carried out to obtain the track of the detection target, the original detection information is fully utilized, and the joint detection performance is improved to the maximum extent.

Description

Short-baseline multi-radar signal level fusion detection method
Technical Field
The invention relates to the field of multi-radar signal joint detection, in particular to a short-baseline multi-radar signal level fusion detection method.
Background
The multi-radar joint detection can effectively take the detection targets of the two radars into consideration by fully mining the characteristics and advantages of the two radars, and improves the comprehensive detection capability by fusing the observation, processing and judgment of the multiple radars. At present, multi-radar joint detection is widely applied, but the main attention is the working performance of a single radar. In the prior art, the multi-radar data fusion algorithm is carried out after detection, and a lot of useful information in an original signal is inevitably lost during detection, so that the advantages of multiple radars are not well exerted.
In 1985, Barniv Y. provides a track-before-detect (DP-TBD) algorithm based on dynamic programming, greatly improves the detection power of a single radar, and improves the detection capability of the single radar on a weak and small target.
Disclosure of Invention
The purpose of the invention is as follows: the invention aims to provide a short-baseline multi-radar signal level fusion detection method, which is used for solving the problem of joint detection of weak and small targets in a radar networking environment, realizing the detection performance of the weak and small targets in a complex environment and improving the detection power of a radar.
In order to solve the technical problem, the invention provides a short-baseline multi-radar signal level fusion detection method, which comprises the following steps:
step 1: selecting a registration radar from two or more radars, converting other radars into a coordinate system of the registration radar, and carrying out spatial registration on each radar;
step 2: performing noise floor normalization on echo detection values of other radars according to the noise statistical characteristics of the noise area of the registration radar;
and step 3: carrying out normalization of an azimuth quantization value and normalization of a distance quantization value on echo detection values of a common detection area of all the radars;
and 4, step 4: establishing a time relation between each resolution unit frame and each radar, designing a corresponding state transfer window according to the time relation, solving a value function and storing the value function;
and 5: performing threshold detection on the final state value function to obtain the target state of the target to be detected;
step 6: and backtracking all the target states to obtain the detection tracks of all the targets to be detected.
Further, in an implementation manner, the radar in step 1 is a short-baseline radar, the radar includes radar a and other radars, and step 1 includes:
if the radar A is a registration radar, the coordinate system of the radar A is a registration radar coordinate system, namely a registration coordinate system, and the video signals of other radars are converted into the registration coordinate system according to the following formula:
Figure BDA0002450121940000021
ρ′C=xB 2+yB 2C 2+2xBρCcosθC+2yBρCsinθC
wherein the point O is the center of the circle of the registration radar, i.e. the origin of the coordinate system of the registration radar, OB(xB,yB) Is rectangular coordinate of the center of the other radar in the coordinate system, and the point C is a point in the coordinate system of the other radar, (rho)CC) Is the polar coordinate of point C in the other radar coordinate system, (ρ'C,θ′C) The polar coordinates of point C in the registered coordinate system.
Further, in one implementation, the step 2 includes:
if the radar A is a registration radar, performing noise floor normalization on the echo detection values of other radars according to the following formula:
Figure BDA0002450121940000022
wherein, s'rNormalized value, s, of noise floor for echo sounding detected values of other radarsrFor the echo detection values of other radars,
Figure BDA0002450121940000023
to register the noise mean estimate of the radar,
Figure BDA0002450121940000024
noise mean estimates for other radars, noise mean estimates for the registration radars
Figure BDA0002450121940000025
I.e. the noise statistics of the noisy regions of the registration radar.
Further, in an implementation manner, the normalization of the azimuth quantized values and the normalization of the range quantized values in step 3 are performed by normalizing echo detection values of each frame of a common detection area of all the radars to M × N sample data, where the echo detection values include azimuth sample data and range sample data, M is the total number of azimuth cells, N is the total number of range cells, and M and N are positive integers;
the step 3 comprises the following steps: step 3-1, if one frame of azimuth sampling data is quantized to M azimuth units, correspondingly converting the sweep echo azimuth code of each radar from 0-360 degrees to 0-M-1;
normalizing the azimuth quantized values, i.e. normalizing the azimuth sample data to M azimuth cells, according to the following formula:
Figure BDA0002450121940000031
theta' is an azimuth code subjected to azimuth quantization value normalization, the value range is 0-M-1, theta is an azimuth angle swept by each radar echo, and the unit is degree;
step 3-2: selecting the radar with the minimum sampling distance unit size from all radars, normalizing the distance quantization values of the radars except the radar with the minimum sampling distance unit size, and if the sampling distance unit size of the radar with the minimum sampling distance unit size is Runit-minThe radar other than the radar having the smallest sampling range cell size has a range cell size value of Runit-otherThe range cell size value R of the radar other than the radar whose sampling range cell size is the smallest is expressed by the following formulaunit-otherNormalization of the distance quantization values is performed:
Figure BDA0002450121940000032
wherein d isr' is a distance value of radar except the radar with the minimum sampling distance unit size after distance quantization value normalization of certain distance sampling data of the radar, the value range is 0-N-1, and drAnd sampling distance values before normalization for radar distance data except the radar with the smallest sampling distance unit size.
Further, in an implementation manner, in the step 4, if each radar is subjected to azimuth processing through the step 3The size of video data of each frame after normalization of the quantization value and normalization of the distance quantization value is M × N, the video data of each radar after normalization of K frames are arranged according to the due north time sequence, and the due north time sequence is marked as t1,t2,…,tKWherein the time of each sweep of each frame of video data is denoted as tkiK denotes a frame number, K is 1,2, …, K, i denotes a sweep number, i.e., an azimuth cell number, i is 0,1,2, …, M-1, j denotes a distance cell number, j is 0,1,2, …, N-1, and (i, j) denotes a resolution cell whose sweep number is i and distance cell number is j;
the step 4 comprises the following steps: step 4-1, according to the video echo arrangement sequence of each radar, setting the frame number k to 1, namely, setting the time t to t1Initially, the value function I of each resolution unit (I, j) of the 1 st frame radar video data1(i, j) initializing, including:
for all sweeps with sweep number i equal to 0,1,2, …, M-1, determining the order of arrival of the ith sweep among all radars, initializing the 1 st frame value function and the state function by the echo probe value of the ith sweep arriving first at the radar:
I1(i,j)=Z1(i,j)
Ψ1(i,j)=0
wherein, I1(i, j) is a function of the value of the video data of frame 1 at the resolution cell (i, j), Z1(i, j) is the echo detection value, Ψ, of the 1 st frame of video data at the resolution element (i, j)k(i, j) is a state function of the kth frame of video data at the distinguishing unit (i, j), the state function is an association relation between a current frame state variable and a previous frame state variable, the state variable refers to a position state of an object to be detected, a value range of the state variable is (i, j), i ∈ (0, 1.. multidot.M-1), j ∈ (0, 1.. multidot.N-1), and a state function Ψ of the 1 st frame of video data at the distinguishing unit (i, j)1(i, j) is initialized to 0;
4-2, recursion of an evaluation function, wherein the frame number k is 2, and the time t is t2When the frame value function of 2 nd frame begins to be calculated;
determining the number of sweeps for all sweeps with sweep number i ═ 0,1,2, …, M-1The order of the arrival of the ith sweep in all the radars takes the echo detection value of the ith sweep which arrives the radar first as the sweep data of the ith sweep of the 2 nd frame, and calculates the time difference delta t between the same sweep of the 2 nd frame of video data and the same sweep of the 1 st frame of video data2i
Δt2i=t2i-t1i,i=0,1,...,M-1
Wherein, t2iFor the time of the 2 nd frame video data at sweep number i, t1iTime at sweep i for frame 1 video data;
calculating a target transfer window q of all radar detection values in the ith sweep in the 2 nd frame of video data according to the motion model of the target to be detected2i
q2i=f(Δt2i)
Wherein q is2iTarget transfer windows for all radar detection values in the ith sweep in the 2 nd frame of video data, i.e. Δ t for the target to be detected2iThe position range of the motion in time, f (-) is a motion model of the target to be detected, and a 2 nd frame value function and a state function are solved according to the following formulas:
Figure BDA0002450121940000051
Figure BDA0002450121940000052
wherein, I2(i, j) is a function of the value of the 2 nd frame of video data at the resolution element (i, j), Z2(i, j) is the echo detection value, Ψ, of the 2 nd frame video data at the resolution element (i, j)2(I, j) is a state function of the 2 nd frame video data at (I, j), I1(x1) Indicating that the 1 st frame of video data is in x1Value function of (a), x1For the target transfer window q in the 1 st frame video data2iThe value of the state variable within;
step 4-3, k frame, time t ═ tkWhen the frame is in a first frame value, starting to calculate a kth frame value function;
for sweep number i ═0,1,2, …, M-1, determining the arrival sequence of the ith sweep in all the radars, taking the echo detection value of the ith sweep which arrives the radar first as the sweep data of the ith sweep of the kth frame, and calculating the time difference delta t between the video data of the kth frame and the same sweep of the video data of the kth-1 frameki
Δtki=tki-t(k-1)i
Wherein, tkiFor the time of the k frame video data at sweep number i, t(k-1)iThe time of sweep i for the k-1 frame video data;
calculating a target transfer window size q for all radar detection values in an ith sweep in a kth frame of video dataki
qki=f(Δtki)
Wherein q iskiA target transfer window for all radar detection values in the ith sweep in the kth frame of video data;
calculating a kth frame value function and a state value:
Figure BDA0002450121940000053
Figure BDA0002450121940000061
wherein, Ik(i, j) is a function of the value of the k-th frame of video data at the resolution element (i, j), Zk(i, j) is the echo detection value of the k frame video data at the resolution unit (i, j), Ψk(I, j) is a state function of the k-th frame video data at the resolution unit (I, j), Ik-1(xk-1) Indicating that the k-1 frame video data is in xk-1Value function of (a), xk-1For the target transfer window q in the k-1 frame video datakiThe value of the state variable in.
Further, in one implementation, in the step 5, the end state is, that is, t is t ═ tKAt the frame time, threshold detection is carried out on the final state value function of the target to be detected, and if the final state value of the unit to be detected is the same as the final state value of the target to be detectedIf the function is greater than the threshold, confirming that the unit to be detected has a target point trace, and acquiring the target state of the target to be detected, wherein the method comprises the following steps:
Figure BDA0002450121940000062
wherein,
Figure BDA0002450121940000063
for the Kth frame value function I (x)K) Greater than a threshold VTSet of all target state variable values of the location point of (a), threshold VTThe constant false alarm threshold value is calculated according to the distribution characteristic of the noise region value function and the false alarm rate.
Further, in one implementation, the set of values for all target state variables is
Figure BDA0002450121940000064
Backtracking the detection tracks of all the targets to be detected by the following formula:
Figure BDA0002450121940000065
wherein,
Figure BDA0002450121940000066
estimating a target state for the k frame video datak+1Is the correlation between the state variable of the k +1 th frame video data and the state variable of the k-th frame,
Figure BDA0002450121940000067
a target state estimate for the K +1 th frame of video data, K-1, …, 1;
obtaining the track estimation of the target to be detected at 1-K time:
Figure BDA0002450121940000068
has the advantages that: the invention provides a short-baseline multi-radar signal level fusion detection method, which adopts a DP-TBD algorithm of multi-radar video signals, and utilizes the antenna asynchronism of a plurality of radars to calculate the inter-frame interval according to the sweep timestamp information of inter-radar correlation processing, and adaptively sets different search criteria, so that the target tracking of the previous frame is more accurate, and the accuracy of the correlation track is improved compared with the single-radar DP-TBD algorithm which adopts a fixed search criterion to perform correlation accumulation on inter-frame signals.
Compared with the conventional multi-radar data fusion detection method, the short-baseline multi-radar signal level fusion detection method provided by the invention can realize the fusion of multi-radar video signals in a signal layer. More frames of radar video signals can be received in the same time, so that the available information is increased; the fused video has short interframe space and small target position transfer area, so that the performance of correlation and accumulation of interframe data of TBD is improved; the fluctuation characteristics of the target can be changed due to different working parameters of a plurality of radars, the detection performance is further improved, and the detection power is increased.
Drawings
In order to more clearly illustrate the technical solution of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of two-radar DP-TBD signal fusion in a short-baseline multi-radar signal level fusion detection method according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a common detection region of two radars in the short-baseline multi-radar signal level fusion detection method according to the embodiment of the present invention;
fig. 3 is a schematic diagram illustrating detection effects of two radars DP-TBD of a simulation target in the short-baseline multi-radar signal level fusion detection method according to the embodiment of the present invention;
fig. 4 is a schematic diagram of a coordinate transformation relationship between two radars in the short-baseline multi-radar signal level fusion detection method provided in the embodiment of the present invention;
fig. 5 is a schematic diagram of a state transition window in a short-baseline multi-radar signal level fusion detection method according to an embodiment of the present invention;
fig. 6 is a schematic diagram illustrating two radar videos in a short-baseline multi-radar signal level fusion detection method according to an embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
In this embodiment, echo signals of multiple radars are fused, which refers to echo signal data of a common detection area of multiple radars. The method is applied to a plurality of radars of a short-baseline station deployment, wherein the short-baseline station deployment refers to the close arrangement of a plurality of radars, the radars are approximately positioned at the same origin, the generality is not lost, two radars are taken as an example, and the short-baseline multi-radar signal level fusion detection method provided by the invention is described in detail below by combining with figures 1-6, and is realized based on a DP-TBD algorithm.
The DP-TBD-based signal fusion detection of multiple radars is realized by preprocessing two or more received radar video data such as spatial registration, noise power normalization, azimuth quantization value normalization and distance quantization value normalization, performing association accumulation on echoes on an assumed track according to the sweeping sequence, not setting a detection threshold, and detecting a small target and a track point thereof according to a specific detection criterion after certain frame data is obtained.
Fig. 1 is a schematic diagram of two-radar DP-TBD signal fusion in the short-baseline multi-radar signal level fusion detection method according to the embodiment of the present invention. The short-baseline multi-radar signal level fusion detection method described in this embodiment includes:
step 1: selecting a registration radar from two or more radars, converting other radars into a coordinate system of the registration radar, and carrying out spatial registration on each radar; in this embodiment, without loss of generality, taking two radars, namely a radar a and a radar B as an example, the radar a is selected as a registration radar, and the radar B is converted into a coordinate system of the registration radar, that is, spatial registration is performed on each radar.
Step 2: performing noise floor normalization on echo detection values of other radars according to the noise statistical characteristics of the noise area of the registration radar; in this embodiment, noise floor normalization is performed on the echo detection value of the radar B according to the noise statistical characteristic of the radar a noise area; and selecting echo detection values of a long-distance area of the radar A in the noise area.
And step 3: carrying out normalization of an azimuth quantization value and normalization of a distance quantization value on echo detection values of a common detection area of all the radars;
and 4, step 4: establishing a time relation between each resolution unit frame and each radar, designing a corresponding state transfer window according to the time relation, solving a value function and storing the value function;
and 5: performing threshold detection on the final state value function to obtain the target state of the target to be detected;
step 6: and backtracking all the target states to obtain the detection tracks of all the targets to be detected.
Specifically, without loss of generality, the signal level fusion of two radars is set in the embodiment, the short-baseline radar a and the radar B are deployed at an interval of 100 meters, and the radar a has a radar video echo detection value of M in one frameA×NA8192 × 10000, distance unit dimension Runit-A30M, the number of radar video echo detection values of one frame of radar B is MB×NB4096 × 6000, distance cell size Runit-BIf the distance between the two radars is 45m, the common detection area of the two radars is within 270km, as shown in fig. 2, which is a schematic diagram of the common detection area of the two radars in the short-baseline multi-radar signal level fusion detection method provided in the embodiment of the present invention. The method comprises the following specific implementation steps of simulating two simulated targets with-1 dB through a radar A video echo detection value which obeys Rayleigh distribution, wherein the distribution coefficient is 10, the radar B video echo detection value obeys Rayleigh distribution, the distribution coefficient is 8, and K is set to be 6 frames:
in the short-baseline multi-radar signal level fusion detection method according to this embodiment, spatial registration is performed in step 1, the radar in step 1 is a short-baseline radar, the radar includes a radar a and other radars, and step 1 includes:
if the radar A is a registration radar, the coordinate system of the radar A is a registration radar coordinate system, namely a registration coordinate system, and the video signals of other radars are converted into the registration coordinate system according to the following formula:
Figure BDA0002450121940000091
ρ′C=xB 2+yB 2C 2+2xBρCcosθC+2yBρCsinθC
wherein the point O is the center of the circle of the registration radar, i.e. the origin of the coordinate system of the registration radar, OB(xB,yB) Is rectangular coordinate of the center of the other radar in the coordinate system, and the point C is a point in the coordinate system of the other radar, (rho)CC) Is the polar coordinate of point C in the other radar coordinate system, (ρ'C,θ′C) The polar coordinates of point C in the registered coordinate system.
Specifically, in this embodiment, as shown in fig. 4, a schematic diagram of a coordinate transformation relationship between two radars in the short-baseline multi-radar signal level fusion detection method provided in the embodiment of the present invention is shown. In the short-baseline multi-radar signal level fusion detection method according to this embodiment, if the radar a is a registration radar, a coordinate system of the radar a is a registration radar coordinate system, that is, a registration coordinate system, and a video echo detection value of the radar B is converted into the registration coordinate system according to the following formula:
Figure BDA0002450121940000092
ρ′C=xB 2+yB 2C 2+2xBρCcosθC+2yBρCsinθC
wherein the point O is the center of the circle of the registration radar, i.e. the origin of the coordinate system of the registration radar, OB(xB,yB) Is rectangular coordinate of the center of the radar B in the coordinate system of the registration, and the point C is a point in the coordinate system of the radar B (rho)CC) Is the polar coordinate of point C in the radar B coordinate system, (ρ'C,θ′C) The polar coordinates of point C in the registered coordinate system.
In the short-baseline multi-radar signal level fusion detection method according to this embodiment, in step 2, normalization processing is performed on noise bases of multiple radar echoes by using noise statistical characteristics of a long-distance region, and step 2 includes:
if the radar A is a registration radar, performing noise floor normalization on the echo detection values of other radars according to the following formula:
Figure BDA0002450121940000101
wherein, s'rNormalized value, s, of noise floor for echo sounding detected values of other radarsrFor the echo detection values of other radars,
Figure BDA0002450121940000102
to register the noise mean estimate of the radar,
Figure BDA0002450121940000103
noise mean estimates for other radars, noise mean estimates for the registration radars
Figure BDA0002450121940000104
I.e. the noise statistics of the noisy regions of the registration radar.
Specifically, in this embodiment, it is assumed that the detection value of the radar a echo is represented by sA(i, j), selecting a long distance 8192 × 1000, namely 8192 azimuth samples × 1000 distance samples of radar echo detection values, and estimating the noise mean value of the radar A to be
Figure BDA0002450121940000105
The value of the radar B echo detection value is represented as sB(i, j), selecting 4096 × 1000 long distances, namely 4096 azimuth samples × 1000 distance samples of radar echo detection values, and estimating the noise mean value of the radar B
Figure BDA0002450121940000106
Then, the echo detection value of the radar B is subjected to noise floor normalization according to the following formula:
Figure BDA0002450121940000107
in the short-baseline multi-radar signal level fusion detection method of this embodiment, azimuth quantization and distance quantization normalization are performed through step 3, the normalization of the azimuth quantization value and the normalization of the distance quantization value in step 3 are performed, that is, each frame of echo detection values in a common detection area of all radars is normalized to M × N sample data, the echo detection values include azimuth sample data and distance sample data, in the sample data, M is the total number of azimuth cells, N is the total number of distance cells, and M and N are positive integers;
the step 3 comprises the following steps: step 3-1, if one frame of azimuth sampling data is quantized to M azimuth units, correspondingly converting the sweep echo azimuth code of each radar from 0-360 degrees to 0-M-1; specifically, in this embodiment, the step 3-1 quantizes one frame of azimuth sampling data to 8192 azimuth units, and correspondingly converts the sweep echo azimuth code of each radar from 0 to 360 degrees to 0 to 8191;
normalizing the azimuth quantized values, i.e. normalizing the azimuth sample data to M azimuth cells, according to the following formula:
Figure BDA0002450121940000111
theta' is an azimuth code subjected to azimuth quantization value normalization, the value range is 0-M-1, theta is an azimuth angle swept by each radar echo, and the unit is degree;
specifically, in this embodiment, the code of the radar a is 0 to 8191, normalization is not required, and the azimuth quantization value of the radar B is normalized according to the following formula, that is, the azimuth sampling data is normalized to 8192 azimuth units:
Figure BDA0002450121940000112
the azimuth code after azimuth quantization value normalization is carried out, the value range is 0-8191, theta is the azimuth angle swept by the radar B echo, and the unit is degree;
step 3-2: selecting the radar with the minimum sampling distance unit size from all radars, normalizing the distance quantization values of the radars except the radar with the minimum sampling distance unit size, and if the sampling distance unit size of the radar with the minimum sampling distance unit size is Runit-minThe radar other than the radar having the smallest sampling range cell size has a range cell size value of Runit-otherThe range cell size value R of the radar other than the radar whose sampling range cell size is the smallest is expressed by the following formulaunit-otherNormalization of the distance quantization values is performed:
Figure BDA0002450121940000113
wherein d isr' is a distance value of radar except the radar with the minimum sampling distance unit size after distance quantization value normalization of certain distance sampling data of the radar, the value range is 0-N-1, and drAnd sampling distance values before normalization for radar distance data except the radar with the smallest sampling distance unit size.
Specifically, in this embodiment, in step 3-2, a value with the minimum sampling distance unit size is selected from the radar to perform normalization of the distance quantization value, and this embodiment is describedIn the embodiment, the smallest sampling distance unit size in the radar is a sampling Runit-min30m, the common detection area is 270km, and the distance quantization value is normalized
Figure BDA0002450121940000121
The distance cell size value R of the radar B is calculated according to the following formulaunit-BNormalization of the distance quantization values was performed as 45 m:
Figure BDA0002450121940000122
wherein d isr' is a distance value of radar B after certain distance sampling data is normalized by a distance quantization value, the value range is 0-8999, drAnd normalizing the distance value before the radar certain distance sampling data is acquired.
In the short-baseline multi-radar signal level fusion detection method according to this embodiment, through the evaluation function in step 4, if the size of video data of each frame of each radar after normalization of the azimuth quantization value and normalization of the distance quantization value in step 3 is M × N, the video data of each radar after normalization of K frames are arranged according to the due north time sequence, and the due north time sequence is recorded as t1,t2,…,tKWherein the time of each sweep of each frame of video data is denoted as tkiK denotes a frame number, K is 1,2, …, K, i denotes a sweep number, i.e., an azimuth cell number, i is 0,1,2, …, M-1, j denotes a distance cell number, j is 0,1,2, …, N-1, and (i, j) denotes a resolution cell whose sweep number is i and distance cell number is j;
the step 4 comprises the following steps: step 4-1, according to the video echo arrangement sequence of each radar, setting the frame number k to 1, namely, setting the time t to t1Initially, the value function I of each resolution unit (I, j) of the 1 st frame radar video data1(i, j) initializing, including:
for all sweeps with sweep number i equal to 0,1,2, …, M-1, determining the order of arrival of the ith sweep among all radars, initializing the 1 st frame value function and the state function by the echo probe value of the ith sweep arriving first at the radar:
I1(i,j)=Z1(i,j)
Ψ1(i,j)=0
wherein, I1(i, j) is a function of the value of the video data of frame 1 at the resolution cell (i, j), Z1(i, j) is the echo detection value, Ψ, of the 1 st frame of video data at the resolution element (i, j)k(i, j) is a state function of the kth frame of video data at the distinguishing unit (i, j), the state function is an association relation between a current frame state variable and a previous frame state variable, the state variable refers to a position state of an object to be detected, a value range of the state variable is (i, j), i ∈ (0, 1.. multidot.M-1), j ∈ (0, 1.. multidot.N-1), and a state function Ψ of the 1 st frame of video data at the distinguishing unit (i, j)1(i, j) is initialized to 0; in this embodiment, values of i and j in the value range (i, j) of the state variable are in one-to-one correspondence with values of i and j in the resolution unit (i, j), the state variable refers to a target position state, and (i, j) can represent a position of a target.
4-2, recursion of an evaluation function, wherein the frame number k is 2, and the time t is t2When the frame value function of 2 nd frame begins to be calculated;
for all sweeps with sweep numbers i equal to 0,1,2, … and M-1, determining the arrival sequence of the ith sweep in all radars, taking the echo detection value of the ith sweep which arrives the radar first as sweep data of the ith sweep of the 2 nd frame, and calculating the time difference delta t between the same sweep of the 2 nd frame video data and the same sweep of the 1 st frame video data2i
Δt2i=t2i-t1i,i=0,1,...,M-1
Wherein, t2iFor the time of the 2 nd frame video data at sweep number i, t1iTime at sweep i for frame 1 video data;
calculating a target transfer window q of all radar detection values in the ith sweep in the 2 nd frame of video data according to the motion model of the target to be detected2i
q2i=f(Δt2i)
Wherein q is2iTarget transfer windows for all radar detection values in the ith sweep in the 2 nd frame of video data, i.e. Δ t for the target to be detected2iThe position range of the motion in time, f (-) is a motion model of the target to be detected, and a 2 nd frame value function and a state function are solved according to the following formulas:
Figure BDA0002450121940000131
Figure BDA0002450121940000132
wherein, I2(i, j) is a function of the value of the 2 nd frame of video data at the resolution element (i, j), Z2(i, j) is the echo detection value, Ψ, of the 2 nd frame video data at the resolution element (i, j)2(I, j) is a state function of the 2 nd frame video data at (I, j), I1(x1) Indicating that the 1 st frame of video data is in x1Value function of (a), x1For the target transfer window q in the 1 st frame video data2iThe value of the state variable within;
step 4-3, k frame, time t ═ tkWhen the frame is in a first frame value, starting to calculate a kth frame value function;
for all sweeps with sweep numbers i equal to 0,1,2, … and M-1, determining the arrival sequence of the ith sweep in all radars, taking the echo detection value of the ith sweep which arrives the radar first as sweep data of the ith sweep of the kth frame, and calculating the time difference delta t between the video data of the kth frame and the same sweep of the video data of the kth-1 frameki
Δtki=tki-t(k-1)i
Wherein, tkiFor the time of the k frame video data at sweep number i, t(k-1)iThe time of sweep i for the k-1 frame video data;
calculating a target transfer window size q for all radar detection values in an ith sweep in a kth frame of video dataki
qki=f(Δtki)
Wherein q iskiA target transfer window for all radar detection values in the ith sweep in the kth frame of video data;
calculating a kth frame value function and a state value:
Figure BDA0002450121940000141
Figure BDA0002450121940000142
wherein, Ik(i, j) is a function of the value of the k-th frame of video data at the resolution element (i, j), Zk(i, j) is the echo detection value of the k frame video data at the resolution unit (i, j), Ψk(I, j) is a state function of the k-th frame video data at the resolution unit (I, j), Ik-1(xk-1) Indicating that the k-1 frame video data is in xk-1Value function of (a), xk-1For the target transfer window q in the k-1 frame video datakiThe value of the state variable in.
Specifically, in this embodiment, the two radars normalize the azimuth quantization value and the distance quantization value in step 3, and then arrange the normalized video data of 6 frames K of the radars in the north chronological order, where M × N is 8192 × 9000, and the north chronological order is t1,t2,…,tKFig. 6 is a schematic diagram illustrating a sequence of two radar videos in the short-baseline multi-radar signal level fusion detection method provided in the embodiment of the present invention, where the time of each sweep of each frame of video data is denoted as tkiK denotes a frame number, K is 1,2, …, K, i denotes a sweep number, and takes the number of azimuth elements, i is 0,1,2, …,8191, j denotes a distance element number, j is 0,1,2, …,8999, (i, j) denotes a resolution element with a sweep number of i and a distance element number of j;
step 4-1: according to the video echo arrangement sequence of the two radars, the number k is equal to 1 from the frame, i.e. t is equal to t from the time t1Initially, initializing a value function for each resolution cell (i, j) of the two radars comprises:
for all sweeps i equal to 0,1,2, …,8191, determining the order of arrival of the ith sweep of the two radars, and initializing the 1 st frame value function and the state function by using the echo detection value of the ith sweep which arrives first, so that:
I1(i,j)=Z1(i,j)
Ψ1(i,j)=0
wherein, I1(i, j) is a function of the value of the video data of frame 1 at the resolution cell (i, j), Z1(i, j) is the echo detection value, Ψ, of the 1 st frame of video data at the resolution element (i, j)k(i, j) is a state function of the kth frame of video data at the resolution unit (i, j), the state function is an association relationship between a state variable and a state variable of a previous frame, the state variable in the present scheme refers to a position state where a target is located, a value range is (i, j), i ∈ (0, 1.. multidot., M-1), j ∈ (0, 1.. multidot., N-1), and a state function Ψ of the 1 st frame of video data at the resolution unit (i, j) is1(i, j) is initialized to 0;
step 4-2: recursion evaluation function, frame number k is 2, time t is t2When the frame value function of 2 nd frame begins to be calculated;
for all sweeps i ═ 0,1,2, …,8191, the following operations are performed: determining the sequence of the arrival of the ith sweep in the two radars, taking the ith sweep data of the radar which arrives first as the ith sweep data of the 2 nd frame, and calculating the time difference delta t between the same sweep of the 2 nd frame of video data and the same sweep of the 1 st frame of video data2i
Δt2i=t2i-t1i,i=0,1,...,M-1
Wherein, t2iFor the time of the 2 nd frame video data at sweep number i, t1iTime at sweep i for frame 1 video data;
calculating a target transfer window q of all radar detection values in the ith sweep in the 2 nd frame of video data according to the motion model of the target to be detected2i
q2i=f(Δt2i)
Wherein q is2iFor all radar probes in the ith sweep in the 2 nd frame of video dataTarget transfer window of measured value, i.e. target to be detected at Δ t2iThe position range of the motion in time, f (-) is a motion model of the target to be detected, and a 2 nd frame value function and a state function are solved according to the following formulas:
Figure BDA0002450121940000161
Figure BDA0002450121940000162
wherein, I2(i, j) is a function of the value of the 2 nd frame of video data at the resolution element (i, j), Z2(i, j) is the echo detection value, Ψ, of the 2 nd frame video data at the resolution element (i, j)2(I, j) is a state function of the 2 nd frame video data at (I, j), I1(x1) Indicating that the 1 st frame of video data is in x1Value function of (a), x1For the target transfer window q in the 1 st frame video data2iThe value of the state variable within;
the process of calculating the target transfer window by the target motion model comprises the following steps: assume that the state transition window of the state unit (i, j) of the k-th frame is qki={(L1,L2)},k=2,...,K,L1,L2The unit is meter, and the value is determined according to the speed range of the target and the inter-frame time interval Δ t, as shown in fig. 5, which is a schematic diagram of a state transition window in the short-baseline multi-radar signal level fusion detection method provided by the embodiment of the invention. Where i is the azimuth cell variable and j is the range cell variable. Suppose that the maximum radial velocity of an object of interest in a detection scene is vr-maxMaximum tangential velocity vt-maxThen the state transition window is calculated as f (Δ t) { (L)1,L2)}={(vr-max×Δt,vt-max×Δt)}。
In this embodiment, assuming that the time of one sweep is constant, in practical application, each sweep of the radar echo is marked with a timestamp, and the timestamps of the ith, i-0, 1,2, …, and 8191 sweeps in the kth frame are tkiThen the inter-frame spacing time difference of the sweepModel is Δ tki=tki-t(k-1)i,k=1,2,…,K
The state transition window model is qki=f(Δtki)={(vt-max×Δtki,vr-max×Δtki) V, value v in this embodimentr-max=100m/s,vt-max=100m/s。
Step 4-3: at the k-th frame, t is tkCalculating the kth frame value function at the moment;
for all sweeps i of the k-th frame radar video data, 0,1,2, …,8191, the following operations are performed: determining the sequence of the arrival of the ith sweep in the two radars, taking the ith sweep data which arrives first as the ith sweep data of the kth frame, and calculating the time difference delta t between the kth frame data and the same sweep of the kth-1 frame dataki
Δtki=tki-t(k-1)i
Calculating a target transfer window size q for all radar detection values in an ith sweep in a kth frame of video dataki
qki=f(Δtki)
Calculating a kth frame value function and a state value:
Figure BDA0002450121940000171
Figure BDA0002450121940000172
wherein, Ik(i, j) is a function of the value of the k-th frame of video data at the resolution element (i, j), Zk(i, j) is the echo detection value of the k frame video data at the resolution unit (i, j), Ψk(I, j) is the association relation between the state variable of the k frame video data at (I, j) and the state variable of the previous frame, Ik-1(xk-1) Indicating that the k-1 frame video data is in xk-1Value function of (a), xk-1For the target transfer window q in the k-1 frame video datakiThe value of the state variable in.
In the short-baseline multi-radar signal level fusion detection method according to this embodiment, in step 5, in the end state, that is, t is t ═ tKAt a frame time, performing threshold detection on a final state value function of a target to be detected, and if the final state value function of the unit to be detected is greater than a threshold, determining that the unit to be detected has a target point trace, including:
Figure BDA0002450121940000173
wherein,
Figure BDA0002450121940000174
for the Kth frame value function I (x)K) Greater than a threshold VTA set of state variable values of the position point of (1), a threshold VTThe constant false alarm threshold value is calculated according to the distribution characteristic of the noise region value function and the false alarm rate.
Specifically, in the present embodiment, in the final state, i.e., t is tKAt the frame time, performing threshold detection on the final state value function of the target to be detected, and if the final state value function of the unit to be detected is greater than the threshold, confirming that the detection unit has a target point trace, and acquiring the target state of the target to be detected, including:
Figure BDA0002450121940000181
wherein,
Figure BDA0002450121940000182
for the Kth frame value function I (x)K) Greater than a threshold VTSet of all target state variable values of the location point of (a), threshold VTThe constant false alarm threshold value is calculated according to the distribution characteristic of the noise region value function and the false alarm rate.
In the short-baseline multi-radar signal level fusion detection method of this embodiment, in step 6, all the target state variable value sets are subjected to the fusion detection
Figure BDA0002450121940000183
Backtracking the detection tracks of all the targets to be detected by the following formula:
Figure BDA0002450121940000184
wherein,
Figure BDA0002450121940000185
estimating a target state for the k frame video datak+1Is the correlation between the state variable of the k +1 th frame video data and the state variable of the k-th frame,
Figure BDA0002450121940000186
a target state estimate for the K +1 th frame of video data, K-1, …, 1;
obtaining the track estimation of the target to be detected at 1-K time:
Figure BDA0002450121940000187
specifically, in this embodiment, in the step 6, the variable values of all the target states are collected
Figure BDA0002450121940000188
Backtracking the detection tracks of all the detection targets by the following formula:
Figure BDA0002450121940000189
wherein,
Figure BDA00024501219400001810
estimate of target state for the k frame, Ψk+1Is the correlation between the state variable of the (k + 1) th frame video data and the state variable of the last frame,
Figure BDA00024501219400001811
k-1, …,1 is the target state estimate for frame K + 1. Obtaining the track estimation of the target to be detected at 1-K time:
Figure BDA00024501219400001812
fig. 3 is a schematic diagram illustrating detection effects of two radars DP-TBD on a simulated target in the short-baseline multi-radar signal level fusion detection method according to the embodiment of the present invention, where K is set to 6 in this embodiment, and fig. 3 shows a detection result of two simulated targets of-1 dB.
The invention adopts DP-TBD algorithm of multi-radar video signals, and compared with single radar DP-TBD algorithm which adopts fixed search criteria to perform correlation accumulation on interframe signals, the invention utilizes the antenna asynchronous characteristic of a plurality of radars, calculates the interval between frames according to the sweep timestamp information of correlation processing between the radars, and adaptively sets different search criteria, so that the target tracking of the previous frame is more accurate, and the accuracy of the correlation track is improved.
Compared with the conventional multi-radar data fusion detection method, the short-baseline multi-radar signal level fusion detection method provided by the invention can realize the fusion of multi-radar video signals in a signal layer. More frames of radar video signals can be received in the same time, so that the available information is increased; the fused video has short interframe space and small target position transfer area, so that the performance of correlation and accumulation of interframe data of TBD is improved; the fluctuation characteristics of the target can be changed due to different working parameters of a plurality of radars, the detection performance is further improved, and the detection power is increased.
In a specific implementation, the present invention further provides a computer storage medium, where the computer storage medium may store a program, and when the program is executed, the program may include some or all of the steps in each embodiment of the short-baseline multi-radar signal level fusion detection method provided by the present invention. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), a Random Access Memory (RAM), or the like.
Those skilled in the art will readily appreciate that the techniques of the embodiments of the present invention may be implemented as software plus a required general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be essentially or partially implemented in the form of a software product, which may be stored in a storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments or some parts of the embodiments.
The same and similar parts in the various embodiments in this specification may be referred to each other. The above-described embodiments of the present invention should not be construed as limiting the scope of the present invention.

Claims (7)

1. A short-baseline multi-radar signal level fusion detection method is characterized by comprising the following steps:
step 1: selecting a registration radar from two or more radars, converting other radars into a coordinate system of the registration radar, and carrying out spatial registration on each radar;
step 2: performing noise floor normalization on echo detection values of other radars according to the noise statistical characteristics of the noise area of the registration radar;
and step 3: carrying out normalization of an azimuth quantization value and normalization of a distance quantization value on echo detection values of a common detection area of all the radars;
and 4, step 4: establishing a time relation between each resolution unit frame and each radar, designing a corresponding state transfer window according to the time relation, solving a value function and storing the value function;
and 5: performing threshold detection on the final state value function to obtain the target state of the target to be detected;
step 6: and backtracking all the target states to obtain the detection tracks of all the targets to be detected.
2. The method for detecting fusion of signal levels of multiple short-baseline radars according to claim 1, wherein the radar in step 1 is a short-baseline radar, the radar includes radar a and other radars, and step 1 includes:
if the radar A is a registration radar, the coordinate system of the radar A is a registration radar coordinate system, namely a registration coordinate system, and the video signals of other radars are converted into the registration coordinate system according to the following formula:
Figure FDA0002450121930000011
ρ′C=xB 2+yB 2C 2+2xBρCcosθC+2yBρCsinθC
wherein the point O is the center of the circle of the registration radar, i.e. the origin of the coordinate system of the registration radar, OB(xB,yB) Is rectangular coordinate of the center of the other radar in the coordinate system, and the point C is a point in the coordinate system of the other radar, (rho)CC) Is the polar coordinate of point C in the other radar coordinate system, (ρ'C,θ′C) The polar coordinates of point C in the registered coordinate system.
3. The short-baseline multi-radar signal level fusion detection method of claim 2, wherein the step 2 comprises:
if the radar A is a registration radar, performing noise floor normalization on the echo detection values of other radars according to the following formula:
Figure FDA0002450121930000021
wherein, s'rNormalized value, s, of noise floor for echo sounding detected values of other radarsrFor the echo detection values of other radars,
Figure FDA0002450121930000022
to register the noise mean estimate of the radar,
Figure FDA0002450121930000023
noise mean estimates for other radars, noise mean estimates for the registration radars
Figure FDA0002450121930000024
I.e. the noise statistics of the noisy regions of the registration radar.
4. The method according to claim 1, wherein the normalization of the azimuth quantization value and the normalization of the range quantization value in step 3 are performed by normalizing echo detection values of each frame of the common detection area of all the radars to M × N sample data, wherein the echo detection values include azimuth sample data and range sample data, and M is the total number of azimuth cells, N is the total number of range cells, and M and N are positive integers;
the step 3 comprises the following steps: step 3-1, if one frame of azimuth sampling data is quantized to M azimuth units, correspondingly converting the sweep echo azimuth code of each radar from 0-360 degrees to 0-M-1;
normalizing the azimuth quantized values, i.e. normalizing the azimuth sample data to M azimuth cells, according to the following formula:
Figure FDA0002450121930000025
theta' is an azimuth code subjected to azimuth quantization value normalization, the value range is 0-M-1, theta is an azimuth angle swept by each radar echo, and the unit is degree;
step 3-2: selecting the radar with the minimum sampling distance unit size from all radars, normalizing the distance quantization values of the radars except the radar with the minimum sampling distance unit size, and if the sampling distance unit size of the radar with the minimum sampling distance unit size is Runit-minThe radar other than the radar having the smallest sampling range cell size has a range cell size value of Runit-otherThe range cell size value R of the radar other than the radar whose sampling range cell size is the smallest is expressed by the following formulaunit-otherNormalization of the distance quantization values is performed:
Figure FDA0002450121930000026
wherein d isr' is a distance value of radar except the radar with the minimum sampling distance unit size after distance quantization value normalization of certain distance sampling data of the radar, the value range is 0-N-1, and drAnd sampling distance values before normalization for radar distance data except the radar with the smallest sampling distance unit size.
5. The method according to claim 1, wherein in step 4, if the normalized azimuth quantization value and the normalized distance quantization value of each radar in step 3 have a video data size of M × N per frame, the normalized K frames of video data of each radar are arranged in a due north time sequence, and the due north time sequence is denoted as t1,t2,…,tKWherein the time of each sweep of each frame of video data is denoted as tkiK denotes a frame number, K is 1,2, …, K, i denotes a sweep number, i.e., an azimuth cell number, i is 0,1,2, …, M-1, j denotes a distance cell number, j is 0,1,2, …, N-1, and (i, j) denotes a resolution cell whose sweep number is i and distance cell number is j;
the step 4 comprises the following steps: step 4-1, according to the video echo arrangement sequence of each radar, setting the frame number k to 1, namely, setting the time t to t1Initially, the value function I of each resolution unit (I, j) of the 1 st frame radar video data1(i, j) initializing, including:
for all sweeps with sweep number i equal to 0,1,2, …, M-1, determining the order of arrival of the ith sweep among all radars, initializing the 1 st frame value function and the state function by the echo probe value of the ith sweep arriving first at the radar:
I1(i,j)=Z1(i,j)
Ψ1(i,j)=0
wherein, I1(i, j) is a function of the value of the video data of frame 1 at the resolution cell (i, j), Z1(i, j) is the echo detection value, Ψ, of the 1 st frame of video data at the resolution element (i, j)k(i, j) is a state function of the kth frame of video data at the distinguishing unit (i, j), the state function is an association relation between a current frame state variable and a previous frame state variable, the state variable refers to a position state of an object to be detected, a value range of the state variable is (i, j), i ∈ (0, 1.. multidot.M-1), j ∈ (0, 1.. multidot.N-1), and a state function Ψ of the 1 st frame of video data at the distinguishing unit (i, j)1(i, j) is initialized to 0;
4-2, recursion of an evaluation function, wherein the frame number k is 2, and the time t is t2When the frame value function of 2 nd frame begins to be calculated;
for all sweeps with sweep numbers i equal to 0,1,2, … and M-1, determining the arrival sequence of the ith sweep in all radars, taking the echo detection value of the ith sweep which arrives the radar first as sweep data of the ith sweep of the 2 nd frame, and calculating the time difference delta t between the same sweep of the 2 nd frame video data and the same sweep of the 1 st frame video data2i
Δt2i=t2i-t1i,i=0,1,...,M-1
Wherein, t2iFor the time of the 2 nd frame video data at sweep number i, t1iTime at sweep i for frame 1 video data;
calculating a target transfer window q of all radar detection values in the ith sweep in the 2 nd frame of video data according to the motion model of the target to be detected2i
q2i=f(Δt2i)
Wherein q is2iTarget transfer windows for all radar detection values in the ith sweep in the 2 nd frame of video data, i.e. Δ t for the target to be detected2iThe position range of the motion in time, f (-) is a motion model of the target to be detected, and a 2 nd frame value function and a state function are solved according to the following formulas:
Figure FDA0002450121930000041
Figure FDA0002450121930000042
wherein, I2(i, j) is a function of the value of the 2 nd frame of video data at the resolution element (i, j), Z2(i, j) is the echo detection value, Ψ, of the 2 nd frame video data at the resolution element (i, j)2(I, j) is a state function of the 2 nd frame video data at (I, j), I1(x1) Indicating that the 1 st frame of video data is in x1Value function of (a), x1For the target transfer window q in the 1 st frame video data2iThe value of the state variable within;
step 4-3, k frame, time t ═ tkWhen the frame is in a first frame value, starting to calculate a kth frame value function;
for all sweeps with sweep numbers i equal to 0,1,2, … and M-1, determining the arrival sequence of the ith sweep in all radars, taking the echo detection value of the ith sweep which arrives the radar first as sweep data of the ith sweep of the kth frame, and calculating the time difference delta t between the video data of the kth frame and the same sweep of the video data of the kth-1 frameki
Δtki=tki-t(k-1)i
Wherein, tkiFor the time of the k frame video data at sweep number i, t(k-1)iThe time of sweep i for the k-1 frame video data;
calculating a target transfer window size q for all radar detection values in an ith sweep in a kth frame of video dataki
qki=f(Δtki)
Wherein q iskiA target transfer window for all radar detection values in the ith sweep in the kth frame of video data;
calculating a kth frame value function and a state value:
Figure FDA0002450121930000051
Figure FDA0002450121930000052
wherein, Ik(i, j) is a function of the value of the k-th frame of video data at the resolution element (i, j), Zk(i, j) is the echo detection value of the k frame video data at the resolution unit (i, j), Ψk(I, j) is a state function of the k-th frame video data at the resolution unit (I, j), Ik-1(xk-1) Indicating that the k-1 frame video data is in xk-1Value function of (a), xk-1For the target transfer window q in the k-1 frame video datakiThe value of the state variable in.
6. The method according to claim 1, wherein in step 5, the final state is t-tKAt a frame time, performing threshold detection on a final state value function of a target to be detected, and if the final state value function of the unit to be detected is greater than a threshold, confirming that the unit to be detected has a target point trace, and acquiring a target state of the target to be detected, wherein the method comprises the following steps:
Figure FDA0002450121930000053
wherein,
Figure FDA0002450121930000054
for the Kth frame value function I (x)K) Greater than a threshold VTSet of all target state variable values of the location point of (a), threshold VTThe constant false alarm threshold value is calculated according to the distribution characteristic of the noise region value function and the false alarm rate.
7. The method of claim 1, wherein in step 6, all the target state variable values are collected
Figure FDA0002450121930000055
Backtracking the detection tracks of all the targets to be detected by the following formula:
Figure FDA0002450121930000061
wherein,
Figure FDA0002450121930000062
estimating a target state for the k frame video datak+1Is the correlation between the state variable of the k +1 th frame video data and the state variable of the k-th frame,
Figure FDA0002450121930000063
a target state estimate for the K +1 th frame of video data, K-1, …, 1;
obtaining the track estimation of the target to be detected at 1-K time:
Figure FDA0002450121930000064
CN202010290262.1A 2020-04-14 2020-04-14 Short-baseline multi-radar signal level fusion detection method Pending CN111427036A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010290262.1A CN111427036A (en) 2020-04-14 2020-04-14 Short-baseline multi-radar signal level fusion detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010290262.1A CN111427036A (en) 2020-04-14 2020-04-14 Short-baseline multi-radar signal level fusion detection method

Publications (1)

Publication Number Publication Date
CN111427036A true CN111427036A (en) 2020-07-17

Family

ID=71557936

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010290262.1A Pending CN111427036A (en) 2020-04-14 2020-04-14 Short-baseline multi-radar signal level fusion detection method

Country Status (1)

Country Link
CN (1) CN111427036A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113608213A (en) * 2021-08-03 2021-11-05 哈尔滨工业大学 A joint detection method of marine targets based on information fusion of marine radar

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106842165A (en) * 2017-03-16 2017-06-13 电子科技大学 One kind is based on different distance angular resolution radar centralization asynchronous fusion method
CN108089183A (en) * 2017-11-28 2018-05-29 西安电子科技大学 A kind of detecting and tracking integral method for asynchronous multi-static radar system
US20190321719A1 (en) * 2015-10-06 2019-10-24 Google Llc Radar-Enabled Sensor Fusion
JP2019219373A (en) * 2018-06-20 2019-12-26 ラプソド ピーティーイー リミテッド Radar and camera-based data fusion
CN110988808A (en) * 2019-12-11 2020-04-10 中国电子科技集团公司第二十研究所 Two-coordinate shipborne radar signal level fusion method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190321719A1 (en) * 2015-10-06 2019-10-24 Google Llc Radar-Enabled Sensor Fusion
CN106842165A (en) * 2017-03-16 2017-06-13 电子科技大学 One kind is based on different distance angular resolution radar centralization asynchronous fusion method
CN108089183A (en) * 2017-11-28 2018-05-29 西安电子科技大学 A kind of detecting and tracking integral method for asynchronous multi-static radar system
JP2019219373A (en) * 2018-06-20 2019-12-26 ラプソド ピーティーイー リミテッド Radar and camera-based data fusion
CN110988808A (en) * 2019-12-11 2020-04-10 中国电子科技集团公司第二十研究所 Two-coordinate shipborne radar signal level fusion method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
翟海涛: "基于视频信号的自适应云雨杂波抑制技术", 《指挥信息系统与技术》 *
赵宗贵: "《信息融合工程实践技术与方法》", 31 July 2015, 国防工业出版社 *
郭凯德: "组网雷达协同探测技术研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113608213A (en) * 2021-08-03 2021-11-05 哈尔滨工业大学 A joint detection method of marine targets based on information fusion of marine radar

Similar Documents

Publication Publication Date Title
CN107861107B (en) Double-threshold CFAR (computational fluid dynamics) and trace point agglomeration method suitable for continuous wave radar
CN106054169B (en) Multistation Radar Signal Fusion detection method based on tracking information
CN108363054B (en) Passive radar multi-target tracking method for single-frequency network and multi-path propagation
CN107436427B (en) Spatial target motion track and radiation signal correlation method
CN106468771B (en) A kind of multi-target detection and tracking method under high clutter conditions of low Observable
CN107576959A (en) Tracking before a kind of Gao Zhongying Radar Targets'Detection based on area maps ambiguity solution
CN106443598A (en) Convolutional neural network based cooperative radar network track deception jamming discrimination method
CN112731307B (en) RATM-CFAR detector based on distance-angle joint estimation and detection method
CN111398948B (en) Maneuvering small target track association method under strong clutter background
CN110146850B (en) Particle filter centralized tracking method for multi-base radar out-of-sequence measurement fusion
CN107346020B (en) A Distributed Batch Estimation Fusion Method for Asynchronous Multistatic Radar Systems
CN111999735B (en) Dynamic and static target separation method based on radial speed and target tracking
CN106772299B (en) A Dynamic Programming Detection Method for PD Radar Weak Targets Based on Distance Matching
CN107436434B (en) Track Inception Method Based on Bidirectional Doppler Estimation
CN114415123B (en) Non-coherent neighborhood based weighting pulse accumulation processing method and system
CN110954895A (en) Tracking method before speed filtering detection based on complex pseudo-spectrum
CN113537417A (en) Target identification method and device based on radar, electronic equipment and storage medium
CN110308442A (en) GM-PHD target tracking method for phased array radar in strong clutter environment
CN109521420A (en) Based on the matched multi-object tracking method of multiple features
CN111413693A (en) A Combination Method of TBD and Conventional Tracking Based on Dual Threshold Split Processing in MIMO Radar
CN111427036A (en) Short-baseline multi-radar signal level fusion detection method
CN113608193A (en) Radar multi-target distance and speed estimation method based on UNet
CN108828584B (en) Tracking-before-detection method for multi-frequency target based on track folding factor deblurring
CN106950550B (en) High dynamic deviation on-line estimation method based on cross-fuzzy interval judgment under condition of range finding and speed measuring ambiguity
CN117310642A (en) Multi-radar dense-cluster target track association method based on multi-scale clustering

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200717

RJ01 Rejection of invention patent application after publication