[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN117784069A - Real-time target fusion recognition method based on motion and scattering characteristics - Google Patents

Real-time target fusion recognition method based on motion and scattering characteristics Download PDF

Info

Publication number
CN117784069A
CN117784069A CN202311841980.3A CN202311841980A CN117784069A CN 117784069 A CN117784069 A CN 117784069A CN 202311841980 A CN202311841980 A CN 202311841980A CN 117784069 A CN117784069 A CN 117784069A
Authority
CN
China
Prior art keywords
target
scattering
rocket
motion
targets
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311841980.3A
Other languages
Chinese (zh)
Inventor
宋艳琴
李春雨
张晓东
杨晓骞
郝云胜
陈勇
王艳旭
王君博
任俊平
吴海超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
No63729 Troops Pla
Original Assignee
No63729 Troops Pla
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by No63729 Troops Pla filed Critical No63729 Troops Pla
Priority to CN202311841980.3A priority Critical patent/CN117784069A/en
Publication of CN117784069A publication Critical patent/CN117784069A/en
Pending legal-status Critical Current

Links

Landscapes

  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a real-time target fusion recognition method based on motion and scattering characteristics, which comprises the following steps of data preprocessing: analyzing and screening the frame format of the real-time data of the phased array radar; feature extraction: r, A, E, RCS data transmitted by the phased array radar in real time are processed, and target motion characteristics and scattering characteristics are extracted; single feature judgment: normalizing the extracted different types of features and respectively selecting corresponding algorithms to perform identification judgment; multi-feature fusion: carrying out decision layer fusion judgment on the characteristic recognition results of different categories of the same target; multi-target display: transmitting the recognition result of the multiple targets to a three-dimensional display system, and identifying different target types by using different colors; the method effectively combines the motion and scattering characteristics, can identify multiple scenes and multiple targets in real time, and improves the accuracy of phased array radar identification.

Description

Real-time target fusion recognition method based on motion and scattering characteristics
Technical Field
The invention relates to the field of phased array radar target recognition, in particular to a real-time target fusion recognition method based on motion and scattering characteristics.
Background
With the development of aerospace technology, the aerospace activities are more frequent, the carrier rocket needs to perform multiple separation actions in the flight process, each sub-stage (including parts such as a booster, a first sub-stage, a second sub-stage and a fairing) of the rocket is thrown away successively, and a large amount of small separation objects such as explosion bolts and nuts are generated. The phased array radar can track and measure multiple targets such as a carrier rocket body, rocket sub-stages, fragments and the like at the same time, and is an important measurement means in a space launching task. In order to accurately send the spacecraft into space, the phased array radar needs to identify multiple targets after detecting the separated targets, so that accurate tracking is realized and specific target measurement data is acquired.
The existing method comprises the step of carrying out space target recognition by utilizing narrow-band phased array radar measurement information, wherein at present, matching recognition is mainly based on electromagnetic scattering characteristics, and target recognition is carried out by utilizing a mahalanobis distance minimum matching principle by adopting the characteristic that confidence intervals are mutually separated as a standard set. However, the method has the defects that the structure of the test target data set is simple, and the scattering property of the measured data is ideal. Secondly, only the scattering characteristics of the targets are applied, when the targets are in a fly-along state densely, multiple targets are mixed, the scattering characteristics are aliased, and the targets are difficult to accurately judge through RCS. Thirdly, a confidence interval and a feature vector need to be determined according to measurement data, and the method cannot be suitable for unknown scenes. And performing spatial target recognition by using the narrow-band phased array radar measurement information, and performing target recognition by applying sequence curve matching of scattering features. The method has three defects, namely, the method is strongly limited by model, can only identify the model of the rocket in the simulation library, and cannot identify the unknown model. And secondly, when the change of the target attitude angle is smaller, the matching time of the change curve is shorter, and the phenomenon that a result cannot be identified or the matching is wrong is easy to occur. And thirdly, the time for data accumulation is long in engineering application, and the identification result is not output timely.
Therefore, the invention provides a real-time target fusion recognition method based on motion and scattering characteristics.
Disclosure of Invention
In order to overcome the defects and shortcomings in the prior art, the invention provides a real-time target fusion identification method based on motion and scattering characteristics.
The technical scheme adopted by the invention is that the real-time target fusion recognition method based on the motion and scattering characteristics comprises the following steps:
step S1, data preprocessing: the transmission of real-time data of the phased array radar generally has a preset format, firstly, data analysis is carried out according to the convention of a frame format, abnormal values are removed, screened target tracks and scattered data are obtained, and a sequential mode is adopted to carry out sliding window processing on real-time data streams according to window lengths and step sizes;
step S2, feature extraction: r, A, E, RCS data transmitted by the phased array radar in real time are processed, and target motion characteristics and scattering characteristics are extracted;
step S3, single feature judgment: normalizing the extracted different types of features and respectively selecting corresponding algorithms to perform identification judgment;
step S4, multi-feature fusion: carrying out decision layer fusion judgment on the different types of feature recognition results of the same target;
step S5, multi-target display: and sending the recognition result of the multiple targets to a three-dimensional display system, and identifying different target types by using different colors.
Further, the step S2 includes:
s2.1, analyzing scattering characteristics of the phased array radar, analyzing target characteristics by combining physical characteristics presented by different targets in space transmission, and carrying out separability analysis on scattering characteristics of multi-scene multi-target data by utilizing a separability criterion based on intra-class inter-class distances to obtain a characteristic combination mean value and an accumulated autocorrelation mean value;
and S2.2, analyzing motion characteristics of the phased array radar, wherein the motion characteristics are characteristics of particle motion and rotation motion around the mass center of the target, and describing by using the position, speed, acceleration and mechanical energy characteristic quantity of the motion of the target.
Further, the step S2.1 includes:
step S2.1.1, extracting characteristics by phased array radar scattering: the scattering cross-sectional area is used for measuring the scattering capability of a target on the phased array radar echo, and the scattering cross-sectional area of a complex target is expressed as:
σ=A T ·r T ·D T
wherein A is T For the projected cross-sectional area of the target, i.e. the projected area of the target on a plane perpendicular to the line of sight of the phased array radar, which depends on the radar observation direction and the target geometry and dimensions, r T The "reflectivity" of a material constituting a target surface is defined as the ratio of the reflected power density to the incident power density at any point of the surface, the ratio being not greater than 1, D T "directivity coefficients" for scattering the target, depending on the pose of the target relative to the phased array radar;
s2.1.2, analyzing scattering characteristics of different targets, wherein in a space launching task, the volumes of the fragment targets are smaller, and the average value of scattering sectional areas is smaller;
step S2.1.3, scatter cross-sectional area feature evaluation: and characteristic separability measurement is carried out on two targets of the rocket body and the rocket sub-level structure by adopting a separability criterion based on inter-class distances in classes, and the two targets can be better distinguished by obtaining an accumulated autocorrelation mean value.
Further, the step S3 includes:
step S3.1, judging scattering characteristics: classifying and identifying the multiple targets based on the extracted combined mean value of the scattering features of the multiple targets and the accumulated autocorrelation mean value by using a support vector machine classification algorithm; firstly, extracting features of an accumulated standard test data set, then adjusting proper classifier parameters, and training to generate an identification template for real-time target identification and result probability evaluation of different targets.
Step S3.2, judging motion characteristics: the space targets, the thrust targets and the non-thrust targets are identified through the height, the speed and the mechanical energy, the thrust targets are rocket bodies, and the non-thrust targets are randomly divided into rocket remains and rocket sub-stages according to the result, and probability equipartition is carried out.
Further, the step S4 includes:
step S4.1, a sample space omega is established, a recognition frame sample space comprises { rocket body, rocket sub-stage, rocket fragments and space targets }, each sub-stage in the set is all possible sets of uncertainty problems, recognition results have atomicity, only four possible results { rocket body }, { rocket sub-stage }, { rocket fragments }, { space targets }, called proposition A, and a power set is not considered;
step S4.2, combining the credibility basic probability distribution function M of the facts, and four propositionsCorresponds to a number M E [0,1 ]]M is called the basic probability number of proposition A, namely probability 1 is distributed to four subsets of the whole set omega, and there are two different information channel evidences of motion and scattering, namely, two mutually independent basic probability distribution functions M 1 And M 2 After a certain target is identified by the motion characteristics, obtaining a basic probability distribution value of the target as M 1 ,M 1 ({ rocket body }) +M 1 ({ rocket stage }) +M 1 ({ rocket fragments }) +M 1 ({ spatial object }) =1, and identification of an object on scattering featuresThen, the target basic probability distribution value M is obtained 2 ,M 2 ({ rocket body }) +M 2 ({ rocket stage }) +M 2 ({ rocket fragments }) +M 2 ({ spatial target }) =1.
Further, the step S4 further includes:
s4.3, calculating an evidence theory normalization constant K, and representing conflict degrees of a plurality of independent evidences;
s4.4, calculating a combined basic probability distribution function (BPA), namely a mass function, respectively by utilizing a conclusion uncertainty fusion rule and a Dempster synthesis rule, calculating the BPA of other targets by adopting a rocket body with the following calculation formula,
step S4.5, calculating trust function values and likelihood function values of all sub-levels in the current target sample space, wherein the trust function, the likelihood function values and the combined mass function values are equal to each other for the identification system, namely:
the beneficial effects are that:
the invention provides a real-time target fusion recognition method based on motion and scattering characteristics, which utilizes a phased array radar to observe real-time data flow to recognize a space target, solves the problems that unknown scenes cannot be recognized and algorithm engineering cannot be realized due to the limitation of models, explores the universality characteristics of scattering characteristics of different targets, can apply the unknown scenes, effectively combines the motion and scattering characteristics, can recognize multiple targets in multiple scenes in real time, has the recognition accuracy of more than 90 percent, improves the accuracy of phased array radar recognition, and can be applied to the existing phased array radar equipment and convergence center target recognition system.
Drawings
FIG. 1 is a flow chart of the method of the present invention;
FIG. 2 is a rocket body normalized RCS sequence diagram of the present invention;
FIG. 3 is a rocket-grade normalized RCS sequence diagram of the present invention;
FIG. 4 is a schematic diagram of the feature separability of the present invention;
FIG. 5 is a plot of a multi-objective normalized feature profile of the present invention;
FIG. 6 is a flow chart of the classification of the target motion feature support vector machine of the present invention;
FIG. 7 is a flow chart of object motion feature identification in accordance with the present invention;
fig. 8 is a block diagram of real-time fusion identification in accordance with the present invention.
Detailed Description
It should be noted that, without conflict, the embodiments and features of the embodiments in the present application may be combined with each other, and the present application will be further described in detail with reference to the drawings and the specific embodiments.
As shown in fig. 1, a real-time target fusion recognition method based on motion and scattering features comprises the following specific steps:
step S1, data preprocessing: the transmission of real-time data of the phased array radar is usually in a preset format, firstly, data analysis is carried out according to the convention of a frame format, abnormal values are removed, the screened target track and scattered data are obtained, and the real-time data stream is subjected to sliding window processing according to window length and step length in a sequential mode.
Step S2, feature extraction: and processing data such as R, A, E, RCS and the like transmitted by the phased array radar in real time, and extracting target motion characteristics and scattering characteristics.
Step S2.1, the scattering cross section (RCS) is the main information of the scattering characteristics, the RCS is a random variable, a random process is formed, and the radar echo intensities of different targets have different statistical characteristics. And analyzing target characteristics by combining physical characteristics presented by different targets in space flight emission, and carrying out separability analysis on scattering characteristics of multi-scene multi-target data by utilizing a separability criterion based on inter-class distances in classes to obtain a characteristic combination mean value and an accumulated autocorrelation mean value with strong universality.
Step S2.1.1, extracting characteristics of scattering cross section (RCS) of the phased array radar, wherein the scattering capability of the radar echo by the target can be measured. When the target is irradiated by radar waves, energy is scattered towards all directions, the sum of a scattered field and an incident field forms a space total field, and the scattered field is secondary radiation of induced electromagnetic current and electromagnetic charge on the surface of the object from the perspective of induced current. This spatial distribution of energy, called scattering, is primarily dependent on the shape, size, structure of the object, and frequency of the incident wave. The method is characterized by using the scattering sectional area of the target, which is an imaginary area of the target, and quantitatively represents the physical quantity of the target on the scattering intensity of radar waves. Physically, the RCS of a complex target can be expressed as:
σ=A T ·r T ·D T
wherein A is T The projected cross-sectional area of the target, namely the projected area of the target on a plane perpendicular to the radar sight line direction, is dependent on the radar observation direction and the geometric outline and size of the target; r is (r) T The "reflectivity" of a material for a target surface, defined as the ratio of the reflected power density at any point on the surface to the incident power density, the ratio being no greater than 1; d (D) T "coefficient of directivity" for scattering of the target, dependent onPose of the target relative to the radar.
The RCS of the target changes with the difference of the relative attitude of the target to the radar, so that the radar echo intensity has different statistical characteristics. The statistical characteristics reflect different physical information of the targets and are a means for identifying the targets. Analytical features include the following (X) k |1<k<L } represents L scattering sequence values of the target RCS):
1) Average value:l is the number of RCS sequence values;
2) Maximum value: x is X max =max{X 1 ,X 2 ,…,X L };
3) Minimum value: x is X min =min{X 1 ,X 2 ,…,X L };
4) Median value: the RCS sequences are ordered according to the size, and the middle number is the median;
5) Extremely bad: x is X range =X max -X min ,X max Is the maximum value, X min Is a minimum value;
6) Standard deviation:X mean is the mean value;
7) Standard mean deviation:X mean is the mean value, X std Is the standard deviation;
8) Coefficient of variation:X mean is the mean value, X std Is the standard deviation;
9) Kurtosis coefficient:X mean is the mean value, X std Is the standard deviation;
10 Coefficient of skewness):X mean is the mean value, X std Is the standard deviation;
11 Spectrum mean:n is the number of fourier transform points of the RCS sequence,is RCS sequence spectral amplitude normalization, +.> f (N) is the spectral amplitude of the RCS sequence after N-point Fourier transform:
12 Autocorrelation mean value):r (k) is a symmetric autocorrelation function,r (k + 1) is an autocorrelation function,x (n) is the normalized RCS subsequence,>
13 Cumulative autocorrelation mean:m is the number of current accumulated segments of the RCS sequence, L is the number of values of the RCS sequence, and r (k) is a symmetric autocorrelation functionThe definition is shown in formula 12.
Step S2.1.2, scattering feature analysis of different targets: in the space launching task, the volume of the fragment target is smaller, and the fragment target is reflected in a smaller RCS mean value. Under the action of the attitude control system, the rocket main body target has a stable attitude and is reflected in small value fluctuation of RCS. The simulation test finds that the pitch angle of the rocket sub-level gradually decreases to the negative direction to be stable, the yaw angle regularly fluctuates in the falling process, and the rolling angle continuously rolls at the initial rotational angle speed. The rocket stage rolls freely in space, the yaw angle regularly fluctuates, the target projection cross-sectional area regularly fluctuates, the RCS stability is poor, the value fluctuation of the RCS is large, and a certain period rule exists. Normalized rocket body and sub-level RCS sequences are shown in FIGS. 2 and 3.
Step S2.1.3, RCS feature evaluation: the targets of fragments are small and well distinguished, the rocket body and the rocket sub-level structure are similar, and characteristic separability measurement is carried out by adopting a separability criterion based on inter-class distances in the classes aiming at the two targets. Intra-class distance characterizes the degree of intra-class aggregation of samples, which can be measured by the sum of the squares of the distances between each sample and its belonging cluster center. The degree of dispersion between the inter-class distance-characterized clusters can be measured by the sum of the squares of the weighted distances between the true center of each cluster and the center of the sample population. The intra-class distance of the feature clustering result is small, the inter-class distance is large, and the classification result has higher distinction degree. For this purpose, a criterion function J is constructed which reflects both intra-class and inter-class distances.
Let { X ] j |1<k<n j And is the characteristic sequence of the j-category target RCS.
Intra-class distance: is the average value of j class feature combinations;
total intra-class distance:c is the total number of the characteristics, N is the total number of all the characteristics;
the inter-class distance is as follows: is the mean value of j class of feature combinations, < >>
And C is the total feature sample mean value, and C is the feature total class number.
Divisibility criterion function
The measurement of the separability of two types of targets, namely rocket bodies and rocket sub-stages, is shown in fig. 4. It can be seen from the graph that the cumulative autocorrelation mean has better separability for both types of targets than other features.
The three standard targets of rocket main body, rocket sub-level and rocket fragments are subjected to classifier test, so that the three targets can be effectively distinguished by the two types of features, and the normalized feature distribution of the targets is shown in figure 5.
Step S2.2, motion characteristic analysis
The motion characteristics are the particle motion characteristics and the rotation characteristics around the mass center of the target, and can be described by the characteristic quantities of the position, the speed, the acceleration, the mechanical energy and the like of the motion of the target, and the parameters (x i ,y i ,z i ),i=1,2,…,M。
Closing speed:v x v y v z is the speed in three directions;
height: the geodesic distance of any point on the track is known as:
xyz is a certain position coordinate
Assuming that the earth is a two-axis rotation ellipsometry body, the geocentric distance of an ellipsoidal surface point with the geocentric latitude phi of a point below the star is:
a e is the long half shaft of the earth, b e Is the short half shaft of the earth
The height is calculated as:
h=r-R
mechanical energy
Considering that the quality of the target cannot be obtained, in practical application, the change of the target speed, the height and the like is mainly reflected through the change of the characteristics, and the dividing effect of the target movement stage is achieved. Thus, the target mass is taken as a unit mass, i.e., m=1. The mechanical energy calculation mode is changed into:
step S3: single feature judgment: and carrying out normalization processing on the extracted different types of features, and respectively selecting corresponding algorithms to carry out identification judgment.
Step S3.1, judging scattering characteristics: and classifying and identifying the multiple targets based on the extracted multi-target scattering feature combined mean and the accumulated autocorrelation mean by using a support vector machine classification algorithm. Firstly, extracting features of an accumulated standard test data set, then adjusting proper classifier parameters, and training to generate an identification template for real-time target identification and result probability evaluation of different targets. The specific flow is shown in fig. 6.
Step S3.2, judging motion characteristics: in the rocket launching process, multiple targets span a wide height range, and targets can be roughly classified by utilizing the target height and speed to separate out relevant targets for rocket launching. Under the premise of not considering the change of the target mass, according to the law of conservation of energy, when the rocket booster works, the rocket main body target obtains stable thrust, the speed and the height of the rocket main body target are both in an ascending trend, and the mechanical energy of the rocket main body target is increased. After the rocket is separated, the rocket sub-stage is in a thrust-free state, the kinetic energy and potential energy of the rocket are converted according to different ballistic positions, and the mechanical energy of the rocket is maintained or reduced. Therefore, space targets, thrust targets and non-thrust targets can be identified through the height, the speed and the mechanical energy, the thrust targets are rocket bodies, and the non-thrust targets are randomly divided into rocket remains and rocket sub-stages according to the result, and probability equipartition is carried out. The specific identification flow is shown in fig. 7.
Step S4, a multi-feature fusion module: and carrying out decision layer fusion judgment on the feature recognition results of different categories of the same target.
The multi-feature fusion recognition layer can be divided into a data acquisition layer, a feature layer, a fractional layer and a decision layer from low to high. The lower the fusion level, the more complex the implementation and the higher the data correlation requirements. The data acquisition layer directly performs pixel point fusion on the data acquired by the equipment, and has the advantages of high difficulty in fusion, high information content in the feature layer fusion, easiness in causing dimension disasters and low data correlation of the fractional layer. And therefore choose to perform the result fusion at the highest decision layer. The decision layer fusion adopts a D-S evidence theory, which is a multi-source body combination rule for improving the confidence degree of the event by simultaneously utilizing the evidence from mutually independent different information sources. The specific implementation steps are as follows:
1) The method comprises the steps of establishing a sample space omega, identifying a frame sample space, wherein the sample space comprises { rocket bodies, rocket sub-stages, rocket fragments and space targets }, each sub-stage in a set is all possible sets of uncertainty problems, the identification result of the problem has atomicity, only four possible results { rocket bodies }, { rocket sub-stages }, { rocket fragments }, and { space targets }, which are called propositions A, and a power set is not considered.
2) The reliability basic probability distribution function M of the combination fact. Four propositionsCorresponds to a number M E [0,1 ]]M is called the basic probability number of proposition a, i.e. probability 1 is assigned to four subsets of the corpus Ω. With evidence of movement and scattering of two different channels of information, i.e. with two mutually independent basic probability distribution functions M 1 And M 2 . After a certain target is identified by the motion characteristics, obtaining a basic probability distribution value of the target as M 1 ,M 1 ({ rocket body }) +M 1 ({ rocket stage }) +M 1 ({ rocket fragments }) +M 1 ({ space object }) =1, and after the scattering feature of a certain object is identified, the basic probability distribution value of the object is M 2 ,M 2 ({ rocket body }) +M 2 ({ rocket stage }) +M 2 ({ rocket fragments }) +M 2 ({ spatial target }) =1.
3) The evidence theory normalization constant K is calculated to represent the conflict degree of a plurality of independent evidences.
4) The combined base probability distribution function (BPA), i.e., mass function, is calculated separately using the conclusion uncertainty fusion rule, the Dempster synthesis rule. The BPA calculation formula of the rocket body is as follows, and BPA of other targets can be calculated by the same method.
5) And calculating trust function values and likelihood function values of all sub-levels in the current target sample space. For the identification system, the trust function, likelihood function value and combined mass function value are equal, namely:
therefore, the credibility of each trust interval can be obtained, and the category attribution of the target can be judged. The whole system algorithm flow is shown in fig. 8:
step S5, multi-target display: and sending the recognition result of the multiple targets to a three-dimensional display system, and identifying different target types by using different colors.
Although embodiments of the present invention have been shown and described, it will be understood by those skilled in the art that various equivalent changes, modifications, substitutions and alterations can be made to these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (6)

1. The real-time target fusion recognition method based on the motion and scattering characteristics is characterized by comprising the following steps of:
step S1, data preprocessing: the transmission of real-time data of the phased array radar generally has a preset format, data analysis is carried out according to the convention of a frame format, abnormal values are removed, screened target tracks and scattered data are obtained, and a sliding window processing is carried out on the real-time data stream according to window length and step length in a sequential mode;
step S2, feature extraction: r, A, E, RCS data transmitted by the phased array radar in real time are processed, and target motion characteristics and scattering characteristics are extracted;
step S3, single feature judgment: carrying out normalization processing on the extracted different types of features, and respectively selecting corresponding algorithms to carry out identification judgment;
step S4, multi-feature fusion: carrying out decision layer fusion judgment on the characteristic recognition results of different categories of the same target;
step S5, multi-target display: and sending the recognition result of the multiple targets to a three-dimensional display system, and identifying different target types by using different colors.
2. The method for real-time object fusion recognition based on motion and scattering features as set forth in claim 1, wherein the step S2 includes:
s2.1, analyzing scattering characteristics of the phased array radar, analyzing target characteristics by combining physical characteristics presented by different targets in space transmission, and carrying out separability analysis on scattering characteristics of multi-scene multi-target data by utilizing a separability criterion based on intra-class inter-class distances to obtain a characteristic combination mean value and an accumulated autocorrelation mean value;
and S2.2, analyzing motion characteristics of the phased array radar, wherein the motion characteristics are characteristics of particle motion and rotation motion around the mass center of the target, and describing by using the position, speed, acceleration and mechanical energy characteristic quantity of the motion of the target.
3. The method for real-time object fusion recognition based on motion and scattering features according to claim 2, wherein the step S2.1 comprises:
step S2.1.1, extracting characteristics by phased array radar scattering: the scattering cross-sectional area is used for measuring the scattering capability of a target on the phased array radar echo, and the scattering cross-sectional area of a complex target is expressed as:
σ=A T ·r T ·D T
wherein A is T For the projected cross-sectional area of the target, i.e. the projected area of the target on a plane perpendicular to the line of sight of the phased array radar, which depends on the direction of observation of the phased array radar and the geometry and dimensions of the target, r T The "reflectivity" of a material constituting a target surface is defined as the ratio of the reflected power density to the incident power density at any point of the surface, the ratio being not greater than 1,D T "directivity coefficients" for scattering the target, depending on the pose of the target relative to the phased array radar;
s2.1.2, analyzing scattering characteristics of different targets, wherein in a space launching task, the volumes of the fragment targets are smaller, and the average value of scattering sectional areas is smaller;
step S2.1.3, evaluating the characteristics of the scattering cross section of the phased array radar: and characteristic separability measurement is carried out on two targets of the rocket body and the rocket sub-level structure by adopting a separability criterion based on inter-class distances in classes, and the two targets can be better distinguished by obtaining an accumulated autocorrelation mean value.
4. The method for real-time object fusion recognition based on motion and scattering features as set forth in claim 1, wherein the step S3 includes:
step S3.1, judging scattering characteristics: classifying and identifying the multiple targets by using a support vector machine classification algorithm and using the extracted multi-target scattering feature combined mean value and the accumulated autocorrelation mean value; extracting the characteristics of the accumulated standard test data set, adjusting the parameters of a proper classifier, and training to generate an identification template for real-time target identification and result probability evaluation of different targets;
step S3.2, judging motion characteristics: the space targets, the thrust targets and the non-thrust targets are identified through the height, the speed and the mechanical energy, the thrust targets are rocket bodies, and the non-thrust targets are randomly divided into rocket remains and rocket sub-stages according to the result, and probability equipartition is carried out.
5. The method for real-time object fusion recognition based on motion and scattering features as set forth in claim 1, wherein the step S4 includes:
step S4.1, a sample space omega is established, wherein the sample space omega comprises { rocket bodies, rocket sub-stages, rocket fragments and space targets }, each sub-stage in the set is all possible sets of uncertainty problems, the identification result has atomicity, only four possible results { rocket bodies }, { rocket sub-stages }, { rocket fragments }, and { space targets }, which are called propositions A, and a power set is not considered;
step S4.2, combining the credibility basic probability distribution function M of the facts, and four propositionsCorresponds to a number M E [0,1 ]]M is called the basic probability number of proposition A, namely probability 1 is distributed to four subsets of the whole set omega, and two different information channel evidences of motion and scattering exist, namely two mutually independent basic probability distribution functions M 1 And M 2 After the target is identified by the motion characteristics, the basic probability distribution value of the target is obtained as M 1 ,M 1 ({ rocket body }) +M 1 ({ rocket stage }) +M 1 ({ rocket fragments }) +M 1 ({ spatial object }) =1, and after the object is identified by the scattering feature, the assigned value of the fundamental probability of the object is obtained as M 2 ,M 2 ({ rocket body }) +M 2 ({ rocket stage }) +M 2 ({ rocket fragments }) +M 2 ({ spatial target }) =1.
6. The method for real-time object fusion identification based on motion and scattering features of claim 1, wherein the step S4 further comprises:
s4.3, calculating an evidence theory normalization constant K, representing conflict degrees of a plurality of independent evidences, wherein the expression is as follows:
s4.4, respectively calculating a combined basic probability distribution function (BPA), namely a mass function, by utilizing a conclusion uncertainty fusion rule and a Dempster synthesis rule, wherein the BPA calculation expression of the rocket body is as follows:
step S4.5, calculating trust function values and likelihood function values of all sub-levels in the current target sample space, wherein for the identification system, the trust function, the likelihood function values and the combined mass function values are equal, and the expression is as follows:
CN202311841980.3A 2023-12-28 2023-12-28 Real-time target fusion recognition method based on motion and scattering characteristics Pending CN117784069A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311841980.3A CN117784069A (en) 2023-12-28 2023-12-28 Real-time target fusion recognition method based on motion and scattering characteristics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311841980.3A CN117784069A (en) 2023-12-28 2023-12-28 Real-time target fusion recognition method based on motion and scattering characteristics

Publications (1)

Publication Number Publication Date
CN117784069A true CN117784069A (en) 2024-03-29

Family

ID=90386974

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311841980.3A Pending CN117784069A (en) 2023-12-28 2023-12-28 Real-time target fusion recognition method based on motion and scattering characteristics

Country Status (1)

Country Link
CN (1) CN117784069A (en)

Similar Documents

Publication Publication Date Title
CN112101278B (en) Homestead point cloud classification method based on k-nearest neighbor feature extraction and deep learning
US8063815B2 (en) Method and system for automatic classification of objects
Kim et al. Drone classification using convolutional neural networks with merged Doppler images
CN108256436B (en) A Radar HRRP Target Recognition Method Based on Joint Classification
Popescu et al. Contextual descriptors for scene classes in very high resolution SAR images
CN109977871B (en) Satellite target identification method based on broadband radar data and GRU neural network
CN108416378A (en) A kind of large scene SAR target identification methods based on deep neural network
CN101136141A (en) Vehicle Classification Method Based on Single Frequency Continuous Wave Radar
CN111401168B (en) Multilayer radar feature extraction and selection method for unmanned aerial vehicle
CN113516052A (en) Imaging millimeter wave radar point cloud target classification method based on machine learning
CN112213697B (en) Feature fusion method for radar deception jamming recognition based on Bayesian decision theory
Feng et al. Target recognition of SAR images via hierarchical fusion of complementary features
Pinel et al. Statistical analysis of real aperture radar field backscattered from sea surfaces under moderate winds by Monte Carlo simulations
CN117784069A (en) Real-time target fusion recognition method based on motion and scattering characteristics
Paulson et al. Synthetic aperture radar quantized grayscale reference automatic target recognition algorithm
Laubie et al. Bistatic aspect diversity for improved SAR target recognition
Jiang et al. A new segmentation algorithm for lunar surface terrain based on CCD images
Kvasnov et al. A classification technique of civil objects by artificial neural networks using estimation of entropy on synthetic aperture radar images
Bai et al. Multi-objective classification of three-dimensional imaging radar point clouds: support vector machine and PointNet
CN113534129B (en) Method and system for evaluating high-speed target detection performance of foundation broadband radar
Zhang et al. Improved sparse representation of open set HRRP recognition method
CN104715265B (en) Radar scene classification method based on compression sampling Yu integrated coding grader
Pei et al. Target aspect identification in sar image: A machine learning approach
Nuo et al. High range resolution profile automatic target recognition using sparse representation
Banas et al. A Multi-View CNN for SAR ATR

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination