CN112784916B - Air target micro-motion parameter real-time extraction method based on multitask convolutional network - Google Patents

Air target micro-motion parameter real-time extraction method based on multitask convolutional network Download PDF

Info

Publication number
CN112784916B
CN112784916B CN202110134778.1A CN202110134778A CN112784916B CN 112784916 B CN112784916 B CN 112784916B CN 202110134778 A CN202110134778 A CN 202110134778A CN 112784916 B CN112784916 B CN 112784916B
Authority
CN
China
Prior art keywords
layer
micro
multitask
blade
pooling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110134778.1A
Other languages
Chinese (zh)
Other versions
CN112784916A (en
Inventor
王鹏辉
范雪欣
刘宏伟
丁军
陈渤
纠博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN202110134778.1A priority Critical patent/CN112784916B/en
Publication of CN112784916A publication Critical patent/CN112784916A/en
Application granted granted Critical
Publication of CN112784916B publication Critical patent/CN112784916B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/417Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section involving the use of neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Engineering & Computer Science (AREA)
  • Molecular Biology (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses an air target micro-motion parameter real-time extraction method based on a multitask convolutional network, and aims to solve the problems that a traditional air target micro-motion parameter estimation method is poor in real-time performance and micro-motion parameter extraction is not independent. The invention mainly comprises the following steps: (1) generating a training set; (2) constructing a multitask convolutional neural network; (3) training a multitask convolutional neural network; (4) obtaining an aerial target micro-motion parameter extraction value; (5) and acquiring the micromotion parameters. The invention has the advantage of extracting each micro-motion parameter of the aerial target in real time and independently.

Description

Air target micro-motion parameter real-time extraction method based on multitask convolutional network
Technical Field
The invention belongs to the technical field of radars, and further relates to a method for extracting micro-motion parameters of an aerial target in real time based on a multitask convolutional network in the technical field of radar target identification. The invention can be used for extracting the physical parameters and the motion parameters of the micro-motion component of the aerial targets with the micro-motion component, such as helicopters, propeller planes, jet planes and the like, and the extracted physical parameters and the extracted motion parameters of the micro-motion component can be used for identifying the aerial targets.
Background
The extraction of physical parameters and motion parameters of the target micro-motion component can be realized by using the micro-motion signal extracted from the radar echo signal of the target. On the premise of ensuring the parameter extraction precision, the extracted parameters can be compared with the parameters of the typical aerial target model micro-motion component, so that the aim of accurately identifying the aerial target is fulfilled. At present, common methods for extracting the micro-motion parameters of the aerial target comprise Hough transform, inverse Radon transform, short-time Fourier transform and the like. The parameter extraction by using the method has advantages and disadvantages, and different algorithms and threshold values are selected to complete the parameter extraction according to different extracted parameters.
The patent document "method for classifying targets of an airplane based on rotor physical parameter estimation" (application No. 201410662970.8, application publication No. CN 104330784B) applied by the university of electronic science and technology of west ampere discloses a method for classifying targets of an airplane based on rotor physical parameter estimation. The method comprises the following specific steps: separating a rotor wing echo signal from an aircraft radar echo signal, and estimating the rotating speed of a rotor wing from a time-frequency domain of the rotor wing echo signal; performing rotor two-dimensional imaging on the rotor echo signal in the time-frequency domain of the rotor echo signal; preprocessing a two-dimensional imaging result of the rotor wing; estimating the rotor length and the number of propeller blades from the pre-processed rotor two-dimensional imaging result; and comparing the rotor physical parameters of the estimated rotor rotation speed omega, the rotor length L and the propeller blade number N with the aircraft rotor physical parameters in the aircraft type standard library to judge the type of the aircraft target. The method has the following defects: when the method is used for extracting the parameters, the time consumption of the process of searching the curve by using Hough transform cannot meet the real-time requirement of aerial target identification.
Electronic science and technology university discloses a helicopter rotor physical parameter extraction method based on time-frequency spectrum image processing in the patent document "a helicopter rotor physical parameter extraction method" (application No. 201910519253.2, application publication No. CN 110133600 a) applied by the university of electronic science and technology. The method comprises the following specific steps: firstly, filtering and segmenting a narrow-band RCS sequence time frequency spectrum of a helicopter, reducing background noise of a time frequency spectrogram, improving image definition and accurately extracting time frequency signal lines; estimating the rotation period of the helicopter rotor by using a least square method, and estimating the number of blades of the rotor by counting the number of maximum bandwidth lines in a single rotation period; and finally, estimating the length of the paddle according to the relation between the frequency spectrum width and the length of the paddle. The method has the following defects: when the method is used for extracting the parameters, the parameters need to be extracted one by one, the micromotion parameters have a dependency relationship, and the micromotion parameters depending on other parameters cannot be independently extracted.
Disclosure of Invention
The invention aims to provide a method for extracting the micro-motion parameters of the aerial target in real time based on the multitask convolutional network aiming at solving the problems that the method for estimating the micro-motion parameters of the aerial target in the prior art is poor in real-time performance and the micro-motion parameters are not independent.
The idea for realizing the purpose of the invention is as follows: the micromotion parameters can be quickly extracted through the good nonlinear mapping capability of the multitask convolutional neural network, and the corresponding micromotion parameters can be independently extracted under the condition of not depending on other unshared modules through the unshared property among the multitask unshared modules. And constructing a multitask convolutional neural network consisting of one multitask sharing module and three multitask non-sharing modules so as to fulfill the aim of extracting the micro-motion parameters of the hollow target in real time. Meanwhile, by utilizing the unshared property of each multitask unshared module, the micromotion parameter corresponding to the module is independently extracted so as to meet the requirement of the independence of the extraction of the micromotion parameter. The convergence of the total loss value ensures that the three parameters can be extracted simultaneously and have certain precision.
The specific steps for realizing the purpose of the invention are as follows:
(1) generating a training set:
(1a) selecting a sample set which comprises M aerial targets, wherein the number of blades of each aerial target is different, each aerial target at least comprises 1000 radar echo samples, and the total number of samples in the sample set is N, wherein M is more than or equal to 3, and N is more than or equal to M1000;
(1b) extracting a micro-motion signal in each radar echo sample in the sample set by adopting a CLEAN algorithm;
(1c) carrying out modulus operation on each micro-motion signal to obtain an amplitude sequence of the micro-motion signal;
(1d) calculating the projection of the blade length corresponding to each amplitude sequence on the radar sight line by combining the attitude angle of the aerial target, obtaining the blade projection length corresponding to the amplitude sequence, and taking the blade rotating speed, the blade projection length and the number of the blades of the type corresponding to the amplitude sequence as the target values of three micro-motion parameters corresponding to the amplitude sequence;
(1e) taking the target values of the three inching parameters corresponding to each amplitude sequence as the labels of the amplitude sequence;
(1f) forming a training set by the amplitude sequences of all the micro-motion signals and the corresponding labels;
(2) constructing a multitask convolutional neural network:
(2a) a12-layer multitask sharing module is built, and the structure sequentially comprises the following steps: the first convolution layer, the ReLU active layer, the first pooling layer, the second convolution layer, the ReLU active layer, the second pooling layer, the third convolution layer, the ReLU active layer, the third pooling layer, the fourth convolution layer, the ReLU active layer and the fourth pooling layer; setting the number of convolution kernels of the first to fourth layers to be 16, 64, 128 and 128 in sequence, setting the sizes of the convolution kernels to be 1 x 5, setting the sizes of the first to fourth pooling layers to be 1 x 2 in a maximum pooling mode, and setting the sizes of the pooling kernels to be 2 and the pooling step size to be 2;
(2b) a first multitask non-shared module is built, and the structure of the module is as follows in sequence: the device comprises a first convolution layer, a ReLU active layer, a pooling layer, a second convolution layer, a ReLU active layer and a full connection layer; the number of convolution kernels of the first layer and the second layer is sequentially set to be 64 and 32, the sizes of the convolution kernels are both set to be 1 x 5, the pooling layer adopts a maximum pooling mode, the size of the pooling kernel is set to be 1 x 2, the pooling step length is set to be 2, and the output dimension of the full-connection layer is 1;
(2c) and constructing a second multitask non-shared module, wherein the structure of the second multitask non-shared module is as follows in sequence: the device comprises a first convolution layer, a ReLU active layer, a pooling layer, a second convolution layer, a ReLU active layer, a third convolution layer, a ReLU active layer and a full connection layer; the number of convolution kernels of the first layer to the third layer is sequentially set to be 64, 32 and 16, the sizes of the convolution kernels are all set to be 1 x 5, the pooling layer adopts a maximum pooling mode, the size of the pooling kernel is set to be 1 x 2, the pooling step length is set to be 2, and the output dimension of the full-connection layer is 1;
(2d) a third multitask non-shared module is built, and the structure of the module is as follows in sequence: the device comprises a first convolution layer, a ReLU active layer, a pooling layer, a second convolution layer, a ReLU active layer and a full connection layer; the number of convolution kernels of the first layer to the second layer is sequentially set to be 64 and 32, the sizes of the convolution kernels are all set to be 1 x 5, the pooling layer adopts a maximum pooling mode, the size of the pooling kernel is set to be 1 x 2, the pooling step length is set to be 2, and the output dimension of the full-connection layer is 1;
(2e) respectively connecting a multitask sharing module with a first multitask unshared module, a second multitask unshared module and a third multitask unshared module in series to form a multitask convolution neural network;
(3) training a multitask convolutional neural network:
inputting the amplitude sequence of the micro-motion signals in the training set into a multi-task convolutional neural network, and iteratively updating each layer of parameters of the multi-task convolutional neural network by using a back propagation gradient descent method until the total loss value of the multi-task convolutional neural network is converged to obtain the trained multi-task convolutional neural network;
(4) obtaining an aerial target micro-motion parameter extraction value:
(4a) processing each real-time acquired aerial target radar echo by adopting the same method as the steps (1b) and (1c) to obtain a micro-motion signal amplitude sequence of the echo;
(4b) inputting all micro-motion signal amplitude sequences into a trained multi-task convolutional neural network, wherein a first multi-task unshared module outputs a blade rotating speed extraction value, a second multi-task unshared module outputs a blade projection length extraction value, and a third multi-task unshared module outputs a blade number extraction value;
(4c) combining the extracted value of the projection length of the blade with the attitude angle of the aerial target, calculating the extracted value of the real length of the blade, and simultaneously, rounding the extracted value of the number of the blade to obtain an integer extracted value of the number of the blade;
(5) obtaining a micromotion parameter:
and taking the blade rotating speed extracted value, the blade real length extracted value and the blade number integer extracted value as each micro-motion parameter corresponding to the amplitude sequence extracted in real time.
Compared with the prior art, the invention has the following advantages:
firstly, the invention constructs a multitask convolutional neural network, utilizes the nonlinear mapping capability of the multitask convolutional neural network to quickly extract the micro-motion parameters, and overcomes the problem of poor instantaneity of micro-motion parameter extraction caused by the fact that a plurality of groups of micro-motion parameters need to be searched or curve fitted when the micro-motion parameters are extracted in the prior art, so that the invention has the advantage of better extracting the micro-motion parameters of the aerial target in real time.
Secondly, the multitask convolutional neural network constructed by the invention comprises three multitask unshared modules, and the total loss value is used as a loss function of the multitask convolutional neural network, so that three parameters of the micro-motion component can be simultaneously and independently extracted, the problem that the parameters have dependency relationship when the micro-motion parameters are extracted in the prior art is solved, and the invention has the advantage of better and independently extracting each micro-motion parameter of an aerial target under the condition of ensuring real time.
Drawings
FIG. 1 is a flow chart of the present invention.
Detailed Description
The specific steps of the present invention will be further described with reference to fig. 1.
Step 1, generating a training set.
Selecting a sample set which comprises M aerial targets, wherein the number of blades of each aerial target is different, each aerial target at least comprises 1000 radar echo samples, and the total number of the samples in the sample set is N, wherein M is more than or equal to 3, and N is more than or equal to M1000.
And extracting the micro-motion signal in each radar echo sample in the sample set by adopting a CLEAN algorithm.
The clear algorithm is adopted, namely, a body component in a radar echo sample is reconstructed, and a result obtained by subtracting the body component from the radar echo sample is used as a micro-motion signal.
And performing modulus operation on each micro-motion signal to obtain an amplitude sequence of the micro-motion signal.
The modulus operation calculation formula is as follows:
si(t)=|ai(t)|
wherein s isi(t) represents the amplitude value of the ith micro-motion signal of the received aerial target at the time t, i is 1,2 … N, |, represents the modulus operation, ai(t) represents the complex echo value of the ith micro-motion signal of the received aerial target at time t.
And calculating the projection of the blade length corresponding to each amplitude sequence on the radar sight line by combining the attitude angle of the aerial target, obtaining the blade projection length corresponding to the amplitude sequence, and taking the blade rotating speed, the blade projection length and the number of the blades of the type corresponding to the amplitude sequence as the target values of the three inching parameters corresponding to the amplitude sequence.
And calculating a projection calculation formula of the blade length corresponding to each amplitude sequence on the radar sight line by combining the attitude angle of the aerial target, wherein the projection calculation formula is as follows:
Di=Li cosβ
wherein D isiThe projection length L of the blade after the projection of the length of the blade corresponding to the ith micro-motion signal amplitude sequence on the radar sight line is representediAnd the length of the blade corresponding to the ith micro-motion signal amplitude sequence is represented, cos represents the operation of a cosine, and beta represents the pitch angle between the radar and the micro-motion component.
And taking the target values of the three inching parameters corresponding to each amplitude sequence as labels of the inching signal amplitude sequence.
And combining the amplitude sequences of all the jogging signals and the corresponding labels to form a training set.
And 2, constructing a multitask convolutional neural network.
A12-layer multitask sharing module is built, and the structure sequentially comprises the following steps: the first convolution layer, the ReLU active layer, the first pooling layer, the second convolution layer, the ReLU active layer, the second pooling layer, the third convolution layer, the ReLU active layer, the third pooling layer, the fourth convolution layer, the ReLU active layer and the fourth pooling layer; the number of convolution kernels of the first to fourth layers is set to be 16, 64, 128 and 128 in sequence, the sizes of the convolution kernels are all set to be 1 x 5, the sizes of the first to fourth pooling layers are all in a maximum pooling mode, the sizes of the pooling kernels are all set to be 1 x 2, and the pooling step sizes are all set to be 2.
A first multitask non-shared module is built, and the structure of the module is as follows in sequence: the device comprises a first convolution layer, a ReLU active layer, a pooling layer, a second convolution layer, a ReLU active layer and a full connection layer; the number of convolution kernels of the first layer and the second layer is sequentially set to be 64 and 32, the sizes of the convolution kernels are both set to be 1 x 5, the pooling layer adopts a maximum pooling mode, the size of the pooling kernel is set to be 1 x 2, the pooling step size is set to be 2, and the output dimension of the full-connection layer is 1.
And constructing a second multitask non-shared module, wherein the structure of the second multitask non-shared module is as follows in sequence: the device comprises a first convolution layer, a ReLU active layer, a pooling layer, a second convolution layer, a ReLU active layer, a third convolution layer, a ReLU active layer and a full connection layer; the number of convolution kernels of the first layer to the third layer is sequentially set to be 64, 32 and 16, the sizes of the convolution kernels are all set to be 1 x 5, the pooling layer adopts a maximum pooling mode, the size of the pooling kernel is set to be 1 x 2, the pooling step size is set to be 2, and the output dimension of the full-connection layer is 1.
A third multitask non-shared module is built, and the structure of the module is as follows in sequence: the device comprises a first convolution layer, a ReLU active layer, a pooling layer, a second convolution layer, a ReLU active layer and a full connection layer; the number of convolution kernels of the first layer to the second layer is sequentially set to be 64 and 32, the sizes of the convolution kernels are all set to be 1 x 5, the pooling layer adopts a maximum pooling mode, the size of the pooling kernel is set to be 1 x 2, the pooling step size is set to be 2, and the output dimension of the full-connection layer is 1.
And respectively connecting the multitask sharing module with the first multitask unshared module, the second multitask unshared module and the third multitask unshared module in series to form the multitask convolution neural network.
And 3, training the multitask convolution neural network.
Inputting the amplitude sequence of the micro-motion signals in the training set into the multi-task convolutional neural network, and iteratively updating each layer of parameters of the multi-task convolutional neural network by using a back propagation gradient descent method until the total loss value of the multi-task convolutional neural network is converged to obtain the trained multi-task convolutional neural network.
The total loss value calculation formula of the multitask convolutional neural network is as follows:
Figure GDA0003427613780000061
wherein Loss represents Loss value of the multitask convolution neural network, S represents amplitude sequence total number of micro-motion signals input into the multitask convolution neural network, m represents total number of multitask unshared modules, sigma represents summation operation, k represents serial number of the multitask unshared modules in the multitask convolution neural network, k is 1,2 … m, W is equal tokMean square error weighted weight, y, representing the extraction parameter of the k-th module set empiricallyk,iRepresenting the target value of the inching parameter corresponding to the ith inching signal amplitude sequence in the training set, wherein the extracted value corresponding to the target value is extracted by the kth module, y'k,iAnd representing the extraction value of the inching parameter corresponding to the ith inching signal amplitude sequence, wherein the extraction value is extracted by the kth module, and m is 3 in the invention.
And 4, obtaining an aerial target micro-motion parameter extraction value.
And (3) processing each real-time acquired aerial target radar echo by adopting the same method as the step 1 to obtain a micro-motion signal amplitude sequence of the echo.
And inputting all micro-motion signal amplitude sequences into a trained multi-task convolutional neural network, wherein a first multi-task non-shared module outputs a blade rotating speed extraction value, a second multi-task non-shared module outputs a blade projection length extraction value, and a third multi-task non-shared module outputs a blade number extraction value.
And combining the extracted value of the projection length of the blade with the attitude angle of the aerial target, calculating the extracted value of the real length of the blade, and rounding the extracted value of the number of the blade to obtain the integer extracted value of the number of the blade.
The calculation formula of the extracted value of the real length of the blade is as follows:
Figure GDA0003427613780000062
wherein, FpAnd the real length extraction value of the P-th amplitude sequence blade extracted in real time is represented, wherein P is 1,2 … P, and P represents the total number of acquired aerial target radar echoes needing to extract the micro-motion parameters in real time. HpAnd the representation network extracts the value of the projection length of the blade corresponding to the p-th amplitude sequence, cos represents the cosine operation, and beta represents the pitch angle between the radar and the micro-motion component.
The formula for rounding the extracted value of the number of the blades is as follows:
Np=[Gp]
wherein N ispNumber of leaves representing the p-th amplitude sequence extracted in real time [ ·]Denotes an operation of rounding off an integer, GpAnd representing the extracted value of the number of the blades corresponding to the p-th amplitude sequence by the network.
And 5, acquiring the micromotion parameters.
And taking the blade rotating speed extraction value, the blade real length extraction value and the blade number integer extraction value as each micromotion parameter corresponding to the amplitude sequence extracted in real time.
The effect of the present invention is further explained by combining the simulation experiment as follows:
1. simulation experiment conditions are as follows:
the hardware platform of the simulation experiment of the invention is as follows: the processor is an Intel i 78700 k CPU, the main frequency is 3.2GHz, and the memory is 16 GB.
The software platform of the simulation experiment of the invention is as follows: windows 10 operating system and python 3.6.
The training set and the test set used in the simulation experiment of the invention are micro-motion signal amplitude sequences processed based on simulation data generated by a radar one-dimensional echo parameter model, and the simulation parameters are set as follows: the radar carrier frequency was set to 1GHz, the dwell time was set to 0.2s, the pulse repetition frequency was set to 4000Hz, and the signal-to-noise ratio was set to 10 dB. Under the condition that the blade angle is not considered, the rotating speed of a helicopter rotor wing is set to be 200-400 rpm, the interval is 10 rpm, the number of blades is set to be 2, 3, 4, 5 or 6, the length of the blades is 5.64-9.14 m, the interval is 0.5 m, the pitch angle is 5-45 degrees, the interval is 5 degrees, the azimuth angle is randomly generated between 0-360 degrees, and the initial phase positions of the blades are all 45 degrees. The radar and target initial distance is 30004 m.
The training set contains 216000 inching signal amplitude sequences and corresponding inching parameter target values, the first test set contains 216000 radar echoes of air targets and corresponding inching parameter target values, wherein each leaf number type contains 43200 radar echoes and corresponding inching parameter target values. The second test set contained radar returns of 100 air targets and corresponding micro-motion parameter target values, where each blade number type contained radar returns of 20 air targets and corresponding micro-motion parameter target values. The micro-motion parameter extraction capability of the invention is evaluated by using the first test set, and the real-time performance of the invention is evaluated by using the second test set.
The configuration parameters of the simulation experiment multitask convolution neural network are as follows: the training batch size is set to be 200, the total number of training rounds is 1000, and network parameters of the multitask convolutional neural network which is trained for 200 rounds, 400 rounds, 600 rounds, 800 rounds and 1000 rounds by using the training set are respectively recorded. In the total loss value of the multitask convolutional neural network, the parameter extraction mean square error weighting weight of a first multitask unshared module is set to be 0.4 according to experience, the parameter extraction mean square error weighting weight of a second multitask unshared module is set to be 0.4, and the parameter extraction mean square error weighting weight of a third multitask unshared module is set to be 0.2.
2. Simulation content and result analysis thereof:
the simulation experiment of the invention is to extract the micro-motion parameters corresponding to 216000 aerial target radar echoes in the first test set by adopting the method of the invention, and the performance of extracting the micro-motion parameters of the invention is obtained. Meanwhile, the invention and the SRDI curve extraction method in the prior art are adopted to respectively extract the micro-motion parameters corresponding to the 100 aerial target radar echoes in the second test set, so as to obtain the micro-motion parameter extraction time.
In the simulation experiment, the adopted prior art SRDI curve extraction method refers to:
in a paper published by dansheng shun, "research on an aircraft target classification method based on physical driving and data driving characteristics" (west ampere electronic science and technology university, master academic paper, 2019), a short-time sparsity-based method for estimating parameters of an aircraft target rotor under a low-repetition-frequency condition is proposed, which is referred to as a short-time rotor parameter estimation method for short.
The evaluation of the micromotion parameter extraction capability of the invention is as follows:
the method of the invention is adopted to extract three micro-motion parameters of each aerial target radar echo in the first test set. The three micro-motion parameters are the blade rotating speed, the blade length and the blade number of each micro-motion component respectively. In the multitask convolutional neural network, a first multitask unshared module extracts the blade rotating speed of each micro-motion component, a second multitask unshared module extracts the blade length of each micro-motion component, and a third multitask unshared module extracts the blade number of each micro-motion component. When the relative error between the micro-motion parameter extracted in real time for each aerial target radar echo in the test set and the corresponding real micro-motion parameter is lower than 5%, the micro-motion parameter of the radar echo is considered to be successfully extracted. The first test set is tested by using the multi-task convolutional neural network which trains 200 rounds, 400 rounds, 600 rounds, 800 rounds and 1000 rounds respectively, all radar echoes corresponding to the successfully extracted micro-motion parameters in the first test set are divided by the total number of the radar echoes in the first test set, and the proportion results of the successfully extracted micro-motion parameters are obtained, and the following table 1 is formed:
TABLE 1 ratio result table (unit: percentage) of successful micro-motion parameter extraction
Number of training rounds 200 wheels 400 wheel 600 wheel 800 wheel 1000 wheels
Rotational speed 96.42 96.51 96.93 96.96 97.38
Blade length 93.30 93.94 94.48 95.86 96.20
Number of blades 99.90 99.91 99.91 99.93 99.96
Extracting three parameters simultaneously 91.74 92.02 93.08 94.58 95.06
As can be seen from table 1, when the multitask convolutional neural network trains for 1000 rounds, the method of the present invention extracts three micro-motion parameters of each aerial target in the first test set at the same time, and the proportion of the three micro-motion parameters extracted successfully at the same time in the first test set is 95.06%, which proves that the result of extracting the micro-motion parameters of the present invention is relatively accurate.
The real-time performance of the present invention was evaluated as follows:
and respectively adopting the method and the short-time rotor parameter estimation method to extract three micro-motion parameters from each aerial target radar echo in the second test set, and evaluating the real-time performance of the method according to the average consumed time. The average time consumed for extracting the micro-motion parameters of each aerial target radar echo in the second test set is 13 milliseconds, and the average time consumed for extracting the parameters of each aerial target radar echo in the second test set by the short-time rotor wing parameter estimation method is 32495 milliseconds: according to the average time consumption for extracting the micro-motion parameters, the time required for extracting the micro-motion parameters is shorter, and only millisecond level is required under the condition of loading the network model in advance. The method provided by the invention is proved to be stronger in real-time property.

Claims (5)

1. A method for extracting micro-motion parameters of an aerial target in real time based on a multitask convolutional network is characterized in that the multitask convolutional neural network is constructed, and a total loss value is used as a convergence condition of the multitask convolutional neural network when the network is trained; the method comprises the following steps:
(1) generating a training set:
(1a) selecting a sample set which comprises M aerial targets, wherein the number of blades of each aerial target is different, each aerial target at least comprises 1000 radar echo samples, and the total number of samples in the sample set is N, wherein M is more than or equal to 3, and N is more than or equal to M1000;
(1b) extracting a micro-motion signal in each radar echo sample in the sample set by adopting a CLEAN algorithm;
(1c) carrying out modulus operation on each micro-motion signal to obtain an amplitude sequence of the micro-motion signal;
(1d) calculating the projection of the blade length corresponding to each amplitude sequence on the radar sight line by combining the attitude angle of the aerial target, obtaining the blade projection length corresponding to the amplitude sequence, and taking the blade rotating speed, the blade projection length and the number of the blades of the type corresponding to the amplitude sequence as the target values of three micro-motion parameters corresponding to the amplitude sequence;
(1e) taking the target values of the three inching parameters corresponding to each amplitude sequence as the labels of the amplitude sequence;
(1f) forming a training set by the amplitude sequences of all the micro-motion signals and the corresponding labels;
(2) constructing a multitask convolutional neural network:
(2a) a12-layer multitask sharing module is built, and the structure sequentially comprises the following steps: the first convolution layer, the ReLU active layer, the first pooling layer, the second convolution layer, the ReLU active layer, the second pooling layer, the third convolution layer, the ReLU active layer, the third pooling layer, the fourth convolution layer, the ReLU active layer and the fourth pooling layer; setting the number of convolution kernels of the first to fourth layers to be 16, 64, 128 and 128 in sequence, setting the sizes of the convolution kernels to be 1 x 5, setting the sizes of the first to fourth pooling layers to be 1 x 2 in a maximum pooling mode, and setting the sizes of the pooling kernels to be 2 and the pooling step size to be 2;
(2b) a first multitask non-shared module is built, and the structure of the module is as follows in sequence: the device comprises a first convolution layer, a ReLU active layer, a pooling layer, a second convolution layer, a ReLU active layer and a full connection layer; the number of convolution kernels of the first layer and the second layer is sequentially set to be 64 and 32, the sizes of the convolution kernels are both set to be 1 x 5, the pooling layer adopts a maximum pooling mode, the size of the pooling kernel is set to be 1 x 2, the pooling step length is set to be 2, and the output dimension of the full-connection layer is 1;
(2c) and constructing a second multitask non-shared module, wherein the structure of the second multitask non-shared module is as follows in sequence: the device comprises a first convolution layer, a ReLU active layer, a pooling layer, a second convolution layer, a ReLU active layer, a third convolution layer, a ReLU active layer and a full connection layer; the number of convolution kernels of the first layer to the third layer is sequentially set to be 64, 32 and 16, the sizes of the convolution kernels are all set to be 1 x 5, the pooling layer adopts a maximum pooling mode, the size of the pooling kernel is set to be 1 x 2, the pooling step length is set to be 2, and the output dimension of the full-connection layer is 1;
(2d) a third multitask non-shared module is built, and the structure of the module is as follows in sequence: the device comprises a first convolution layer, a ReLU active layer, a pooling layer, a second convolution layer, a ReLU active layer and a full connection layer; the number of convolution kernels of the first layer to the second layer is sequentially set to be 64 and 32, the sizes of the convolution kernels are all set to be 1 x 5, the pooling layer adopts a maximum pooling mode, the size of the pooling kernel is set to be 1 x 2, the pooling step length is set to be 2, and the output dimension of the full-connection layer is 1;
(2e) respectively connecting a multitask sharing module with a first multitask unshared module, a second multitask unshared module and a third multitask unshared module in series to form a multitask convolution neural network;
(3) training a multitask convolutional neural network:
inputting the amplitude sequence of the micro-motion signals in the training set into a multi-task convolutional neural network, and iteratively updating each layer of parameters of the multi-task convolutional neural network by using a back propagation gradient descent method until the total loss value of the multi-task convolutional neural network is converged to obtain the trained multi-task convolutional neural network;
(4) obtaining an aerial target micro-motion parameter extraction value:
(4a) processing each real-time acquired aerial target radar echo by adopting the same method as the steps (1b) and (1c) to obtain a micro-motion signal amplitude sequence of the echo;
(4b) inputting all micro-motion signal amplitude sequences into a trained multi-task convolutional neural network, wherein a first multi-task unshared module outputs a blade rotating speed extraction value, a second multi-task unshared module outputs a blade projection length extraction value, and a third multi-task unshared module outputs a blade number extraction value;
(4c) combining the extracted value of the projection length of the blade with the attitude angle of the aerial target, calculating the extracted value of the real length of the blade, and simultaneously, rounding the extracted value of the number of the blade to obtain an integer extracted value of the number of the blade;
(5) obtaining a micromotion parameter:
and taking the blade rotating speed extracted value, the blade real length extracted value and the blade number integer extracted value as each micro-motion parameter corresponding to the amplitude sequence extracted in real time.
2. The method for extracting the micro-motion parameters of the aerial target based on the multitask convolutional network according to the claim 1, which is characterized in that: and (2) reconstructing the fuselage signal in the radar echo sample by adopting the CLEAN algorithm in the step (1b), and taking the result obtained by subtracting the fuselage signal from the radar echo sample as a micro-motion signal.
3. The method for extracting the micro-motion parameters of the aerial target based on the multitask convolutional network according to the claim 1, which is characterized in that: in the step (1d), the calculation of the projection of the blade length corresponding to each amplitude sequence on the radar sight line by combining the attitude angle of the aerial target is obtained by the following formula:
Di=Li cosβ
wherein D isiThe projection length L of the blade after the projection of the length of the blade corresponding to the ith micro-motion signal amplitude sequence on the radar sight line is representediAnd the length of the blade corresponding to the ith micro-motion signal amplitude sequence is represented, cos represents the operation of a cosine, and beta represents the pitch angle between the radar and the micro-motion component.
4. The method for extracting the micro-motion parameters of the aerial target based on the multitask convolutional network according to the claim 1, which is characterized in that: the total loss value of the multitask convolutional neural network in the step (3) is obtained by the following formula:
Figure FDA0003427613770000031
wherein Loss represents Loss value of the multitask convolution neural network, S represents amplitude sequence total number of micro-motion signals input into the multitask convolution neural network, m represents total number of multitask unshared modules, sigma represents summation operation, k represents serial number of the multitask unshared modules in the multitask convolution neural network, k is 1,2 … m, W is equal tokMean square error weighted weight, y, representing the extraction parameter of the k-th module set empiricallyk,iRepresenting the target value of the inching parameter corresponding to the ith inching signal amplitude sequence in the training set, wherein the extracted value corresponding to the target value is extracted by the kth module, y'k,iAnd representing the extraction value of the inching parameter corresponding to the ith inching signal amplitude sequence, wherein the extraction value is extracted by the kth module, and m is 3 in the invention.
5. The method for extracting the micro-motion parameters of the aerial target based on the multitask convolutional network according to the claim 1, which is characterized in that: the extracted value of the real length of the blade in the step (4c) is obtained by the following formula:
Figure FDA0003427613770000041
wherein, FpThe real length extraction value of the blade of the pth amplitude sequence which is extracted in real time is represented as P1, 2 … P, P represents the total number of the obtained aerial target radar echoes needing to extract the micro-motion parameters in real time, HpThe method comprises the steps of representing a projection length extraction value of a blade corresponding to the p-th amplitude sequence by a multitask convolution neural network, representing cosine operation, and representing a pitch angle between a radar and a micro-motion component.
CN202110134778.1A 2021-01-29 2021-01-29 Air target micro-motion parameter real-time extraction method based on multitask convolutional network Active CN112784916B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110134778.1A CN112784916B (en) 2021-01-29 2021-01-29 Air target micro-motion parameter real-time extraction method based on multitask convolutional network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110134778.1A CN112784916B (en) 2021-01-29 2021-01-29 Air target micro-motion parameter real-time extraction method based on multitask convolutional network

Publications (2)

Publication Number Publication Date
CN112784916A CN112784916A (en) 2021-05-11
CN112784916B true CN112784916B (en) 2022-03-04

Family

ID=75760188

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110134778.1A Active CN112784916B (en) 2021-01-29 2021-01-29 Air target micro-motion parameter real-time extraction method based on multitask convolutional network

Country Status (1)

Country Link
CN (1) CN112784916B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113311406B (en) * 2021-05-28 2023-06-30 西安电子科技大学 Aircraft time-frequency domain rotor wing parameter estimation method based on multichannel attention network

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104330784A (en) * 2014-11-19 2015-02-04 西安电子科技大学 Plane target classification method based on rotor wing physical parameter estimation
CN106529402A (en) * 2016-09-27 2017-03-22 中国科学院自动化研究所 Multi-task learning convolutional neural network-based face attribute analysis method
CN108765417A (en) * 2018-06-15 2018-11-06 西安邮电大学 It is a kind of that system and method is generated based on the femur X-ray film of deep learning and digital reconstruction irradiation image
CN109031219A (en) * 2018-06-14 2018-12-18 西安电子科技大学 Wideband radar Ballistic Target fine motion geometric parameter estimation method based on phase ranging
CN110161480A (en) * 2019-06-18 2019-08-23 西安电子科技大学 Radar target identification method based on semi-supervised depth probabilistic model
CN110363219A (en) * 2019-06-10 2019-10-22 南京理工大学 Midcourse target fine motion form recognition methods based on convolutional neural networks
CN110490052A (en) * 2019-07-05 2019-11-22 山东大学 Face datection and face character analysis method and system based on cascade multi-task learning
CN110516576A (en) * 2019-08-20 2019-11-29 西安电子科技大学 Near-infrared living body faces recognition methods based on deep neural network
CN111367660A (en) * 2020-02-25 2020-07-03 北京思特奇信息技术股份有限公司 Method and system for sharing group shared resources
CN111598232A (en) * 2020-04-30 2020-08-28 南京理工大学 Method for estimating complex micro-motion space cone target parameters by using deep learning convolutional neural network

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111220958B (en) * 2019-12-10 2023-05-26 西安宁远电子电工技术有限公司 Radar target Doppler image classification and identification method based on one-dimensional convolutional neural network

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104330784A (en) * 2014-11-19 2015-02-04 西安电子科技大学 Plane target classification method based on rotor wing physical parameter estimation
CN106529402A (en) * 2016-09-27 2017-03-22 中国科学院自动化研究所 Multi-task learning convolutional neural network-based face attribute analysis method
CN109031219A (en) * 2018-06-14 2018-12-18 西安电子科技大学 Wideband radar Ballistic Target fine motion geometric parameter estimation method based on phase ranging
CN108765417A (en) * 2018-06-15 2018-11-06 西安邮电大学 It is a kind of that system and method is generated based on the femur X-ray film of deep learning and digital reconstruction irradiation image
CN110363219A (en) * 2019-06-10 2019-10-22 南京理工大学 Midcourse target fine motion form recognition methods based on convolutional neural networks
CN110161480A (en) * 2019-06-18 2019-08-23 西安电子科技大学 Radar target identification method based on semi-supervised depth probabilistic model
CN110490052A (en) * 2019-07-05 2019-11-22 山东大学 Face datection and face character analysis method and system based on cascade multi-task learning
CN110516576A (en) * 2019-08-20 2019-11-29 西安电子科技大学 Near-infrared living body faces recognition methods based on deep neural network
CN111367660A (en) * 2020-02-25 2020-07-03 北京思特奇信息技术股份有限公司 Method and system for sharing group shared resources
CN111598232A (en) * 2020-04-30 2020-08-28 南京理工大学 Method for estimating complex micro-motion space cone target parameters by using deep learning convolutional neural network

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Direct Force Feedback Control and Online Multi-Task Optimization for Aerial Manipulators;Gabriele Nava等;《IEEE Robotics and Automation Letters》;20200430;第5卷(第2期);第331-338页 *
Kang Li等.End-to-end cubic phase signal recovery method based on deep convolutional neural network.《IET Radar, Sonar & Navigation》.2020, *
卷积神经网络在低空空域无人机检测中的研究;甘雨涛;《中国优秀硕士学位论文全文数据库工程科技II辑》;20200115;C031-241 *
宽带雷达三维干涉测量弹道目标微动参数估计;魏嘉琪等;《电子与信息学报》;20190430;第41卷(第4期);第787-794页 *

Also Published As

Publication number Publication date
CN112784916A (en) 2021-05-11

Similar Documents

Publication Publication Date Title
CN107728142B (en) Radar high-resolution range profile target identification method based on two-dimensional convolutional network
CN107728143B (en) Radar high-resolution range profile target identification method based on one-dimensional convolutional neural network
CN104077787B (en) A kind of Aircraft Targets sorting technique based on time domain and Doppler domain
CN112882009B (en) Radar micro Doppler target identification method based on amplitude and phase dual-channel network
CN104330784B (en) Plane target classification method based on rotor wing physical parameter estimation
CN103885043B (en) The miscellaneous Robust classification method of making an uproar of Aircraft Targets based on broad match filtering
CN111273285B (en) Micro Doppler spectrum correlation matrix characteristic extraction method for multi-rotor unmanned aerial vehicle
CN112731330B (en) Radar carrier frequency parameter change steady target identification method based on transfer learning
CN112784916B (en) Air target micro-motion parameter real-time extraction method based on multitask convolutional network
CN112882011A (en) Radar carrier frequency variation robust target identification method based on frequency domain correlation characteristics
CN107390193A (en) Frequency modulated continuous wave radar Aircraft Targets sorting technique based on the fusion of more range cells
CN112835003B (en) Radar repetition frequency variation steady target recognition method based on resampling preprocessing
CN112882012B (en) Radar target noise robust identification method based on signal-to-noise ratio matching and echo enhancement
CN104535982B (en) Aircraft target classification method based on angular domain division
CN111458688A (en) Radar high-resolution range profile target identification method based on three-dimensional convolution network
Zhu et al. Radar HRRP group-target recognition based on combined methods in the backgroud of sea clutter
CN116660851A (en) Method and system for distinguishing targets of birds and rotor unmanned aerial vehicle under low signal-to-noise ratio condition
CN113311406B (en) Aircraft time-frequency domain rotor wing parameter estimation method based on multichannel attention network
CN109061586B (en) Target micro-motion characteristic modeling method based on dynamic RCS model
CN112731331B (en) Micro-motion target noise steady identification method based on signal-to-noise ratio adaptive network
CN115856811A (en) Micro Doppler feature target classification method based on deep learning
CN113866739A (en) Multi-rotor target parameter estimation method based on GLCT-GPTF
Shreyamsha Kumar et al. Target identification using harmonic wavelet based ISAR imaging
CN113281714A (en) Bird target detection method based on radar micro Doppler feature enhancement
CN114492505B (en) Air group target and extension target identification method based on semi-actual measurement data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant