CN114943286B - Unknown target discrimination method based on fusion of time domain features and space domain features - Google Patents

Unknown target discrimination method based on fusion of time domain features and space domain features Download PDF

Info

Publication number
CN114943286B
CN114943286B CN202210548825.1A CN202210548825A CN114943286B CN 114943286 B CN114943286 B CN 114943286B CN 202210548825 A CN202210548825 A CN 202210548825A CN 114943286 B CN114943286 B CN 114943286B
Authority
CN
China
Prior art keywords
time domain
dimensional range
training
range profile
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210548825.1A
Other languages
Chinese (zh)
Other versions
CN114943286A (en
Inventor
周代英
易传莉雯
王特起
何彬宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN202210548825.1A priority Critical patent/CN114943286B/en
Publication of CN114943286A publication Critical patent/CN114943286A/en
Application granted granted Critical
Publication of CN114943286B publication Critical patent/CN114943286B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2135Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/2431Multiple classes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent

Abstract

The invention belongs to the technical field of unknown target identification, and particularly relates to an unknown target discrimination method based on fusion of time domain features and space domain features. The method extracts depth time domain related features of the one-dimensional range profile (HRRP) data after pretreatment through a bidirectional gating circulation unit network (GRU), and fuses the depth time domain related features with main energy features of a space domain in a vector series mode to realize the discrimination of unknown targets. The fusion characteristics not only utilize the main energy information of the space domain in the one-dimensional range profile, but also contain the time domain correlation information between the one-dimensional range profiles, so that the discrimination rate of unknown targets is improved, and the effectiveness of the method is verified by a simulation experiment result.

Description

Unknown target discrimination method based on fusion of time domain features and space domain features
Technical Field
The invention belongs to the technical field of unknown target identification, and particularly relates to an unknown target discrimination method based on fusion of time domain features and space domain features.
Background
The radar one-dimensional distance image is a one-dimensional vector formed by the echo sequence of each distance unit, and comprises information such as the size of a target, the structure of an object and the like. In the traditional one-dimensional range profile target identification, a training sample data set of a target is firstly acquired to establish a feature library, so that the target can be classified and identified. In practice, however, the target to be recognized may not be involved in training, thereby causing erroneous recognition of the type. Therefore, the target to be recognized needs to be firstly distinguished to ensure the correct recognition of the subsequent target category.
The subspace discrimination method and the deep long and short term memory neural network model discrimination method are effective unknown target discrimination methods, however, the subspace discrimination method only utilizes spatial domain feature information in one-dimensional range profile data, but ignores time domain correlation information between target one-dimensional range profiles. The deep long-short term memory neural network model judgment method can extract time domain related features, but does not contain spatial information in one-dimensional range profile data. Therefore, the above method has room for further improvement.
Disclosure of Invention
The invention provides an unknown target distinguishing method based on fusion of a deep time domain related feature and a space domain main energy feature.
The technical scheme of the invention is as follows:
an unknown target discrimination method based on fusion of time domain features and space domain features comprises the following steps:
s1, obtaining a high-resolution one-dimensional range profile, and obtaining a training one-dimensional range profile data set of X = [ X ] after normalization 1 ,X 2 ,..,X t ,...,X m ],X t Representing the t-th one-dimensional range profile in a training data set X, wherein t is more than or equal to 1 and less than or equal to m, m is the number of samples of the training one-dimensional range profile, and the training data set X comprises samples of k types of targets;
s2, extracting time domain features through a bidirectional GRU network, wherein the bidirectional GRU network sequentially comprises a bidirectional GRU layer, a dropout layer, a flatten layer, a first dense layer, a second dense layer and a third dense layer; the processing mode of the bidirectional GRU layer on input data is as follows:
in forward processing, there is X at each time t And H t-1 Two inputs, H t-1 Outputs of GRU units in a previous state
Figure BDA0003653603940000021
Hidden layer state characteristic H with current time t GRU unit output time sequence characteristic H at previous moment t-1 Correlation; in GRU unit, first through update gate u t Selecting the time domain characteristic H of the previous time t-1 Degree of memory of (c):
u t =σ(W u X t +Z u H t-1 )
wherein σ (·) is sigmoid function, W u 、Z u Updating the gate correspondence weight matrix; by a reset gate r t Choosing to ignore the previous time domain feature H t-1 Degree of information:
r t =σ(W r X t +Z r H t-1 )
wherein W r 、Z r Corresponding to the weight matrix for the reset gate; characteristic H of the previous moment in GRU t-1 And the output r of the reset gate t After multiplication, adding the hidden state as a parameter to calculate the hidden state of the current moment, and obtaining the hidden state characteristic H of the distance image at the current moment after the tanh function t
H t =tanh(WX t +Z(r t ⊙H t-1 ))
The symbol £ indicates that the corresponding position element in the matrix is multiplied, and W, Z is a weight matrix; current hidden state feature H t And historical time domain feature H t-1 By updating the door u t Weighting to obtain the final output time domain correlation characteristics
Figure BDA0003653603940000022
Figure BDA0003653603940000023
In reverse, each time has an X t And H t+1 Two inputs, the same way, can obtain the time domain correlation feature of reverse extraction
Figure BDA0003653603940000024
Forward and backward two eigenvectors
Figure BDA0003653603940000025
Outputting the time domain related characteristics H extracted by the bidirectional GRU layer after superposition ot
Figure BDA0003653603940000026
Then the time domain correlation characteristic H ot Connecting all the feature maps into a feature vector by a dropout layer with the parameter of 0.3 and then connecting 1 scatter layer, and then using the feature vector extracted by the 3 layers of dense layers as the depth time domain related feature F Dt
S3, extracting the main energy characteristics of the airspace:
performing space domain principal component analysis on a training data set X of the one-dimensional range profile to obtain a characteristic vector matrix A, and sampling the one-dimensional range profile sample X t And (3) projecting to A:
F St =A T X t
wherein, F St Is X t Corresponding airspace main energy characteristics;
s4, converting the time domain characteristics F Dt And space domain main energy feature F St Carrying out fusion to obtain X t Corresponding fused feature vector F t
F t =[F Dt ,F St ]
Wherein t is more than or equal to 1 and less than or equal to m; the training sample data set has k classes, q class samples are classified into one class and non-q class samples are classified into another class in sequence during training, q is more than or equal to 1 and less than or equal to k, and k classifiers are constructed; according to the fusion feature vector F corresponding to all training samples 1 、F 2 …F m Constructing decision functions of k classifiers, wherein the decision function D of the q-th classifier q (·):
Figure BDA0003653603940000031
Wherein, X any For any one-dimensional range profile sample, F any Is X any Corresponding fused feature vector, D q (F any ) Is a one-dimensional range profile sample X any Corresponding decision function value, Y q,t For training sample X t Corresponding class label in the qth classifier, when X t When the training sample belongs to the q-th class, the corresponding Y q,t A value of 1 when X t When the training samples belong to the other categories, the corresponding Y q,t A value of-1, alpha q,t And b q In order to optimize the obtained coefficient by using a training sample solution, K (·,) is a Gaussian kernel function;
s5, inputting the one-dimensional range profile of the target to be recognized as X e Extracting deep time domain related feature F by using bidirectional GRU network De Corresponding spatial domain principal energy feature F Se Fusing to obtain a fused feature vector F e
F e =[F De ,F Se ]
Mixing X e Corresponding fused feature vector F e Substituting decision function D q (·) Calculating decision function values D of k classifiers 1 (F e )、D 2 (F e )…D k (F e ) For k decision function values D 1 (F e )、D 2 (F e )…D k (F e ) Sorting, taking the maximum decision function value and the set decision threshold d th And comparing, if the value is larger than the threshold, judging that the target to be recognized is a known target, otherwise, judging that the target to be recognized is an unknown target.
The method has the advantages that the fusion characteristics not only utilize the space domain main energy information in the one-dimensional range profile, but also contain the time domain correlation information between the one-dimensional range profiles, so that the discrimination rate of unknown targets is improved.
Drawings
Fig. 1 is a schematic diagram of a bidirectional GRU network structure.
Fig. 2 is a two-way GRU time development diagram.
Fig. 3 is a diagram of a GRU unit structure.
Detailed Description
The invention is described in detail below with reference to the following figures and simulations:
as shown in fig. 1, in the bidirectional GRU network structure constructed by the present invention, energy normalization preprocessing is performed on sample high resolution one-dimensional range profile (HRRP) data, and after the preprocessing, all training one-dimensional range profile data sets are X = [ X ] 1 ,X 2 ,..,X t ,...,X m ],X t And (3) representing the t-th one-dimensional range profile (t is more than or equal to 1 and less than or equal to m) in the training data set X, wherein m is the number of training one-dimensional range profile samples. The training data set X includes samples of class k targets.
Passing the preprocessed HRRP sample data through a bidirectional GRU layer to obtain time domain related characteristics H ot Then the time domain correlation characteristic H ot Followed by a five-layer BP network. Firstly, connecting all feature maps into a feature vector through a dropout layer with the parameter of 0.3 and then connecting 1 flatten layer, and then using the feature vector extracted by the 3-layer dense layer as the depth time domain related feature. Extracting and training one-dimensional range profile sample X from a deep bidirectional GRU network t The corresponding characteristic is the depth time domainCorrelation feature F Dt
The structure of the bidirectional GRU layer in fig. 1 is as shown in fig. 2, and input data is processed in forward and reverse directions through the GRU layer to obtain forward and reverse eigenvectors
Figure BDA0003653603940000041
Outputting the time domain related characteristics H extracted by the bidirectional GRU layer after superposition ot
Figure BDA0003653603940000042
The structure of each GRU unit in FIG. 2 is shown in FIG. 3, and in forward processing, each time is provided with an X t And H t-1 Two inputs, H t-1 Outputs of GRU units for a previous state
Figure BDA0003653603940000043
Hidden layer state characteristic H with current time t GRU unit output time sequence characteristic H at previous moment t-1 And (6) correlating.
In the GRU unit, as shown in FIG. 3, first, through the refresh gate u t Selecting the time domain characteristic H of the previous time t-1 Degree of memory of (c):
u t =σ(W u X t +Z u H t-1 ) (1)
wherein σ (·) is sigmoid function, W u 、Z u To update the gate correspondence weight matrix.
By a reset gate r t Choosing to ignore the previous time domain feature H t-1 Degree of information:
r t =σ(W r X t +Z r H t-1 ) (2)
wherein W r 、Z r The gate is reset to correspond to the weight matrix. Characteristic H of the previous moment in GRU t-1 And the output r of the reset gate t After multiplication, the hidden state at the current moment is added and calculated as a parameter, and after the hidden state is subjected to tanh functionObtaining the hidden layer state characteristic H of the distance image at the current moment t
H t =tanh(WX t +Z(r t ⊙H t-1 )) (3)
The symbol |, indicates multiplication of the corresponding position element in the matrix, W, Z is the weight matrix. Current hidden state feature H t And historical time domain features H t-1 By updating the door u t Weighting to obtain the final output time domain correlation characteristics
Figure BDA0003653603940000051
/>
Figure BDA0003653603940000052
The above are obtained by forward processing
Figure BDA0003653603940000053
I.e. the time-domain related feature extracted in the forward direction->
Figure BDA0003653603940000054
When the reverse process is performed, the input is X t And H t+1 Similarly, an inversely extracted time-domain related feature->
Figure BDA0003653603940000055
Performing space domain principal component analysis on a training data set X of the one-dimensional range profile to obtain a characteristic vector matrix A, and preprocessing a one-dimensional range profile sample X t And (3) projecting to A:
F St =A T X t (6)
wherein, F St Is X t Corresponding spatial domain dominant energy features.
Will train the one-dimensional range profile sample X t Deep time domain correlation feature F extracted through bidirectional GRU network Dt With corresponding spatial domain principal energy features F St Carrying out fusion to obtain X t Corresponding fused feature vector F t
F t =[F Dt ,F St ] (7)
Wherein t is more than or equal to 1 and less than or equal to m. The training sample data set has k classes, the q class samples are classified into one class during training, the non-q class samples are classified into another class (q is more than or equal to 1 and less than or equal to k), and k classifiers are constructed. According to the fusion feature vector F corresponding to all the training samples 1 、F 2 …F m Constructing decision functions of k classifiers, wherein the decision function D of the q-th classifier q (·):
Figure BDA0003653603940000061
Wherein, X any For any one-dimensional range profile sample, F any Is X any Corresponding fused feature vector, D q (F any ) Is a one-dimensional range profile sample X any Corresponding decision function value, Y q,t For training sample X t Class labels corresponding in the qth classifier, when X t When the training sample belongs to the q-th class, the corresponding Y q,t Value of 1 when X t When the training samples belong to the other categories, the corresponding Y q,t The value is-1. Alpha is alpha q,t And b q To optimize the resulting coefficients using the training sample solution, K (·,) is a gaussian kernel function.
Inputting a one-dimensional range profile of a target to be recognized as X e Extracting deep time domain related feature F by using bidirectional GRU network De Corresponding spatial domain principal energy feature F Se Fusing to obtain a fused feature vector F e
F e =[F De ,F Se ] (9)
Mixing X e Corresponding fused feature vector F e Substituting into formula (8), calculating decision function value D of k classifiers 1 (F e )、D 2 (F e )…D k (F e ) For k decision function values D 1 (F e )、D 2 (F e )…D k (F e ) Sorting, and taking the maximum decision function value and threshold d th And comparing, if the value is larger than the threshold, judging that the target to be recognized is a known target, otherwise, judging that the target to be recognized is an unknown target. Wherein, all training samples are used to determine the discrimination threshold d th
Simulation example
HRRP data of five different types of military aircrafts, namely AH64, AN26, F15 and B1B, B, obtained by electromagnetic characteristic calculation software are adopted for experiments. The carrier frequency of the radar for measurement in the experiment is 6GHz, the signal bandwidth of the radar is 400MHz, the elevation angle of an airplane target is 3 degrees, the radar is collected at intervals of 0.1 degree in the range of 0-180 degrees of azimuth angles, 1801 HRRP samples are collected in each airplane, the number of distance units is 320, and the HRRP simulation data of each airplane is a matrix of 1801 multiplied by 320 substantially.
Three kinds of 1400 one-dimensional range profile data which are taken according to the proportion of 1 to 2 in the range of 0-160 degrees are selected from five kinds of airplane HRRP data, when noise is introduced to enable the signal to noise ratio to be-5 db, three kinds of known targets are randomly selected as training data sets by taking 0.1 degrees as intervals, and the other two kinds of known targets are used as unknown targets for experiments. The learning rate of 0.001, the cross entropy loss function and adam optimization are selected in the experiment. The average discrimination results of the depth convolution neural network and the fusion method of the depth time domain correlation characteristics and the spatial domain energy characteristics based on the bidirectional GRU network on the unknown target are shown in the table 1.
TABLE 1 average discrimination against unknown targets (%)
Figure BDA0003653603940000071
The results in table 1 show that three airplanes are randomly extracted as data in the database, and under the condition that the other two airplanes are unknown targets, the judgment result is poor when the deep convolutional neural network is adopted at intervals of 0.1 degrees and the signal-to-noise ratio is-5 db, the extracted time domain relevant features and the space domain energy features are serially fused to obtain information with better classification performance through a time domain relevant feature and space domain energy feature fusion method based on a bidirectional GRU network, the high judgment rate of the unknown targets can be realized under the condition of low signal-to-noise ratio, the judgment accuracy is more than 94%, and the effectiveness of the method is verified.

Claims (1)

1. An unknown target discrimination method based on fusion of time domain features and space domain features is characterized by comprising the following steps:
s1, obtaining a high-resolution one-dimensional range profile, and obtaining a training one-dimensional range profile data set of X = [ X ] after normalization 1 ,X 2 ,..,X t ,...,X m ],X t Representing the t-th one-dimensional range profile in a training data set X, wherein t is more than or equal to 1 and less than or equal to m, m is the number of samples of the training one-dimensional range profile, and the training data set X comprises samples of k types of targets;
s2, extracting time domain features through a bidirectional GRU network, wherein the bidirectional GRU network sequentially comprises a bidirectional GRU layer, a dropout layer, a flatten layer, a first dense layer, a second dense layer and a third dense layer; the processing mode of the bidirectional GRU layer on input data is as follows:
in forward processing, there is X at each time t And H t-1 Two inputs, H t-1 Outputs of GRU units in a previous state
Figure FDA0003653603930000014
Hidden layer state characteristic H with current time t GRU unit output time sequence characteristic H at previous moment t-1 Correlation; in GRU unit, first through updating gate u t Selecting the time domain feature H of the previous time t-1 Degree of memory of (c):
u t =σ(W u X t +Z u H t-1 )
wherein σ (·) is sigmoid function, W u 、Z u Updating the gate correspondence weight matrix; by a reset gate r t Choosing to ignore the previous time domain feature H t-1 Degree of information:
r t =σ(W r X t +Z r H t-1 )
wherein W r 、Z r Corresponding to the weight matrix for the reset gate; GRU middle frontCharacteristic H of a moment t-1 And the output r of the reset gate t After multiplication, adding the hidden state as a parameter to calculate the hidden state at the current moment, and obtaining the hidden state characteristic H from the current moment through the tanh function t
H t =tanh(WX t +Z(r t ⊙H t-1 ))
The symbol £ indicates that the corresponding position element in the matrix is multiplied, and W, Z is a weight matrix; current hidden layer state feature H t And historical time domain features H t-1 By updating the door u t Weighting to obtain the final output time domain correlation characteristics
Figure FDA0003653603930000011
Figure FDA0003653603930000012
In reverse, each time has an X t And H t+1 Two inputs, the same way, can obtain the time domain correlation feature of reverse extraction
Figure FDA0003653603930000013
Forward and backward two eigenvectors
Figure FDA0003653603930000021
Outputting the time domain related characteristics H extracted by the bidirectional GRU layer after superposition ot
Figure FDA0003653603930000022
Then the time domain correlation characteristic H ot Connecting all the feature maps into a feature vector by a dropout layer with the parameter of 0.3 and then connecting 1 scatter layer, and then using the feature vector extracted by the 3 layers of dense layers as the depth time domain related feature F Dt
S3, extracting the main energy characteristics of the airspace:
performing space domain principal component analysis on a training data set X of the one-dimensional range profile to obtain a characteristic vector matrix A, and sampling the one-dimensional range profile sample X t And (3) projecting to A:
F St =A T X t
wherein, F St Is X t Corresponding airspace main energy characteristics;
s4, converting the time domain characteristics F Dt And space domain main energy feature F St Carrying out fusion to obtain X t Corresponding fused feature vector F t
F t =[F Dt ,F St ]
Wherein t is more than or equal to 1 and less than or equal to m; the training sample data set has k classes, q class samples are classified into one class and non-q class samples are classified into another class in sequence during training, q is more than or equal to 1 and less than or equal to k, and k classifiers are constructed; according to the fusion feature vector F corresponding to all training samples 1 、F 2 …F m Constructing decision functions of k classifiers, wherein the decision function D of the q-th classifier q (·):
Figure FDA0003653603930000023
Wherein X any For any one-dimensional range profile sample, F any Is X any Corresponding fused feature vector, D q (F any ) Is a one-dimensional range profile sample X any Corresponding decision function value, Y q,t For training sample X t Class labels corresponding in the qth classifier, when X t When the training sample belongs to the q-th class, the corresponding Y q,t A value of 1 when X t When the training samples belong to the other categories, the corresponding Y q,t A value of-1, alpha q,t And b q In order to optimize the obtained coefficient by using a training sample solution, K (·,) is a Gaussian kernel function;
s5, inputting the one-dimensional range profile of the target to be recognized as X e Extracting deep time domain related feature F by using bidirectional GRU network De And is andcorresponding space domain main energy characteristic F Se Fusing to obtain a fused feature vector F e
F e =[F De ,F Se ]
Mixing X e Corresponding fused feature vector F e Substituting decision function D q (. To calculate decision function values D of k classifiers 1 (F e )、D 2 (F e )…D k (F e ) For k decision function values D 1 (F e )、D 2 (F e )…D k (F e ) Sorting, taking the maximum decision function value and the set decision threshold d th And comparing, if the value is larger than the threshold, judging that the target to be recognized is a known target, otherwise, judging that the target to be recognized is an unknown target.
CN202210548825.1A 2022-05-20 2022-05-20 Unknown target discrimination method based on fusion of time domain features and space domain features Active CN114943286B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210548825.1A CN114943286B (en) 2022-05-20 2022-05-20 Unknown target discrimination method based on fusion of time domain features and space domain features

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210548825.1A CN114943286B (en) 2022-05-20 2022-05-20 Unknown target discrimination method based on fusion of time domain features and space domain features

Publications (2)

Publication Number Publication Date
CN114943286A CN114943286A (en) 2022-08-26
CN114943286B true CN114943286B (en) 2023-04-07

Family

ID=82909914

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210548825.1A Active CN114943286B (en) 2022-05-20 2022-05-20 Unknown target discrimination method based on fusion of time domain features and space domain features

Country Status (1)

Country Link
CN (1) CN114943286B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7724960B1 (en) * 2006-09-08 2010-05-25 University Of Central Florida Research Foundation Inc. Recognition and classification based on principal component analysis in the transform domain
CN107463966A (en) * 2017-08-17 2017-12-12 电子科技大学 Radar range profile's target identification method based on dual-depth neutral net
CN108764084A (en) * 2018-05-17 2018-11-06 西安电子科技大学 Video classification methods based on spatial domain sorter network and the time domain network integration
CN111079594A (en) * 2019-12-04 2020-04-28 成都考拉悠然科技有限公司 Video action classification and identification method based on double-current cooperative network
CN111273288A (en) * 2020-03-06 2020-06-12 电子科技大学 Radar unknown target identification method based on long-term and short-term memory network
CN112580615A (en) * 2021-02-26 2021-03-30 北京远鉴信息技术有限公司 Living body authentication method and device and electronic equipment
CN113095386A (en) * 2021-03-31 2021-07-09 华南师范大学 Gesture recognition method and system based on three-axis acceleration space-time feature fusion
CN113411566A (en) * 2021-05-17 2021-09-17 杭州电子科技大学 No-reference video quality evaluation method based on deep learning
CN113487564A (en) * 2021-07-02 2021-10-08 杭州电子科技大学 Double-current time sequence self-adaptive selection video quality evaluation method for user original video
CN114428234A (en) * 2021-12-23 2022-05-03 西安电子科技大学 Radar high-resolution range profile noise reduction identification method based on GAN and self-attention

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10068171B2 (en) * 2015-11-12 2018-09-04 Conduent Business Services, Llc Multi-layer fusion in a convolutional neural network for image classification

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7724960B1 (en) * 2006-09-08 2010-05-25 University Of Central Florida Research Foundation Inc. Recognition and classification based on principal component analysis in the transform domain
CN107463966A (en) * 2017-08-17 2017-12-12 电子科技大学 Radar range profile's target identification method based on dual-depth neutral net
CN108764084A (en) * 2018-05-17 2018-11-06 西安电子科技大学 Video classification methods based on spatial domain sorter network and the time domain network integration
CN111079594A (en) * 2019-12-04 2020-04-28 成都考拉悠然科技有限公司 Video action classification and identification method based on double-current cooperative network
CN111273288A (en) * 2020-03-06 2020-06-12 电子科技大学 Radar unknown target identification method based on long-term and short-term memory network
CN112580615A (en) * 2021-02-26 2021-03-30 北京远鉴信息技术有限公司 Living body authentication method and device and electronic equipment
CN113095386A (en) * 2021-03-31 2021-07-09 华南师范大学 Gesture recognition method and system based on three-axis acceleration space-time feature fusion
CN113411566A (en) * 2021-05-17 2021-09-17 杭州电子科技大学 No-reference video quality evaluation method based on deep learning
CN113487564A (en) * 2021-07-02 2021-10-08 杭州电子科技大学 Double-current time sequence self-adaptive selection video quality evaluation method for user original video
CN114428234A (en) * 2021-12-23 2022-05-03 西安电子科技大学 Radar high-resolution range profile noise reduction identification method based on GAN and self-attention

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
Ning Wang等.Construction and Analysis of Cross-layer Aggregation Neural Network for AMI Intrusion Detection.《2022 4th Asia Energy and Electrical Engineering Symposium (AEEES)》.2022,148-153. *
Yuchen Chu等.Attention Enhanced Spatial Temporal Neural Network For HRRP Recognition.《ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)》.2021,3805-3809. *
Yudong Cao等.Transfer learning for remaining useful life prediction of multi-conditions bearings based on bidirectional-GRU network.《Measurement》.2021,第178卷1-14. *
冯贤洋.基于物联网和CNN-BiGRU-AM神经网络齿轮箱关键零部件故障诊断.《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》.2022,C029-193. *
郑纯丹等.稀疏分解在雷达一维距离像中的应用.《雷达科学与技术》.2013,第11卷55-58. *
陈景霞等.基于混合神经网络的脑电时空特征情感分类.《软件学报》.2021,第32卷3869-3883. *

Also Published As

Publication number Publication date
CN114943286A (en) 2022-08-26

Similar Documents

Publication Publication Date Title
CN109086700B (en) Radar one-dimensional range profile target identification method based on deep convolutional neural network
CN107564025B (en) Electric power equipment infrared image semantic segmentation method based on deep neural network
CN110334741B (en) Radar one-dimensional range profile identification method based on cyclic neural network
CN106599797B (en) A kind of infrared face recognition method based on local parallel neural network
CN108256436B (en) Radar HRRP target identification method based on joint classification
CN111123257B (en) Radar moving target multi-frame joint detection method based on graph space-time network
CN111913156B (en) Radar radiation source individual identification method based on deep learning model and feature combination
CN110033473B (en) Moving target tracking method based on template matching and depth classification network
CN113050042B (en) Radar signal modulation type identification method based on improved UNet3+ network
CN106443632B (en) The radar target identification method of multitask Factor Analysis Model is kept based on label
CN109002848B (en) Weak and small target detection method based on feature mapping neural network
CN109948722B (en) Method for identifying space target
CN107977683B (en) Joint SAR target recognition method based on convolution feature extraction and machine learning
CN113657491A (en) Neural network design method for signal modulation type recognition
CN114595732B (en) Radar radiation source sorting method based on depth clustering
CN110929842B (en) Accurate intelligent detection method for non-cooperative radio signal burst time region
CN111880158A (en) Radar target detection method and system based on convolutional neural network sequence classification
CN111273288A (en) Radar unknown target identification method based on long-term and short-term memory network
CN112965062A (en) Radar range profile target identification method based on LSTM-DAM network
CN112560596A (en) Radar interference category identification method and system
CN110033043B (en) Radar one-dimensional range profile rejection method based on condition generation type countermeasure network
CN110223342B (en) Space target size estimation method based on deep neural network
CN113109782B (en) Classification method directly applied to radar radiation source amplitude sequence
CN112835008B (en) High-resolution range profile target identification method based on attitude self-adaptive convolutional network
CN112946600B (en) Method for constructing radar HRRP database based on WGAN-GP

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant