CN115856881A - Millimeter wave radar behavior sensing method based on dynamic lightweight network - Google Patents

Millimeter wave radar behavior sensing method based on dynamic lightweight network Download PDF

Info

Publication number
CN115856881A
CN115856881A CN202310041358.8A CN202310041358A CN115856881A CN 115856881 A CN115856881 A CN 115856881A CN 202310041358 A CN202310041358 A CN 202310041358A CN 115856881 A CN115856881 A CN 115856881A
Authority
CN
China
Prior art keywords
dynamic
data
layer
branch
network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310041358.8A
Other languages
Chinese (zh)
Other versions
CN115856881B (en
Inventor
盛碧云
包燕
肖甫
桂林卿
蔡惠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Posts and Telecommunications
Original Assignee
Nanjing University of Posts and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Posts and Telecommunications filed Critical Nanjing University of Posts and Telecommunications
Priority to CN202310041358.8A priority Critical patent/CN115856881B/en
Publication of CN115856881A publication Critical patent/CN115856881A/en
Application granted granted Critical
Publication of CN115856881B publication Critical patent/CN115856881B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention belongs to the technical field of millimeter wave radar signal behavior perception, and discloses a millimeter wave radar behavior perception method based on a dynamic lightweight network, wherein radar micro Doppler data is subjected to denoising processing, three two-dimensional matrixes of each frame are connected into a three-dimensional matrix, the three-dimensional matrixes of all frames in a sample are connected along a time dimension, and similar video data are generated, so that the defect that a traditional heat map only can contain single information is overcome; meanwhile, a dynamic lightweight module based on attention mechanism and channel disorder is designed, a dynamic lightweight Slowfast network is provided, model sensing precision and working efficiency are improved, and calculation complexity is reduced.

Description

Millimeter wave radar behavior sensing method based on dynamic lightweight network
Technical Field
The invention belongs to the technical field of millimeter wave radar signal behavior perception, and particularly relates to a millimeter wave radar behavior perception method based on a dynamic lightweight network.
Background
The millimeter wave radar has higher distance resolution and speed resolution, and along with the improvement of millimeter wave radar integration level, the millimeter wave radar with small volume is enough to be integrated into consumption equipment such as smart phones and smart speakers, and the signal of the millimeter wave radar is utilized to identify human activities to become a hot direction for research.
At present, a sensing method based on millimeter wave radar micro-Doppler data directly inputs a heat map into a deep learning algorithm to identify human activities; however, the generated thermal image data often only contains partial information, and the measured target cannot be comprehensively depicted from distance, speed and angle (including horizontal angle and pitch angle); if a relatively complete target heat map information is desired, different information is often respectively input into different networks in the prior art, and the networks are utilized to perform information fusion, so that the complexity of operation is caused; and the existing model usually only focuses on precision, and ignores the complexity and the calculation cost of the model.
Disclosure of Invention
In order to solve the technical problem, the invention provides a millimeter wave radar behavior sensing method based on a dynamic lightweight network, which converts three-view signals of each frame of radar micro-Doppler data into three-dimensional sequence data, connects a plurality of frame data of each sample, and generates video-like data; the Slowfast structure has a good effect in video sensing, can fully mine the space-time characteristics of data and improve the representation capability of the characteristics, simultaneously designs a dynamic lightweight module based on attention mechanism and channel disorder, provides a dynamic lightweight Slowfast network, improves the sensing precision and the working efficiency of a model and reduces the calculation complexity.
The invention relates to a millimeter wave radar behavior sensing method based on a dynamic lightweight network, which comprises the following steps:
step 1, collecting radar micro Doppler data of different actions for different volunteers in different scenes, and simultaneously dividing a training set and a test set;
step 2, denoising acquired radar micro Doppler data;
step 3, further processing the de-noised data to generate matrix data of distance-speed, distance-elevation angle and distance-azimuth angle;
step 4, further processing the matrix of the three dimensions to generate three-view data of the video;
step 5, constructing a dynamic lightweight Slowfast network, inputting data of a training set into the network, and training a network model;
and 6, inputting the test sample into the trained model, and outputting the action category of the test sample to realize human activity recognition.
Further, in step 2, the interference from the stationary target is estimated, and the average frequency response of the frequency domain is eliminated, so as to generate the de-noised data.
Further, in step 3, fast fourier FFT transformation of range, doppler, and antenna dimensions is performed on the original IQ signal to obtain a range-velocity matrix, a range-azimuth matrix, and a range-pitch angle matrix of the target.
Further, in step 4, the distance-velocity matrix, the distance-azimuth matrix and the distance-pitch angle matrix of each frame are connected into a three-dimensional matrix, and then the three-dimensional matrices of all frames in a sample are connected along the time dimension to generate video-like data.
Further, an attention mechanism and channel disruption based on dynamic convolution are integrated into a Slowfast network, and a dynamic lightweight Slowfast structure model is constructed; the dynamic lightweight Slowfast network comprises a fast branch and a slow branch, wherein the fast branch and the slow branch have the same structure and comprise five stages; the first stage comprises a depth separable convolutional layer, an activation layer and a maximum pooling layer which are arranged in sequence; the second phase comprises a dynamic lightweight module Block A; the third stage and the fourth stage respectively comprise a dynamic lightweight module Block B and a dynamic lightweight module Block C which are arranged in sequence; the fifth stage comprises a dynamic lightweight module Block B, an activation layer and a global average pooling layer which are arranged in sequence; and after each stage is finished, fusing the data of the fast branch and the data of the slow branch, wherein the fusion of the first four stages is that the fast branch data and the slow branch data are connected after passing through the depth separable layer, and the fusion of the fifth stage is that the fast branch data and the slow branch data are directly connected.
Further, the dynamic lightweight module Block a and the dynamic lightweight module Block B have the same structure and respectively include a left branch and a right branch, and data is respectively input to the left branch and the right branch; the left branch comprises a dynamic depth convolution layer, a normalization layer, a dynamic point convolution layer, a normalization layer and an activation layer which are arranged in sequence; the right branch comprises a dynamic point convolution layer, a normalization layer, an activation layer, a dynamic depth convolution layer, a normalization layer, a dynamic point convolution layer, a normalization layer and an activation layer which are arranged in sequence; and connecting the characteristic data obtained by the left branch and the right branch of the dynamic lightweight modules Block A and Block B, and disordering the channel to be used as the input of the next layer.
Further, the dynamic lightweight module Block C includes two branches, where the two branches are divided into two layers of data, and half of the data is input into the right branch of the Block C and sequentially passes through the dynamic point convolution layer, the normalization layer, the activation layer, the dynamic depth convolution layer, the normalization layer, the dynamic point convolution layer, the normalization layer, and the activation layer; the other half of the data is not processed and is connected with the characteristic data obtained by the right branch, and the channel is disturbed to be used as the input of the next layer.
The invention has the beneficial effects that: according to the characteristics of millimeter wave radar signals, the invention innovatively provides a three-view signal processing mode based on radar micro-Doppler data, two-dimensional matrix data of distance-speed, distance-elevation angle and distance-azimuth angle of each frame are directly spliced to generate three-dimensional data information, all frames form 4-dimensional video samples, the type of data can be fused with multi-view information in the initial data generation stage, and the defect that the traditional heat map only can contain single information is overcome. In addition, according to the characteristics of video-like data, on a Slowfast model architecture, by embedding the attention mechanism and the channel disorder design into a Slowfast network, three dynamic lightweight-level modules are designed, the modules realize network parameter lightweight by using a depth separable convolution and a channel disorder mechanism, and meanwhile, by using dynamic convolution, weights are obtained for different convolution kernels, so that the sensing precision is improved; on the basis, a dynamic lightweight Slowfast network is provided, dynamic deep convolutional layers (combining dynamic convolution and deep convolution) and dynamic point convolutional layers (combining dynamic convolution and point convolution) are fused into three modules, network parameters and complexity are further reduced, network lightweight and model accuracy are improved, and human body activity recognition tasks can be achieved through the network with small parameters and calculation cost.
Drawings
FIG. 1 is a basic frame diagram of a human body activity recognition method of a millimeter wave radar based on a dynamic lightweight network in an embodiment of the present invention;
FIG. 2 is a flowchart of a human body activity recognition method based on a dynamic lightweight network millimeter wave radar in an embodiment of the present invention;
FIG. 3 is a schematic diagram of class video data generation according to an embodiment of the present invention;
FIG. 4 is a diagram of a dynamic lightweight module according to an embodiment of the present invention;
fig. 5 is a diagram of a model structure of a dynamic lightweight Slowfast network in an embodiment of the present invention;
fig. 6 is a schematic diagram illustrating confusion matrices for different scene recognition effects according to an embodiment of the present invention.
Detailed Description
In order that the present invention may be more readily and clearly understood, a more particular description of the invention briefly described above will be rendered by reference to specific embodiments that are illustrated in the appended drawings.
The invention relates to a millimeter wave radar behavior sensing method based on a dynamic lightweight network, which comprises the following steps:
step 1, collecting radar micro Doppler data of different actions for different volunteers in different scenes, and simultaneously dividing a training set and a test set;
step 2, denoising the acquired radar micro Doppler data;
step 3, further processing the de-noised data to generate matrix data of distance-speed, distance-elevation angle and distance-azimuth angle;
step 4, further processing the matrix of the three dimensions to generate three-view data of the video;
step 5, constructing a dynamic lightweight Slowfast network, inputting data of a training set into the network, and training a network model;
and 6, inputting the test sample into the trained model, and outputting the action category of the test sample to realize human activity recognition.
According to the characteristics of millimeter wave radar signals, the invention innovatively provides a three-view signal processing mode based on radar micro-Doppler data, and generates similar video data, wherein the similar video data can be fused with multi-view information in the initial data generation stage, so that the defect that the traditional heat map only contains single information is overcome. In addition, according to the characteristics of the video-like data, the attention mechanism and the channel disorganization design are embedded into the Slowfast network, three dynamic lightweight-level modules are designed, and the dynamic lightweight-level Slowfast network is provided on the basis, and can realize the human activity recognition task with smaller parameter quantity and calculation cost.
Fig. 1 is a basic frame diagram of a millimeter wave radar behavior sensing method based on a dynamic lightweight network according to an embodiment of the present invention. The whole framework of the embodiment comprises millimeter wave radar signal acquisition, data preprocessing, dynamic lightweight Slowfast network model construction and action recognition.
The method for sensing the behaviors of the millimeter wave radar based on the dynamic lightweight network according to the present invention will be described in detail below.
Fig. 2 is a flowchart of a millimeter wave radar behavior sensing method based on a dynamic lightweight network according to an embodiment of the present invention.
In step 1, an IWR6843AOP evaluation board with a working frequency band of 60-64GHz produced by TI company is used for transmitting and receiving millimeter wave radar signals, a PC (personal computer) with mmWave studio of the TI company is equipped to realize conversion from analog signals to digital signals, and frequency modulation continuous wave signals of actions of a tested person are collected. And (3) acquiring radar micro-Doppler data of different actions for different volunteers in different scenes by using equipment, and simultaneously dividing a training set and a test set.
In step 2, because the frequency response related to the stationary object in the range-doppler frequency domain is kept consistent with time, the interference from the stationary object is roughly estimated by using the average frequency response of the range-doppler frequency domain, and the average frequency response of the range-doppler frequency domain is subtracted by using the range-doppler frequency domain, so that the obtained result is used as the de-noised data.
In step 3, fast Fourier Transform (FFT) of the range dimension is performed on the original IQ signal, and on the basis, doppler dimension FFT is further performed to obtain the velocity of the target on a plurality of range domains, i.e. range-velocity; and then obtaining the results of the distance-azimuth angle and the distance-pitch angle by adopting angle FFT along all the horizontal receiving antennas and the vertical receiving antennas respectively, and directly storing distance-speed, distance-azimuth angle and distance-pitch angle matrixes.
In step 4, one motion sample comprises a plurality of frames, and a distance-speed matrix, a distance-azimuth angle matrix and a distance-pitch angle matrix contained in each frame are connected into three-dimensional data to generate three views to describe observation changes in horizontal and vertical spaces; and finally, connecting the three-dimensional data generated by each frame into multi-frame video data.
In step 5, an attention mechanism and channel disruption based on dynamic convolution are integrated into a Slowfast network, and a dynamic lightweight Slowfast structure model is constructed. The dynamic lightweight network is composed of dynamic lightweight modules, wherein the specific structure diagram of the dynamic lightweight modules is shown in fig. 4. The specific construction of the dynamic lightweight Slowfast network is shown in fig. 5. Inputting training data and class labels into a dynamic lightweight Slowfast network to extract features so as to judge the type of human body actions; the updated model parameters are then propagated back to train the network model based on the loss values between the predicted and actual values.
Fig. 4 is a structural diagram of a dynamic lightweight module in the embodiment of the present invention. FIG. 4 includes three blocks Block A-C, where DyDWConv denotes dynamic depth convolution, dyConv denotes dynamic point convolution, and Block A and B input all features into two branches respectively for feature extraction, but the steps in the gray region are different. The specific process of the modules Block A and B is that the left branch of the data input module of the previous layer respectively passes through a dynamic depth convolution layer, a normalization layer and a dynamic point convolution layer; the right branch of the input module passes through the dynamic point convolution layer, the normalization layer, the activation layer, the dynamic depth convolution layer, the normalization layer, the dynamic point convolution layer, the normalization layer and the activation layer respectively. And connecting the characteristic data obtained by the left branch and the right branch of the module and disturbing the channel to be used as the input of the next layer. The specific process of Block C is to divide the data of the previous layer equally, wherein the right branch of one half of the input modules passes through the dynamic point convolution layer, the normalization layer, the activation layer, the dynamic depth convolution layer, the normalization layer, the dynamic point convolution layer, the normalization layer and the activation layer. The other half does not do any processing and is connected with the characteristic data obtained by the right branch, and the channel is disturbed to be used as the input of the next layer. The dynamic depth convolution layer is formed by combining dynamic convolution and depth convolution, and the dynamic point convolution layer is formed by combining dynamic convolution and point convolution, so that network lightweight and model precision are improved.
Fig. 5 is a diagram of a model structure of a dynamic lightweight Slowfast network in an embodiment of the present invention. It can be seen from fig. 5 that the dynamic lightweight Slowfast network is a five-stage network, and is composed of fast branches and slow branches, which respectively represent data of fast and slow frame rates, and the fast branches are continuously merged into the slow branches, so as to extract feature information from the input data. In the first stage, two branches respectively extract a certain proportion of data frames, and the data frames are sent to a depth separable convolutional layer, an active layer, a maximum pooling layer, a second-stage input module Block A, a third stage and a fourth stage respectively input modules Block B and Block C, a fifth stage respectively input module Block B, and an active layer global average pooling layer; after each stage is finished, the data of the fast and slow branches are fused, the fast branch data and the slow branch data are connected after passing through the depth separable layer in the first four stages, and the fast branch data and the slow branch data are directly connected in the fifth stage; and finally, outputting the final classification result by utilizing the full connection layer and the softmax layer.
Further, step 6 is specifically to input the test sample into the trained model, output the motion category of the test sample, and implement human activity recognition.
Acquiring an action sample in an outdoor scene, wherein the action sample comprises five actions of box, wave, stand, walk and square; samples were collected indoors, including five acts of wave, jump, throw, walk, square. Fig. 6 is a schematic diagram illustrating confusion matrices of recognition effects in different scenarios, where the horizontal axis is a prediction label and the vertical axis is a real label. Each value in the confusion matrix of fig. 6 represents the proportion of a certain type of action predicted as that type or other types, and the larger the value of the diagonal line, the higher the recognition accuracy. As can be seen from the two confusion matrices in fig. 6, the method of the embodiment can realize high-precision recognition of various actions in different scenes.
The parameter quantity ratio of the original Slowfast model and the lightweight network according to the invention is shown in table 1. As can be seen from Table 1, the dynamic lightweight network parameters and floating point operands are greatly reduced, and lightweight design of the network is realized;
TABLE 1 model complexity description of dynamic lightweight networks in the examples of the invention
Figure SMS_1
The above description is only a preferred embodiment of the present invention, and is not intended to limit the present invention, and all equivalent variations made by using the contents of the present specification and the drawings are within the protection scope of the present invention.

Claims (7)

1. A millimeter wave radar behavior perception method based on a dynamic lightweight network is characterized by comprising the following steps:
step 1, collecting radar micro Doppler data of different actions for different volunteers in different scenes, and simultaneously dividing a training set and a test set;
step 2, denoising the acquired radar micro Doppler data;
step 3, further processing the de-noised data to generate matrix data of distance-speed, distance-elevation angle and distance-azimuth angle;
step 4, further processing the matrix of the three dimensions to generate three-view data of the video;
step 5, constructing a dynamic lightweight Slowfast network, inputting data of a training set into the network, and training a network model;
and 6, inputting the test sample into the trained model, and outputting the action category of the test sample to realize human activity recognition.
2. The method as claimed in claim 1, wherein in step 2, interference from stationary targets is estimated, and the average frequency response of the frequency domain is eliminated to generate de-noised data.
3. The method as claimed in claim 1, wherein in step 3, fast fourier FFT transformation of range, doppler, and antenna dimensions is performed on the original IQ signals to obtain a range-velocity matrix, a range-azimuth matrix, and a range-pitch angle matrix of the target.
4. The method as claimed in claim 1, wherein in step 4, the range-velocity matrix, the range-azimuth matrix, and the range-pitch matrix of each frame are connected into a three-dimensional matrix, and the three-dimensional matrices of all frames in a sample are connected along the time dimension to generate the video-like data.
5. The millimeter wave radar behavior sensing method based on the dynamic lightweight network, according to claim 1, characterized in that the dynamic lightweight Slowfast network comprises a fast branch and a slow branch, and the fast branch and the slow branch have the same structure and comprise five stages; the first stage comprises a depth separable convolutional layer, an activation layer and a maximum pooling layer which are arranged in sequence; the second phase comprises a dynamic lightweight module Block A; the third stage and the fourth stage respectively comprise a dynamic lightweight module Block B and a dynamic lightweight module Block C which are arranged in sequence; the fifth stage comprises a dynamic lightweight module Block B, an activation layer and a global average pooling layer which are arranged in sequence; and after each stage is finished, fusing the data of the fast branch and the data of the slow branch, wherein the fusion of the first four stages is that the fast branch data and the slow branch data are connected after passing through the depth separable layer, and the fusion of the fifth stage is that the fast branch data and the slow branch data are directly connected.
6. The millimeter wave radar behavior sensing method based on the dynamic lightweight network according to claim 5, wherein the dynamic lightweight module Block A and the dynamic lightweight module Block B both comprise a left branch and a right branch, and data is input to the left branch and the right branch respectively; the left branch comprises a dynamic depth convolution layer, a normalization layer, a dynamic point convolution layer, a normalization layer and an activation layer which are arranged in sequence; the right branch comprises a dynamic point convolution layer, a normalization layer, an activation layer, a dynamic depth convolution layer, a normalization layer, a dynamic point convolution layer, a normalization layer and an activation layer which are arranged in sequence; and connecting the characteristic data obtained by the left branch and the right branch of the dynamic lightweight modules Block A and Block B, and disordering the channel to be used as the input of the next layer.
7. The millimeter wave radar behavior sensing method based on the dynamic lightweight network is characterized in that the dynamic lightweight module Block C comprises two branches, the two branches are divided into two layers of data, wherein half of the data is input into the right branch of the Block C and sequentially passes through the dynamic point convolution layer, the normalization layer, the activation layer, the dynamic depth convolution layer, the normalization layer, the dynamic point convolution layer, the normalization layer and the activation layer; the other half of the data is not processed and is connected with the characteristic data obtained by the right branch, and the channel is disturbed to be used as the input of the next layer.
CN202310041358.8A 2023-01-12 2023-01-12 Millimeter wave radar behavior sensing method based on dynamic lightweight network Active CN115856881B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310041358.8A CN115856881B (en) 2023-01-12 2023-01-12 Millimeter wave radar behavior sensing method based on dynamic lightweight network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310041358.8A CN115856881B (en) 2023-01-12 2023-01-12 Millimeter wave radar behavior sensing method based on dynamic lightweight network

Publications (2)

Publication Number Publication Date
CN115856881A true CN115856881A (en) 2023-03-28
CN115856881B CN115856881B (en) 2023-05-12

Family

ID=85657295

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310041358.8A Active CN115856881B (en) 2023-01-12 2023-01-12 Millimeter wave radar behavior sensing method based on dynamic lightweight network

Country Status (1)

Country Link
CN (1) CN115856881B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110765974A (en) * 2019-10-31 2020-02-07 复旦大学 Micro-motion gesture recognition method based on millimeter wave radar and convolutional neural network
CN113963441A (en) * 2021-10-25 2022-01-21 中国科学技术大学 Cross-domain enhancement-based millimeter wave radar gesture recognition method and system
CN113989718A (en) * 2021-10-29 2022-01-28 南京邮电大学 Human body target detection method facing radar signal heat map
CN114445914A (en) * 2022-01-26 2022-05-06 厦门大学 Millimeter wave data automatic labeling method and system based on video
CN115048951A (en) * 2021-03-09 2022-09-13 深圳市万普拉斯科技有限公司 Millimeter wave radar-based gesture recognition method and device and terminal equipment
CN115063884A (en) * 2022-06-14 2022-09-16 电子科技大学 Millimeter wave radar head action recognition method based on multi-domain fusion deep learning
CN115294656A (en) * 2022-08-23 2022-11-04 南京邮电大学 FMCW radar-based hand key point tracking method
CN115343704A (en) * 2022-07-29 2022-11-15 中国地质大学(武汉) Gesture recognition method of FMCW millimeter wave radar based on multi-task learning

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110765974A (en) * 2019-10-31 2020-02-07 复旦大学 Micro-motion gesture recognition method based on millimeter wave radar and convolutional neural network
CN115048951A (en) * 2021-03-09 2022-09-13 深圳市万普拉斯科技有限公司 Millimeter wave radar-based gesture recognition method and device and terminal equipment
CN113963441A (en) * 2021-10-25 2022-01-21 中国科学技术大学 Cross-domain enhancement-based millimeter wave radar gesture recognition method and system
CN113989718A (en) * 2021-10-29 2022-01-28 南京邮电大学 Human body target detection method facing radar signal heat map
CN114445914A (en) * 2022-01-26 2022-05-06 厦门大学 Millimeter wave data automatic labeling method and system based on video
CN115063884A (en) * 2022-06-14 2022-09-16 电子科技大学 Millimeter wave radar head action recognition method based on multi-domain fusion deep learning
CN115343704A (en) * 2022-07-29 2022-11-15 中国地质大学(武汉) Gesture recognition method of FMCW millimeter wave radar based on multi-task learning
CN115294656A (en) * 2022-08-23 2022-11-04 南京邮电大学 FMCW radar-based hand key point tracking method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
ZHANJUN HAO ET AL.: "Millimeter wave radar-based method for detecting typical dangerous driving manoeuvres", 《RESEARCH SQUARE》 *
元志安 等: "基于RDSNet的毫米波雷达人体跌倒检测方法", 《雷达学报》 *
李永 等: "基于深度学习的人体行为识别检测综述", 《科学技术与工程》 *

Also Published As

Publication number Publication date
CN115856881B (en) 2023-05-12

Similar Documents

Publication Publication Date Title
CN111797717B (en) High-speed high-precision SAR image ship detection method
CN111123257B (en) Radar moving target multi-frame joint detection method based on graph space-time network
CN107229918A (en) A kind of SAR image object detection method based on full convolutional neural networks
CN110619373B (en) Infrared multispectral weak target detection method based on BP neural network
CN109063569A (en) A kind of semantic class change detecting method based on remote sensing image
CN110780271A (en) Spatial target multi-mode radar classification method based on convolutional neural network
US20200393558A1 (en) System and method of enhancing a performance of an electromagnetic sensor
CN113610905B (en) Deep learning remote sensing image registration method based on sub-image matching and application
CN114241422A (en) Student classroom behavior detection method based on ESRGAN and improved YOLOv5s
CN112364689A (en) Human body action and identity multi-task identification method based on CNN and radar image
Wang et al. L2R GAN: LiDAR-to-radar translation
Li et al. Blinkflow: A dataset to push the limits of event-based optical flow estimation
CN113222824B (en) Infrared image super-resolution and small target detection method
CN111781599B (en) SAR moving ship target speed estimation method based on CV-EstNet
CN116933141B (en) Multispectral laser radar point cloud classification method based on multicore graph learning
Yang et al. GAN-based radar spectrogram augmentation via diversity injection strategy
CN113050083A (en) Ultra-wideband radar human body posture reconstruction method based on point cloud
CN117173556A (en) Small sample SAR target recognition method based on twin neural network
CN115856881B (en) Millimeter wave radar behavior sensing method based on dynamic lightweight network
CN115327544B (en) Little-sample space target ISAR defocus compensation method based on self-supervision learning
CN115272865A (en) Target detection method based on adaptive activation function and attention mechanism
CN115187830A (en) SAR image and signal-based fuzzy comprehensive evaluation method for artificial electromagnetic environment construction effect
CN107832796B (en) SAR image classification method based on curve ripple depth latter network model
El-Bana et al. Evaluating the Potential of Wavelet Pooling on Improving the Data Efficiency of Light-Weight CNNs
Li et al. Monitoring the growth status of corn crop from uav images based on dense convolutional neural network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant