CN115856881B - Millimeter wave radar behavior sensing method based on dynamic lightweight network - Google Patents

Millimeter wave radar behavior sensing method based on dynamic lightweight network Download PDF

Info

Publication number
CN115856881B
CN115856881B CN202310041358.8A CN202310041358A CN115856881B CN 115856881 B CN115856881 B CN 115856881B CN 202310041358 A CN202310041358 A CN 202310041358A CN 115856881 B CN115856881 B CN 115856881B
Authority
CN
China
Prior art keywords
dynamic
data
layer
branch
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310041358.8A
Other languages
Chinese (zh)
Other versions
CN115856881A (en
Inventor
盛碧云
包燕
肖甫
桂林卿
蔡惠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Posts and Telecommunications
Original Assignee
Nanjing University of Posts and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Posts and Telecommunications filed Critical Nanjing University of Posts and Telecommunications
Priority to CN202310041358.8A priority Critical patent/CN115856881B/en
Publication of CN115856881A publication Critical patent/CN115856881A/en
Application granted granted Critical
Publication of CN115856881B publication Critical patent/CN115856881B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention belongs to the technical field of millimeter wave radar signal behavior sensing, and discloses a millimeter wave radar behavior sensing method based on a dynamic lightweight network, which is characterized in that after radar micro Doppler data is subjected to denoising treatment, three two-dimensional matrixes of each frame are connected into a three-dimensional matrix, the three-dimensional matrixes of all frames in one sample are connected along a time dimension to generate video-like data, and the defect that a traditional heat map only can contain single information is overcome; meanwhile, a dynamic lightweight module based on an attention mechanism and channel disorder is designed, a dynamic lightweight Slowfast network is provided, the model perception precision and the working efficiency are improved, and the calculation complexity is reduced.

Description

Millimeter wave radar behavior sensing method based on dynamic lightweight network
Technical Field
The invention belongs to the technical field of millimeter wave radar signal behavior sensing, and particularly relates to a millimeter wave radar behavior sensing method based on a dynamic lightweight network.
Background
Millimeter wave radars have higher distance resolution and speed resolution, and along with the improvement of the integration level of the millimeter wave radars, the millimeter wave radars with small volumes are enough integrated into consumer equipment such as smart phones and intelligent loudspeakers, and human body activity recognition by utilizing millimeter wave radar signals becomes a popular direction of research.
At present, a thermal map is directly input into a deep learning algorithm to perform human activity recognition based on a sensing method of micro Doppler data of a millimeter wave radar; however, the generated heat map image data often only contains partial information therein, and the measured target cannot be comprehensively depicted from the distance, the speed and the angle (including horizontal angle and pitch angle) at the same time; if the whole target heat map information is hoped to be obtained, the prior art often inputs different information into different networks respectively, and information fusion is carried out by utilizing the networks, so that operation complexity is caused; and the existing model only pays attention to precision, and complexity and calculation cost of the model are ignored.
Disclosure of Invention
In order to solve the technical problems, the invention provides a millimeter wave radar behavior sensing method based on a dynamic lightweight network, which converts three-view signals of each frame of radar micro Doppler data into three-dimensional sequence data, connects a plurality of frames of data of each sample, and generates video-like data; the Slowfast structure has a good effect in video perception, can fully excavate the space-time characteristics of data and improve the representation capability of the characteristics, and meanwhile, the invention designs a dynamic lightweight module based on an attention mechanism and channel disruption, provides a dynamic lightweight Slowfast network, improves the model perception precision and the working efficiency and reduces the calculation complexity.
The invention relates to a millimeter wave radar behavior sensing method based on a dynamic lightweight network, which comprises the following steps:
step 1, collecting radar micro Doppler data of different actions for different volunteers in different scenes, and dividing a training set and a testing set at the same time;
step 2, denoising the acquired radar micro Doppler data;
step 3, further processing the denoised data to generate matrix data of distance-speed, distance-elevation angle and distance-azimuth angle;
step 4, further processing the three-dimensional matrix to generate three-view data of the class video;
step 5, constructing a dynamic lightweight sfast network, inputting data of a training set into the network, and training a network model;
and 6, inputting the test sample into the trained model, and outputting the action category of the test sample to realize human activity recognition.
Further, in step 2, interference from the stationary target is estimated, and the average frequency response of the frequency domain is eliminated, so as to generate denoised data.
Further, in step 3, a fast fourier FFT of the range, doppler, and antenna dimensions is performed on the original IQ signal, so as to obtain a range-speed matrix, a range-azimuth matrix, and a range-pitch matrix of the target.
In step 4, the distance-speed matrix, the distance-azimuth matrix and the distance-pitch matrix of each frame are connected to form a three-dimensional matrix, and then the three-dimensional matrices of all frames in one sample are connected along the time dimension to generate video-like data.
Furthermore, the attention mechanism and the channel disturbance based on dynamic convolution are integrated into a Slowfast network, and a dynamic lightweight Slowfast structure model is constructed; the dynamic lightweight sfast network comprises a fast branch and a slow branch, and the fast branch and the slow branch have the same structure and all comprise five stages; the first stage comprises a depth separable convolution layer, an activation layer and a maximum pooling layer which are sequentially arranged; the second stage comprises a dynamic lightweight module Block A; the third stage and the fourth stage comprise a dynamic lightweight module Block B and a dynamic lightweight module Block C which are sequentially arranged; the fifth stage comprises a dynamic lightweight module BlockB, an activation layer and a global average pooling layer which are sequentially arranged; after each stage is finished, the data of the fast branch and the data of the slow branch are fused, wherein the fusion of the first four stages is that the fast branch data is connected with the slow branch data after being subjected to depth separable layers, and the fusion of the fifth stage is that the fast branch data and the slow branch data are directly connected.
Further, the dynamic lightweight module Block A and the dynamic lightweight module Block B have the same structure and respectively comprise a left branch and a right branch, and data are respectively input into the left branch and the right branch; the left branch comprises a dynamic depth convolution layer, a normalization layer, a dynamic point convolution layer, a normalization layer and an activation layer which are sequentially arranged; the right branch comprises a dynamic point convolution layer, a normalization layer, an activation layer, a dynamic depth convolution layer, a normalization layer, a dynamic point convolution layer, a normalization layer and an activation layer which are sequentially arranged; and connecting characteristic data obtained by the left branch and the right branch of the dynamic lightweight modules Block A and Block B, and disturbing channels to be used as input of the next layer.
Further, the dynamic lightweight module Block C comprises two branches, wherein the two branches bisect the data of the previous layer, and half of the data is input into the right branch of the Block C and sequentially passes through a dynamic point convolution layer, a normalization layer, an activation layer, a dynamic depth convolution layer, a normalization layer, a dynamic point convolution layer, a normalization layer and an activation layer; the other half of the data is not processed and is connected with the characteristic data obtained by the right branch, and the channel is disturbed to be used as the input of the next layer.
The beneficial effects of the invention are as follows: according to the characteristics of millimeter wave radar signals, the invention creatively provides a three-view signal processing mode based on radar micro Doppler data, and the two-dimensional matrix data of the distance-speed, the distance-elevation angle and the distance-azimuth angle of each frame are directly spliced to generate three-dimensional data information, all frames form a 4-dimensional video sample, the type data can be fused with multi-view information in an initial data generation stage, and the defect that the traditional heat map only can contain single information is overcome. In addition, according to the characteristics of the video-like data, on a Slowfast model architecture, three dynamic light-level modules are designed by embedding a design of a attention mechanism and channel disruption into a Slowfast network, the modules realize the light weight of network parameters by combining depth separable convolution with the channel disruption mechanism, and meanwhile, different convolution kernels are weighted by utilizing dynamic convolution, so that the perception precision is improved; on the basis, a dynamic lightweight Slowfast network is provided, a dynamic depth convolution layer (combining dynamic convolution and depth convolution) and a dynamic point convolution layer (combining dynamic convolution and point convolution) are fused into three modules, network parameters and complexity are further reduced, meanwhile, the network weight and the model accuracy are improved, and the network can realize human body activity recognition tasks with smaller parameter quantity and calculation cost.
Drawings
FIG. 1 is a basic frame diagram of a human activity recognition method of a millimeter wave radar based on a dynamic lightweight network in an embodiment of the invention;
FIG. 2 is a flow chart of a human activity recognition method of millimeter wave radar based on a dynamic lightweight network in an embodiment of the invention;
FIG. 3 is a schematic diagram of video-like data generation in an embodiment of the present invention;
FIG. 4 is a diagram of a dynamic lightweight module architecture in an embodiment of the invention;
FIG. 5 is a diagram of a model structure of a dynamic lightweight swfast network in an embodiment of the present invention;
fig. 6 is a schematic diagram illustrating a confusion matrix of different scene recognition effects in an embodiment of the present invention.
Detailed Description
In order that the invention may be more readily understood, a more particular description of the invention will be rendered by reference to specific embodiments that are illustrated in the appended drawings.
The invention discloses a millimeter wave radar behavior sensing method based on a dynamic lightweight network, which comprises the following steps:
step 1, collecting radar micro Doppler data of different actions for different volunteers in different scenes, and dividing a training set and a testing set at the same time;
step 2, denoising the acquired radar micro Doppler data;
step 3, further processing the denoised data to generate matrix data of distance-speed, distance-elevation angle and distance-azimuth angle;
step 4, further processing the three-dimensional matrix to generate three-view data of the class video;
step 5, constructing a dynamic lightweight sfast network, inputting data of a training set into the network, and training a network model;
and 6, inputting the test sample into the trained model, and outputting the action category of the test sample to realize human activity recognition.
According to the characteristics of millimeter wave radar signals, the three-view signal processing mode based on radar micro Doppler data is innovatively provided, video-like data is generated, multi-view information can be fused in the initial data generation stage of the data, and the defect that a traditional heat map can only contain single information is overcome. In addition, the invention also designs three dynamic lightweight level modules by embedding the design of attention mechanism and channel disruption into the Slowfast network according to the characteristics of the video-like data, and provides a dynamic lightweight Slowfast network on the basis, wherein the network can realize human activity recognition tasks with smaller parameter quantity and calculation cost.
Referring to fig. 1, a basic frame diagram of a millimeter wave radar behavior sensing method based on a dynamic lightweight network according to an embodiment of the present invention is shown. The whole framework of the embodiment comprises millimeter wave radar signal acquisition, data preprocessing, dynamic lightweight Slowfast network model construction and action recognition.
The millimeter wave radar behavior sensing method based on the dynamic lightweight network is described in detail below.
Referring to fig. 2, a flowchart of a millimeter wave radar behavior sensing method based on a dynamic lightweight network according to an embodiment of the present invention is shown.
In step 1, an IWR6843AOP evaluation board with a working frequency band of 60-64GHz manufactured by TI company is used for transmitting and receiving millimeter wave radar signals, and a PC equipped with mmWave studio of TI company is used for converting analog signals into digital signals and collecting frequency modulation continuous wave signals of the actions of the tested personnel. And acquiring radar micro Doppler data of different actions of different volunteers in different scenes by using equipment, and dividing a training set and a testing set.
In step 2, since the frequency response of the stationary object in the range-doppler frequency domain remains consistent with time, the average frequency response of the range-doppler frequency domain is used to roughly estimate the interference from the stationary object, and the average frequency response of the frequency domain is subtracted from the range-doppler frequency domain, and the obtained result is used as denoised data.
In step 3, performing Fast Fourier Transform (FFT) of a distance dimension on the original IQ signal, and further performing FFT of a Doppler dimension on the basis of the fast Fourier transform to obtain the speed of the target on a plurality of distance domains, namely distance-speed; and then obtaining the results of the distance-azimuth angle and the distance-pitch angle along all the horizontal and vertical receiving antennas by adopting angle FFT, and directly storing the distance-speed, the distance-azimuth angle and the distance-pitch angle matrix.
In step 4, one action sample comprises a plurality of frames, the distance-speed, the distance-azimuth angle and the distance-pitch angle matrix contained in each frame are connected into three-dimensional data, and three views are generated to describe the observation changes in the horizontal and vertical spaces; and finally, connecting the three-dimensional data generated by each frame into multi-frame video data.
In step 5, the attention mechanism and channel disruption based on dynamic convolution are integrated into a Slowfast network, and a dynamic lightweight Slowfast structure model is constructed. The dynamic lightweight network is composed of dynamic lightweight modules, wherein the specific structure diagram of the dynamic lightweight modules is shown in fig. 4. The specific construction of the dynamic lightweight swfast network is shown in fig. 5. Inputting training data and category labels into a dynamic lightweight swfast network to extract characteristics so as to judge the type of human body actions; the network model is then trained based on the loss values between the predicted and actual values by back-propagating the updated model parameters.
FIG. 4 is a block diagram of a dynamic lightweight module in an embodiment of the invention. Fig. 4 includes three blocks Block a-C, where DyDWConv represents dynamic depth convolution, dyConv represents dynamic point convolution, and Block a and B input all features into two branches respectively for feature extraction, but the steps in the gray area are different. The specific process of the modules Block A and B is that the left branch of the data input module of the upper layer respectively passes through a dynamic depth convolution layer, a normalization layer and a dynamic point convolution layer; the right branch of the input module passes through a dynamic point convolution layer, a normalization layer, an activation layer, a dynamic depth convolution layer, a normalization layer, a dynamic point convolution layer, a normalization layer and an activation layer respectively. And connecting the characteristic data obtained by the left branch and the right branch of the module and disturbing the channel to be used as the input of the next layer. The block C comprises the specific process of bisecting the data of the previous layer, wherein the right branch of one half of the input modules passes through a dynamic point convolution layer, a normalization layer, an activation layer, a dynamic depth convolution layer, a normalization layer, a dynamic point convolution layer, a normalization layer and an activation layer. The other half does not do any processing, is connected with the characteristic data obtained by the right branch, and breaks up the channel to be used as the input of the next layer. The dynamic depth convolution layer combines dynamic convolution and depth convolution, and the dynamic point convolution layer combines dynamic convolution and point convolution, and realizes network weight reduction and model precision improvement.
FIG. 5 is a diagram of a model structure of a dynamic lightweight swfast network in an embodiment of the present invention. It can be seen from fig. 5 that the dynamic lightweight sfast network is a five-stage network and is composed of fast branches and slow branches, which respectively represent data of fast and slow frame rates, and the fast branches are continuously fused to the slow branches, so as to extract feature information from the input data. The first stage, two branches extract a certain proportion of data frames respectively, send the data frames into a depth separable convolution layer, an activation layer and a maximum pooling layer, the second stage inputs a module Block A, the third stage and the fourth stage respectively inputs a module Block B and a module Block C, and the fifth stage respectively inputs a module Block B and an activation layer global average pooling layer; after each stage is finished, the data of the fast and slow branches are fused, the first four stages are fused, namely the fast branch data are connected with the slow branch data after being subjected to depth separable layers, and the fifth stage is fused to directly perform connection operation; and finally, outputting a final classification result by using the full connection layer and the softmax layer.
Further, in step 6, the test sample is input into the trained model, and the action category of the test sample is output, so as to realize human activity recognition.
Collecting action samples in an outdoor scene, wherein the action samples comprise five actions of box, wave, stand, walk and square; samples were collected indoors, containing five actions of wave, jump, throw, walk, square. An explanatory diagram of a confusion matrix of recognition effects in different scenes is shown in fig. 6, in which the horizontal axis is a predictive label and the vertical axis is a true label. Each value in the confusion matrix of fig. 6 represents the duty cycle that a certain class of action predicts as that class or another class, the larger the value of the diagonal, the higher the accuracy of the recognition. As can be seen from the two confusion matrices of fig. 6, the embodiment method can realize high-precision recognition of various actions of different scenes.
The parameter pairs of the original Slowfast model and the lightweight network according to the invention are shown in table 1. As can be seen from table 1, the dynamic lightweight network parameter and floating point operand are greatly reduced, and the lightweight design of the network is realized;
table 1 model complexity description of dynamic lightweight networks in embodiments of the invention
Figure SMS_1
The foregoing is merely a preferred embodiment of the present invention, and is not intended to limit the present invention, and all equivalent variations using the description and drawings of the present invention are within the scope of the present invention.

Claims (5)

1. The millimeter wave radar behavior sensing method based on the dynamic lightweight network is characterized by comprising the following steps of:
step 1, collecting radar micro Doppler data of different actions for different volunteers in different scenes, and dividing a training set and a testing set at the same time;
step 2, denoising the acquired radar micro Doppler data;
step 3, further processing the denoised data to generate matrix data of distance-speed, distance-elevation angle and distance-azimuth angle;
step 4, further processing the three-dimensional matrix to generate three-view data of the class video;
in step 4, the distance-speed matrix, the distance-azimuth angle matrix and the distance-pitch angle matrix of each frame are connected into a three-dimensional matrix, and then the three-dimensional matrices of all frames in one sample are connected along the time dimension to generate video-like data;
step 5, constructing a dynamic lightweight sfast network, inputting data of a training set into the network, and training a network model;
the dynamic lightweight sfast network comprises a fast branch and a slow branch, and the fast branch and the slow branch have the same structure and all comprise five stages; the first stage comprises a depth separable convolution layer, an activation layer and a maximum pooling layer which are sequentially arranged; the second stage comprises a dynamic lightweight module Block A; the third stage and the fourth stage comprise a dynamic lightweight module Block B and a dynamic lightweight module Block C which are sequentially arranged; the fifth stage comprises a dynamic lightweight module Block B, an activation layer and a global average pooling layer which are sequentially arranged; after each stage is finished, the data of the fast branch and the slow branch are fused, wherein the fusion of the first four stages is that the data of the fast branch is connected with the data of the slow branch after being subjected to depth separable layer, and the fusion of the fifth stage is that the data of the fast branch and the data of the slow branch are directly connected;
and 6, inputting the test sample into the trained model, and outputting the action category of the test sample to realize human activity recognition.
2. The millimeter wave radar behavior sensing method based on a dynamic lightweight network according to claim 1, wherein in step 2, interference from a stationary target is estimated, and an average frequency response of the frequency domain is eliminated, so as to generate denoised data.
3. The millimeter wave radar behavior sensing method based on the dynamic lightweight network according to claim 1, wherein in step 3, the original IQ signal is subjected to fast fourier transform of distance, doppler and antenna dimensions to obtain a distance-speed matrix, a distance-azimuth matrix and a distance-pitch angle matrix of the target.
4. The millimeter wave radar behavior sensing method based on the dynamic lightweight network according to claim 1, wherein the dynamic lightweight module Block a and the dynamic lightweight module Block B each comprise a left branch and a right branch, and data are respectively input into the left branch and the right branch; the left branch comprises a dynamic depth convolution layer, a normalization layer, a dynamic point convolution layer, a normalization layer and an activation layer which are sequentially arranged; the right branch comprises a dynamic point convolution layer, a normalization layer, an activation layer, a dynamic depth convolution layer, a normalization layer, a dynamic point convolution layer, a normalization layer and an activation layer which are sequentially arranged; and connecting characteristic data obtained by the left branch and the right branch of the dynamic lightweight modules Block A and Block B, and disturbing channels to be used as input of the next layer.
5. The millimeter wave radar behavior sensing method based on the dynamic lightweight network according to claim 1, wherein the dynamic lightweight module Block C comprises two branches, the two branches of which bisect the data of the previous layer, wherein half of the data is input into the right branch of the Block C and sequentially passes through a dynamic point convolution layer, a normalization layer, an activation layer, a dynamic depth convolution layer, a normalization layer, a dynamic point convolution layer, a normalization layer and an activation layer; the other half of the data is not processed and is connected with the characteristic data obtained by the right branch, and the channel is disturbed to be used as the input of the next layer.
CN202310041358.8A 2023-01-12 2023-01-12 Millimeter wave radar behavior sensing method based on dynamic lightweight network Active CN115856881B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310041358.8A CN115856881B (en) 2023-01-12 2023-01-12 Millimeter wave radar behavior sensing method based on dynamic lightweight network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310041358.8A CN115856881B (en) 2023-01-12 2023-01-12 Millimeter wave radar behavior sensing method based on dynamic lightweight network

Publications (2)

Publication Number Publication Date
CN115856881A CN115856881A (en) 2023-03-28
CN115856881B true CN115856881B (en) 2023-05-12

Family

ID=85657295

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310041358.8A Active CN115856881B (en) 2023-01-12 2023-01-12 Millimeter wave radar behavior sensing method based on dynamic lightweight network

Country Status (1)

Country Link
CN (1) CN115856881B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113963441A (en) * 2021-10-25 2022-01-21 中国科学技术大学 Cross-domain enhancement-based millimeter wave radar gesture recognition method and system
CN115048951A (en) * 2021-03-09 2022-09-13 深圳市万普拉斯科技有限公司 Millimeter wave radar-based gesture recognition method and device and terminal equipment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110765974B (en) * 2019-10-31 2023-05-02 复旦大学 Micro gesture recognition method based on millimeter wave radar and convolutional neural network
CN113989718A (en) * 2021-10-29 2022-01-28 南京邮电大学 Human body target detection method facing radar signal heat map
CN114445914A (en) * 2022-01-26 2022-05-06 厦门大学 Millimeter wave data automatic labeling method and system based on video
CN115063884B (en) * 2022-06-14 2024-04-23 电子科技大学 Millimeter wave radar head action recognition method based on multi-domain fusion deep learning
CN115343704A (en) * 2022-07-29 2022-11-15 中国地质大学(武汉) Gesture recognition method of FMCW millimeter wave radar based on multi-task learning
CN115294656A (en) * 2022-08-23 2022-11-04 南京邮电大学 FMCW radar-based hand key point tracking method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115048951A (en) * 2021-03-09 2022-09-13 深圳市万普拉斯科技有限公司 Millimeter wave radar-based gesture recognition method and device and terminal equipment
CN113963441A (en) * 2021-10-25 2022-01-21 中国科学技术大学 Cross-domain enhancement-based millimeter wave radar gesture recognition method and system

Also Published As

Publication number Publication date
CN115856881A (en) 2023-03-28

Similar Documents

Publication Publication Date Title
CN111123257B (en) Radar moving target multi-frame joint detection method based on graph space-time network
CN108710863A (en) Unmanned plane Scene Semantics dividing method based on deep learning and system
CN108764298B (en) Electric power image environment influence identification method based on single classifier
CN109063569A (en) A kind of semantic class change detecting method based on remote sensing image
CN109376589A (en) ROV deformation target and Small object recognition methods based on convolution kernel screening SSD network
CN113610905B (en) Deep learning remote sensing image registration method based on sub-image matching and application
He et al. Deep learning applications based on SDSS photometric data: detection and classification of sources
CN110619373A (en) Infrared multispectral weak target detection method based on BP neural network
CN111781599B (en) SAR moving ship target speed estimation method based on CV-EstNet
CN113435254A (en) Sentinel second image-based farmland deep learning extraction method
CN111104850A (en) Remote sensing image building automatic extraction method and system based on residual error network
CN114943902A (en) Urban vegetation unmanned aerial vehicle remote sensing classification method based on multi-scale feature perception network
Guo et al. Salient object detection from low contrast images based on local contrast enhancing and non-local feature learning
CN108875555A (en) Video interest neural network based region and well-marked target extraction and positioning system
Lu et al. Citrus green fruit detection via improved feature network extraction
CN115856881B (en) Millimeter wave radar behavior sensing method based on dynamic lightweight network
CN116883360B (en) Multi-scale double-channel-based fish shoal counting method
Jia et al. Polar-Net: Green fruit instance segmentation in complex orchard environment
CN116338628B (en) Laser radar sounding method and device based on learning architecture and electronic equipment
CN117173556A (en) Small sample SAR target recognition method based on twin neural network
CN116189021A (en) Multi-branch intercrossing attention-enhanced unmanned aerial vehicle multispectral target detection method
Feng et al. Fish abundance estimation from multi-beam sonar by improved MCNN
CN112668615B (en) Satellite cloud picture prediction method based on depth cross-scale extrapolation fusion
CN112099018B (en) Moving object detection method and device based on combination of radial speed and regional energy
CN107038706A (en) Infrared image confidence level estimation device and method based on adaptive mesh

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant