CN114972041A - Polarization radar image super-resolution reconstruction method and device based on residual error network - Google Patents
Polarization radar image super-resolution reconstruction method and device based on residual error network Download PDFInfo
- Publication number
- CN114972041A CN114972041A CN202210900724.6A CN202210900724A CN114972041A CN 114972041 A CN114972041 A CN 114972041A CN 202210900724 A CN202210900724 A CN 202210900724A CN 114972041 A CN114972041 A CN 114972041A
- Authority
- CN
- China
- Prior art keywords
- resolution
- data
- image
- image data
- module
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000010287 polarization Effects 0.000 title claims abstract description 74
- 238000000034 method Methods 0.000 title claims abstract description 40
- 238000003384 imaging method Methods 0.000 claims abstract description 63
- 238000012549 training Methods 0.000 claims abstract description 60
- 238000005070 sampling Methods 0.000 claims description 42
- 230000006870 function Effects 0.000 claims description 34
- 238000000605 extraction Methods 0.000 claims description 29
- 238000012545 processing Methods 0.000 claims description 26
- 238000012805 post-processing Methods 0.000 claims description 17
- 230000001427 coherent effect Effects 0.000 claims description 12
- 238000005215 recombination Methods 0.000 claims description 9
- 230000006798 recombination Effects 0.000 claims description 9
- 238000009825 accumulation Methods 0.000 claims description 7
- 238000010276 construction Methods 0.000 claims description 7
- 238000010606 normalization Methods 0.000 claims description 6
- 230000008707 rearrangement Effects 0.000 claims description 5
- 238000004088 simulation Methods 0.000 claims description 5
- 230000009467 reduction Effects 0.000 claims description 4
- 238000012360 testing method Methods 0.000 claims description 4
- 230000004913 activation Effects 0.000 claims description 3
- 230000008521 reorganization Effects 0.000 claims 1
- 239000011159 matrix material Substances 0.000 description 11
- 238000001514 detection method Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000013519 translation Methods 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 238000012512 characterization method Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4053—Scaling of whole images or parts thereof, e.g. expanding or contracting based on super-resolution, i.e. the output image resolution being higher than the sensor resolution
- G06T3/4076—Scaling of whole images or parts thereof, e.g. expanding or contracting based on super-resolution, i.e. the output image resolution being higher than the sensor resolution using the original low-resolution images to iteratively correct the high-resolution images
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/41—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4046—Scaling of whole images or parts thereof, e.g. expanding or contracting using neural networks
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- General Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Computational Linguistics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
The application relates to a polarization radar image super-resolution reconstruction method and device based on a residual error network, and belongs to the technical field of radar imaging remote sensing. The method comprises the following steps: constructing a polarized radar image training data set; inputting low-resolution image data in the training data set into a polarization radar image super-resolution reconstruction network model based on a residual error network for training to obtain a well-trained polarization radar image super-resolution reconstruction network model; and inputting the low-resolution polarization radar image to be reconstructed into the trained polarization radar image super-resolution reconstruction network model for super-resolution reconstruction to obtain a high-resolution polarization radar image. By adopting the method, according to the grouping residual convolution module and the residual attention trunk network module in the polarized radar image super-resolution network model, the correlation between the real part and the imaginary part in the image data and the correlation between the spatial domain and the multi-channel of the shallow layer and the deep layer characteristics of the image are respectively utilized, so that the super-resolution reconstruction precision is effectively improved.
Description
Technical Field
The application relates to the technical field of radar imaging remote sensing, in particular to a polarized radar image super-resolution reconstruction method and device based on a residual error network.
Background
The polarized radar can acquire multi-polarization scattering information of the target, and is beneficial to the interpretation of a target scattering mechanism and the inversion of characteristic parameters. Among them, as a typical imaging Radar, Inverse Synthetic Aperture Radar (ISAR) can observe and monitor a space target by transmitting a large bandwidth signal, and plays an important role in maintaining space security.
The high-resolution image in the polarized radar image contains more target detail information, and target detection and classification are facilitated. However, generating High Resolution (HR) radar images requires a large bandwidth and a large coherent integration angle, which is limited by the radar system. In addition, limited by the hardware condition of the radar system, the increase of the resolution ratio can reduce the imaging width. Therefore, the method has important significance for improving the resolution of the polarized radar image without increasing the cost and simultaneously maintaining the original imaging width. Currently, a supervised learning method is often used in the super-Resolution field in computer vision, which requires constructing Low Resolution (LR) and high Resolution image data pairs so that a network model can learn a mapping relationship of LR image data to HR image data. For the generation of LR image data, HR image data is usually processed by using a down-sampling function or constructing a degraded network. However, none of these studies have generated corresponding LR and HR image data pairs for radar system parameters.
In addition, compared with an optical image, a polarization radar image has the characteristics of a high dynamic range, complex data, multiple channels and the like, and a good effect cannot be achieved when a network model in computer vision is directly migrated to the polarization radar image processing, so that it is necessary to construct a network model which can simultaneously consider the characteristics of the high dynamic range, the complex data, the multiple channels and the like of the polarization radar image to carry out super-resolution reconstruction on the polarization radar image.
Disclosure of Invention
In view of the foregoing, it is necessary to provide a method and an apparatus for polarization radar image super-resolution reconstruction based on a residual error network.
A polarized radar image super-resolution reconstruction method based on a residual error network comprises the following steps:
imaging processing is carried out on echo data of an observation target, and a polarimetric radar image training data set is obtained; the polarimetric radar image training data set comprises a plurality of groups of data pairs consisting of high-resolution image data and low-resolution image data obtained by imaging, and the high-resolution image data and the low-resolution image data are complex data;
inputting low-resolution image data into a pre-constructed polarization radar image super-resolution reconstruction network model based on a residual error network, and outputting high-resolution reconstruction image data; the polarization radar image super-resolution reconstruction network model comprises a grouping residual error convolution module, a residual error attention backbone network module, an up-sampling module and a post-processing module; performing data recombination on a real part and an imaginary part in the low-resolution image data according to a grouping residual convolution module to obtain total real data, and performing feature extraction on the total real data to obtain image shallow layer features; according to the residual attention backbone network module, deep feature extraction, spatial attention weighting and channel attention weighting are carried out on the image shallow feature to obtain weighted features; according to the up-sampling module, up-sampling is carried out on the weighted features to obtain up-sampling features; processing the up-sampling characteristics according to a post-processing module to obtain high-resolution reconstructed image data;
inputting the high-resolution reconstructed image data and the high-resolution image data into a pre-constructed loss function, and training the polarization radar image super-resolution reconstruction network model according to the loss function to obtain a trained polarization radar image super-resolution reconstruction network model;
and inputting the low-resolution polarization radar image to be reconstructed into the trained polarization radar image super-resolution reconstruction network model for super-resolution reconstruction to obtain a high-resolution polarization radar image.
In one embodiment, the method further comprises:
establishing a polarized radar observation model; wherein the polarized radar observation model comprises two or more basic scattering structures;
and performing data simulation on the observation target according to a basic scattering structure observation target in the polarization radar observation model and electromagnetic simulation software to obtain echo data of the observation target with different bandwidths at different observation angles.
In one embodiment, the imaging processing of the echo data of the observation target to obtain a polarimetric radar image training data set includes:
and imaging the echo data according to the central imaging angle, the imaging aperture and the bandwidth of the echo data to obtain multiple groups of data pairs including high-resolution image data and low-resolution image data, and forming a polarimetric radar image training data set by the multiple groups of data pairs.
In one embodiment, the imaging processing of the echo data according to the central imaging angle, the imaging aperture and the bandwidth of the echo data to obtain a plurality of sets of data pairs including high resolution image data and low resolution image data includes:
calculating to obtain an imaging aperture required by echo data according to the azimuth resolution, and determining a coherent accumulation angle required by the echo data according to the imaging aperture and the central imaging angle;
calculating a bandwidth value required by the echo data according to the distance direction resolution;
imaging the echo data according to the angle interval of the coherent accumulation angle and the bandwidth value to obtain a plurality of groups of data pairs including high-resolution image data and low-resolution image data; the angle interval comprises an angle interval corresponding to the high-resolution image data and an angle interval corresponding to the low-resolution image data, and the central imaging angles of the same group of data pairs formed by the high-resolution image data and the low-resolution image data are the same.
In one embodiment, the data reconstruction of the real part and the imaginary part in the low-resolution image data according to the grouping residual convolution module to obtain total real data, and the feature extraction of the total real data to obtain the image shallow feature includes:
splitting a real part and an imaginary part in the low-resolution image data and carrying out modulus value processing to obtain total real data, and carrying out grouping processing to the total real data to obtain a plurality of groups of real data;
normalizing the multiple groups of real number data according to the maximum value and the minimum value in the multiple groups of real number data to obtain multiple groups of normalized real number data;
inputting a plurality of groups of normalized real number data into a residual error attention unit for feature extraction to obtain a plurality of groups of shallow layer features, wherein the residual error attention unit comprises two convolution layers, an activation layer and an attention module;
and splicing the multiple groups of shallow features in the channel dimension to obtain the image shallow features.
In one embodiment, the residual attention backbone network module comprises a plurality of residual attention units;
according to the residual attention backbone network module, deep feature extraction, space attention weighting and channel attention weighting are carried out on the image shallow feature to obtain weighted features, wherein the weighted features comprise:
and sequentially carrying out deep feature extraction, space attention weighting and channel attention weighting on the image shallow feature according to the residual attention units to obtain a weighted image deep feature, and adding the weighted image deep feature and the image shallow feature to obtain a weighted feature.
In one embodiment, the upsampling module comprises two convolution layers, a sub-pixel rearrangement layer;
according to the up-sampling module, up-sampling the weighted features to obtain up-sampling features, including:
and sequentially carrying out channel expansion, channel reduction and characteristic dimension reduction on the weighted characteristics according to the first convolutional layer, the sub-pixel rearrangement layer and the second convolutional layer to obtain the up-sampling characteristics.
In one embodiment, the upsampling feature is processed according to a post-processing module to obtain high-resolution reconstructed image data:
the post-processing module carries out inverse normalization processing on the up-sampling characteristics according to the maximum value and the minimum value in the multiple groups of real number data in the grouped residual convolution module to obtain inverse normalization data;
and performing data recombination on the reverse normalized data to obtain high-resolution reconstructed image data.
In one embodiment, inputting the high-resolution reconstructed image data and the high-resolution image data into a loss function which is constructed in advance, and training the polarized radar image super-resolution reconstruction network model according to the loss function to obtain a trained polarized radar image super-resolution reconstruction network model, the method comprises the following steps:
inputting the high-resolution reconstructed image data and the high-resolution image data into a pre-constructed loss function, and calculating a single-channel loss function in the polarimetric radar image training data set according to the loss function;
carrying out weighted summation on the single-channel loss function to obtain a multi-channel weighted loss function;
and training the polarization radar image super-resolution reconstruction network model according to the multi-channel weighting loss function to obtain the trained polarization radar image super-resolution reconstruction network model.
A polarized radar image super-resolution reconstruction apparatus based on a residual error network, the apparatus comprising:
the training set construction module is used for carrying out imaging processing on echo data of an observation target to obtain a polarimetric radar image training data set; the polarimetric radar image training data set comprises a plurality of groups of data pairs consisting of high-resolution image data and low-resolution image data obtained by imaging, and the high-resolution image data and the low-resolution image data are complex data;
the model construction module is used for inputting low-resolution image data into a pre-constructed polarization radar image super-resolution reconstruction network model based on a residual error network and outputting high-resolution reconstruction image data; the polarization radar image super-resolution reconstruction network model comprises a grouping residual error convolution module, a residual error attention backbone network module, an up-sampling module and a post-processing module; performing data recombination on a real part and an imaginary part in the low-resolution image data according to a grouping residual convolution module to obtain total real data, and performing feature extraction on the total real data to obtain image shallow layer features; according to the residual attention backbone network module, deep feature extraction, space attention weighting and channel attention weighting are carried out on the image shallow feature to obtain weighted features; according to the up-sampling module, up-sampling is carried out on the weighted features to obtain up-sampling features; processing the up-sampling characteristics according to a post-processing module to obtain high-resolution reconstructed image data;
the model training module is used for inputting the high-resolution reconstructed image data and the high-resolution image data into a loss function which is constructed in advance, and training the polarized radar image super-resolution reconstruction network model according to the loss function to obtain a trained polarized radar image super-resolution reconstruction network model;
and the test module is used for inputting the low-resolution polarization radar image to be reconstructed into the trained polarization radar image super-resolution reconstruction network model for super-resolution reconstruction to obtain the high-resolution polarization radar image.
According to the polarization radar image super-resolution reconstruction method and device based on the residual error network, the polarization radar image training data set comprising high-resolution image data and low-resolution image data is obtained by imaging the echo data of the observation target; inputting a polarized radar image training data set into a pre-constructed polarized radar image super-resolution reconstruction network model based on a residual error network for training to obtain a trained polarized radar image super-resolution reconstruction network model, wherein a grouped residual error convolution module in the polarized radar image super-resolution reconstruction network model performs data reconstruction and feature extraction on low-resolution image data according to the correlation between the real part and the imaginary part of the low-resolution image data, so that the error of image shallow feature extraction is effectively reduced; according to the correlation between the spatial domain and multiple channels of the image shallow feature, effective reconstruction of spatial information and channel information is achieved by deep feature extraction, spatial attention weighting and channel attention weighting on the image shallow feature, so that the super-resolution reconstruction precision is effectively improved, and important technical support is provided for detection and identification of subsequent radar targets.
Drawings
FIG. 1 is a schematic flowchart of a super-resolution reconstruction method for a polarization radar image based on a residual error network in an embodiment;
FIG. 2 is a schematic structural diagram of a polarization radar image super-resolution reconstruction network model based on a residual error network in an embodiment;
FIG. 3 is a diagram illustrating a residual attention unit in one embodiment;
fig. 4 is a schematic structural diagram of a residual attention backbone network module in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In one embodiment, as shown in fig. 1, there is provided a polarization radar image super-resolution reconstruction method based on a residual error network, including the following steps:
step S1, imaging the echo data of the observation target to obtain a polarimetric radar image training data set; the polarimetric radar image training data set comprises a plurality of groups of data pairs consisting of high-resolution image data and low-resolution image data obtained by imaging, and the high-resolution image data and the low-resolution image data are complex data.
It can be understood that echo data of different imaging apertures and different bandwidths of an observation target under different observation angles are subjected to imaging processing to obtain high-resolution and low-resolution image data pairs corresponding to polarization radar system parameters, and a plurality of groups of image data pairs are integrated into a training set to provide a super-resolution data set capable of being trained and tested for a polarization radar image super-resolution reconstruction network model. For the constructed polarized radar image training data set, the data set can be enhanced or amplified by using data enhancement means such as rotation, cropping and translation.
Step S2, inputting low-resolution image data into a pre-constructed polarization radar image super-resolution reconstruction network model based on a residual error network, and outputting high-resolution reconstruction image data; the polarization radar image super-resolution reconstruction network model comprises a grouping residual error convolution module, a residual error attention main network module, an up-sampling module and a post-processing module.
The method can be understood that the real part and the imaginary part of the low-resolution image data are fully utilized by carrying out data recombination on the real part and the imaginary part in the low-resolution image data according to a grouping residual convolution module in the polarized radar image super-resolution reconstruction network model, so that the image shallow feature in the low-resolution image data is effectively extracted; according to the residual attention backbone network module, deep feature extraction, spatial attention weighting and channel attention weighting are carried out on the image shallow feature, and the correlation between spatial domains and multiple channels in the image shallow feature and the deep feature is fully utilized, so that effective reconstruction of spatial information and channel information in the image shallow feature is guaranteed.
And step S3, inputting the high-resolution reconstructed image data and the high-resolution image data into a loss function which is constructed in advance, and training the polarized radar image super-resolution reconstruction network model according to the loss function to obtain the trained polarized radar image super-resolution reconstruction network model.
It can be understood that according to the characteristic that high-resolution reconstructed image data output by the polarized radar image super-resolution reconstruction network model is multi-channel data, the pre-constructed loss function is a multi-channel weighted loss function.
And step S4, inputting the low-resolution polarized radar image to be reconstructed into the trained polarized radar image super-resolution reconstruction network model for super-resolution reconstruction, and obtaining the high-resolution polarized radar image.
The super-resolution reconstruction network model of the polarimetric radar image provided by the invention can also be popularized to other polarimetric radar image related fields such as polarimetric interference SAR images, dual-polarization SAR images and fully-polarized SAR images.
According to the polarization radar image super-resolution reconstruction method based on the residual error network, the polarization radar image training data set comprising high-resolution image data and low-resolution image data is obtained by imaging the echo data of an observation target; inputting a polarized radar image training data set into a pre-constructed polarized radar image super-resolution reconstruction network model based on a residual error network for training to obtain a trained polarized radar image super-resolution reconstruction network model, wherein a grouped residual error convolution module in the polarized radar image super-resolution reconstruction network model performs data reconstruction and feature extraction on low-resolution image data according to the correlation between the real part and the imaginary part of the low-resolution image data, so that the error of image shallow feature extraction is effectively reduced; the residual attention backbone network module in the polarization radar image super-resolution reconstruction network model realizes effective reconstruction of spatial information and channel information by performing deep feature extraction, spatial attention weighting and channel attention weighting on shallow features of an image according to correlation between the spatial domain and multiple channels of the shallow features of the image, so that the super-resolution reconstruction precision is effectively improved.
In one embodiment, the method further comprises: establishing a polarized radar observation model; the polarized radar observation model comprises two or more basic scattering structures, the types of the basic scattering structures can be the same or different, and the basic scattering structures are arranged in an asymmetrical L shape; and performing data simulation on the observation target according to a basic scattering structure observation target in the polarization radar observation model and electromagnetic simulation software to obtain echo data of the observation target with different bandwidths at different observation angles.
Specifically, taking a polarization ISAR system in a polarization radar system as an example, polarization ISAR observation models are established, and polarization ISAR observation models of a cube, a cylinder, a dihedron, a flat plate, a top cap, a narrow dihedral angle, a sphere and a three-plane angle structure are respectively established, wherein each observation model comprises three same basic scattering structures consistent with an observation target structure;
observing corresponding targets according to basic scattering structures in each observation model, performing data simulation on the observation targets according to electromagnetic simulation software, and respectively simulating to obtain echo data of eight observation targets of a cube, a cylinder, a dihedron, a flat plate, a top cap, a narrow dihedral angle, a sphere and a three-plane angle structure, wherein the pitch angle of the eight observation targets is 45 degrees and 60 degrees, the azimuth angle of the eight observation targets is-90 degrees to 90 degrees, and the angle step selection of the azimuth angle is 0.2 degrees.
It can be understood that three identical basic scattering structures are arranged in the polarization radar observation model, the resolution degradation conditions of the observation target in the azimuth direction and the distance direction can be obtained, and the three identical basic scattering structures are arranged into the L-shaped arrangement in the asymmetric arrangement, so that more training samples can be obtained through simulation.
In one embodiment, the echo data is subjected to imaging processing according to the central imaging angle, the imaging aperture and the bandwidth of the echo data to obtain a plurality of groups of data pairs including high-resolution image data and low-resolution image data, and the plurality of groups of data pairs form a polarized radar image training data set.
In one embodiment, the high resolution imaging aperture required to obtain the echo data is calculated based on the azimuthal resolutionAccording to the central imaging angleAnd high resolution imaging apertureThe angle interval in which the coherent accumulation angle corresponding to the high-resolution image data required for calculating the echo data is located isThe bandwidth value required for obtaining echo data by calculating the distance resolution isTo be aligned atWithin the range ofThe echo data of the bandwidth is imaged to obtain the size ofOf the high resolution image dataWherein,andindicating the number of channels, height and width,irepresenting the first in a polarimetric radar image training datasetiA piece of data;
calculating the low-resolution imaging aperture required by the echo data according to the azimuth resolutionAccording to the central imaging angleAnd a low resolution imaging apertureThe angle interval in which the coherent accumulation angle corresponding to the low-resolution image data required for obtaining the echo data is calculated isCalculating echo data according to the distance resolutionThe required bandwidth value isSelecting coherent accumulation angle inWithin the range ofThe echo data of the bandwidth is imaged to obtain the size ofLow resolution image data ofWherein,andrespectively representing the number, height and width of channels of the low-resolution image data, an,;
Will have the same central imaging angleThe high-resolution image data and the low-resolution image data form a group of training data pairs, and echo data of different central imaging angles are processed in the same way to obtain echo dataSet training data pairs according toThe set of training data pairs constitutes a polarimetric radar image training data set, denoted as。
Specifically, imaging processing is respectively carried out on echo data of eight observation targets by utilizing a polar coordinate formatted imaging algorithm, wherein the echo data with 4GHz bandwidth and 10-degree imaging aperture are selected for imaging to obtain echo data with the size ofHR image data of (2); selecting echo data with 2GHz bandwidth and 5-degree imaging aperture for imaging to obtain the echo data with the size ofLR image data of (1).
In one embodiment, the coherent matrix is based on polarizationRepresenting low resolution image data in a polarimetric radar image training datasetPolarising the coherence matrix under reciprocity conditionsFrom Pauli vectorIs constructed as shown in
Wherein the polarized coherent matrixIs composed ofThe matrix is a matrix of a plurality of matrices,representing a polarized coherence matrixTo (1)And row and columnElements of a column, superscripts of elementsRepresenting conjugate operations of data due to polarized coherent matrixThe lower triangular element of (1) is the conjugate of the upper triangular element, and only the following six elements in the matrix need to be consideredWhereinin the case of non-negative real data,is complex data;
as shown in fig. 2, the low resolution image data of the number of channels 6Inputting a polarized radar image super-resolution reconstruction network model based on a residual error network, wherein,,the defined field representing the data is a complex field,respectively representing the number, height and width of channels of data, the data in each channel being;
Firstly, according to the grouping residual error convolution module pair in the network modelThe real part and the imaginary part are split and subjected to modulus value processing to obtain total real data with 12 channelsWhereinthe domain of definition representing the data is a real number domain,comprises the following elements
Then, the total real data is processedGrouping to obtain four groups of real dataAre respectively as、、And. Wherein,respectively representing the processes of taking imaginary part, taking real part and taking modulus value to the data, and superscriptingIs shown asGrouping real number data;
then according to the maximum value in the four groups of real dataAnd minimum valueNormalizing the four groups of real data to obtain four groups of normalized real dataIs shown as
four groups of normalized real dataInputting Residual Attention Unit (RESA) to perform feature extraction to obtain four groups of shallow featuresIs shown as
Wherein,for the functional representation of RESA, the residual attention unit is shown in FIG. 3, and includes two convolutional layers, an active layer, and an attention module;
finally, four groups of shallow layer characteristicsSplicing in channel dimension to obtain shallow image features。
Specifically, a network model is trained by using a polarized radar image data set with a pitch angle of 45 degrees; in the network model, the RESA modules in the grouped residual convolution module can be replaced by other pre-trained feature extraction modules to extract shallow features, and the number of the RESA modules can be increased or decreased; an activation layer in the RESA module selects a parameterized modified Linear Unit (PReLU); the attention mechanism can be selected from a non-local attention mechanism, or can be replaced by other attention models; adjustment coefficients in normalization and denormalization processesIs chosen to be 255.
It is worth noting that for the input data form of the present invention, in addition toThe data characterization can be carried out by using elements in the polarization coherent matrix and the polarization scattering matrixAnd scattering matrix by polarizationElements in the derived other statistics are data-characterized.
In one embodiment, as shown in FIG. 4, the shallow features of the image areInput is composed ofThe residual error attention unit is connected in series to form a residual error attention main network module, deep feature extraction, space attention weighting and channel attention weighting are sequentially carried out on the image shallow feature according to the residual error attention units to obtain a weighted image deep feature, and the weighted image deep feature and the input image shallow feature are combinedAdding to obtain a weighted featureIs shown as
In one embodiment, the features are weightedThe up-sampling module is input and comprises two convolution layers and a sub-pixel rearrangement layer which are mutually connected in series; according to the first convolutional layer to be inputChannel number expansion ofThe number of times of the total number of the parts,the super-resolution is set as a super-resolution multiple; reducing the number of channels of input data to original according to a sub-pixel rearrangement layerAnd will beIn each channelImage expansion of size toRealizing the up-sampling of data; finally, the characteristic dimension of the image is reduced to 9 through a second convolution layer, and the output up-sampling characteristic is obtainedWherein。
in one embodiment, the upsampling featureInputting the data into a post-processing module, wherein the post-processing module is used for convolving the data according to the maximum value in multiple groups of real numbers in the grouped residual errorAnd minimum valueTo the up-sampling featurePerforming inverse normalization to obtain inverse normalized dataIs shown as
Then, the data is denormalizedThe data is reassembled, specifically,wherein the elements of each channel are respectivelyWill beReconstruction into high resolution reconstructed image dataIs shown as
in one embodiment, high resolution reconstructed image dataAnd high resolution image dataInputting a pre-constructed loss functionLoss functionIs defined as
according to a loss functionComputing a single-channel loss function in a polarimetric radar image training datasetFor single channel loss functionCarrying out weighted summation to obtain a multi-channel weighted loss functionIs represented as
Wherein,is shown asA training pairThe maximum value in the channel is the maximum value,is the total number of channels;
according to a multi-channel weighted loss functionAnd training the polarization radar image super-resolution reconstruction network model to obtain the trained polarization radar image super-resolution reconstruction network model.
Specifically, the network model optimizer selects an Adaptive Moment Estimation (ADAM) optimizer, wherein the optimizer parameters are set to be、And。
in order To further verify the beneficial effect of the polarization radar image super-resolution reconstruction method based on the residual error network provided by the invention, in a specific embodiment, for an observation target with a three-face angle structure, the method and a Bicubic (Bicubic) method according To the invention respectively perform super-resolution reconstruction processing on a low-resolution polarization radar image with a pitch angle of 60 degrees, a Peak Signal-To-Noise Ratio (PSNR) index is adopted To quantitatively evaluate a total backscattering power characteristic SPAN of a reconstruction result, and average PSNR indexes of eight kinds of observation targets obtained by different methods are shown in table 1.
TABLE 1 average PSNR indicators obtained by different methods
As can be seen from table 1, compared with the Bicubic method, the average PSNR index obtained by the method provided by the present invention is improved by 4.12dB, that is, the reconstruction result obtained by the method provided by the present invention has a higher peak signal-to-noise ratio, the reconstruction accuracy is higher, and an important technical support can be provided for the subsequent detection and identification of the radar target.
It should be understood that, although the steps in the flowchart of fig. 1 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 1 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternately with other steps or at least some of the sub-steps or stages of other steps.
In one embodiment, a polarized radar image super-resolution reconstruction apparatus based on a residual error network is provided, which includes: training set construction module, model training module and test module, wherein:
the training set construction module is used for carrying out imaging processing on echo data of an observation target to obtain a polarimetric radar image training data set; the polarimetric radar image training data set comprises a plurality of groups of data pairs consisting of high-resolution image data and low-resolution image data obtained by imaging, and the high-resolution image data and the low-resolution image data are complex data.
It can be understood that echo data of different imaging apertures and different bandwidths of an observation target under different observation angles are subjected to imaging processing to obtain high-resolution and low-resolution image data pairs corresponding to polarization radar system parameters, and a plurality of groups of image data pairs are integrated into a training set to provide a super-resolution data set capable of being trained and tested for a polarization radar image super-resolution reconstruction network model. For the constructed polarized radar image training data set, the data set can be enhanced or amplified by using data enhancement means such as rotation, cropping and translation.
The model construction module is used for inputting low-resolution image data into a pre-constructed polarization radar image super-resolution reconstruction network model based on a residual error network and outputting high-resolution reconstruction image data; the polarization radar image super-resolution reconstruction network model comprises a grouping residual error convolution module, a residual error attention backbone network module, an up-sampling module and a post-processing module; performing data recombination on a real part and an imaginary part in the low-resolution image data according to a grouping residual convolution module to obtain total real data, and performing feature extraction on the total real data to obtain image shallow layer features; according to the residual attention backbone network module, deep feature extraction, space attention weighting and channel attention weighting are carried out on the image shallow feature to obtain weighted features; according to the up-sampling module, up-sampling is carried out on the weighted features to obtain up-sampling features; and processing the up-sampling characteristics according to a post-processing module to obtain high-resolution reconstructed image data.
The method can be understood that the real part and the imaginary part of the low-resolution image data are fully utilized by carrying out data recombination on the real part and the imaginary part in the low-resolution image data according to a grouping residual convolution module in the polarized radar image super-resolution reconstruction network model, so that the image shallow feature in the low-resolution image data is effectively extracted; according to the residual attention backbone network module, deep feature extraction, spatial attention weighting and channel attention weighting are carried out on the image shallow feature, and the correlation between spatial domains and multiple channels in the image shallow feature and the deep feature is fully utilized, so that effective reconstruction of spatial information and channel information in the image shallow feature is guaranteed.
And the model training module is used for inputting the high-resolution reconstructed image data and the high-resolution image data into a loss function which is constructed in advance, and training the polarized radar image super-resolution reconstruction network model according to the loss function to obtain the trained polarized radar image super-resolution reconstruction network model.
It can be understood that according to the characteristic that high-resolution reconstructed image data output by the polarized radar image super-resolution reconstruction network model is multi-channel data, the pre-constructed loss function is a multi-channel weighted loss function.
And the test module is used for inputting the low-resolution polarization radar image to be reconstructed into the trained polarization radar image super-resolution reconstruction network model for super-resolution reconstruction to obtain the high-resolution polarization radar image.
The super-resolution reconstruction network model of the polarimetric radar image provided by the invention can also be popularized to other polarimetric radar image related fields such as polarimetric interference SAR images, dual-polarization SAR images and fully-polarized SAR images.
The specific limitations of the polarization radar image super-resolution reconstruction device based on the residual error network can refer to the limitations of the polarization radar image super-resolution reconstruction method based on the residual error network, and are not described herein again. All or part of the modules in the polarization radar image super-resolution reconstruction device based on the residual error network can be realized through software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is specific and detailed, but not to be understood as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.
Claims (10)
1. A polarized radar image super-resolution reconstruction method based on a residual error network is characterized by comprising the following steps:
imaging processing is carried out on echo data of an observation target, and a polarimetric radar image training data set is obtained; the polarimetric radar image training data set comprises a plurality of groups of data pairs consisting of high-resolution image data and low-resolution image data obtained by imaging, and the high-resolution image data and the low-resolution image data are complex data;
inputting the low-resolution image data into a pre-constructed polarization radar image super-resolution reconstruction network model based on a residual error network, and outputting high-resolution reconstruction image data; the polarization radar image super-resolution reconstruction network model comprises a grouping residual error convolution module, a residual error attention main network module, an up-sampling module and a post-processing module; performing data recombination on a real part and an imaginary part in the low-resolution image data according to the grouped residual convolution module to obtain total real data, and performing feature extraction on the total real data to obtain image shallow layer features; carrying out deep feature extraction, spatial attention weighting and channel attention weighting on the image shallow feature according to the residual attention backbone network module to obtain weighted features; according to the up-sampling module, up-sampling is carried out on the weighted features to obtain up-sampling features; processing the up-sampling characteristics according to the post-processing module to obtain high-resolution reconstructed image data;
inputting the high-resolution reconstructed image data and the high-resolution image data into a pre-constructed loss function, and training the polarized radar image super-resolution reconstruction network model according to the loss function to obtain a trained polarized radar image super-resolution reconstruction network model;
and inputting the low-resolution polarization radar image to be reconstructed into the trained polarization radar image super-resolution reconstruction network model for super-resolution reconstruction to obtain a high-resolution polarization radar image.
2. The method of claim 1, further comprising:
establishing a polarized radar observation model; wherein the polarized radar observation model comprises two or more basic scattering structures;
and performing data simulation on the observation target according to the basic scattering structure observation target in the polarization radar observation model and electromagnetic simulation software to obtain echo data of the observation target under different observation angles and with different bandwidths.
3. The method of claim 1, wherein imaging echo data of an observed target to obtain a polarimetric radar image training dataset comprises:
and imaging the echo data according to the central imaging angle, the imaging aperture and the bandwidth of the echo data to obtain a plurality of groups of data pairs including high-resolution image data and low-resolution image data, and forming the plurality of groups of data pairs into the polarized radar image training data set.
4. The method of claim 3, wherein the image processing of the echo data according to the central imaging angle, imaging aperture and bandwidth of the echo data to obtain a plurality of sets of data pairs including high resolution image data and low resolution image data comprises:
calculating to obtain an imaging aperture required by the echo data according to the azimuth resolution, and determining a coherent accumulation angle required by the echo data according to the imaging aperture and the central imaging angle;
calculating a bandwidth value required by the echo data according to the distance resolution;
imaging the echo data according to the angle interval of the coherent accumulation angle and the bandwidth value to obtain a plurality of groups of data pairs including high-resolution image data and low-resolution image data; the angle interval comprises an angle interval corresponding to the high-resolution image data and an angle interval corresponding to the low-resolution image data, and the central imaging angles of the same group of data pairs formed by the high-resolution image data and the low-resolution image data are the same.
5. The method according to any one of claims 1 to 4, wherein the performing data reorganization on a real part and an imaginary part in the low resolution image data according to the grouped residual convolution module to obtain total real data, and performing feature extraction on the total real data to obtain image shallow features comprises:
splitting a real part and an imaginary part in the low-resolution image data and performing modulus value processing to obtain total real data, and performing grouping processing to the total real data to obtain multiple groups of real data;
normalizing the multiple groups of real number data according to the maximum value and the minimum value in the multiple groups of real number data to obtain multiple groups of normalized real number data;
inputting the multiple groups of normalized real number data into a residual error attention unit for feature extraction to obtain multiple groups of shallow layer features, wherein the residual error attention unit comprises two convolution layers, an activation layer and an attention module;
and splicing the multiple groups of shallow features in the channel dimension to obtain the image shallow features.
6. The method of claim 1, wherein the residual attention backbone network module comprises a plurality of residual attention units;
according to the residual attention main network module, deep feature extraction, space attention weighting and channel attention weighting are carried out on the image shallow feature to obtain weighted features, and the method comprises the following steps:
and sequentially carrying out deep feature extraction, space attention weighting and channel attention weighting on the image shallow feature according to the residual attention units to obtain a weighted image deep feature, and adding the weighted image deep feature and the image shallow feature to obtain the weighted feature.
7. The method of claim 1, wherein the upsampling module comprises two convolutional layers, a sub-pel reordering layer;
according to the up-sampling module, up-sampling the weighted features to obtain up-sampling features, and the up-sampling features comprise:
and sequentially performing channel expansion, channel reduction and characteristic dimension reduction on the weighted characteristic according to the first convolutional layer, the sub-pixel rearrangement layer and the second convolutional layer to obtain the up-sampling characteristic.
8. The method of claim 1, wherein the upsampled features are processed according to the post-processing module to obtain high resolution reconstructed image data:
the post-processing module carries out reverse normalization processing on the up-sampling characteristics according to the maximum value and the minimum value in the multiple groups of real number data in the grouped residual convolution module to obtain reverse normalization data;
and performing data recombination on the reverse normalized data to obtain the high-resolution reconstructed image data.
9. The method according to claim 1, wherein the inputting the high resolution reconstructed image data and the high resolution image data into a pre-constructed loss function, and training the polar radar image super-resolution reconstruction network model according to the loss function to obtain a trained polar radar image super-resolution reconstruction network model comprises:
inputting the high-resolution reconstructed image data and the high-resolution image data into a pre-constructed loss function, and calculating a single-channel loss function in the polarized radar image training data set according to the loss function;
carrying out weighted summation on the single-channel loss function to obtain a multi-channel weighted loss function;
and training the polarized radar image super-resolution reconstruction network model according to the multi-channel weighting loss function to obtain the trained polarized radar image super-resolution reconstruction network model.
10. A polarized radar image super-resolution reconstruction device based on a residual error network is characterized by comprising:
the training set construction module is used for performing imaging processing on echo data of an observation target to obtain a polarization radar image training data set; the polarimetric radar image training data set comprises a plurality of groups of data pairs consisting of high-resolution image data and low-resolution image data obtained by imaging, wherein the high-resolution image data and the low-resolution image data are complex data;
the model construction module is used for inputting the low-resolution image data into a pre-constructed polarization radar image super-resolution reconstruction network model based on a residual error network and outputting high-resolution reconstruction image data; the polarization radar image super-resolution reconstruction network model comprises a grouping residual error convolution module, a residual error attention backbone network module, an up-sampling module and a post-processing module; performing data recombination on a real part and an imaginary part in the low-resolution image data according to the grouped residual convolution module to obtain total real data, and performing feature extraction on the total real data to obtain image shallow layer features; carrying out deep feature extraction, spatial attention weighting and channel attention weighting on the image shallow feature according to the residual attention backbone network module to obtain weighted features; according to the up-sampling module, up-sampling is carried out on the weighted features to obtain up-sampling features; processing the up-sampling characteristics according to the post-processing module to obtain high-resolution reconstructed image data;
the model training module is used for inputting the high-resolution reconstructed image data and the high-resolution image data into a pre-constructed loss function, and training the polarized radar image super-resolution reconstruction network model according to the loss function to obtain a trained polarized radar image super-resolution reconstruction network model;
and the test module is used for inputting the low-resolution polarization radar image to be reconstructed into the trained polarization radar image super-resolution reconstruction network model for super-resolution reconstruction to obtain a high-resolution polarization radar image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210900724.6A CN114972041B (en) | 2022-07-28 | 2022-07-28 | Polarization radar image super-resolution reconstruction method and device based on residual error network |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210900724.6A CN114972041B (en) | 2022-07-28 | 2022-07-28 | Polarization radar image super-resolution reconstruction method and device based on residual error network |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114972041A true CN114972041A (en) | 2022-08-30 |
CN114972041B CN114972041B (en) | 2022-10-21 |
Family
ID=82969899
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210900724.6A Active CN114972041B (en) | 2022-07-28 | 2022-07-28 | Polarization radar image super-resolution reconstruction method and device based on residual error network |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114972041B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116128727A (en) * | 2023-02-02 | 2023-05-16 | 中国人民解放军国防科技大学 | Super-resolution method, system, equipment and medium for polarized radar image |
WO2024136129A1 (en) * | 2022-12-22 | 2024-06-27 | 오픈엣지테크놀로지 주식회사 | Network parameter correction method for neural network operating in integer type npu, and device therefor |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110221630A1 (en) * | 2010-03-12 | 2011-09-15 | The Boeing Company | Super-resolution imaging radar |
CN104122549A (en) * | 2014-07-21 | 2014-10-29 | 电子科技大学 | Deconvolution based radar angle super-resolution imaging method |
CN112419155A (en) * | 2020-11-26 | 2021-02-26 | 武汉大学 | Super-resolution reconstruction method for fully-polarized synthetic aperture radar image |
US20210208247A1 (en) * | 2020-01-03 | 2021-07-08 | Qualcomm Incorporated | Super-resolution enhancement techniques for radar |
CN113096017A (en) * | 2021-04-14 | 2021-07-09 | 南京林业大学 | Image super-resolution reconstruction method based on depth coordinate attention network model |
WO2022057837A1 (en) * | 2020-09-16 | 2022-03-24 | 广州虎牙科技有限公司 | Image processing method and apparatus, portrait super-resolution reconstruction method and apparatus, and portrait super-resolution reconstruction model training method and apparatus, electronic device, and storage medium |
CN114429422A (en) * | 2021-12-22 | 2022-05-03 | 山东师范大学 | Image super-resolution reconstruction method and system based on residual channel attention network |
-
2022
- 2022-07-28 CN CN202210900724.6A patent/CN114972041B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110221630A1 (en) * | 2010-03-12 | 2011-09-15 | The Boeing Company | Super-resolution imaging radar |
CN104122549A (en) * | 2014-07-21 | 2014-10-29 | 电子科技大学 | Deconvolution based radar angle super-resolution imaging method |
US20210208247A1 (en) * | 2020-01-03 | 2021-07-08 | Qualcomm Incorporated | Super-resolution enhancement techniques for radar |
WO2022057837A1 (en) * | 2020-09-16 | 2022-03-24 | 广州虎牙科技有限公司 | Image processing method and apparatus, portrait super-resolution reconstruction method and apparatus, and portrait super-resolution reconstruction model training method and apparatus, electronic device, and storage medium |
CN112419155A (en) * | 2020-11-26 | 2021-02-26 | 武汉大学 | Super-resolution reconstruction method for fully-polarized synthetic aperture radar image |
CN113096017A (en) * | 2021-04-14 | 2021-07-09 | 南京林业大学 | Image super-resolution reconstruction method based on depth coordinate attention network model |
CN114429422A (en) * | 2021-12-22 | 2022-05-03 | 山东师范大学 | Image super-resolution reconstruction method and system based on residual channel attention network |
Non-Patent Citations (6)
Title |
---|
H SHEN,L LIN,J LI,Q YUAN,L ZHAO: "A residual convolutional neural network for polarimetric SAR image super-resolution", 《ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING》 * |
崔兴超,粟毅,陈思伟: "融合极化旋转域特征和超像素技术的极化SAR舰船检测", 《雷达学报》 * |
杨亚霖: "基于探墙雷达的图像超分辨率重建算法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
王永金: "基于深度学习的图像超分辨率算法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
田彪等: "宽带逆合成孔径雷达高分辨率成像技术综述", 《雷达学报》 * |
白富瑞: "基于注意力机制神经网络的超分辨率重建算法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024136129A1 (en) * | 2022-12-22 | 2024-06-27 | 오픈엣지테크놀로지 주식회사 | Network parameter correction method for neural network operating in integer type npu, and device therefor |
CN116128727A (en) * | 2023-02-02 | 2023-05-16 | 中国人民解放军国防科技大学 | Super-resolution method, system, equipment and medium for polarized radar image |
Also Published As
Publication number | Publication date |
---|---|
CN114972041B (en) | 2022-10-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114972041B (en) | Polarization radar image super-resolution reconstruction method and device based on residual error network | |
CN111476717B (en) | Face image super-resolution reconstruction method based on self-attention generation countermeasure network | |
CN111369440B (en) | Model training and image super-resolution processing method, device, terminal and storage medium | |
CN111353947A (en) | Magnetic resonance parallel imaging method and related equipment | |
CN111784581B (en) | SAR image super-resolution reconstruction method based on self-normalization generation countermeasure network | |
Qin et al. | Enhancing ISAR resolution by a generative adversarial network | |
Wei | Image super‐resolution reconstruction using the high‐order derivative interpolation associated with fractional filter functions | |
Yeganeh et al. | Objective quality assessment for image super-resolution: A natural scene statistics approach | |
WO2023124971A1 (en) | Magnetic resonance imaging down-sampling and reconstruction method based on cross-domain network | |
CN111754598B (en) | Local space neighborhood parallel magnetic resonance imaging reconstruction method based on transformation learning | |
CN111640067B (en) | Single image super-resolution reconstruction method based on three-channel convolutional neural network | |
Lin et al. | Low-resolution fully polarimetric SAR and high-resolution single-polarization SAR image fusion network | |
CN109991602A (en) | ISAR image resolution enhancement method based on depth residual error network | |
CN111667407A (en) | Image super-resolution method guided by depth information | |
CN114140442A (en) | Deep learning sparse angle CT reconstruction method based on frequency domain and image domain degradation perception | |
Wang et al. | Group shuffle and spectral-spatial fusion for hyperspectral image super-resolution | |
CN105931184A (en) | SAR image super-resolution method based on combined optimization | |
Wang et al. | Local conditional neural fields for versatile and generalizable large-scale reconstructions in computational imaging | |
Chilukuri et al. | Analysing Of Image Quality Computation Models Through Convolutional Neural Network | |
CN114565511B (en) | Lightweight image registration method, system and device based on global homography estimation | |
CN116862765A (en) | Medical image super-resolution reconstruction method and system | |
CN114066749B (en) | Phase correlation anti-noise displacement estimation method, device and storage medium | |
Li et al. | Polarimetric ISAR super-resolution based on group residual attention network | |
Zhang et al. | A novel super-resolution method of PolSAR images based on target decomposition and polarimetric spatial correlation | |
Liu et al. | CNN-Enhanced graph attention network for hyperspectral image super-resolution using non-local self-similarity |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |