CN114663303A - Neural network-based remote sensing image cloud layer distinguishing and removing method - Google Patents
Neural network-based remote sensing image cloud layer distinguishing and removing method Download PDFInfo
- Publication number
- CN114663303A CN114663303A CN202210255108.XA CN202210255108A CN114663303A CN 114663303 A CN114663303 A CN 114663303A CN 202210255108 A CN202210255108 A CN 202210255108A CN 114663303 A CN114663303 A CN 114663303A
- Authority
- CN
- China
- Prior art keywords
- cloud
- cloud layer
- remote sensing
- image
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 61
- 238000013528 artificial neural network Methods 0.000 title claims abstract description 24
- 238000012549 training Methods 0.000 claims abstract description 36
- 238000005728 strengthening Methods 0.000 claims abstract description 26
- 230000006870 function Effects 0.000 claims description 66
- 230000004913 activation Effects 0.000 claims description 31
- 230000008569 process Effects 0.000 claims description 20
- 238000010586 diagram Methods 0.000 claims description 19
- 230000002708 enhancing effect Effects 0.000 claims description 14
- 238000010276 construction Methods 0.000 claims description 8
- 238000012360 testing method Methods 0.000 claims description 8
- 230000003044 adaptive effect Effects 0.000 claims description 4
- 238000011176 pooling Methods 0.000 claims description 4
- 238000004088 simulation Methods 0.000 claims description 4
- 238000010200 validation analysis Methods 0.000 claims description 4
- 238000012795 verification Methods 0.000 claims description 4
- 238000004422 calculation algorithm Methods 0.000 description 3
- 238000013135 deep learning Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000006978 adaptation Effects 0.000 description 2
- 230000009977 dual effect Effects 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 238000011426 transformation method Methods 0.000 description 2
- 208000037170 Delayed Emergence from Anesthesia Diseases 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000002401 inhibitory effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/73—Deblurring; Sharpening
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/048—Activation functions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/77—Retouching; Inpainting; Scratch removal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Biomedical Technology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to a method for distinguishing and removing cloud layers of remote sensing images based on a neural network, which comprises the following steps: selecting a plurality of non-cloud remote sensing images, simulating to generate cloud layers to obtain a cloud layer thickness image, adding the cloud layer thickness image to the non-cloud image to obtain a cloud image, and extracting edges of the cloud layer thickness image to form a remote sensing image data set; constructing a cloud layer edge estimation unit and a cloud layer thickness estimation unit; building a characteristic strengthening unit and a cloud layer distinguishing unit; constructing a cloud layer removal network for generating a cloud removal image from a single cloud image; training a cloud layer removing network based on a cloud layer distinguishing unit loss function; and importing model parameters after training, and obtaining a cloud-removed remote sensing image from a single cloud remote sensing image.
Description
Technical Field
The invention relates to the technical field of image processing technology and deep learning, in particular to a method for distinguishing and removing cloud layers of remote sensing images based on a neural network.
Background
The remote sensing satellite data has the characteristics of high resolution, high information content and the like, and is widely applied to various fields, such as: traffic monitoring, weather forecasting, etc. However, the optical remote sensing image is often influenced by factors such as cloud and fog due to the sensor, a part of electromagnetic waves in a visible light wave band are shielded, only a part of electromagnetic waves reach the optical sensor, so that ground information in the image is covered, valuable information in the image is lost, the utilization efficiency of the optical remote sensing image is greatly reduced, and resource waste is caused. Therefore, the cloud layer removal of the remote sensing image becomes an urgent problem to be solved.
Researchers research the cloud layer removing algorithm of the remote sensing image, and generally divide the algorithm into a transformation method, an assumption prior method, a deep learning method and the like. A typical transformation method is generally a homomorphic filtering method, and because information such as cloud exists in a low-frequency domain of an image, the influence of a cloud layer can be removed by enhancing high-frequency information and inhibiting low-frequency information; the hypothesis prior method provides hypothesis information through statistical analysis of the cloud images and the non-cloud images, and the hypothesis information is used as prior to reversely deduce the remote sensing image without the cloud layer; the deep learning method utilizes the strong characteristic learning capability of the convolutional neural network to recover the cloudless image, and has the characteristics of strong adaptability, strong robustness and the like. Therefore, the remote sensing image cloud layer removing method based on the deep neural network has practical significance and application value.
Disclosure of Invention
The technical scheme of the invention is as follows: the method overcomes the defects of the prior art, and provides a method for distinguishing and removing the cloud layer of the remote sensing image based on the neural network, so that the automatic distinguishing of the thickness of the cloud layer of the remote sensing image is realized, the cloud layer in the remote sensing image is adaptively removed, and additional prior information is not needed.
The technical scheme of the invention is as follows: a method for distinguishing and removing cloud layers of remote sensing images based on a neural network comprises the following steps:
1) selecting a plurality of non-cloud remote sensing images, simulating to generate cloud layers to obtain a cloud layer thickness image, adding the cloud layer thickness image to the non-cloud image to obtain a cloud layer edge image, extracting the edges of the cloud layer thickness image to obtain a cloud layer edge image, forming a remote sensing image data set by the non-cloud image, the cloud layer edge image, the cloud layer thickness image and the cloud layer edge image, and dividing the remote sensing image data set into a training set, a verification set and a test set;
2) the cloud layer edge estimation unit is set up and used for estimating edge characteristic information of a cloud layer, and the cloud layer thickness estimation unit is set up and used for generating cloud layer thickness information and providing the cloud layer thickness information for a subsequent network to finish cloud layer removal;
3) the method comprises the steps of establishing a characteristic strengthening unit for strengthening image cloud layer characteristics, and establishing a cloud layer distinguishing unit for distinguishing whether cloud in an image is thoroughly removed;
4) building a cloud layer removing network based on a cloud layer edge estimation unit, a cloud layer thickness estimation unit, a feature strengthening unit and a cloud layer distinguishing unit, wherein the cloud layer removing network is used for generating a cloud removing picture from a single cloud picture;
5) training a cloud layer removing network based on a cloud layer distinguishing unit loss function;
6) and importing model parameters after training, and obtaining a cloud-removed remote sensing image from a single cloud remote sensing image.
In the step 1), the specific steps of constructing the remote sensing image data set are as follows:
11) selecting m cloud-free remote sensing images, generating cloud layers through simulation to obtain m cloud layer thickness images, adding the m cloud layer thickness images into the cloud-free remote sensing images to obtain m cloud remote sensing images, extracting edges of the cloud layer thickness images by using a Sobel operator to obtain m cloud layer edge images, segmenting all images according to the size of NxN, obtaining N cloud-free remote sensing images H, N, forming a data set U by K, N cloud layer thickness images G and N cloud layer edge images J, wherein m, N and N are positive integers;
12) expanding the data set, specifically: randomly selecting a% of images from the data set U in the step 11), and randomly rotating the images, wherein the rotation operations comprise 90-degree rotation, 180-degree rotation, 270-degree rotation and mirror surface overturning; obtaining an expanded image after rotation operation, and adding the expanded image into a data set U to form a remote sensing image data set V, wherein a is a positive integer;
13) the remote sensing image data set V in the step 12) is according to z1:z2:z3Dividing a training set, a validation set and a test set, wherein z1、z2And z3Are all positive integers.
In the step 2), the specific construction process of the cloud layer edge estimation unit is as follows:
and constructing a cloud layer edge estimation unit based on w residual modules, wherein the cloud layer edge estimation unit is used for estimating edge characteristic information of a cloud layer, the unit input is a cloud remote sensing image, and the output is a cloud layer edge image, and w is a positive integer.
The residual error module comprises z convolution blocks Block and 1 residual error connection, each convolution Block Block is composed of 2 convolutions, 2 LReLU activation functions with slope of lr, 1 residual error learning and 1 feature strengthening unit, convolution kernel sizes are s multiplied by s, steps are e, wherein z, s and e are positive integers, and lr is decimal within the range of [0,1 ].
In the step 2), the specific construction process of the cloud layer thickness estimation unit is as follows:
the method comprises the steps that a cloud layer thickness estimation unit is established based on y double residual modules and 1 residual connection, cloud layer thickness information is generated and provided for a subsequent network to complete cloud layer removal, a cloud remote sensing image is input into the cloud layer thickness estimation unit, a cloud layer thickness map is output, and y is a positive integer.
The double residual error module consists of 3 convolutions, 3 LReLU activating functions with slope lr, 2 residual error learning units and 1 feature strengthening unit, the sizes of convolution kernels are d multiplied by d, steps are t, d and t are positive integers, and lr is a decimal within the range of [0,1 ].
In the step 3), the specific construction process of the feature enhancing unit is as follows:
building a feature strengthening unit based on u convolutions, u-1 LReLU activation functions with slope of lr and 1 Sigmoid activation functions, wherein the feature strengthening unit is used for strengthening image features, the sizes of convolution kernels are f multiplied by f, and the stride is g, wherein u, f and g are positive integers, and lr is a decimal number within a range of [0,1 ];
the process of enhancing the image features comprises the following steps: the input of the feature strengthening unit is convolved, and an LReLU activation function and a Sigmoid activation function with the slope of lr are used for obtaining a feature map beta1A feature map beta1With the original input beta2The corresponding pixel product of (2) to realize feature enhancement at the adaptive element level; wherein lr is in the range of [0,1]The decimal fraction of (c).
In the step 3), the specific construction process of the cloud layer discrimination unit is as follows:
constructing a cloud layer discrimination unit based on j convolutions, j-1 ReLU activation functions, 1 Tanh activation function and 1 mean value pooling, wherein the cloud layer discrimination unit is used for discriminating whether the cloud in the image is removed completely, and the convolution kernel sizes of the j convolutions are sequentially { f }i×fiI e (1,2, …, j) }, with steps of d in turni×diI e (1,2, …, j) }, where j, fiAnd diAre all positive integers;
the process of judging whether the cloud in the image is completely removed is as follows: training a cloud layer distinguishing unit by adopting a data set cloud-free remote sensing image and a cloud remote sensing image, wherein the input of the cloud layer distinguishing unit is an image, the output of the cloud layer distinguishing unit is probability x, and x belongs to [0,1 ]; if the cloud layer of the input image is completely removed, the output x of the cloud layer judging unit is close to 1, and if the cloud layer is not completely removed, the output x is close to 0; wherein x is a decimal number in the range of [0,1 ].
In the step 4), the specific process of constructing the cloud layer removing network is as follows:
training a cloud layer removal network by adopting a cloud remote sensing image, a non-cloud remote sensing image, a cloud layer thickness image and a cloud layer edge image in a data set;
output alpha of the cloud layer edge estimation unit1Output alpha of the cloud layer thickness estimation unit2And thin cloud remote sensing image alpha3Cascading on the channel, and obtaining a characteristic diagram alpha after passing through a characteristic strengthening unit4Judging by using a cloud layer judging unit, if the output is greater than a threshold value k, considering that the cloud layer is thoroughly removed to obtain an output cloud-removed remote sensing image, if the output is less than or equal to the threshold value k, considering that the cloud layer is not thoroughly removed, and repeating the process to carry out the cloud layerRemoving, wherein k is in the range of [0,1]]The decimal fraction of (c).
In the step 5), the cloud image removing characteristic loss function is specifically:
wherein C, v and b represent the serial numbers of the length, the width and the channel of the feature diagram, W, H and C represent the length, the width and the channel of the feature diagram, sigma (-) represents the output feature diagram of the I layer of the VGG16 network, H represents the cloud-free remote sensing image,representing the cloud-removed remote sensing image, C belongs to (1, …, W), v belongs to (1, …, H), b belongs to (1, …, C), and C, v, b, W, H, C and l are all positive integers.
In the step 5), the cloud image reconstruction loss function is specifically:
in the step 5), the cloud layer thickness loss function is specifically:
wherein G represents a thickness map of the cloud layer of the label,a predicted cloud thickness map is shown.
In the step 5), the cloud layer edge loss function is specifically:
In the step 5), the cloud layer discrimination unit loss function is specifically:
in the formula, H represents a cloud-free remote sensing graph, K represents a cloud remote sensing graph, p represents the number of images in a training set, and D (·) represents the output of a cloud layer discrimination unit, wherein p is a positive integer.
The technical scheme provided by the invention has the beneficial effects that:
1. the traditional hypothesis prior algorithm deduces a cloud-removed image according to a physical model based on hypothesis information, however, the hypothesis information has adaptability and limited adaptation range, and can fail in some cases, and the method realizes cloud layer discrimination and removal according to cloud layer characteristics, has wide adaptation range, and can process cloud remote sensing images in many scenes;
2. the traditional filtering method generally processes the whole image, the cloud layer part in the image can be processed in a self-adaptive mode, the cloud layer part cannot be processed in a self-adaptive mode, the problem of color distortion and the like of the cloud layer part cannot be caused when the cloud layer is removed, and the cloud removed image is natural and real;
3. the method has the advantages of wide application range, strong robustness, simplicity, convenience and practicability, and high cloud layer removing efficiency.
Drawings
FIG. 1 is a flow chart of a method for discriminating and removing cloud layers of a remote sensing image based on a neural network;
FIG. 2 is a schematic diagram of a cloud edge estimation unit;
FIG. 3 is a schematic diagram of a cloud layer thickness estimation unit;
FIG. 4 is a schematic diagram of a feature enhancement unit structure;
FIG. 5 is a schematic diagram of a cloud layer discriminating unit;
fig. 6 is a schematic diagram of a cloud layer removal network structure.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention are described in further detail below.
Example 1
The embodiment of the invention provides a method for distinguishing and removing cloud layers of remote sensing images based on a neural network, and the method is described in detail in the following description with reference to fig. 1:
101: selecting a plurality of non-cloud remote sensing images, simulating to generate cloud layers to obtain a cloud layer thickness image, adding the cloud layer thickness image to the non-cloud image to obtain a cloud layer edge image, extracting the edges of the cloud layer thickness image to obtain a cloud layer edge image, forming a remote sensing image data set by the non-cloud image, the cloud layer edge image, the cloud layer thickness image and the cloud layer edge image, and dividing the remote sensing image data set into a training set, a verification set and a test set;
102: constructing a cloud layer edge estimation unit for estimating edge characteristic information of a cloud layer, and constructing a cloud layer thickness estimation unit for generating cloud layer thickness information and providing the cloud layer thickness information for a subsequent network to finish cloud layer removal;
103: the method comprises the steps of establishing a characteristic strengthening unit for strengthening image cloud layer characteristics, and establishing a cloud layer distinguishing unit for distinguishing whether cloud in an image is thoroughly removed;
104: constructing a cloud layer removing network based on the cloud layer edge estimation unit and the cloud layer thickness estimation unit in the step 102 and the feature strengthening unit and the cloud layer distinguishing unit in the step 103, and generating a cloud layer removing image from a single cloud image;
105: training a cloud layer removing network based on a cloud layer distinguishing unit loss function;
106: and importing model parameters after training, and obtaining a cloud-removed remote sensing image from a single cloud remote sensing image.
The specific steps in step 101 are as follows:
1) selecting m cloud-free remote sensing images, generating cloud layers through simulation to obtain m cloud layer thickness images, adding the m cloud layer thickness images into the cloud-free remote sensing images to obtain m cloud remote sensing images, extracting edges of the cloud layer thickness images by using a Sobel operator to obtain m cloud layer edge images, segmenting all images according to the size of NxN, obtaining N cloud-free remote sensing images H, N, forming a data set U by K, N cloud layer thickness images G and N cloud layer edge images J, wherein m, N and N are positive integers; (ii) a
2) Expanding the data set, specifically: randomly selecting a% of images from the data set U in the step 1), randomly performing one of 90-degree rotation, 180-degree rotation, 270-degree rotation and mirror surface turning on the images to obtain an expanded image, and adding the expanded image into the data set U to form a remote sensing image data set V, wherein a is a positive integer
3) The remote sensing image data set V in the step 2) is according to z1:z2:z3Dividing a training set, a validation set and a test set, wherein z1、z2And z3Are all positive integers.
The specific steps in step 102 are as follows:
1) as shown in fig. 2, the residual module includes z convolution blocks Block and 1 residual connection, each convolution Block is composed of 2 convolutions, 2 lreol activation functions with slope lr, 1 residual learning and 1 feature enhancing unit, the sizes of convolution kernels are s × s, steps are e, wherein z, s and e are positive integers, and lr is a decimal within a range of [0,1 ];
2) as shown in fig. 2, a Cloud Edge Estimation Unit (CEEU) is constructed based on w residual modules, and is used for estimating Edge feature information of a Cloud layer, wherein the Unit inputs a Cloud remote sensing image and outputs a Cloud Edge map, and w is a positive integer;
3) as shown in fig. 3, the dual residual unit is composed of 3 convolutions, 3 lreol activation functions with slope lr, 2 residual learning units, and 1 feature enhancement unit, the sizes of convolution kernels are d × d, and the steps are t, where d and t are positive integers, and lr is a decimal number in the range of [0,1 ];
4) as shown in fig. 3, a Cloud layer Thickness Estimation Unit (CTEU) is constructed based on y double residual modules and 1 residual connection, and is used for generating Cloud layer Thickness information and providing the Cloud layer Thickness information to a subsequent network to complete Cloud layer removal, the CTEU Unit inputs a Cloud remote sensing image and outputs a Cloud layer Thickness map, wherein y is a positive integer.
Wherein, the specific steps in step 103 are as follows:
1) as shown in fig. 4, a Feature Enhancement Unit (FEU) is constructed based on u convolutions, u-1 lreul activation functions (slope is lr) and 1 Sigmoid activation function, and is used for enhancing image features, the sizes of convolution kernels are f × f, and steps are g, wherein u, f and g are positive integers, and lr is a decimal within a range of [0,1 ];
2) the feature enhancing unit FEU inputs a feature map beta obtained by a convolution, LReLU activation function with slope of lr and Sigmoid activation function1A feature map beta1With the original input beta2To achieve feature enhancement at the adaptive element level, where lr is in the range of 0,1]The decimal fraction of (d);
3) as shown in fig. 5, a Cloud layer Discrimination Unit (CDU) is constructed based on j convolutions, j-1 ReLU activation functions, 1 Tanh activation function, and 1 mean pooling, and is used for discriminating whether clouds in an image are removed completely, and convolution kernels of the j convolutions have sizes { f } in sequencei×fiI e (1,2, …, j) }, with steps of d in turni×diI ∈ (1,2, …, j) }, where j, fiAnd diAre all positive integers;
4) and training a cloud layer discrimination unit CDU by adopting a data set cloud-free remote sensing image and a cloud remote sensing image, wherein the input of the CDU is an image, and the output is probability x, and x belongs to [0,1 ]. If the input image is a cloud-free remote sensing image, the output x is close to 1, and if the input image has a cloud remote sensing image, the output x is close to 0. Therefore, if the cloud layer of the input image is completely removed, the cloud layer discrimination unit outputs x close to 1, and if the cloud layer is not completely removed, the output x close to 0. Wherein x is a decimal number in the range of [0,1 ].
Wherein, the specific steps in step 104 are as follows:
1) as shown in fig. 6, a cloud layer removal network is built based on the cloud layer edge estimation unit CEEU and the cloud layer thickness estimation unit CTEU in step 102, and the feature enhancing unit FEU and the cloud layer discrimination unit CDU in step 103, and is used for generating a cloud-removed picture from a single cloud-containing picture, and training the cloud layer removal network by adopting a data set including a cloud remote sensing picture, a cloud-free remote sensing picture, a cloud layer thickness picture and a cloud layer edge picture;
2) output alpha of cloud layer edge estimation unit CEEU1Output alpha of the cloud layer thickness estimation unit CTEU2And thin cloud remote sensing image alpha3Cascading on the channel, and obtaining a feature map alpha after passing through a feature enhancing unit FEU4Judging by using a cloud layer judging unit CDU, if the output is greater than a threshold value k, considering that the cloud layer is completely removed to obtain an output cloud-removed remote sensing image, if the output is less than or equal to the threshold value k, considering that the cloud layer is not completely removed, and repeating the process to remove the cloud layer, wherein k is in the range of [0,1]]The decimal fraction of (c).
Wherein, the specific steps in step 105 are as follows:
1) training a cloud layer removing network based on a cloud layer characteristic loss function, a cloud layer reconstruction loss function, a cloud layer thickness loss function and a cloud layer edge loss function, wherein the specific functions are 2), 3), 4) and 5);
2) the cloud image removing characteristic loss function is specifically as follows:
wherein C, v and b represent the serial numbers of the length, the width and the channel of the feature diagram, W, H and C represent the length, the width and the channel of the feature diagram, sigma (-) represents the output feature diagram of the I layer of the VGG16 network, H represents the cloud-free remote sensing image,representing a cloud-removed remote sensing image, wherein C belongs to (1, …, W), v belongs to (1, …, H), b belongs to (1, …, C), and C, v, b, W, H, C and l are positive integers;
3) the cloud image reconstruction loss function is specifically as follows:
4) the cloud layer thickness loss function is specifically:
wherein G represents a thickness map of the cloud layer of the label,representing a predicted cloud layer thickness map;
5) the cloud edge loss function is specifically:
6) training a cloud layer discrimination unit based on a cloud layer discrimination unit loss function, wherein the cloud layer discrimination unit loss function is specifically as follows:
in the formula, H represents a cloud-free remote sensing graph, K represents a cloud remote sensing graph, p represents the number of training set images, and D (·) represents the output of the cloud layer discrimination unit, wherein p is a positive integer.
Wherein, the specific steps in step 106 are as follows: and importing model parameters after training, and obtaining a cloud-removed remote sensing image from a single cloud remote sensing image.
Example 2
The embodiment of the invention provides a method for distinguishing and removing cloud layers of remote sensing images based on a neural network, and the method is described in detail in the following description with reference to fig. 1:
201: selecting a plurality of non-cloud remote sensing images, simulating to generate cloud layers to obtain a cloud layer thickness image, adding the cloud layer thickness image to the non-cloud image to obtain a cloud layer edge image, extracting the edges of the cloud layer thickness image to obtain a cloud layer edge image, forming a remote sensing image data set by the non-cloud image, the cloud layer edge image, the cloud layer thickness image and the cloud layer edge image, and dividing the remote sensing image data set into a training set, a verification set and a test set;
202: constructing a cloud layer edge estimation unit for estimating edge characteristic information of a cloud layer, and constructing a cloud layer thickness estimation unit for generating cloud layer thickness information and providing the cloud layer thickness information for a subsequent network to finish cloud layer removal;
203: the method comprises the steps of establishing a characteristic strengthening unit for strengthening image cloud layer characteristics, and establishing a cloud layer distinguishing unit for distinguishing whether cloud in an image is thoroughly removed;
204: constructing a cloud layer removing network based on the cloud layer edge estimation unit and the cloud layer thickness estimation unit in the step 102 and the feature strengthening unit and the cloud layer distinguishing unit in the step 103, and generating a cloud layer removing image from a single cloud image;
205: training a cloud layer removing network based on a cloud layer distinguishing unit loss function;
206: and importing model parameters after training is finished, and obtaining a cloud-removed remote sensing image from a single cloud remote sensing image.
Wherein, the specific steps in step 201 are as follows:
1) selecting m cloud-free remote sensing images, generating cloud layers through simulation to obtain m cloud layer thickness images, adding the cloud layer thickness images into the cloud-free remote sensing images to obtain m cloud remote sensing images, extracting edges of the cloud layer thickness images by using a Sobel operator to obtain m cloud layer edge images, segmenting all images according to the size of N multiplied by N to obtain N cloud-free remote sensing images H, N cloud layer thickness images K, N, and forming a data set U by N cloud layer edge images J, wherein m is 500, N is 5000, and N is 256;
2) expanding the data set, specifically: randomly selecting a% of images from the data set U in the step 1), randomly performing one operation of 90-degree rotation, 180-degree rotation, 270-degree rotation and mirror surface turning on the images to obtain an expanded image, and adding the expanded image into the data set U to form a remote sensing image data set V, wherein a is 50;
3) the remote sensing image data set V in the step 2) is according to z1:z2:z3Dividing a training set, a validation set and a test set, wherein z1=7,z2=1,z3=2。
Wherein, the specific steps in step 202 are as follows:
1) as shown in fig. 2, the residual Block includes z convolution blocks Block and 1 residual connection, each convolution Block is composed of 2 convolutions, 2 lreol activation functions with slope lr, 1 residual learning and 1 feature enhancing unit, the convolution kernel sizes are s × s, and the steps are e, where z is 5, lr is 0.2, s is 4, and e is 1;
2) as shown in fig. 2, a Cloud Edge Estimation Unit (CEEU) is constructed based on w residual modules, and is used for estimating Edge feature information of a Cloud layer, wherein the Unit inputs a Cloud remote sensing image and outputs a Cloud Edge map, and w is 5;
3) as shown in fig. 3, the dual residual unit is composed of 3 convolutions, 3 lreul activation functions with slope lr, 2 residual learning units, and 1 feature enhancement unit, where the sizes of convolution kernels are d × d and the steps are t, where lr is 0.2, d is 4, and t is 1;
4) as shown in fig. 3, a Cloud layer Thickness Estimation Unit (CTEU) is constructed based on y double residual modules and 1 residual connection, and is used for generating Cloud layer Thickness information and providing the Cloud layer Thickness information to a subsequent network to complete Cloud layer removal, the CTEU Unit inputs a Cloud remote sensing image and outputs a Cloud layer Thickness map, wherein y is 3.
Wherein, the specific steps in step 203 are:
1) as shown in fig. 4, a Feature Enhancement Unit (FEU) is constructed based on u convolutions, u-1 lreul activation functions (slope is lr) and 1 Sigmoid activation function, and is used for enhancing image features, where the sizes of convolution kernels are f × f and the steps are g, where u is 7, lr is 0.2, f is 4, and g is 1;
2) the feature enhancing unit FEU inputs a feature map beta obtained by a convolution, LReLU activation function with slope of lr and Sigmoid activation function1A feature map beta1With the original input beta2The corresponding pixel product of (a), implementing adaptive element-level feature enhancement, where lr is 0.2;
3) as shown in fig. 5, a Cloud layer discriminating Unit (CDU) is constructed based on j convolutions, j-1 ReLU activation functions, 1 Tanh activation function, and 1 mean pooling, and is used for discriminating whether clouds in an image are removed completely, convolution kernel sizes of the j convolutions are {3 × 3, 5 × 5, 7 × 7, 9 × 9, 11 × 11} in sequence, and steps are {2,2,2,1,1} in sequence, where j is 5;
4) and training a cloud layer discrimination unit CDU by adopting a data set cloud-free remote sensing image and a cloud remote sensing image, wherein the input of the CDU is an image, and the output is probability x, and x belongs to [0,1 ]. If the input image is a cloud-free remote sensing image, the output x is close to 1, and if the input image has a cloud remote sensing image, the output x is close to 0. Therefore, if the cloud layer of the input image is completely removed, the cloud layer discrimination unit outputs x close to 1, and if the cloud layer is not completely removed, the output x close to 0. Wherein x is a decimal number in the range of [0,1 ].
Wherein, the specific steps in step 204 are as follows:
1) as shown in fig. 6, a cloud layer removal network is built based on the cloud layer edge estimation unit CEEU and the cloud layer thickness estimation unit CTEU in step 102, the feature enhancement unit FEU and the cloud layer discrimination unit CDU in step 103, and is used for generating a cloud removal map from a single cloud-containing map, and training the cloud layer removal network by using a cloud remote sensing map, a cloud-free remote sensing map, a cloud layer thickness map and a cloud layer edge map in a data set;
2) output alpha of cloud layer edge estimation unit CEEU1Output alpha of the cloud layer thickness estimation unit CTEU2And thin cloud remote sensing image alpha3Cascading on the channel, and obtaining a feature map alpha after passing through a feature enhancing unit FEU4And judging by using a cloud layer judging unit CDU, if the output is greater than a threshold value k, considering that the cloud layer is thoroughly removed to obtain an output cloud-removed remote sensing image, if the output is less than or equal to the threshold value k, considering that the cloud layer is not thoroughly removed, and repeating the process to remove the cloud layer, wherein k is 0.6.
Wherein, the specific steps in step 205 are:
1) training a cloud layer removing network based on a cloud layer characteristic loss function, a cloud layer reconstruction loss function, a cloud layer thickness loss function and a cloud layer edge loss function, wherein the specific functions are 2), 3), 4) and 5);
2) the cloud image removing characteristic loss function is specifically as follows:
wherein C, v and b represent the serial numbers of the length, the width and the channel of the feature diagram, W, H and C represent the length, the width and the channel of the feature diagram, sigma (-) represents the output feature diagram of the I layer of the VGG16 network, H represents the cloud-free remote sensing image,representing a cloud-removed remote sensing image, wherein C belongs to (1, …, W), v belongs to (1, …, H), b belongs to (1, …, C), C, v, b, W, H and C are positive integers, and l is 20.
3) The cloud image reconstruction loss function is specifically as follows:
4) the cloud layer thickness loss function is specifically:
wherein G represents a thickness map of the cloud layer of the label,representing a predicted cloud layer thickness map;
5) the cloud layer edge loss function is specifically:
6) training a cloud layer discrimination unit based on a cloud layer discrimination unit loss function, wherein the cloud layer discrimination unit loss function is specifically as follows:
in the formula, H represents a cloud-free remote sensing graph, K represents a cloud remote sensing graph, p represents the number of images in a training set, and D (·) represents the output of a cloud layer discrimination unit, wherein p is a positive integer.
Wherein, the specific steps in step 206 are: and importing model parameters after training, and obtaining a cloud-removed remote sensing image from a single cloud remote sensing image.
Those skilled in the art will appreciate that the drawings are only schematic illustrations of preferred embodiments, and the above-mentioned serial numbers of the embodiments of the present invention are only for description and do not represent the merits of the embodiments.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.
Claims (14)
1. A method for distinguishing and removing cloud layers of remote sensing images based on a neural network is characterized by comprising the following steps:
1) selecting a plurality of non-cloud remote sensing images, simulating to generate cloud layers to obtain a cloud layer thickness image, adding the cloud layer thickness image to the non-cloud image to obtain a cloud layer edge image, extracting the edges of the cloud layer thickness image to obtain a cloud layer edge image, forming a remote sensing image data set by the non-cloud image, the cloud layer edge image, the cloud layer thickness image and the cloud layer edge image, and dividing the remote sensing image data set into a training set, a verification set and a test set;
2) constructing a cloud layer edge estimation unit for estimating edge characteristic information of a cloud layer, and constructing a cloud layer thickness estimation unit for generating cloud layer thickness information and providing the cloud layer thickness information for a subsequent network to finish cloud layer removal;
3) the method comprises the steps of establishing a characteristic strengthening unit for strengthening image cloud layer characteristics, and establishing a cloud layer distinguishing unit for distinguishing whether cloud in an image is thoroughly removed;
4) building a cloud layer removing network based on a cloud layer edge estimation unit, a cloud layer thickness estimation unit, a feature strengthening unit and a cloud layer distinguishing unit, wherein the cloud layer removing network is used for generating a cloud removing picture from a single cloud picture;
5) training a cloud layer removing network based on a cloud layer distinguishing unit loss function;
6) and importing model parameters after training, and obtaining a cloud-removed remote sensing image from a single cloud remote sensing image.
2. The method for distinguishing and removing the cloud layer of the remote sensing image based on the neural network as claimed in claim 1, wherein in the step 1), the specific steps for constructing the data set of the remote sensing image are as follows:
11) selecting m cloud-free remote sensing images, generating cloud layers through simulation to obtain m cloud layer thickness images, adding the m cloud layer thickness images into the cloud-free remote sensing images to obtain m cloud remote sensing images, extracting edges of the cloud layer thickness images by using a Sobel operator to obtain m cloud layer edge images, segmenting all images according to the size of NxN, obtaining N cloud-free remote sensing images H, N, forming a data set U by K, N cloud layer thickness images G and N cloud layer edge images J, wherein m, N and N are positive integers;
12) expanding the data set, specifically: randomly selecting a% of images from the data set U in the step 11), and randomly rotating the images, wherein the rotation operations comprise 90-degree rotation, 180-degree rotation, 270-degree rotation and mirror surface overturning; obtaining an expanded image after rotation operation, and adding the expanded image into a data set U to form a remote sensing image data set V, wherein a is a positive integer;
13) the remote sensing image data set V in the step 12) is according to z1:z2:z3Dividing a training set, a validation set and a test set, wherein z1、z2And z3Are all made ofA positive integer.
3. The method for distinguishing and removing the cloud layer of the remote sensing image based on the neural network as claimed in claim 1, wherein in the step 2), the specific construction process of the cloud layer edge estimation unit is as follows:
and constructing a cloud layer edge estimation unit based on w residual modules, wherein the cloud layer edge estimation unit is used for estimating edge characteristic information of a cloud layer, the unit input is a cloud remote sensing image, and the output is a cloud layer edge image, and w is a positive integer.
4. The method for distinguishing and removing the cloud layer of the remote sensing image based on the neural network as claimed in claim 3, wherein the residual module comprises z convolution blocks Block and 1 residual connection, each convolution Block is composed of 2 convolutions, 2 LReLU activation functions with slope lr, 1 residual learning and 1 feature enhancement unit, the convolution kernel sizes are s × s, the steps are e, wherein z, s and e are positive integers, and lr is a decimal within a range of [0,1 ].
5. The method for distinguishing and removing the cloud layer of the remote sensing image based on the neural network as claimed in claim 1, wherein in the step 2), the specific construction process of the cloud layer thickness estimation unit is as follows:
the method comprises the steps that a cloud layer thickness estimation unit is established based on y double residual modules and 1 residual connection, cloud layer thickness information is generated and provided for a subsequent network to complete cloud layer removal, a cloud remote sensing image is input into the cloud layer thickness estimation unit, a cloud layer thickness map is output, and y is a positive integer.
6. The method for distinguishing and removing the cloud layer of the remote sensing image based on the neural network as claimed in claim 5, wherein the double residual module is composed of 3 convolutions, 3 LReLU activation functions with slope lr, 2 residual learning units and 1 feature strengthening unit, the sizes of convolution kernels are d x d, the steps are t, wherein d and t are positive integers, and lr is a decimal within a range of [0,1 ].
7. The method for distinguishing and removing the cloud layer of the remote sensing image based on the neural network as claimed in claim 1, wherein in the step 3), the specific construction process of the feature enhancing unit is as follows:
building a feature strengthening unit based on u convolutions, u-1 LReLU activation functions with slope of lr and 1 Sigmoid activation functions, wherein the feature strengthening unit is used for strengthening image features, the sizes of convolution kernels are f multiplied by f, and the stride is g, wherein u, f and g are positive integers, and lr is a decimal number within a range of [0,1 ];
the process of enhancing the image characteristics comprises the following steps: the input of the feature strengthening unit is convolved, and an LReLU activation function and a Sigmoid activation function with the slope of lr are used for obtaining a feature map beta1A feature map beta1With the original input beta2The corresponding pixel product of (2) to realize feature enhancement at the adaptive element level; wherein lr is in the range of [0,1]]The decimal fraction of (c).
8. The method for distinguishing and removing the cloud layer of the remote sensing image based on the neural network according to claim 1, wherein in the step 3), the specific construction process of the cloud layer distinguishing unit is as follows:
constructing a cloud layer discrimination unit based on j convolutions, j-1 ReLU activation functions, 1 Tanh activation function and 1 mean value pooling, wherein the cloud layer discrimination unit is used for discriminating whether the cloud in the image is removed completely, and the convolution kernel sizes of the j convolutions are sequentially { f }i×fiI e (1,2, …, j) }, with steps of d in turni×diI ∈ (1,2, …, j) }, where j, fiAnd diAre all positive integers;
the process of judging whether the cloud in the image is completely removed is as follows: training a cloud layer distinguishing unit by adopting a data set cloud-free remote sensing image and a cloud remote sensing image, wherein the input of the cloud layer distinguishing unit is an image, the output is probability x, and x belongs to [0,1 ]; if the cloud layer of the input image is completely removed, the output x of the cloud layer judging unit is close to 1, and if the cloud layer is not completely removed, the output x is close to 0; wherein x is a decimal number in the range of [0,1 ].
9. The method for distinguishing and removing the cloud layer of the remote sensing image based on the neural network according to claim 1, wherein in the step 4), the specific process of constructing the cloud layer removing network is as follows:
training a cloud layer removal network by adopting a cloud remote sensing image, a non-cloud remote sensing image, a cloud layer thickness image and a cloud layer edge image in a data set;
output alpha of the cloud layer edge estimation unit1Output alpha of the cloud layer thickness estimation unit2And thin cloud remote sensing image alpha3Cascading on the channel, and obtaining a characteristic diagram alpha after passing through a characteristic strengthening unit4Judging by using a cloud layer judging unit, if the output is greater than a threshold k, considering that the cloud layer is thoroughly removed to obtain an output cloud-removed remote sensing image, if the output is less than or equal to the threshold k, considering that the cloud layer is not thoroughly removed, and repeating the process to remove the cloud layer, wherein k is in the range of [0,1]]The decimal fraction of (c).
10. The method for distinguishing and removing the cloud layer of the remote sensing image based on the neural network as claimed in claim 1, wherein in the step 5), the cloud layer removing characteristic loss function is specifically as follows:
in the formula, C, v and b represent the serial numbers of the length, width and channel of the feature diagram, W, H and C represent the length, width and channel size of the feature diagram, sigma (-) represents the output feature diagram of the I-th layer of the VGG16 network, H represents the cloud-free remote sensing image,representing the cloud-removed remote sensing image, C belongs to (1, …, W), v belongs to (1, …, H), b belongs to (1, …, C), and C, v, b, W, H, C and l are all positive integers.
12. the method for distinguishing and removing the cloud layer of the remote sensing image based on the neural network as claimed in claim 10, wherein in the step 5), the cloud layer thickness loss function is specifically:
13. The method for distinguishing and removing the cloud layer of the remote sensing image based on the neural network as claimed in claim 10, wherein in the step 5), the cloud layer edge loss function is specifically:
14. The method for discriminating and removing the cloud layer of the remote sensing image based on the neural network as claimed in claim 10, wherein in the step 5), the cloud layer discrimination unit loss function is specifically:
in the formula, H represents a cloud-free remote sensing graph, K represents a cloud remote sensing graph, p represents the number of images in a training set, and D (·) represents the output of a cloud layer discrimination unit, wherein p is a positive integer.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210255108.XA CN114663303A (en) | 2022-03-15 | 2022-03-15 | Neural network-based remote sensing image cloud layer distinguishing and removing method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210255108.XA CN114663303A (en) | 2022-03-15 | 2022-03-15 | Neural network-based remote sensing image cloud layer distinguishing and removing method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114663303A true CN114663303A (en) | 2022-06-24 |
Family
ID=82030182
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210255108.XA Pending CN114663303A (en) | 2022-03-15 | 2022-03-15 | Neural network-based remote sensing image cloud layer distinguishing and removing method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114663303A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116645448A (en) * | 2023-04-25 | 2023-08-25 | 北京卫星信息工程研究所 | Quantitative cloud automatic adding method and device for optical remote sensing image |
CN117876817A (en) * | 2023-12-25 | 2024-04-12 | 北京化工大学 | Method for generating countermeasure sample |
-
2022
- 2022-03-15 CN CN202210255108.XA patent/CN114663303A/en active Pending
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116645448A (en) * | 2023-04-25 | 2023-08-25 | 北京卫星信息工程研究所 | Quantitative cloud automatic adding method and device for optical remote sensing image |
CN116645448B (en) * | 2023-04-25 | 2023-12-22 | 北京卫星信息工程研究所 | Quantitative cloud automatic adding method and device for optical remote sensing image |
CN117876817A (en) * | 2023-12-25 | 2024-04-12 | 北京化工大学 | Method for generating countermeasure sample |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111915530B (en) | End-to-end-based haze concentration self-adaptive neural network image defogging method | |
CN114663303A (en) | Neural network-based remote sensing image cloud layer distinguishing and removing method | |
CN109215034B (en) | Weak supervision image semantic segmentation method based on spatial pyramid covering pooling | |
CN110503613B (en) | Single image-oriented rain removing method based on cascade cavity convolution neural network | |
CN109741340B (en) | Ice cover radar image ice layer refined segmentation method based on FCN-ASPP network | |
CN109102475B (en) | Image rain removing method and device | |
CN110097522B (en) | Single outdoor image defogging method based on multi-scale convolution neural network | |
CN111738954B (en) | Single-frame turbulence degradation image distortion removal method based on double-layer cavity U-Net model | |
CN111179196B (en) | Multi-resolution depth network image highlight removing method based on divide-and-conquer | |
CN111127354A (en) | Single-image rain removing method based on multi-scale dictionary learning | |
CN110796623A (en) | Infrared image rain removing method and device based on progressive residual error network | |
CN114663309A (en) | Image defogging method and system based on multi-scale information selection attention mechanism | |
CN112991199A (en) | Image high-low frequency decomposition noise removing method based on residual error dense network | |
CN114140346A (en) | Image processing method and device | |
CN112767280A (en) | Single image raindrop removing method based on loop iteration mechanism | |
CN112164010A (en) | Multi-scale fusion convolution neural network image defogging method | |
CN113962889A (en) | Thin cloud removing method, device, equipment and medium for remote sensing image | |
Hussain et al. | Image denoising to enhance character recognition using deep learning | |
CN111160282B (en) | Traffic light detection method based on binary Yolov3 network | |
CN110349119B (en) | Pavement disease detection method and device based on edge detection neural network | |
CN111738939A (en) | Complex scene image defogging method based on semi-training generator | |
CN114926348B (en) | Device and method for removing low-illumination video noise | |
CN115358952A (en) | Image enhancement method, system, equipment and storage medium based on meta-learning | |
CN116823627A (en) | Image complexity evaluation-based oversized image rapid denoising method | |
Li et al. | Distribution-transformed network for impulse noise removal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |