CN114663303A - Neural network-based remote sensing image cloud layer distinguishing and removing method - Google Patents

Neural network-based remote sensing image cloud layer distinguishing and removing method Download PDF

Info

Publication number
CN114663303A
CN114663303A CN202210255108.XA CN202210255108A CN114663303A CN 114663303 A CN114663303 A CN 114663303A CN 202210255108 A CN202210255108 A CN 202210255108A CN 114663303 A CN114663303 A CN 114663303A
Authority
CN
China
Prior art keywords
cloud
cloud layer
remote sensing
image
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210255108.XA
Other languages
Chinese (zh)
Inventor
王晓宇
刘宇航
梁友鉴
佘玉成
王丹丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aerospace Dongfanghong Satellite Co Ltd
Original Assignee
Aerospace Dongfanghong Satellite Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aerospace Dongfanghong Satellite Co Ltd filed Critical Aerospace Dongfanghong Satellite Co Ltd
Priority to CN202210255108.XA priority Critical patent/CN114663303A/en
Publication of CN114663303A publication Critical patent/CN114663303A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a method for distinguishing and removing cloud layers of remote sensing images based on a neural network, which comprises the following steps: selecting a plurality of non-cloud remote sensing images, simulating to generate cloud layers to obtain a cloud layer thickness image, adding the cloud layer thickness image to the non-cloud image to obtain a cloud image, and extracting edges of the cloud layer thickness image to form a remote sensing image data set; constructing a cloud layer edge estimation unit and a cloud layer thickness estimation unit; building a characteristic strengthening unit and a cloud layer distinguishing unit; constructing a cloud layer removal network for generating a cloud removal image from a single cloud image; training a cloud layer removing network based on a cloud layer distinguishing unit loss function; and importing model parameters after training, and obtaining a cloud-removed remote sensing image from a single cloud remote sensing image.

Description

Neural network-based remote sensing image cloud layer distinguishing and removing method
Technical Field
The invention relates to the technical field of image processing technology and deep learning, in particular to a method for distinguishing and removing cloud layers of remote sensing images based on a neural network.
Background
The remote sensing satellite data has the characteristics of high resolution, high information content and the like, and is widely applied to various fields, such as: traffic monitoring, weather forecasting, etc. However, the optical remote sensing image is often influenced by factors such as cloud and fog due to the sensor, a part of electromagnetic waves in a visible light wave band are shielded, only a part of electromagnetic waves reach the optical sensor, so that ground information in the image is covered, valuable information in the image is lost, the utilization efficiency of the optical remote sensing image is greatly reduced, and resource waste is caused. Therefore, the cloud layer removal of the remote sensing image becomes an urgent problem to be solved.
Researchers research the cloud layer removing algorithm of the remote sensing image, and generally divide the algorithm into a transformation method, an assumption prior method, a deep learning method and the like. A typical transformation method is generally a homomorphic filtering method, and because information such as cloud exists in a low-frequency domain of an image, the influence of a cloud layer can be removed by enhancing high-frequency information and inhibiting low-frequency information; the hypothesis prior method provides hypothesis information through statistical analysis of the cloud images and the non-cloud images, and the hypothesis information is used as prior to reversely deduce the remote sensing image without the cloud layer; the deep learning method utilizes the strong characteristic learning capability of the convolutional neural network to recover the cloudless image, and has the characteristics of strong adaptability, strong robustness and the like. Therefore, the remote sensing image cloud layer removing method based on the deep neural network has practical significance and application value.
Disclosure of Invention
The technical scheme of the invention is as follows: the method overcomes the defects of the prior art, and provides a method for distinguishing and removing the cloud layer of the remote sensing image based on the neural network, so that the automatic distinguishing of the thickness of the cloud layer of the remote sensing image is realized, the cloud layer in the remote sensing image is adaptively removed, and additional prior information is not needed.
The technical scheme of the invention is as follows: a method for distinguishing and removing cloud layers of remote sensing images based on a neural network comprises the following steps:
1) selecting a plurality of non-cloud remote sensing images, simulating to generate cloud layers to obtain a cloud layer thickness image, adding the cloud layer thickness image to the non-cloud image to obtain a cloud layer edge image, extracting the edges of the cloud layer thickness image to obtain a cloud layer edge image, forming a remote sensing image data set by the non-cloud image, the cloud layer edge image, the cloud layer thickness image and the cloud layer edge image, and dividing the remote sensing image data set into a training set, a verification set and a test set;
2) the cloud layer edge estimation unit is set up and used for estimating edge characteristic information of a cloud layer, and the cloud layer thickness estimation unit is set up and used for generating cloud layer thickness information and providing the cloud layer thickness information for a subsequent network to finish cloud layer removal;
3) the method comprises the steps of establishing a characteristic strengthening unit for strengthening image cloud layer characteristics, and establishing a cloud layer distinguishing unit for distinguishing whether cloud in an image is thoroughly removed;
4) building a cloud layer removing network based on a cloud layer edge estimation unit, a cloud layer thickness estimation unit, a feature strengthening unit and a cloud layer distinguishing unit, wherein the cloud layer removing network is used for generating a cloud removing picture from a single cloud picture;
5) training a cloud layer removing network based on a cloud layer distinguishing unit loss function;
6) and importing model parameters after training, and obtaining a cloud-removed remote sensing image from a single cloud remote sensing image.
In the step 1), the specific steps of constructing the remote sensing image data set are as follows:
11) selecting m cloud-free remote sensing images, generating cloud layers through simulation to obtain m cloud layer thickness images, adding the m cloud layer thickness images into the cloud-free remote sensing images to obtain m cloud remote sensing images, extracting edges of the cloud layer thickness images by using a Sobel operator to obtain m cloud layer edge images, segmenting all images according to the size of NxN, obtaining N cloud-free remote sensing images H, N, forming a data set U by K, N cloud layer thickness images G and N cloud layer edge images J, wherein m, N and N are positive integers;
12) expanding the data set, specifically: randomly selecting a% of images from the data set U in the step 11), and randomly rotating the images, wherein the rotation operations comprise 90-degree rotation, 180-degree rotation, 270-degree rotation and mirror surface overturning; obtaining an expanded image after rotation operation, and adding the expanded image into a data set U to form a remote sensing image data set V, wherein a is a positive integer;
13) the remote sensing image data set V in the step 12) is according to z1:z2:z3Dividing a training set, a validation set and a test set, wherein z1、z2And z3Are all positive integers.
In the step 2), the specific construction process of the cloud layer edge estimation unit is as follows:
and constructing a cloud layer edge estimation unit based on w residual modules, wherein the cloud layer edge estimation unit is used for estimating edge characteristic information of a cloud layer, the unit input is a cloud remote sensing image, and the output is a cloud layer edge image, and w is a positive integer.
The residual error module comprises z convolution blocks Block and 1 residual error connection, each convolution Block Block is composed of 2 convolutions, 2 LReLU activation functions with slope of lr, 1 residual error learning and 1 feature strengthening unit, convolution kernel sizes are s multiplied by s, steps are e, wherein z, s and e are positive integers, and lr is decimal within the range of [0,1 ].
In the step 2), the specific construction process of the cloud layer thickness estimation unit is as follows:
the method comprises the steps that a cloud layer thickness estimation unit is established based on y double residual modules and 1 residual connection, cloud layer thickness information is generated and provided for a subsequent network to complete cloud layer removal, a cloud remote sensing image is input into the cloud layer thickness estimation unit, a cloud layer thickness map is output, and y is a positive integer.
The double residual error module consists of 3 convolutions, 3 LReLU activating functions with slope lr, 2 residual error learning units and 1 feature strengthening unit, the sizes of convolution kernels are d multiplied by d, steps are t, d and t are positive integers, and lr is a decimal within the range of [0,1 ].
In the step 3), the specific construction process of the feature enhancing unit is as follows:
building a feature strengthening unit based on u convolutions, u-1 LReLU activation functions with slope of lr and 1 Sigmoid activation functions, wherein the feature strengthening unit is used for strengthening image features, the sizes of convolution kernels are f multiplied by f, and the stride is g, wherein u, f and g are positive integers, and lr is a decimal number within a range of [0,1 ];
the process of enhancing the image features comprises the following steps: the input of the feature strengthening unit is convolved, and an LReLU activation function and a Sigmoid activation function with the slope of lr are used for obtaining a feature map beta1A feature map beta1With the original input beta2The corresponding pixel product of (2) to realize feature enhancement at the adaptive element level; wherein lr is in the range of [0,1]The decimal fraction of (c).
In the step 3), the specific construction process of the cloud layer discrimination unit is as follows:
constructing a cloud layer discrimination unit based on j convolutions, j-1 ReLU activation functions, 1 Tanh activation function and 1 mean value pooling, wherein the cloud layer discrimination unit is used for discriminating whether the cloud in the image is removed completely, and the convolution kernel sizes of the j convolutions are sequentially { f }i×fiI e (1,2, …, j) }, with steps of d in turni×diI e (1,2, …, j) }, where j, fiAnd diAre all positive integers;
the process of judging whether the cloud in the image is completely removed is as follows: training a cloud layer distinguishing unit by adopting a data set cloud-free remote sensing image and a cloud remote sensing image, wherein the input of the cloud layer distinguishing unit is an image, the output of the cloud layer distinguishing unit is probability x, and x belongs to [0,1 ]; if the cloud layer of the input image is completely removed, the output x of the cloud layer judging unit is close to 1, and if the cloud layer is not completely removed, the output x is close to 0; wherein x is a decimal number in the range of [0,1 ].
In the step 4), the specific process of constructing the cloud layer removing network is as follows:
training a cloud layer removal network by adopting a cloud remote sensing image, a non-cloud remote sensing image, a cloud layer thickness image and a cloud layer edge image in a data set;
output alpha of the cloud layer edge estimation unit1Output alpha of the cloud layer thickness estimation unit2And thin cloud remote sensing image alpha3Cascading on the channel, and obtaining a characteristic diagram alpha after passing through a characteristic strengthening unit4Judging by using a cloud layer judging unit, if the output is greater than a threshold value k, considering that the cloud layer is thoroughly removed to obtain an output cloud-removed remote sensing image, if the output is less than or equal to the threshold value k, considering that the cloud layer is not thoroughly removed, and repeating the process to carry out the cloud layerRemoving, wherein k is in the range of [0,1]]The decimal fraction of (c).
In the step 5), the cloud image removing characteristic loss function is specifically:
Figure BDA0003548205140000041
wherein C, v and b represent the serial numbers of the length, the width and the channel of the feature diagram, W, H and C represent the length, the width and the channel of the feature diagram, sigma (-) represents the output feature diagram of the I layer of the VGG16 network, H represents the cloud-free remote sensing image,
Figure BDA0003548205140000042
representing the cloud-removed remote sensing image, C belongs to (1, …, W), v belongs to (1, …, H), b belongs to (1, …, C), and C, v, b, W, H, C and l are all positive integers.
In the step 5), the cloud image reconstruction loss function is specifically:
Figure BDA0003548205140000043
in the step 5), the cloud layer thickness loss function is specifically:
Figure BDA0003548205140000051
wherein G represents a thickness map of the cloud layer of the label,
Figure BDA0003548205140000052
a predicted cloud thickness map is shown.
In the step 5), the cloud layer edge loss function is specifically:
Figure BDA0003548205140000053
wherein J represents a label cloud layer edge graph,
Figure BDA0003548205140000054
representing a predicted cloud edge map.
In the step 5), the cloud layer discrimination unit loss function is specifically:
Figure BDA0003548205140000055
in the formula, H represents a cloud-free remote sensing graph, K represents a cloud remote sensing graph, p represents the number of images in a training set, and D (·) represents the output of a cloud layer discrimination unit, wherein p is a positive integer.
The technical scheme provided by the invention has the beneficial effects that:
1. the traditional hypothesis prior algorithm deduces a cloud-removed image according to a physical model based on hypothesis information, however, the hypothesis information has adaptability and limited adaptation range, and can fail in some cases, and the method realizes cloud layer discrimination and removal according to cloud layer characteristics, has wide adaptation range, and can process cloud remote sensing images in many scenes;
2. the traditional filtering method generally processes the whole image, the cloud layer part in the image can be processed in a self-adaptive mode, the cloud layer part cannot be processed in a self-adaptive mode, the problem of color distortion and the like of the cloud layer part cannot be caused when the cloud layer is removed, and the cloud removed image is natural and real;
3. the method has the advantages of wide application range, strong robustness, simplicity, convenience and practicability, and high cloud layer removing efficiency.
Drawings
FIG. 1 is a flow chart of a method for discriminating and removing cloud layers of a remote sensing image based on a neural network;
FIG. 2 is a schematic diagram of a cloud edge estimation unit;
FIG. 3 is a schematic diagram of a cloud layer thickness estimation unit;
FIG. 4 is a schematic diagram of a feature enhancement unit structure;
FIG. 5 is a schematic diagram of a cloud layer discriminating unit;
fig. 6 is a schematic diagram of a cloud layer removal network structure.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention are described in further detail below.
Example 1
The embodiment of the invention provides a method for distinguishing and removing cloud layers of remote sensing images based on a neural network, and the method is described in detail in the following description with reference to fig. 1:
101: selecting a plurality of non-cloud remote sensing images, simulating to generate cloud layers to obtain a cloud layer thickness image, adding the cloud layer thickness image to the non-cloud image to obtain a cloud layer edge image, extracting the edges of the cloud layer thickness image to obtain a cloud layer edge image, forming a remote sensing image data set by the non-cloud image, the cloud layer edge image, the cloud layer thickness image and the cloud layer edge image, and dividing the remote sensing image data set into a training set, a verification set and a test set;
102: constructing a cloud layer edge estimation unit for estimating edge characteristic information of a cloud layer, and constructing a cloud layer thickness estimation unit for generating cloud layer thickness information and providing the cloud layer thickness information for a subsequent network to finish cloud layer removal;
103: the method comprises the steps of establishing a characteristic strengthening unit for strengthening image cloud layer characteristics, and establishing a cloud layer distinguishing unit for distinguishing whether cloud in an image is thoroughly removed;
104: constructing a cloud layer removing network based on the cloud layer edge estimation unit and the cloud layer thickness estimation unit in the step 102 and the feature strengthening unit and the cloud layer distinguishing unit in the step 103, and generating a cloud layer removing image from a single cloud image;
105: training a cloud layer removing network based on a cloud layer distinguishing unit loss function;
106: and importing model parameters after training, and obtaining a cloud-removed remote sensing image from a single cloud remote sensing image.
The specific steps in step 101 are as follows:
1) selecting m cloud-free remote sensing images, generating cloud layers through simulation to obtain m cloud layer thickness images, adding the m cloud layer thickness images into the cloud-free remote sensing images to obtain m cloud remote sensing images, extracting edges of the cloud layer thickness images by using a Sobel operator to obtain m cloud layer edge images, segmenting all images according to the size of NxN, obtaining N cloud-free remote sensing images H, N, forming a data set U by K, N cloud layer thickness images G and N cloud layer edge images J, wherein m, N and N are positive integers; (ii) a
2) Expanding the data set, specifically: randomly selecting a% of images from the data set U in the step 1), randomly performing one of 90-degree rotation, 180-degree rotation, 270-degree rotation and mirror surface turning on the images to obtain an expanded image, and adding the expanded image into the data set U to form a remote sensing image data set V, wherein a is a positive integer
3) The remote sensing image data set V in the step 2) is according to z1:z2:z3Dividing a training set, a validation set and a test set, wherein z1、z2And z3Are all positive integers.
The specific steps in step 102 are as follows:
1) as shown in fig. 2, the residual module includes z convolution blocks Block and 1 residual connection, each convolution Block is composed of 2 convolutions, 2 lreol activation functions with slope lr, 1 residual learning and 1 feature enhancing unit, the sizes of convolution kernels are s × s, steps are e, wherein z, s and e are positive integers, and lr is a decimal within a range of [0,1 ];
2) as shown in fig. 2, a Cloud Edge Estimation Unit (CEEU) is constructed based on w residual modules, and is used for estimating Edge feature information of a Cloud layer, wherein the Unit inputs a Cloud remote sensing image and outputs a Cloud Edge map, and w is a positive integer;
3) as shown in fig. 3, the dual residual unit is composed of 3 convolutions, 3 lreol activation functions with slope lr, 2 residual learning units, and 1 feature enhancement unit, the sizes of convolution kernels are d × d, and the steps are t, where d and t are positive integers, and lr is a decimal number in the range of [0,1 ];
4) as shown in fig. 3, a Cloud layer Thickness Estimation Unit (CTEU) is constructed based on y double residual modules and 1 residual connection, and is used for generating Cloud layer Thickness information and providing the Cloud layer Thickness information to a subsequent network to complete Cloud layer removal, the CTEU Unit inputs a Cloud remote sensing image and outputs a Cloud layer Thickness map, wherein y is a positive integer.
Wherein, the specific steps in step 103 are as follows:
1) as shown in fig. 4, a Feature Enhancement Unit (FEU) is constructed based on u convolutions, u-1 lreul activation functions (slope is lr) and 1 Sigmoid activation function, and is used for enhancing image features, the sizes of convolution kernels are f × f, and steps are g, wherein u, f and g are positive integers, and lr is a decimal within a range of [0,1 ];
2) the feature enhancing unit FEU inputs a feature map beta obtained by a convolution, LReLU activation function with slope of lr and Sigmoid activation function1A feature map beta1With the original input beta2To achieve feature enhancement at the adaptive element level, where lr is in the range of 0,1]The decimal fraction of (d);
3) as shown in fig. 5, a Cloud layer Discrimination Unit (CDU) is constructed based on j convolutions, j-1 ReLU activation functions, 1 Tanh activation function, and 1 mean pooling, and is used for discriminating whether clouds in an image are removed completely, and convolution kernels of the j convolutions have sizes { f } in sequencei×fiI e (1,2, …, j) }, with steps of d in turni×diI ∈ (1,2, …, j) }, where j, fiAnd diAre all positive integers;
4) and training a cloud layer discrimination unit CDU by adopting a data set cloud-free remote sensing image and a cloud remote sensing image, wherein the input of the CDU is an image, and the output is probability x, and x belongs to [0,1 ]. If the input image is a cloud-free remote sensing image, the output x is close to 1, and if the input image has a cloud remote sensing image, the output x is close to 0. Therefore, if the cloud layer of the input image is completely removed, the cloud layer discrimination unit outputs x close to 1, and if the cloud layer is not completely removed, the output x close to 0. Wherein x is a decimal number in the range of [0,1 ].
Wherein, the specific steps in step 104 are as follows:
1) as shown in fig. 6, a cloud layer removal network is built based on the cloud layer edge estimation unit CEEU and the cloud layer thickness estimation unit CTEU in step 102, and the feature enhancing unit FEU and the cloud layer discrimination unit CDU in step 103, and is used for generating a cloud-removed picture from a single cloud-containing picture, and training the cloud layer removal network by adopting a data set including a cloud remote sensing picture, a cloud-free remote sensing picture, a cloud layer thickness picture and a cloud layer edge picture;
2) output alpha of cloud layer edge estimation unit CEEU1Output alpha of the cloud layer thickness estimation unit CTEU2And thin cloud remote sensing image alpha3Cascading on the channel, and obtaining a feature map alpha after passing through a feature enhancing unit FEU4Judging by using a cloud layer judging unit CDU, if the output is greater than a threshold value k, considering that the cloud layer is completely removed to obtain an output cloud-removed remote sensing image, if the output is less than or equal to the threshold value k, considering that the cloud layer is not completely removed, and repeating the process to remove the cloud layer, wherein k is in the range of [0,1]]The decimal fraction of (c).
Wherein, the specific steps in step 105 are as follows:
1) training a cloud layer removing network based on a cloud layer characteristic loss function, a cloud layer reconstruction loss function, a cloud layer thickness loss function and a cloud layer edge loss function, wherein the specific functions are 2), 3), 4) and 5);
2) the cloud image removing characteristic loss function is specifically as follows:
Figure BDA0003548205140000091
wherein C, v and b represent the serial numbers of the length, the width and the channel of the feature diagram, W, H and C represent the length, the width and the channel of the feature diagram, sigma (-) represents the output feature diagram of the I layer of the VGG16 network, H represents the cloud-free remote sensing image,
Figure BDA0003548205140000092
representing a cloud-removed remote sensing image, wherein C belongs to (1, …, W), v belongs to (1, …, H), b belongs to (1, …, C), and C, v, b, W, H, C and l are positive integers;
3) the cloud image reconstruction loss function is specifically as follows:
Figure BDA0003548205140000093
4) the cloud layer thickness loss function is specifically:
Figure BDA0003548205140000094
wherein G represents a thickness map of the cloud layer of the label,
Figure BDA0003548205140000095
representing a predicted cloud layer thickness map;
5) the cloud edge loss function is specifically:
Figure BDA0003548205140000096
wherein J represents a label cloud layer edge graph,
Figure BDA0003548205140000097
representing a predicted cloud edge graph;
6) training a cloud layer discrimination unit based on a cloud layer discrimination unit loss function, wherein the cloud layer discrimination unit loss function is specifically as follows:
Figure BDA0003548205140000101
in the formula, H represents a cloud-free remote sensing graph, K represents a cloud remote sensing graph, p represents the number of training set images, and D (·) represents the output of the cloud layer discrimination unit, wherein p is a positive integer.
Wherein, the specific steps in step 106 are as follows: and importing model parameters after training, and obtaining a cloud-removed remote sensing image from a single cloud remote sensing image.
Example 2
The embodiment of the invention provides a method for distinguishing and removing cloud layers of remote sensing images based on a neural network, and the method is described in detail in the following description with reference to fig. 1:
201: selecting a plurality of non-cloud remote sensing images, simulating to generate cloud layers to obtain a cloud layer thickness image, adding the cloud layer thickness image to the non-cloud image to obtain a cloud layer edge image, extracting the edges of the cloud layer thickness image to obtain a cloud layer edge image, forming a remote sensing image data set by the non-cloud image, the cloud layer edge image, the cloud layer thickness image and the cloud layer edge image, and dividing the remote sensing image data set into a training set, a verification set and a test set;
202: constructing a cloud layer edge estimation unit for estimating edge characteristic information of a cloud layer, and constructing a cloud layer thickness estimation unit for generating cloud layer thickness information and providing the cloud layer thickness information for a subsequent network to finish cloud layer removal;
203: the method comprises the steps of establishing a characteristic strengthening unit for strengthening image cloud layer characteristics, and establishing a cloud layer distinguishing unit for distinguishing whether cloud in an image is thoroughly removed;
204: constructing a cloud layer removing network based on the cloud layer edge estimation unit and the cloud layer thickness estimation unit in the step 102 and the feature strengthening unit and the cloud layer distinguishing unit in the step 103, and generating a cloud layer removing image from a single cloud image;
205: training a cloud layer removing network based on a cloud layer distinguishing unit loss function;
206: and importing model parameters after training is finished, and obtaining a cloud-removed remote sensing image from a single cloud remote sensing image.
Wherein, the specific steps in step 201 are as follows:
1) selecting m cloud-free remote sensing images, generating cloud layers through simulation to obtain m cloud layer thickness images, adding the cloud layer thickness images into the cloud-free remote sensing images to obtain m cloud remote sensing images, extracting edges of the cloud layer thickness images by using a Sobel operator to obtain m cloud layer edge images, segmenting all images according to the size of N multiplied by N to obtain N cloud-free remote sensing images H, N cloud layer thickness images K, N, and forming a data set U by N cloud layer edge images J, wherein m is 500, N is 5000, and N is 256;
2) expanding the data set, specifically: randomly selecting a% of images from the data set U in the step 1), randomly performing one operation of 90-degree rotation, 180-degree rotation, 270-degree rotation and mirror surface turning on the images to obtain an expanded image, and adding the expanded image into the data set U to form a remote sensing image data set V, wherein a is 50;
3) the remote sensing image data set V in the step 2) is according to z1:z2:z3Dividing a training set, a validation set and a test set, wherein z1=7,z2=1,z3=2。
Wherein, the specific steps in step 202 are as follows:
1) as shown in fig. 2, the residual Block includes z convolution blocks Block and 1 residual connection, each convolution Block is composed of 2 convolutions, 2 lreol activation functions with slope lr, 1 residual learning and 1 feature enhancing unit, the convolution kernel sizes are s × s, and the steps are e, where z is 5, lr is 0.2, s is 4, and e is 1;
2) as shown in fig. 2, a Cloud Edge Estimation Unit (CEEU) is constructed based on w residual modules, and is used for estimating Edge feature information of a Cloud layer, wherein the Unit inputs a Cloud remote sensing image and outputs a Cloud Edge map, and w is 5;
3) as shown in fig. 3, the dual residual unit is composed of 3 convolutions, 3 lreul activation functions with slope lr, 2 residual learning units, and 1 feature enhancement unit, where the sizes of convolution kernels are d × d and the steps are t, where lr is 0.2, d is 4, and t is 1;
4) as shown in fig. 3, a Cloud layer Thickness Estimation Unit (CTEU) is constructed based on y double residual modules and 1 residual connection, and is used for generating Cloud layer Thickness information and providing the Cloud layer Thickness information to a subsequent network to complete Cloud layer removal, the CTEU Unit inputs a Cloud remote sensing image and outputs a Cloud layer Thickness map, wherein y is 3.
Wherein, the specific steps in step 203 are:
1) as shown in fig. 4, a Feature Enhancement Unit (FEU) is constructed based on u convolutions, u-1 lreul activation functions (slope is lr) and 1 Sigmoid activation function, and is used for enhancing image features, where the sizes of convolution kernels are f × f and the steps are g, where u is 7, lr is 0.2, f is 4, and g is 1;
2) the feature enhancing unit FEU inputs a feature map beta obtained by a convolution, LReLU activation function with slope of lr and Sigmoid activation function1A feature map beta1With the original input beta2The corresponding pixel product of (a), implementing adaptive element-level feature enhancement, where lr is 0.2;
3) as shown in fig. 5, a Cloud layer discriminating Unit (CDU) is constructed based on j convolutions, j-1 ReLU activation functions, 1 Tanh activation function, and 1 mean pooling, and is used for discriminating whether clouds in an image are removed completely, convolution kernel sizes of the j convolutions are {3 × 3, 5 × 5, 7 × 7, 9 × 9, 11 × 11} in sequence, and steps are {2,2,2,1,1} in sequence, where j is 5;
4) and training a cloud layer discrimination unit CDU by adopting a data set cloud-free remote sensing image and a cloud remote sensing image, wherein the input of the CDU is an image, and the output is probability x, and x belongs to [0,1 ]. If the input image is a cloud-free remote sensing image, the output x is close to 1, and if the input image has a cloud remote sensing image, the output x is close to 0. Therefore, if the cloud layer of the input image is completely removed, the cloud layer discrimination unit outputs x close to 1, and if the cloud layer is not completely removed, the output x close to 0. Wherein x is a decimal number in the range of [0,1 ].
Wherein, the specific steps in step 204 are as follows:
1) as shown in fig. 6, a cloud layer removal network is built based on the cloud layer edge estimation unit CEEU and the cloud layer thickness estimation unit CTEU in step 102, the feature enhancement unit FEU and the cloud layer discrimination unit CDU in step 103, and is used for generating a cloud removal map from a single cloud-containing map, and training the cloud layer removal network by using a cloud remote sensing map, a cloud-free remote sensing map, a cloud layer thickness map and a cloud layer edge map in a data set;
2) output alpha of cloud layer edge estimation unit CEEU1Output alpha of the cloud layer thickness estimation unit CTEU2And thin cloud remote sensing image alpha3Cascading on the channel, and obtaining a feature map alpha after passing through a feature enhancing unit FEU4And judging by using a cloud layer judging unit CDU, if the output is greater than a threshold value k, considering that the cloud layer is thoroughly removed to obtain an output cloud-removed remote sensing image, if the output is less than or equal to the threshold value k, considering that the cloud layer is not thoroughly removed, and repeating the process to remove the cloud layer, wherein k is 0.6.
Wherein, the specific steps in step 205 are:
1) training a cloud layer removing network based on a cloud layer characteristic loss function, a cloud layer reconstruction loss function, a cloud layer thickness loss function and a cloud layer edge loss function, wherein the specific functions are 2), 3), 4) and 5);
2) the cloud image removing characteristic loss function is specifically as follows:
Figure BDA0003548205140000131
wherein C, v and b represent the serial numbers of the length, the width and the channel of the feature diagram, W, H and C represent the length, the width and the channel of the feature diagram, sigma (-) represents the output feature diagram of the I layer of the VGG16 network, H represents the cloud-free remote sensing image,
Figure BDA0003548205140000132
representing a cloud-removed remote sensing image, wherein C belongs to (1, …, W), v belongs to (1, …, H), b belongs to (1, …, C), C, v, b, W, H and C are positive integers, and l is 20.
3) The cloud image reconstruction loss function is specifically as follows:
Figure BDA0003548205140000133
4) the cloud layer thickness loss function is specifically:
Figure BDA0003548205140000134
wherein G represents a thickness map of the cloud layer of the label,
Figure BDA0003548205140000135
representing a predicted cloud layer thickness map;
5) the cloud layer edge loss function is specifically:
Figure BDA0003548205140000136
wherein J represents a label cloud layer edge graph,
Figure BDA0003548205140000141
representing a predicted cloud edge graph;
6) training a cloud layer discrimination unit based on a cloud layer discrimination unit loss function, wherein the cloud layer discrimination unit loss function is specifically as follows:
Figure BDA0003548205140000142
in the formula, H represents a cloud-free remote sensing graph, K represents a cloud remote sensing graph, p represents the number of images in a training set, and D (·) represents the output of a cloud layer discrimination unit, wherein p is a positive integer.
Wherein, the specific steps in step 206 are: and importing model parameters after training, and obtaining a cloud-removed remote sensing image from a single cloud remote sensing image.
Those skilled in the art will appreciate that the drawings are only schematic illustrations of preferred embodiments, and the above-mentioned serial numbers of the embodiments of the present invention are only for description and do not represent the merits of the embodiments.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (14)

1. A method for distinguishing and removing cloud layers of remote sensing images based on a neural network is characterized by comprising the following steps:
1) selecting a plurality of non-cloud remote sensing images, simulating to generate cloud layers to obtain a cloud layer thickness image, adding the cloud layer thickness image to the non-cloud image to obtain a cloud layer edge image, extracting the edges of the cloud layer thickness image to obtain a cloud layer edge image, forming a remote sensing image data set by the non-cloud image, the cloud layer edge image, the cloud layer thickness image and the cloud layer edge image, and dividing the remote sensing image data set into a training set, a verification set and a test set;
2) constructing a cloud layer edge estimation unit for estimating edge characteristic information of a cloud layer, and constructing a cloud layer thickness estimation unit for generating cloud layer thickness information and providing the cloud layer thickness information for a subsequent network to finish cloud layer removal;
3) the method comprises the steps of establishing a characteristic strengthening unit for strengthening image cloud layer characteristics, and establishing a cloud layer distinguishing unit for distinguishing whether cloud in an image is thoroughly removed;
4) building a cloud layer removing network based on a cloud layer edge estimation unit, a cloud layer thickness estimation unit, a feature strengthening unit and a cloud layer distinguishing unit, wherein the cloud layer removing network is used for generating a cloud removing picture from a single cloud picture;
5) training a cloud layer removing network based on a cloud layer distinguishing unit loss function;
6) and importing model parameters after training, and obtaining a cloud-removed remote sensing image from a single cloud remote sensing image.
2. The method for distinguishing and removing the cloud layer of the remote sensing image based on the neural network as claimed in claim 1, wherein in the step 1), the specific steps for constructing the data set of the remote sensing image are as follows:
11) selecting m cloud-free remote sensing images, generating cloud layers through simulation to obtain m cloud layer thickness images, adding the m cloud layer thickness images into the cloud-free remote sensing images to obtain m cloud remote sensing images, extracting edges of the cloud layer thickness images by using a Sobel operator to obtain m cloud layer edge images, segmenting all images according to the size of NxN, obtaining N cloud-free remote sensing images H, N, forming a data set U by K, N cloud layer thickness images G and N cloud layer edge images J, wherein m, N and N are positive integers;
12) expanding the data set, specifically: randomly selecting a% of images from the data set U in the step 11), and randomly rotating the images, wherein the rotation operations comprise 90-degree rotation, 180-degree rotation, 270-degree rotation and mirror surface overturning; obtaining an expanded image after rotation operation, and adding the expanded image into a data set U to form a remote sensing image data set V, wherein a is a positive integer;
13) the remote sensing image data set V in the step 12) is according to z1:z2:z3Dividing a training set, a validation set and a test set, wherein z1、z2And z3Are all made ofA positive integer.
3. The method for distinguishing and removing the cloud layer of the remote sensing image based on the neural network as claimed in claim 1, wherein in the step 2), the specific construction process of the cloud layer edge estimation unit is as follows:
and constructing a cloud layer edge estimation unit based on w residual modules, wherein the cloud layer edge estimation unit is used for estimating edge characteristic information of a cloud layer, the unit input is a cloud remote sensing image, and the output is a cloud layer edge image, and w is a positive integer.
4. The method for distinguishing and removing the cloud layer of the remote sensing image based on the neural network as claimed in claim 3, wherein the residual module comprises z convolution blocks Block and 1 residual connection, each convolution Block is composed of 2 convolutions, 2 LReLU activation functions with slope lr, 1 residual learning and 1 feature enhancement unit, the convolution kernel sizes are s × s, the steps are e, wherein z, s and e are positive integers, and lr is a decimal within a range of [0,1 ].
5. The method for distinguishing and removing the cloud layer of the remote sensing image based on the neural network as claimed in claim 1, wherein in the step 2), the specific construction process of the cloud layer thickness estimation unit is as follows:
the method comprises the steps that a cloud layer thickness estimation unit is established based on y double residual modules and 1 residual connection, cloud layer thickness information is generated and provided for a subsequent network to complete cloud layer removal, a cloud remote sensing image is input into the cloud layer thickness estimation unit, a cloud layer thickness map is output, and y is a positive integer.
6. The method for distinguishing and removing the cloud layer of the remote sensing image based on the neural network as claimed in claim 5, wherein the double residual module is composed of 3 convolutions, 3 LReLU activation functions with slope lr, 2 residual learning units and 1 feature strengthening unit, the sizes of convolution kernels are d x d, the steps are t, wherein d and t are positive integers, and lr is a decimal within a range of [0,1 ].
7. The method for distinguishing and removing the cloud layer of the remote sensing image based on the neural network as claimed in claim 1, wherein in the step 3), the specific construction process of the feature enhancing unit is as follows:
building a feature strengthening unit based on u convolutions, u-1 LReLU activation functions with slope of lr and 1 Sigmoid activation functions, wherein the feature strengthening unit is used for strengthening image features, the sizes of convolution kernels are f multiplied by f, and the stride is g, wherein u, f and g are positive integers, and lr is a decimal number within a range of [0,1 ];
the process of enhancing the image characteristics comprises the following steps: the input of the feature strengthening unit is convolved, and an LReLU activation function and a Sigmoid activation function with the slope of lr are used for obtaining a feature map beta1A feature map beta1With the original input beta2The corresponding pixel product of (2) to realize feature enhancement at the adaptive element level; wherein lr is in the range of [0,1]]The decimal fraction of (c).
8. The method for distinguishing and removing the cloud layer of the remote sensing image based on the neural network according to claim 1, wherein in the step 3), the specific construction process of the cloud layer distinguishing unit is as follows:
constructing a cloud layer discrimination unit based on j convolutions, j-1 ReLU activation functions, 1 Tanh activation function and 1 mean value pooling, wherein the cloud layer discrimination unit is used for discriminating whether the cloud in the image is removed completely, and the convolution kernel sizes of the j convolutions are sequentially { f }i×fiI e (1,2, …, j) }, with steps of d in turni×diI ∈ (1,2, …, j) }, where j, fiAnd diAre all positive integers;
the process of judging whether the cloud in the image is completely removed is as follows: training a cloud layer distinguishing unit by adopting a data set cloud-free remote sensing image and a cloud remote sensing image, wherein the input of the cloud layer distinguishing unit is an image, the output is probability x, and x belongs to [0,1 ]; if the cloud layer of the input image is completely removed, the output x of the cloud layer judging unit is close to 1, and if the cloud layer is not completely removed, the output x is close to 0; wherein x is a decimal number in the range of [0,1 ].
9. The method for distinguishing and removing the cloud layer of the remote sensing image based on the neural network according to claim 1, wherein in the step 4), the specific process of constructing the cloud layer removing network is as follows:
training a cloud layer removal network by adopting a cloud remote sensing image, a non-cloud remote sensing image, a cloud layer thickness image and a cloud layer edge image in a data set;
output alpha of the cloud layer edge estimation unit1Output alpha of the cloud layer thickness estimation unit2And thin cloud remote sensing image alpha3Cascading on the channel, and obtaining a characteristic diagram alpha after passing through a characteristic strengthening unit4Judging by using a cloud layer judging unit, if the output is greater than a threshold k, considering that the cloud layer is thoroughly removed to obtain an output cloud-removed remote sensing image, if the output is less than or equal to the threshold k, considering that the cloud layer is not thoroughly removed, and repeating the process to remove the cloud layer, wherein k is in the range of [0,1]]The decimal fraction of (c).
10. The method for distinguishing and removing the cloud layer of the remote sensing image based on the neural network as claimed in claim 1, wherein in the step 5), the cloud layer removing characteristic loss function is specifically as follows:
Figure FDA0003548205130000041
in the formula, C, v and b represent the serial numbers of the length, width and channel of the feature diagram, W, H and C represent the length, width and channel size of the feature diagram, sigma (-) represents the output feature diagram of the I-th layer of the VGG16 network, H represents the cloud-free remote sensing image,
Figure FDA0003548205130000042
representing the cloud-removed remote sensing image, C belongs to (1, …, W), v belongs to (1, …, H), b belongs to (1, …, C), and C, v, b, W, H, C and l are all positive integers.
11. The method for distinguishing and removing the cloud layer of the remote sensing image based on the neural network as claimed in claim 10, wherein in the step 5), the cloud layer image reconstruction loss function is specifically:
Figure FDA0003548205130000043
12. the method for distinguishing and removing the cloud layer of the remote sensing image based on the neural network as claimed in claim 10, wherein in the step 5), the cloud layer thickness loss function is specifically:
Figure FDA0003548205130000044
wherein G represents a thickness map of the cloud layer of the label,
Figure FDA0003548205130000045
a predicted cloud thickness map is shown.
13. The method for distinguishing and removing the cloud layer of the remote sensing image based on the neural network as claimed in claim 10, wherein in the step 5), the cloud layer edge loss function is specifically:
Figure FDA0003548205130000046
wherein J represents a label cloud layer edge graph,
Figure FDA0003548205130000047
representing a predicted cloud edge map.
14. The method for discriminating and removing the cloud layer of the remote sensing image based on the neural network as claimed in claim 10, wherein in the step 5), the cloud layer discrimination unit loss function is specifically:
Figure FDA0003548205130000051
in the formula, H represents a cloud-free remote sensing graph, K represents a cloud remote sensing graph, p represents the number of images in a training set, and D (·) represents the output of a cloud layer discrimination unit, wherein p is a positive integer.
CN202210255108.XA 2022-03-15 2022-03-15 Neural network-based remote sensing image cloud layer distinguishing and removing method Pending CN114663303A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210255108.XA CN114663303A (en) 2022-03-15 2022-03-15 Neural network-based remote sensing image cloud layer distinguishing and removing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210255108.XA CN114663303A (en) 2022-03-15 2022-03-15 Neural network-based remote sensing image cloud layer distinguishing and removing method

Publications (1)

Publication Number Publication Date
CN114663303A true CN114663303A (en) 2022-06-24

Family

ID=82030182

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210255108.XA Pending CN114663303A (en) 2022-03-15 2022-03-15 Neural network-based remote sensing image cloud layer distinguishing and removing method

Country Status (1)

Country Link
CN (1) CN114663303A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116645448A (en) * 2023-04-25 2023-08-25 北京卫星信息工程研究所 Quantitative cloud automatic adding method and device for optical remote sensing image
CN117876817A (en) * 2023-12-25 2024-04-12 北京化工大学 Method for generating countermeasure sample

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116645448A (en) * 2023-04-25 2023-08-25 北京卫星信息工程研究所 Quantitative cloud automatic adding method and device for optical remote sensing image
CN116645448B (en) * 2023-04-25 2023-12-22 北京卫星信息工程研究所 Quantitative cloud automatic adding method and device for optical remote sensing image
CN117876817A (en) * 2023-12-25 2024-04-12 北京化工大学 Method for generating countermeasure sample

Similar Documents

Publication Publication Date Title
CN111915530B (en) End-to-end-based haze concentration self-adaptive neural network image defogging method
CN114663303A (en) Neural network-based remote sensing image cloud layer distinguishing and removing method
CN109215034B (en) Weak supervision image semantic segmentation method based on spatial pyramid covering pooling
CN110503613B (en) Single image-oriented rain removing method based on cascade cavity convolution neural network
CN109741340B (en) Ice cover radar image ice layer refined segmentation method based on FCN-ASPP network
CN109102475B (en) Image rain removing method and device
CN110097522B (en) Single outdoor image defogging method based on multi-scale convolution neural network
CN111738954B (en) Single-frame turbulence degradation image distortion removal method based on double-layer cavity U-Net model
CN111179196B (en) Multi-resolution depth network image highlight removing method based on divide-and-conquer
CN111127354A (en) Single-image rain removing method based on multi-scale dictionary learning
CN110796623A (en) Infrared image rain removing method and device based on progressive residual error network
CN114663309A (en) Image defogging method and system based on multi-scale information selection attention mechanism
CN112991199A (en) Image high-low frequency decomposition noise removing method based on residual error dense network
CN114140346A (en) Image processing method and device
CN112767280A (en) Single image raindrop removing method based on loop iteration mechanism
CN112164010A (en) Multi-scale fusion convolution neural network image defogging method
CN113962889A (en) Thin cloud removing method, device, equipment and medium for remote sensing image
Hussain et al. Image denoising to enhance character recognition using deep learning
CN111160282B (en) Traffic light detection method based on binary Yolov3 network
CN110349119B (en) Pavement disease detection method and device based on edge detection neural network
CN111738939A (en) Complex scene image defogging method based on semi-training generator
CN114926348B (en) Device and method for removing low-illumination video noise
CN115358952A (en) Image enhancement method, system, equipment and storage medium based on meta-learning
CN116823627A (en) Image complexity evaluation-based oversized image rapid denoising method
Li et al. Distribution-transformed network for impulse noise removal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination