CN113361414A - Remote sensing image cloud amount calculation method based on composite neural network - Google Patents

Remote sensing image cloud amount calculation method based on composite neural network Download PDF

Info

Publication number
CN113361414A
CN113361414A CN202110635804.9A CN202110635804A CN113361414A CN 113361414 A CN113361414 A CN 113361414A CN 202110635804 A CN202110635804 A CN 202110635804A CN 113361414 A CN113361414 A CN 113361414A
Authority
CN
China
Prior art keywords
cloud
neural network
remote sensing
thumb
grade
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110635804.9A
Other languages
Chinese (zh)
Other versions
CN113361414B (en
Inventor
路志英
王港
曹鑫磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University
Original Assignee
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University filed Critical Tianjin University
Priority to CN202110635804.9A priority Critical patent/CN113361414B/en
Publication of CN113361414A publication Critical patent/CN113361414A/en
Application granted granted Critical
Publication of CN113361414B publication Critical patent/CN113361414B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/047Probabilistic or stochastic networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to the technical field of remote sensing satellite image processing and transmission, and provides a remote sensing image cloud amount calculation method based on a composite neural network, which has the characteristics of high efficiency, good optimization performance, high accuracy and the like, and is suitable for the fields of remote sensing cloud amount calculation, detection and the like. (1) Establishing a cloud computing sample library in a cloud grading mode; (2) constructing a composite neural network; (3) training and learning the composite neural network, and respectively adjusting the weights of the miniature neural network and the miniature neural network to obtain a network model for cloud computing; (4) and carrying out cloud amount calculation on the remote sensing image by adopting a thumb chart priority strategy to obtain a cloud amount value. The invention is mainly applied to remote sensing satellite image processing and transmission occasions.

Description

Remote sensing image cloud amount calculation method based on composite neural network
Technical Field
The invention relates to the technical field of remote sensing satellite image processing and transmission, in particular to a remote sensing image cloud cover calculation method based on a composite neural network, which can be used for application scenes such as remote sensing image cloud cover calculation, remote sensing satellite imaging quality evaluation, remote sensing satellite data transmission and the like.
Background
The following methods can be used for cloud amount evaluation in the field of remote sensing image cloud amount calculation, but all have certain defects in accuracy and calculation efficiency:
(1) the brightness threshold method is the oldest and simple cloud region extraction and cloud cover calculation algorithm, and is used for extracting a cloud region from an image of a certain type of sensor through a threshold value based on the difference of brightness values of clouds and ground objects. For different images, the use of a fixed empirical threshold is often unreliable, and a dynamic threshold that varies with geographical location and season may improve detection accuracy to some extent. In addition, Otsu, a Japanese scholarer, proposes an automatic threshold based on a maximum inter-class variance classification criterion (hereinafter abbreviated as Otsu threshold), and is also commonly used for extracting cloud regions, but these methods are only effective for scenes with high cloud content, and cannot automatically identify non-cloud scenes.
(2) When the luminance is used for the analysis, the high-luminance ground objects such as snow, sand, and rock are inevitably misjudged. The chromaticity information is extracted from the visible light multispectral image to be used as a judgment basis, and the misjudgment can be reduced to a certain extent. Further, the use of short wave infrared and thermal infrared information can more effectively reduce misjudgment, but the sensor is required to have enough wave spectrum detection range and is mainly suitable for a hyperspectral camera and an infrared multispectral camera image. The method related to short-wave infrared and thermal infrared information cannot be implemented due to the spectral detection range of the optical remote sensing sensor.
(3) By analyzing the difference of the texture features of the cloud and the ground features on the image, proper features or feature combinations such as fractal dimension, gray level co-occurrence matrix and the like are extracted, or the features are extended in a multi-scale space to distinguish the cloud and the ground features. However, the types of clouds in the satellite images are various, the distribution of the features of different types of clouds is not concentrated in each feature space, and the feature of the ground object is the same, so that the accuracy of cloud detection by using texture features is limited.
(4) Cloud detection and cloud amount calculation based on two or more images with similar time phases in the same region are also common methods. In the method, the cloud is regarded as a change target in the image, and the idea of change detection is used for detecting the cloud. The multiple images may be from the same sensor or from different, even different, classes of sensors. The cloud detection method based on multiple images is often used in combination with the cloud detection method based on a single image, so that the detection precision can be effectively improved, but the method has the defects of high requirement on data, requirement on consistent multiple image regions and similar time phases, requirement on better geometric consistency and color consistency among the images, huge calculation amount and difficulty in engineering implementation.
(5) The comprehensive classification method simultaneously utilizes the characteristics of radiation, texture, time phase and the like of the image to detect the cloud in the image, or divides the image into various categories such as cloud, water, forest, bare land and the like at one time. K-means (K-means), Markov Random Field (Markov Random Field) models, convolutional Neural Network (Neural Network) models, etc. are commonly used technologies, and the most widely applied Support Vector Machine (SVM) algorithm based on statistical learning theory and structure risk minimization principle is used. Some recent researches introduce the idea of multilevel semantic segmentation and object-oriented into the classification model, and under the condition of considering the context information of the image, the coarse classification result is subjected to multiple judgments to obtain a cloud detection result with higher precision. The methods improve the detection accuracy to a certain extent, but generally a large number of manually interpreted cloud-containing images of different types are required to be used as samples to train the classifier, and the manual interpretation of the cloud-containing images is extremely time-consuming and labor-consuming work. And most methods have low efficiency and high requirements on the memory of a computer, and often do not have the capability of processing massive remote sensing images.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention aims to provide a remote sensing image cloud amount calculation method based on a composite neural network, which has the characteristics of high efficiency, good optimization performance, high accuracy and the like and is suitable for the fields of remote sensing cloud amount calculation, detection and the like. Therefore, the technical scheme adopted by the invention is that the remote sensing image cloud amount calculation method based on the composite neural network comprises the following steps:
(1) establishing a cloud computing sample library in a cloud grading mode, wherein the sample library comprises a thumb graph and a browsing graph of an image;
(2) constructing a composite neural network, which comprises a miniature neural network for calculating the cloud amount of the thumb map and a miniature neural network for calculating a browsing map;
(3) training and learning the composite neural network, and respectively adjusting the weights of the miniature neural network and the miniature neural network to obtain a network model for cloud computing;
(4) and carrying out cloud amount calculation on the remote sensing image by adopting a thumb chart priority strategy to obtain a cloud amount value.
The step (1) comprises the following detailed steps:
(101) arranging standard remote sensing image product data, wherein the standard remote sensing image product data comprises metadata, a thumb graph, a browsing graph and an original image;
(102) reading a cloud cover value in the metadata, wherein the cloud cover value range is between 0 and 100, observing the cloud cover condition in the original image, if the cloud cover value is close to the cloud cover value in the metadata, keeping the image data as a sample, and if the difference is larger, bringing the image data into a manual interpretation range;
(103) the cloud cover is divided into 11 grade levels of 0 to 10, wherein the cloud cover is 0 grade, 1-10 grade 1, 11 to 20 grade 2, 21 to 30 grade 3, 31 to 40 grade 4, 41 to 50 grade 5, 51 to 60 grade 6, 61 to 70 grade 7, 71 to 80 grade 8, 81 to 90 grade 9, 91 to 100 grade 10, and the interpreted data is directly interpreted as the 11 grade levels by human visual observation.
The step (2) comprises the following detailed steps:
(201) constructing a miniature neural network for calculating the thumb map cloud volume, wherein the input convolutional layer is a convolutional layer of 256 multiplied by 3, and then accessing a Batch Normalization layer and a maximum pooling layer, wherein the Maxpooling layer is 4 multiplied by 4, so as to form a first group of network modules; the second, third and fourth groups of network modules are consistent with the first group of modules in structure, but the convolutional layers are respectively 64 × 64 × 12, 16 × 16 × 48 and 4 × 4 × 192; the last network module comprises three full-connection layers and a last Softmax layer, wherein the input layer of the full-connection layer is 3072, the hidden layers are 1536 and 1536, and the final output is 11 types;
(202) constructing a small neural network for calculating the cloud cover of the browsing image, wherein the input convolutional layer is a 1024 × 1024 × 3 convolutional layer, and then accessing a Batch Normalization layer and a Maxpooling layer, wherein the Maxpooling layer is 4 × 4, so as to form a first group of network modules; the second, third, fourth, fifth sets of network modules are constructed identically to the first set of modules, but with the convolutional layers 256 × 256 × 12, 64 × 64 × 48, 16 × 16 × 192, 4 × 4 × 768, respectively; the last network module comprises four full-connection layers and a last Softmax layer, wherein the input layer of the full-connection layer is 12288, the hidden layers are 3072 and 3072, and the final output is 11 types;
the step (3) comprises the following detailed steps:
(301) defining a loss function of a remote sensing image cloud amount calculation neural network, wherein the loss function of the miniature neural network for calculating the thumb map cloud amount and the loss function of the miniature neural network for calculating the browsing map cloud amount are as follows:
Figure BDA0003105636730000031
wherein
Figure BDA0003105636730000032
Is the expected output probability, y is the actual output probability, and x is the input quantity;
(302) defining a method for calculating the final cloud amount
Figure BDA0003105636730000033
Where i is the cloud number level, piIs the probability of a cloud rating of i.
(303) Training and learning in a designed micro neural network and a small neural network by using a thumb graph and a browsing graph sample of the remote sensing image with (0-10) labels to obtain a network model for newly generating cloud amount calculation of the remote sensing image.
The step (4) comprises the following detailed steps:
(401) acquiring remote sensing image data to be subjected to cloud amount calculation, and reading a thumb map and a browser map of the remote sensing image data;
(402) carrying out cloud amount calculation on the thumb chart by using the trained micro neural network, and outputting the final result of the cloud amount by using the formula in the step (302);
(403) if the cloud amount is 0 or 10, the cloud amount calculation of the browsing image is not needed, and the calculation result of the cloud amount of the thumb image is directly output as a final result;
(404) if the cloud amount is more than 0 and less than 10, continuously utilizing the trained small neural network to calculate the cloud amount of the browsing image;
(405) using a weighting mode to sum the two calculated cloud cover to obtain the final cloud cover, YLfinl=αYLthumb+βYLbrowingWherein α and β are weights, YLthumbCloud volume results calculated for the thumb map, YLbrowingCloud cover results calculated for the browsing graph.
The invention has the characteristics and beneficial effects that:
1. the invention provides a remote sensing image cloud amount calculation method based on a composite neural network, overcomes the defects that the cloud amount calculation cannot be automatically carried out, the cloud amount calculation is inaccurate, and the cloud amount calculation consumes too large calculation resources in the existing method, and improves the performance of the remote sensing image cloud amount calculation.
2. The method has better practical application and lightweight rapid deployment performance, and can meet the application requirements of remote sensing image quality evaluation and cloud amount calculation.
Description of the drawings:
FIG. 1 is a schematic flow diagram of the present invention.
FIG. 2 is a schematic diagram of a remote sensing image thumb pattern according to the present invention.
FIG. 3 is a schematic view of a remote sensing image browsing pattern according to the present invention.
FIG. 4 is a schematic diagram of a miniature neural network according to the present invention.
FIG. 5 is a schematic diagram of a small neural network structure according to the present invention.
Detailed Description
The invention aims to avoid the defects in the prior art and provides a remote sensing image cloud amount calculation method based on a composite neural network, which has the characteristics of high efficiency, good optimization performance, high accuracy and the like, is suitable for the fields of remote sensing cloud amount calculation, detection and the like, and has the detailed steps as shown in figure 1.
The purpose of the invention is realized as follows:
(1) establishing a cloud computing sample library in a cloud grading mode, wherein the sample library comprises a thumb graph and a browsing graph of an image;
(2) constructing a composite neural network, which comprises a miniature neural network for calculating the cloud amount of the thumb map and a miniature neural network for calculating a browsing map;
(3) training and learning the composite neural network, and respectively adjusting the weights of the miniature neural network and the miniature neural network to obtain a network model for cloud computing;
(4) and carrying out cloud amount calculation on the remote sensing image by adopting a thumb chart priority strategy to obtain a cloud amount value.
The step (1) comprises the following detailed steps:
(101) arranging standard remote sensing image product data, wherein the standard remote sensing image product data comprises metadata, a thumb graph, a browsing graph and an original image, the thumb graph is shown in figure 2, and the browsing graph is shown in figure 3;
(102) reading a cloud cover value in the metadata, wherein the cloud cover value range is between 0 and 100, observing the cloud cover condition in the original image, if the cloud cover value is close to the cloud cover value in the metadata, keeping the image data as a sample, and if the difference is larger, bringing the image data into a manual interpretation range;
(103) the cloud cover is divided into 11 grade levels of 0 to 10, wherein the cloud cover is 0 grade, 1-10 grade 1, 11 to 20 grade 2, 21 to 30 grade 3, 31 to 40 grade 4, 41 to 50 grade 5, 51 to 60 grade 6, 61 to 70 grade 7, 71 to 80 grade 8, 81 to 90 grade 9, 91 to 100 grade 10, and the interpreted data is directly interpreted as the 11 grade levels by human visual observation.
The step (2) comprises the following detailed steps:
(201) constructing a miniature neural network for calculating the cloud amount of the thumbtack, wherein the input convolutional layer is a convolutional layer of 256 multiplied by 3, and then accessing a Batch Normalization layer and a Maxpooling layer, wherein the Maxpooling layer is 4 multiplied by 4, so as to form a first group of network modules; the second, third and fourth groups of network modules are consistent with the first group of modules in structure, but the convolutional layers are respectively 64 × 64 × 12, 16 × 16 × 48 and 4 × 4 × 192; the final network module comprises three full-connection layers and a final Softmax layer, wherein the input layer of the full-connection layer is 3072, the hidden layers are 1536 and 1536, the final output is 11 types, and the specific network structure is schematically shown in FIG. 4;
(202) constructing a small neural network for calculating the cloud cover of the browsing image, wherein the input convolutional layer is a 1024 × 1024 × 3 convolutional layer, and then accessing a Batch Normalization layer and a Maxpooling layer, wherein the Maxpooling layer is 4 × 4, so as to form a first group of network modules; the second, third, fourth, fifth sets of network modules are constructed identically to the first set of modules, but with the convolutional layers 256 × 256 × 12, 64 × 64 × 48, 16 × 16 × 192, 4 × 4 × 768, respectively; the final network module comprises four full-connection layers and a final Softmax layer, wherein the input layer of the full-connection layer is 12288, the hidden layers are 3072 and 3072, the final output is 11 types, and the specific network structure is schematically shown in FIG. 5;
the step (3) comprises the following detailed steps:
(301) defining a loss function of a remote sensing image cloud amount calculation neural network, wherein the loss function of the miniature neural network for calculating the thumb map cloud amount and the loss function of the miniature neural network for calculating the browsing map cloud amount are as follows:
Figure BDA0003105636730000051
wherein
Figure BDA0003105636730000052
Is the expected output probability, y is the actual output probability, and x is the input quantity;
(302) defining a method for calculating the final cloud amount
Figure BDA0003105636730000053
Where i is the cloud number level, piIs the probability of a cloud rating of i.
(303) Training and learning in a designed micro neural network and a small neural network by using a thumb graph and a browsing graph sample of the remote sensing image with (0-10) labels to obtain a network model for newly generating cloud amount calculation of the remote sensing image.
The step (4) comprises the following detailed steps:
(401) acquiring remote sensing image data to be subjected to cloud amount calculation, and reading a thumb map and a browser map of the remote sensing image data;
(402) carrying out cloud amount calculation on the thumb chart by using the trained micro neural network, and outputting the final result of the cloud amount by using the formula in the step (302);
(403) if the cloud amount is 0 or 10, the cloud amount calculation of the browsing image is not needed, and the calculation result of the cloud amount of the thumb image is directly output as a final result;
(404) if the cloud amount is more than 0 and less than 10, continuously utilizing the trained small neural network to calculate the cloud amount of the browsing image;
(405) using a weighting mode to sum the two calculated cloud cover to obtain the final cloud cover, YLfinl=αYLthumb+βYLbrowingWherein α and β are weights, YLthumbCloud volume results calculated for the thumb map, YLbrowingCloud cover results calculated for the browsing graph.
The present invention will be described in further detail with reference to the accompanying drawings and specific examples.
(101) Arranging standard remote sensing image product data, wherein the standard remote sensing image product data comprises metadata, a thumb graph, a browsing graph and an original image;
(102) reading a cloud cover value in the metadata, wherein the cloud cover value range is between 0 and 100, observing the cloud cover condition in the original image, if the cloud cover value is close to the cloud cover value in the metadata, keeping the image data as a sample, and if the difference is larger, bringing the image data into a manual interpretation range;
(103) the cloud cover is divided into 11 grade levels of 0 to 10, wherein the cloud cover is 0 grade, 1-10 grade 1, 11 to 20 grade 2, 21 to 30 grade 3, 31 to 40 grade 4, 41 to 50 grade 5, 51 to 60 grade 6, 61 to 70 grade 7, 71 to 80 grade 8, 81 to 90 grade 9, 91 to 100 grade 10, and the interpreted data is directly interpreted as the 11 grade levels by human visual observation.
(201) Constructing a miniature neural network for calculating the cloud amount of the thumbtack, wherein the input convolutional layer is a convolutional layer of 256 multiplied by 3, and then a Batch Normalization layer and a Maxpooling layer are accessed, wherein the Maxpooling layer is 4 multiplied by 4, and the modules form a first group of network modules; the second, third and fourth groups of network modules are consistent with the first group of modules in structure, but the convolutional layers are respectively 64 × 64 × 12, 16 × 16 × 48 and 4 × 4 × 192; the last network module comprises three full-connection layers and a last Softmax layer, wherein the input layer of the full-connection layer is 3072, the hidden layers are 1536 and 1536, and the final output is 11 types;
(202) constructing a small neural network for calculating the cloud cover of the browsing image, wherein the input convolutional layer is a 1024 × 1024 × 3 convolutional layer, and then accessing a Batch Normalization layer and a Maxpooling layer, wherein the Maxpooling layer is 4 × 4, and the modules form a first group of network modules; the second, third, fourth, fifth sets of network modules are constructed identically to the first set of modules, but with the convolutional layers 256 × 256 × 12, 64 × 64 × 48, 16 × 16 × 192, 4 × 4 × 768, respectively; the last network module comprises four full-connection layers and a last Softmax layer, wherein the input layer of the full-connection layer is 12288, the hidden layers are 3072 and 3072, and the final output is 11 types;
(301) defining a loss function of a remote sensing image cloud amount calculation neural network, wherein the loss function of the miniature neural network for calculating the thumb map cloud amount and the loss function of the miniature neural network for calculating the browsing map cloud amount are as follows:
Figure BDA0003105636730000061
wherein
Figure BDA0003105636730000062
Is the expected output probability, y is the actual output probability, and x is the input quantity;
(302) defining a method for calculating the final cloud amount
Figure BDA0003105636730000063
Where i is the cloud number level, piIs the probability of a cloud rating of i.
(303) Training and learning in a designed micro neural network and a small neural network by using a thumb graph and a browsing graph sample of the remote sensing image with (0-10) labels to obtain a network model for newly generating cloud amount calculation of the remote sensing image.
(401) Acquiring remote sensing image data to be subjected to cloud amount calculation, and reading a thumb map and a browser map of the remote sensing image data;
(402) carrying out cloud amount calculation on the thumb chart by using the trained micro neural network, and outputting the final result of the cloud amount by using the formula in the step (302);
(403) if the cloud amount is 0 or 10, the cloud amount calculation of the browsing image is not needed, and the calculation result of the cloud amount of the thumb image is directly output as a final result;
(404) if the cloud amount is more than 0 and less than 10, continuously utilizing the trained small neural network to calculate the cloud amount of the browsing image;
(405) using a weighting mode to sum the two calculated cloud cover to obtain the final cloud cover, YLfinl=αYLthumb+βYLbrowingWherein α and β are weights, YLthumbCloud volume results calculated for the thumb map, YLbrowingCloud cover results calculated for the browsing graph.
In the implementation process, the cloud amount calculation test is carried out by utilizing domestic high-resolution series satellite images, resource series satellite images and satellite remote sensing images obtained by observation of foreign MODIS, SENTINEL-2, SPOT and the like, several cloud amount detection algorithms of the mainstream in the world are compared, and the following table shows the comparison result. As can be seen from the following table, the method of the invention can obtain better accuracy of cloud amount calculation without the need of original images and multispectral data, and is less in time consumption and more suitable for application such as image quality evaluation.
TABLE 1 comparison table of cloud amount calculation results of remote sensing images
Figure BDA0003105636730000064
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (3)

1. A remote sensing image cloud amount calculation method based on a composite neural network is characterized by comprising the following steps:
(1) establishing a cloud computing sample library in a cloud grading mode, wherein the sample library comprises a thumb graph and a browsing graph of an image;
(2) constructing a composite neural network, which comprises a miniature neural network for calculating the cloud amount of the thumb map and a miniature neural network for calculating a browsing map;
(3) training and learning the composite neural network, and respectively adjusting the weights of the miniature neural network and the miniature neural network to obtain a network model for cloud computing;
(4) and carrying out cloud amount calculation on the remote sensing image by adopting a thumb chart priority strategy to obtain a cloud amount value.
2. The method for calculating the cloud amount of the remote sensing image based on the composite neural network as claimed in claim 1, wherein the step (1) comprises the following detailed steps:
(101) arranging standard remote sensing image product data, wherein the standard remote sensing image product data comprises metadata, a thumb graph, a browsing graph and an original image;
(102) reading a cloud cover value in the metadata, wherein the cloud cover value range is between 0 and 100, observing the cloud cover condition in the original image, if the cloud cover value is close to the cloud cover value in the metadata, keeping the image data as a sample, and if the difference is larger, bringing the image data into a manual interpretation range;
(103) the cloud cover is divided into 11 grade levels of 0 to 10, wherein the cloud cover is 0 grade, 1-10 grade 1, 11 to 20 grade 2, 21 to 30 grade 3, 31 to 40 grade 4, 41 to 50 grade 5, 51 to 60 grade 6, 61 to 70 grade 7, 71 to 80 grade 8, 81 to 90 grade 9, 91 to 100 grade 10, and the interpreted data is directly interpreted as the 11 grade levels by human visual observation.
3. The method for calculating the cloud amount of the remote sensing image based on the composite neural network as claimed in claim 1, wherein the step (2) comprises the following detailed steps:
(201) constructing a miniature neural network for calculating the cloudiness of the thumb image, wherein the input convolution layer is a convolution layer of 256 x 3, and then a Batch Normalization layer and a maximum pooling layer are accessed, wherein the Maxpooling layer is 4 x 4, so as to form a first group of network modules; the second, third and fourth groups of network modules are consistent with the first group of modules in structure, but the convolutional layers are respectively 64 × 64 × 12, 16 × 16 × 48 and 4 × 4 × 192; the last network module comprises three full-connection layers and a last Softmax layer, wherein the input layer of the full-connection layer is 3072, the hidden layers are 1536 and 1536, and the final output is 11 types;
(202) constructing a small neural network for calculating the cloud cover of the browsing image, wherein the input convolutional layer is a 1024 × 1024 × 3 convolutional layer, and then accessing a Batch Normalization layer and a Maxpooling layer, wherein the Maxpooling layer is 4 × 4, so as to form a first group of network modules; the second, third, fourth, fifth sets of network modules are constructed identically to the first set of modules, but with the convolutional layers 256 × 256 × 12, 64 × 64 × 48, 16 × 16 × 192, 4 × 4 × 768, respectively; the last network module comprises four full-connection layers and a last Softmax layer, wherein the input layer of the full-connection layer is 12288, the hidden layers are 3072 and 3072, and the final output is 11 types;
the step (3) comprises the following detailed steps:
(301) defining a loss function of a remote sensing image cloud amount calculation neural network, wherein the loss function of the miniature neural network for calculating the thumb map cloud amount and the loss function of the miniature neural network for calculating the browsing map cloud amount are as follows:
Figure FDA0003105636720000021
wherein
Figure FDA0003105636720000022
Is the expected output probability, y is the actual output probability, and x is the input quantity;
(302) defining a method for calculating the final cloud amount
Figure FDA0003105636720000023
Where i is the cloud number level, piIs the probability of a cloud rating of i.
(303) Training and learning in a designed micro neural network and a small neural network by using a thumb graph and a browsing graph sample of the remote sensing image with (0-10) labels to obtain a network model for newly generating cloud amount calculation of the remote sensing image.
The step (4) comprises the following detailed steps:
(401) acquiring remote sensing image data to be subjected to cloud amount calculation, and reading a thumb map and a browser map of the remote sensing image data;
(402) carrying out cloud amount calculation on the thumb chart by using the trained micro neural network, and outputting the final result of the cloud amount by using the formula in the step (302);
(403) if the cloud amount is 0 or 10, the cloud amount calculation of the browsing image is not needed, and the calculation result of the cloud amount of the thumb image is directly output as a final result;
(404) if the cloud amount is more than 0 and less than 10, continuously utilizing the trained small neural network to calculate the cloud amount of the browsing image;
(405) using a weighting mode to sum the two calculated cloud cover to obtain the final cloud cover, YLfinl=αYLthumb+βYLbrowingWherein α and β are weights, YLthumbCloud volume results calculated for the thumb map, YLbrowingCloud cover results calculated for the browsing graph.
CN202110635804.9A 2021-06-08 2021-06-08 Remote sensing image cloud amount calculation method based on composite neural network Active CN113361414B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110635804.9A CN113361414B (en) 2021-06-08 2021-06-08 Remote sensing image cloud amount calculation method based on composite neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110635804.9A CN113361414B (en) 2021-06-08 2021-06-08 Remote sensing image cloud amount calculation method based on composite neural network

Publications (2)

Publication Number Publication Date
CN113361414A true CN113361414A (en) 2021-09-07
CN113361414B CN113361414B (en) 2022-09-02

Family

ID=77533087

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110635804.9A Active CN113361414B (en) 2021-06-08 2021-06-08 Remote sensing image cloud amount calculation method based on composite neural network

Country Status (1)

Country Link
CN (1) CN113361414B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012005461A2 (en) * 2010-07-09 2012-01-12 대한민국(기상청장) Method for automatically calculating information on clouds
CN104504389A (en) * 2014-12-18 2015-04-08 南京信息工程大学 Satellite cloud amount computing method based on convolution neural network
CN108846474A (en) * 2018-05-18 2018-11-20 南京信息工程大学 The satellite cloud picture cloud amount calculation method of convolutional neural networks is intensively connected based on multidimensional
CN109871269A (en) * 2019-01-15 2019-06-11 中国人民解放军63921部队 A kind of Remote Sensing Data Processing method, system, electronic equipment and medium
CN110533063A (en) * 2019-07-17 2019-12-03 赛德雷特(珠海)航天科技有限公司 A kind of cloud amount calculation method and device based on satellite image and GMDH neural network
CN112419401A (en) * 2020-11-23 2021-02-26 上海交通大学 Aircraft surface defect detection system based on cloud edge cooperation and deep learning

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012005461A2 (en) * 2010-07-09 2012-01-12 대한민국(기상청장) Method for automatically calculating information on clouds
CN104504389A (en) * 2014-12-18 2015-04-08 南京信息工程大学 Satellite cloud amount computing method based on convolution neural network
CN108846474A (en) * 2018-05-18 2018-11-20 南京信息工程大学 The satellite cloud picture cloud amount calculation method of convolutional neural networks is intensively connected based on multidimensional
CN109871269A (en) * 2019-01-15 2019-06-11 中国人民解放军63921部队 A kind of Remote Sensing Data Processing method, system, electronic equipment and medium
CN110533063A (en) * 2019-07-17 2019-12-03 赛德雷特(珠海)航天科技有限公司 A kind of cloud amount calculation method and device based on satellite image and GMDH neural network
CN112419401A (en) * 2020-11-23 2021-02-26 上海交通大学 Aircraft surface defect detection system based on cloud edge cooperation and deep learning

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
FENGYING XIE ET AL.: "Multi-level Cloud Detection in Remote Sensing Images Based on Deep Learning", 《IEEE》 *
夏旻 等: "基于卷积神经网络的卫星云图云量计算", 《系统仿真学报》 *

Also Published As

Publication number Publication date
CN113361414B (en) 2022-09-02

Similar Documents

Publication Publication Date Title
CN111753828B (en) Natural scene horizontal character detection method based on deep convolutional neural network
CN107358260B (en) Multispectral image classification method based on surface wave CNN
CN109063754B (en) Remote sensing image multi-feature joint classification method based on OpenStreetMap
CN109684922B (en) Multi-model finished dish identification method based on convolutional neural network
CN108898065B (en) Deep network ship target detection method with candidate area rapid screening and scale self-adaption
CN110276386A (en) A kind of apple grading method and system based on machine vision
CN110633708A (en) Deep network significance detection method based on global model and local optimization
CN110309780A (en) High resolution image houseclearing based on BFD-IGA-SVM model quickly supervises identification
CN109726649B (en) Remote sensing image cloud detection method and system and electronic equipment
CN111968171A (en) Aircraft oil quantity measuring method and system based on artificial intelligence
CN103366373B (en) Multi-time-phase remote-sensing image change detection method based on fuzzy compatible chart
CN113435254A (en) Sentinel second image-based farmland deep learning extraction method
Hasanlou et al. A sub-pixel multiple change detection approach for hyperspectral imagery
CN107529647B (en) Cloud picture cloud amount calculation method based on multilayer unsupervised sparse learning network
CN111222545A (en) Image classification method based on linear programming incremental learning
Yao et al. Optical remote sensing cloud detection based on random forest only using the visible light and near-infrared image bands
Wang et al. A PM2. 5 concentration estimation method based on multi-feature combination of image patches
CN111428752B (en) Visibility detection method based on infrared image
CN113361414B (en) Remote sensing image cloud amount calculation method based on composite neural network
Yin et al. Cloud detection of high-resolution remote sensing image based on improved U-Net
CN116452872A (en) Forest scene tree classification method based on improved deep pavv3+
CN115311566A (en) Cloud mask image generation method and pre-training network training method
Wu et al. Ground-based vision cloud image classification based on extreme learning machine
CN115424006A (en) Multi-source multi-level data fusion method applied to crop phenotypic parameter inversion
CN115393631A (en) Hyperspectral image classification method based on Bayesian layer graph convolution neural network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant