CN109886193B - Volume cloud detection method and device and computer readable storage medium - Google Patents

Volume cloud detection method and device and computer readable storage medium Download PDF

Info

Publication number
CN109886193B
CN109886193B CN201910129005.7A CN201910129005A CN109886193B CN 109886193 B CN109886193 B CN 109886193B CN 201910129005 A CN201910129005 A CN 201910129005A CN 109886193 B CN109886193 B CN 109886193B
Authority
CN
China
Prior art keywords
color channel
primary color
scale
gray
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910129005.7A
Other languages
Chinese (zh)
Other versions
CN109886193A (en
Inventor
彭真明
刘雨菡
曹思颖
吕昱霄
彭凌冰
杨春平
赵学功
何艳敏
蒲恬
王光慧
曹兆洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN201910129005.7A priority Critical patent/CN109886193B/en
Publication of CN109886193A publication Critical patent/CN109886193A/en
Application granted granted Critical
Publication of CN109886193B publication Critical patent/CN109886193B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses a cirrus cloud detection method, which comprises the following steps: inputting a remote sensing image, and acquiring a primary color channel gray scale image of the remote sensing image; acquiring a fractal dimension characteristic diagram of a primary color channel gray scale diagram; acquiring a weight gray-scale image based on the fractal dimension characteristic image and the primary color channel gray-scale image; clustering calculation is carried out on pixel points of the primary color channel gray-scale image, a clustering result of the pixel points is obtained, and a pre-distribution label of each pixel point of the primary color channel gray-scale image is determined based on the clustering result; establishing a graph segmentation model based on the primary color channel gray-scale image, the weight gray-scale image and the pre-distribution label; and performing minimum cut calculation on the graph cut model, and outputting a detection result of the remote sensing image. The invention also discloses a volume cloud detection device and a computer readable storage medium.

Description

Volume cloud detection method and device and computer readable storage medium
Technical Field
The invention relates to the technical field of remote sensing image processing, in particular to a method and equipment for detecting rolling clouds and a computer readable storage medium.
Background
The cloud detection in the remote sensing image has extremely wide application in life, such as weather forecast, geographical monitoring and the like. In particular, in remote sensing imaging, other areas are shielded by the cirrus cloud, so how to extract cirrus clouds in different shapes in a remote sensing image is a research hotspot in recent years. The shape of the cirrus cloud is often measured in a magic way, the brightness of the cirrus cloud is difficult to distinguish from some remote sensing high radiation objects such as snow, white buildings and the like, and some semitransparent thin clouds exist, so that the cirrus cloud detection difficulty is greatly improved under the conditions.
In the prior art, a detection method of a rolling cloud in a remote sensing image is mainly a detection method based on a single frame image, and includes a threshold segmentation method and a machine learning method, for example, a method proposed by Kang et al in 2017 for detecting a rolling cloud by using a training multi-feature fusion model of a Support Vector Machine (SVM); yuan et al, 2015, proposed a method for detecting a rolling cloud by dividing the area between the rolling cloud and other objects by combining a bag of words (BoW) model and an SVM; zhan et al, 2017, proposed a method for detecting a volume cloud by using a Convolutional Neural Network (CNN) for feature extraction. However, the texture structure features of the cirrus cloud cannot be well extracted by the threshold segmentation method, and the contrast condition between the cirrus cloud and other objects in the remote sensing image is excessively depended on, so that the cirrus cloud and other high-radiation objects are difficult to distinguish; the machine learning method requires a large amount of sample data for training, so that when the sample is limited, the cirrus cloud cannot be effectively detected.
On the other hand, the graph cut method is widely applied to various fields as a popular image cutting method, and the image cutting problem can be converted into a model optimization problem by using the graph cut method, so that the dependence degree on features in image detection is reduced, and the graph cut method has good applicability to the case of limited samples.
Disclosure of Invention
The invention aims to: the cirrus cloud detection method is provided, and the problems that in the existing cirrus cloud detection, when high-radiation object interference and limited sample data are met, the detection effect is poor by using a threshold segmentation method and a machine learning method are solved by adopting a fractal and graph cut algorithm to detect cirrus clouds.
The invention specifically adopts the following technical scheme for realizing the purpose:
in a first aspect, the invention discloses a cirrus cloud detection method, comprising the following steps:
step 1, inputting a remote sensing image, and acquiring a primary color channel gray-scale image of the remote sensing image;
step 2, obtaining a fractal dimension characteristic diagram of a primary color channel gray scale diagram;
step 3, obtaining a weight gray-scale image based on the fractal dimension characteristic image and the primary color channel gray-scale image;
step 4, performing clustering calculation on the pixel points of the primary color channel gray-scale image to obtain a clustering result of the pixel points, and determining a pre-distribution label of each pixel point of the primary color channel gray-scale image based on the clustering result;
step 5, establishing a graph segmentation model based on the primary color channel gray-scale image, the weight gray-scale image and the pre-distributed labels;
and 6, performing minimum segmentation calculation on the graph cut model, and outputting a detection result of the remote sensing image.
Further, step 2 specifically includes:
step 2.1, calculating the fractal curved surface area S (n) of each pixel point in the primary color channel gray-scale image, wherein n represents a unit area scale;
step 2.2, calculating the fractal dimension d of each pixel point based on the fractal curved surface area S (n) and a fractal surface formula; wherein the fractal surface formula is S (n) ═ n2-d
And 2.3, acquiring a fractal dimension characteristic diagram of the primary color channel gray-scale diagram based on the fractal dimension d of each pixel point.
Further, step 3 specifically includes:
and taking the fractal dimension characteristic graph as a weight to multiply and weight the primary color channel gray-scale graph, and acquiring a weight gray-scale graph based on a weighting result.
Further, step 4 specifically includes:
step 4.1, randomly selecting K pixel points in the primary color channel gray-scale image as initial values of a clustering center, and calculating Euclidean distances between other pixel points except the clustering center in the primary color channel gray-scale image and the clustering center;
step 4.2, clustering all pixel points based on the Euclidean distances, acquiring the pixel point which is closest to the mean value of the Euclidean distances in each cluster as a new cluster center for iterative computation, and acquiring a final clustering result when the cluster center is not changed any more;
and 4.3, determining the pre-distribution label of each pixel point in the primary color channel gray-scale image based on the final clustering result.
Further, step 5 specifically includes:
step 5.1, determining a criterion function of the graph cut model;
step 5.2, determining an area item of the criterion function based on the primary color channel gray-scale map; determining a boundary item of the criterion function based on the weight gray-scale map; determining a calculated initial value of the criterion function based on the pre-assigned tag;
and 5.3, establishing the graph cut model based on the criterion function.
Further, the criterion function of the graph cut model specifically includes:
e (l) ═ r (l) + b (l); wherein, L is a calculation initial value of the criterion function, r (L) is an area term of the criterion function, b (L) is a boundary term of the criterion function, a is a preset balance parameter, and e (L) is a graph cut value.
Further, performing minimal cut calculation on the graph cut model specifically includes:
the minimum cut value of the graph cut model was calculated using the Ford-Fulkerson labeling algorithm.
Further, the primary color channel gray scale map is a blue color channel gray scale map.
In a second aspect, the present invention discloses a cirrus cloud detection device, including:
a processor, a memory, and a communication bus;
the communication bus is used for realizing communication connection between the processor and the memory;
the memory is used for storing a volume cloud detection program capable of running on the processor;
the processor is configured to:
inputting a remote sensing image, and acquiring a primary color channel gray scale image of the remote sensing image;
acquiring a fractal dimension characteristic diagram of a primary color channel gray scale diagram;
acquiring a weight gray-scale image based on the fractal dimension characteristic image and the primary color channel gray-scale image;
clustering calculation is carried out on pixel points of the primary color channel gray scale image, a clustering result of the pixel points is obtained, and pre-distribution labels of the pixel points are determined based on the clustering result;
determining a graph segmentation model based on the primary color channel gray-scale image, the weight gray-scale image and the pre-distribution label;
and calculating the graph cut model, and outputting a detection result of the remote sensing image based on the calculation result.
In a third aspect, the present invention discloses a computer-readable storage medium, in which one or more programs are stored, and the one or more programs are executable by one or more processors to implement the steps of the volume cloud detection method in any one of the first aspects.
After the scheme is adopted, the invention has the following beneficial effects:
the method avoids the limitation that the traditional threshold method cannot distinguish the cirrus cloud from other high-radiation objects and the dependence of a machine learning method on the number of samples and the extraction characteristics, realizes effective detection on the cirrus cloud by adopting a method for optimizing a graph cutting algorithm by extracting the fractal dimension characteristics of the cirrus cloud, solves the cirrus cloud detection problem under the condition of limited number of samples, and simultaneously improves the accuracy and recall rate of the detection result.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a schematic flow chart of a cirrus cloud detection method according to embodiment 1 of the present invention;
fig. 2 is a schematic flow chart of a method for detecting a rolling cloud according to embodiment 2 of the present invention;
FIG. 3 is a remote sensing image in embodiment 2 of the present invention;
FIG. 4 is a gray scale diagram of primary color channels in embodiment 2 of the present invention;
fig. 5 is a fractal dimension characteristic diagram in embodiment 2 of the present invention;
FIG. 6 is a weight gray scale diagram in embodiment 2 of the present invention;
FIG. 7 is a result of detecting a remote sensing image in embodiment 2 of the present invention;
fig. 8 is a schematic diagram of a hardware structure of a rolling cloud detection device according to embodiment 3 of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the detailed description and specific examples, while indicating the preferred embodiment of the invention, are intended for purposes of illustration only and are not intended to limit the scope of the invention. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.
It is noted that relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
Example 1
Referring to fig. 1, an embodiment of the present invention provides a cirrus cloud detection method, including the following steps:
s101, inputting a remote sensing image, and acquiring a primary color channel gray scale image of the remote sensing image;
s102, acquiring a fractal dimension characteristic diagram of a primary color channel gray scale diagram;
s103, acquiring a weight gray-scale image based on the fractal dimension characteristic image and the primary color channel gray-scale image;
s104, carrying out clustering calculation on pixel points of the primary color channel gray-scale image to obtain a clustering result of the pixel points, and determining a pre-distribution label of each pixel point of the channel gray-scale image based on the clustering result;
s105, establishing a graph segmentation model based on the primary color channel gray-scale image, the weight gray-scale image and the pre-distributed labels;
and S106, performing minimum cut calculation on the graph cut model, and outputting a detection result of the remote sensing image.
Because the detection of the cirrus cloud is mainly from a remote sensing image transmitted by a satellite or a spacecraft, the prior art generally adopts a method for extracting spectral features from the remote sensing image for detection, and the embodiment of the invention adopts a fractal algorithm such as a coverage method and the like, so as to grab the irregular features of the cirrus cloud, and effectively distinguish the cirrus cloud from other high-radiation objects by acquiring the fractal dimension features of the cirrus cloud; furthermore, a graph cut model for detecting the cirrus clouds is established based on the curved surface fractal characteristics of the cirrus clouds and the clustering condition of each pixel in the gray level image, the detection result of the highlighted cirrus clouds in the remote sensing image can be obtained through graph cut calculation, excessive sample quantity is not needed in the calculation process, the problem that a machine learning method cannot be applied when the sample quantity is limited is solved, and the method is more efficient and accurate compared with a traditional threshold method which cannot effectively distinguish the cirrus clouds characteristics.
Example 2
Referring to fig. 2 to 7, an embodiment of the present invention is further optimized based on embodiment 1, and provides a cirrus cloud detection method, including the following steps:
s201, inputting the remote sensing image, and obtaining a primary color channel gray scale image of the remote sensing image.
As shown in fig. 2, the remote sensing image in the embodiment of the present invention is actually a color image. Because the color image contains a large amount of information and is low in processing speed and not beneficial to digital processing, the color remote sensing image is converted into a channel gray-scale image of one of Red, Green and Blue (RGB) primary colors in the embodiment of the invention.
Preferably, as shown in fig. 3, in the embodiment of the present invention, the remote sensing image is converted into a blue channel gray scale image for further processing. Due to the fact that the attribute of the cirrus cloud is white, compared with the method that the remote sensing image is converted into a green channel gray-scale image or a red channel gray-scale image, the display effect of the blue channel gray-scale image on the cirrus cloud boundary is better, and the distinguishing degree between pixels is higher.
Specifically, some software or function libraries, such as openCV, Mathematica, matlab, and the like, may be used to process the remote sensing image and extract the primary color channel grayscale, and specific application processes of these software or function libraries are not described herein again.
Optionally, the remote sensing image may also be converted into a grayscale image of a red channel or a green channel, and although the same display effect as that of a blue channel cannot be achieved, the same technical effect may also be achieved based on the improvement of a subsequent algorithm, and details of the method for acquiring the grayscale images of the red channel and the green channel are not repeated here.
Therefore, the remote sensing image is converted into the primary color channel gray-scale image, so that the subsequent processing of the remote sensing image is facilitated, and the good effect on the identification of the cirrus cloud is achieved.
S202, calculating the fractal curved surface area S (n) of each pixel point in the primary color channel gray-scale image by using a covering method, wherein n represents a unit area scale.
It will be appreciated that the function f (i, j) may be set here based on a primary channel gray map, where f represents the gray value and (i, j) represents the pixel position. When the covering method is used, a gray image is imagined as a fractal curved surface in a three-dimensional space, all points which are n away from the surface of the curved surface in the three-dimensional space are assumed to be covered by a blanket with the single-layer thickness of n, and the upper surface of the covering curved surface is set to be Un(i, j) lower surface Dn(i, j) and an initial value of U0(i,j)=D0(i, j) ═ f (i, j), then the equations for the calculation of the top and bottom surfaces of the blanket are as follows:
Figure BDA0001974605290000061
Figure BDA0001974605290000062
where (p, q) represents a pixel position at a distance of less than 1 from the pixel (i, j), max represents taking the maximum value, and min represents taking the minimum value.
From the equations (1) and (2), the blanket volume νnThe calculation formula of (2) is as follows:
Figure BDA0001974605290000071
from equation (3), the blanket surface area is calculated as:
Figure BDA0001974605290000072
s203, calculating the fractal dimension d of each pixel point based on the fractal curved surface area S (n) and a fractal surface formula; wherein, the fractal surface formula is S (n) ═ n2-d
The fractal dimension d of the image surface can be obtained by the formula (4) and a fractal surface formula, wherein the calculation formula of the fractal dimension d of the image surface is as follows:
Figure BDA0001974605290000073
wherein In represents taking a natural logarithm.
Optionally, the image surface fractal dimension d of each pixel in the primary color channel gray-scale map may also be obtained through other algorithms, and other specific algorithms may include: a size code method, a small island method, a box-counting dimension method, a structure function method, a half-variance method, a transformation method and the like. For the specific process of calculating the fractal dimension d of the image surface by adopting other methods, details are not repeated here.
And S204, acquiring a fractal dimension characteristic diagram of the primary color channel gray-scale diagram based on the fractal dimension d of each pixel point.
It is understood that the fractal dimension can reflect the effectiveness of the space occupied by the complex shape, and is a measurement parameter of the irregularity of the complex shape. Specifically, in the embodiment of the present invention, the fractal dimension may reflect an occupation rate and a filling rate of a corresponding cirrus cloud pixel in the primary color channel gray-scale image in an image space.
Further, the specific output effect can be seen in a fractal dimension characteristic diagram shown in fig. 5. Obviously, due to the fact that the volume cloud has irregular multi-level attribute characteristics, the fractal dimension of the pixel corresponding to the volume cloud is high, and the fractal dimension of the background pixel is low. Therefore, the cirrus cloud and the background object can be distinguished through the fractal dimension characteristic diagram.
S205, multiplying and weighting the fractal dimension characteristic graph serving as a weight with the primary color channel gray-scale graph, and obtaining a weight gray-scale graph based on a weighting result.
It can be understood that, in this step, the fractal dimension feature of each pixel is used as a weight to perform a weighted operation on the primary color channel gray-scale map, and the calculation formula is as follows:
B(i,j)=d(i,j)×f(i,j)………………………………………(6)
wherein f is a gray value, d is a fractal dimension characteristic, and B is a weight gray value obtained through calculation.
Understandably, due to the fact that the structure of the cirrus cloud is complex, the fractal dimension of the pixel points corresponding to the cirrus cloud in the image is high, and after the primary color channel gray-scale image is weighted by the fractal dimension, the pixels corresponding to the cirrus cloud in the image can be effectively highlighted. Specifically, the output effect can be seen in a weight grayscale chart shown in fig. 6.
S206, randomly selecting K pixel points from the primary color channel gray-scale image as initial values of the clustering centers, and calculating Euclidean distances between other pixel points except the clustering centers and the clustering centers in the primary color channel gray-scale image.
It can be understood that, in the embodiment of the present invention, the K-means algorithm is used to cluster each pixel in the primary color channel gray scale map. The K-means algorithm is used as a partitional clustering algorithm, a clustering center can be used for representing a cluster, an initial center is randomly selected in an iteration process, the center is not necessarily a point in the cluster, a final clustering center can be determined through continuous iteration, and the algorithm is mainly used for processing numerical data. The iterative process of the clustering of the method mainly calculates Euclidean distances between other positions and the central position. Specifically, the calculation formula of the euclidean distance is as follows:
Figure BDA0001974605290000081
wherein, JkDenotes the distance, x, between an arbitrary pixel and the Kth cluster centeri,jTwo-dimensional coordinates representing the position of the pixel, ckIn the K-th clusterAnd (4) a heart. After the Euclidean distance between each pixel and the cluster center is obtained, the pixels are classified according to the nearest distance criterion, namely, the pixels are distributed to the class corresponding to the cluster center nearest to the pixel.
S207, clustering all pixel points based on Euclidean distances, acquiring the pixel point closest to the mean value of the Euclidean distances in each cluster as a new clustering center for iterative computation, and acquiring a final clustering result when the clustering center is not changed any more;
it will be appreciated that the initial center of the cluster center, since it was randomly chosen, may not exist in its current corresponding cluster. Obviously, the risk cannot be eliminated by performing clustering once, and the clustering center is reselected from each clustering result to perform iterative operation until the clustering center is not changed any more, so that the pixels corresponding to the cloud or background can be classified into the corresponding classes.
And S208, determining a pre-distribution label of each pixel point in the primary color channel gray-scale image based on the final clustering result.
It can be understood that, in the embodiment of the present invention, the K-means algorithm is used as a machine learning method, and is not used for performing sample training and detection on a cirrus cloud as in the prior art, but is used for performing preset classification on each pixel in a gray-scale map so as to assign a preset label value to each pixel; specifically, all pixels in the image are classified into two types, namely, a cloud and a background object, and corresponding tag values are set, for example, a pixel corresponding to the cloud is set to a tag value of 1, and a pixel corresponding to the background object is set to a tag value of 0. By the clustering method, the conversion from the pixel to the label value is realized, the subsequent operation processing is convenient, and the detection efficiency is improved.
Optionally, other clustering algorithms, such as a mean shift clustering algorithm, a density clustering algorithm, and the like, may also be used in the embodiment of the present invention to cluster the pixels in the gray scale map. For the specific implementation process using other types of clustering algorithms, details are not repeated here.
S209, determining a criterion function of the image segmentation model, and determining parameters of the criterion function based on the primary color channel gray-scale image, the weight gray-scale image and the pre-distributed labels.
It can be understood that the embodiment of the present invention is expected to realize the detection of the effective pixels in the image, i.e., the pixels corresponding to the cirrus clouds, by establishing the graph cut model.
Specifically, the graph cut model is established based on a graph cut algorithm (graph-cut) criterion function, which is as follows:
E(L)=a*R(L)+B(L)…………………………………………(8)
wherein, L is the initial value of the graph cut model, r (L) is the region term of the graph cut model, b (L) is the boundary term of the graph cut model, a is the preset balance parameter, and e (L) is the graph cut value.
It is understood that in the criterion function, the region term is a priori penalty term, and the boundary term is a similarity penalty term between regions; the balance parameter is used to balance the influence degree of the region term and the boundary term on the graph cut value, for example, if a takes 0, it is stated that only the boundary term factor is considered, and the balance region term factor is not considered. The graph cut algorithm aims to solve the minimum cut path between the source point and the sink point, namely, the result of the maximum graph cut value is realized.
Specifically, in the embodiment of the present invention, the area term of the criterion function is determined based on the primary color channel gray-scale map; determining a boundary item of a criterion function based on the weight gray-scale map; a calculated initial value of a criterion function is determined based on the pre-assigned tags.
It can be understood that the pixels based on the primary color channel gray scale map can perform preliminary region division on the region occupied by the cloud and the background object, so that the pixels are suitable for the region item parameter as a criterion function; the pixels corresponding to the cloud and the background object in the weight gray-scale image are obviously divided for the boundary of the cloud and the background object due to different weights, so that the method is suitable for serving as the boundary item parameter of the criterion function; and the pixels corresponding to the cloud and the background can be assigned with different pre-assigned labels through a clustering algorithm, so that the method is suitable for being used as a calculation initial value in a criterion function.
Specifically, in the embodiment of the present invention, since the area term has a direct influence on the calculation result of the graph cut, a may be a positive integer greater than 0, and experimental tests show that a has a preferred value of 100.
It will be appreciated that after the parameters of the graph cut model criterion function are determined, the graph cut model can be built and processed further.
And S210, performing minimum segmentation calculation on the graph cut model, and outputting a calculation result as a detection result of the remote sensing image.
It will be appreciated that after the graph cut model is built, graph cut results may be obtained using a graph cut algorithm.
Preferably, the embodiment of the present invention calculates the minimum cut value of the graph cut model using the Ford-Fulkerson labeling algorithm.
It is understood that the purpose of the graph cut algorithm is to obtain the maximum flow that satisfies the directed graph model G ═ V, E >.
Specifically, the directed graph has a unique source point S as a departure point, i.e., an in-degree of 0, and a unique sink point T as an end point, i.e., an out-degree of 0. At the start of the algorithm, let all vertices u, V ∈ V have f (u, V) ═ 0, i.e., the value of the flow at the initial state is 0, and then the flow value is increased by finding an augmented path in the model. An augmented path may be considered herein as a path from a source S to a sink T along which the traffic of the network may be increased, thereby increasing the value of the flow. Through an iterative experiment, when all the augmentation paths in the model are found, the calculation is stopped, and the minimum cut (or called maximum flow) of the model is obtained as a detection result.
It can be understood that, by obtaining the minimum cut result of the graph cut model, the output of the pixels corresponding to the rolling cloud can be maximized, so that the detection result of the rolling cloud is output. Specifically, the output effect can be seen in a detection result graph of the remote sensing image shown in fig. 7.
Alternatively, other kinds of graph cutting algorithms can be used to obtain the detection result, including Goldberg-Tarjan algorithm, Edmonds-Karp algorithm, etc., which are not described herein again.
It can be understood that for the specific implementation of the graph cut algorithm in the embodiment of the present invention, compiling tools such as Python, OpenCV, Matlab, and the like may be used, and details are not described here.
Experiments prove that the accuracy rate of the cirrus cloud detection can be improved to more than 95% by adopting the cirrus cloud detection method, and the recall rate is improved to 90%.
Example 3
Referring to fig. 8, in a specific hardware structure of a rolling cloud detection device according to embodiment 3 of the present invention, the rolling cloud detection device 8 may include: a memory 82 and a processor 83; the various components are coupled together by a communication bus 81. It will be appreciated that the communication bus 81 is used to enable communications among these components. The communication bus 81 includes a power bus, a control bus, and a status signal bus, in addition to a data bus. For clarity of illustration, however, the various buses are labeled in fig. 8 as communication bus 81.
A memory 82 for storing a positioning method program executable on the processor 83;
a processor 83, configured to execute the following steps when executing the positioning method program:
step 1, inputting a remote sensing image, and acquiring a primary color channel gray-scale image of the remote sensing image;
step 2, obtaining a fractal dimension characteristic diagram of a primary color channel gray scale diagram;
step 3, obtaining a weight gray-scale image based on the fractal dimension characteristic image and the primary color channel gray-scale image;
step 4, performing clustering calculation on the pixel points of the primary color channel gray-scale image to obtain a clustering result of the pixel points, and determining a pre-distribution label of each pixel point of the primary color channel gray-scale image based on the clustering result;
step 5, establishing a graph segmentation model based on the primary color channel gray-scale image, the weight gray-scale image and the pre-distributed labels;
and 6, performing minimum segmentation calculation on the graph cut model, and outputting a detection result of the remote sensing image.
Further, step 2 specifically includes:
step 2.1, calculating the fractal curved surface area S (n) of each pixel point in the primary color channel gray-scale image, wherein n represents a unit area scale;
step 2.2, based on fractal curveCalculating the fractal dimension d of each pixel point by using the surface area S (n) and a fractal surface formula; wherein the fractal surface formula is S (n) ═ n2-d
And 2.3, acquiring a fractal dimension characteristic diagram of the primary color channel gray-scale diagram based on the fractal dimension d of each pixel point.
Further, step 3 specifically includes:
and taking the fractal dimension characteristic graph as a weight to multiply and weight the primary color channel gray-scale graph, and acquiring a weight gray-scale graph based on a weighting result.
Further, step 4 specifically includes:
step 4.1, randomly selecting K pixel points in the primary color channel gray-scale image as initial values of a clustering center, and calculating Euclidean distances between other pixel points except the clustering center in the primary color channel gray-scale image and the clustering center;
step 4.2, clustering all pixel points based on the Euclidean distances, acquiring the pixel point which is closest to the mean value of the Euclidean distances in each cluster as a new cluster center for iterative computation, and acquiring a final clustering result when the cluster center is not changed any more;
and 4.3, determining the pre-distribution label of each pixel point in the primary color channel gray-scale image based on the final clustering result.
Further, step 5 specifically includes:
step 5.1, determining a criterion function of the graph cut model;
step 5.2, determining an area item of the criterion function based on the primary color channel gray-scale map; determining a boundary item of the criterion function based on the weight gray-scale map; determining a calculated initial value of the criterion function based on the pre-assigned tag;
and 5.3, establishing the graph cut model based on the criterion function.
Further, the criterion function of the graph cut model specifically includes:
e (l) ═ r (l) + b (l); wherein, L is a calculation initial value of the criterion function, r (L) is an area term of the criterion function, b (L) is a boundary term of the criterion function, a is a preset balance parameter, and e (L) is a graph cut value.
Further, performing minimal cut calculation on the graph cut model specifically includes:
the minimum cut value of the graph cut model was calculated using the Ford-Fulkerson labeling algorithm.
Further, the primary color channel gray scale map is a blue color channel gray scale map.
It will be appreciated that the memory 82 in embodiments of the invention may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of illustration and not limitation, many forms of RAM are available, such as Static random access memory (Static RAM, SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic random access memory (Synchronous DRAM, SDRAM), Double Data Rate Synchronous Dynamic random access memory (ddr Data Rate SDRAM, ddr SDRAM), Enhanced Synchronous SDRAM (ESDRAM), Synchlink DRAM (SLDRAM), and Direct Rambus RAM (DRRAM). The memory 82 of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
And processor 83 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 83. The Processor 83 may be a general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 82, and the processor 83 reads the information in the memory 82 and performs the steps of the above method in combination with the hardware thereof.
Based on the foregoing embodiments, an embodiment of the present invention provides a computer-readable medium, where a rolling cloud detection program is stored, and when the rolling cloud detection program is executed by at least one processor, the steps of the positioning method in any of the above embodiments are implemented.
It is understood that the method steps in the above embodiments may be stored in a computer-readable storage medium, and based on such understanding, part of the technical solutions of the embodiments of the present invention that essentially or contributes to the prior art, or all or part of the technical solutions may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the method of the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It is to be understood that the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or any combination thereof. For a hardware implementation, the Processing units may be implemented within one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), general purpose processors, controllers, micro-controllers, microprocessors, other electronic units configured to perform the functions of the present Application, or a combination thereof.
For a software implementation, the techniques herein may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions herein. The software codes may be stored in a memory and executed by a processor. The memory may be implemented within the processor or external to the processor.
Specifically, when the processor 83 in the user terminal is further configured to run the computer program, the method steps in the foregoing embodiments are executed, which is not described herein again.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of a hardware embodiment, a software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It should be noted that: the technical schemes described in the embodiments of the present invention can be combined arbitrarily without conflict.
The above embodiments are merely preferred embodiments of the present invention, which are not intended to limit the scope of the present invention, and any modifications, equivalents, improvements, etc. made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (10)

1. A cirrus cloud detection method is characterized by comprising the following steps:
step 1, inputting a remote sensing image, and acquiring a primary color channel gray-scale image of the remote sensing image;
step 2, obtaining a fractal dimension characteristic diagram of a primary color channel gray scale diagram;
step 3, obtaining a weight gray-scale image based on the fractal dimension characteristic image and the primary color channel gray-scale image;
step 4, performing clustering calculation on the pixel points of the primary color channel gray-scale image to obtain a clustering result of the pixel points, and determining a pre-distribution label of each pixel point of the primary color channel gray-scale image based on the clustering result;
step 5, establishing a graph segmentation model based on the primary color channel gray-scale image, the weight gray-scale image and the pre-distributed labels;
and 6, performing minimum segmentation calculation on the graph cut model, and outputting a detection result of the remote sensing image.
2. The cirrus cloud detection method of claim 1, wherein the step 2 specifically comprises:
step 2.1, calculating the fractal curved surface area S (n) of each pixel point in the primary color channel gray-scale image, wherein n represents a unit area scale;
step 2.2, calculating the fractal dimension d of each pixel point based on the fractal curved surface area S (n) and a fractal surface formula; wherein the fractal surface formula is S (n) ═ n2-d
And 2.3, acquiring a fractal dimension characteristic diagram of the primary color channel gray-scale diagram based on the fractal dimension d of each pixel point.
3. The cirrus cloud detection method of claim 1, wherein the step 3 specifically comprises:
and taking the fractal dimension characteristic graph as a weight to multiply and weight the primary color channel gray-scale graph, and acquiring a weight gray-scale graph based on a weighting result.
4. The cirrus cloud detection method of claim 1, wherein the step 4 specifically comprises:
step 4.1, randomly selecting K pixel points in the primary color channel gray-scale image as initial values of a clustering center, and calculating Euclidean distances between other pixel points except the clustering center in the primary color channel gray-scale image and the clustering center;
step 4.2, clustering all pixel points based on the Euclidean distances, acquiring the pixel point which is closest to the mean value of the Euclidean distances in each cluster as a new cluster center for iterative computation, and acquiring a final clustering result when the cluster center is not changed any more;
and 4.3, determining the pre-distribution label of each pixel point in the primary color channel gray-scale image based on the final clustering result.
5. The cirrus cloud detection method of claim 1, wherein the step 5 specifically comprises:
step 5.1, determining a criterion function of the graph cut model;
step 5.2, determining an area item of the criterion function based on the primary color channel gray-scale map; determining a boundary item of the criterion function based on the weight gray-scale map; determining a calculated initial value of the criterion function based on the pre-assigned tag;
and 5.3, establishing the graph cut model based on the criterion function.
6. The rolling cloud detection method of claim 5, wherein the criterion function of the graph cut model specifically comprises:
e (l) ═ r (l) + b (l); wherein, L is a calculation initial value of the criterion function, r (L) is an area term of the criterion function, b (L) is a boundary term of the criterion function, a is a preset balance parameter, and e (L) is a graph cut value.
7. The rolling cloud detection method according to claim 1, wherein the performing minimal cut computation on the graph cut model specifically includes:
the minimum cut value of the graph cut model was calculated using the Ford-Fulkerson labeling algorithm.
8. The rolling cloud detection method of any of claims 1-7, wherein the primary color channel gray scale map is a blue color channel gray scale map.
9. A rolling cloud detection device, comprising:
a processor, a memory, and a communication bus;
the communication bus is used for realizing communication connection between the processor and the memory;
the memory is used for storing a volume cloud detection program capable of running on the processor;
the processor is configured to:
inputting a remote sensing image, and acquiring a primary color channel gray scale image of the remote sensing image;
acquiring a fractal dimension characteristic diagram of a primary color channel gray scale diagram;
acquiring a weight gray-scale image based on the fractal dimension characteristic image and the primary color channel gray-scale image;
clustering calculation is carried out on pixel points of the primary color channel gray scale image, a clustering result of the pixel points is obtained, and pre-distribution labels of the pixel points are determined based on the clustering result;
determining a graph segmentation model based on the primary color channel gray-scale image, the weight gray-scale image and the pre-distribution label;
and calculating the graph cut model, and outputting a detection result of the remote sensing image based on the calculation result.
10. A computer-readable storage medium storing one or more programs, the one or more programs being executable by one or more processors to implement the steps of the volume cloud detection method according to any one of claims 1 to 8.
CN201910129005.7A 2019-02-21 2019-02-21 Volume cloud detection method and device and computer readable storage medium Active CN109886193B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910129005.7A CN109886193B (en) 2019-02-21 2019-02-21 Volume cloud detection method and device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910129005.7A CN109886193B (en) 2019-02-21 2019-02-21 Volume cloud detection method and device and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN109886193A CN109886193A (en) 2019-06-14
CN109886193B true CN109886193B (en) 2020-11-20

Family

ID=66928743

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910129005.7A Active CN109886193B (en) 2019-02-21 2019-02-21 Volume cloud detection method and device and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN109886193B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114187320B (en) * 2021-12-14 2022-11-08 北京柏惠维康科技股份有限公司 Spine CT image segmentation method and spine imaging identification method and device
CN114022790B (en) * 2022-01-10 2022-04-26 成都国星宇航科技有限公司 Cloud layer detection and image compression method and device in remote sensing image and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7747625B2 (en) * 2003-07-31 2010-06-29 Hewlett-Packard Development Company, L.P. Organizing a collection of objects
US8885925B2 (en) * 2013-03-12 2014-11-11 Harris Corporation Method for 3D object identification and pose detection using phase congruency and fractal analysis
CN105631903A (en) * 2015-12-24 2016-06-01 河海大学 Remote sensing image water extraction method and device based on RGBW characteristic space diagram cutting algorithm
CN106228553A (en) * 2016-07-20 2016-12-14 湖南大学 High-resolution remote sensing image shadow Detection apparatus and method
CN108647658A (en) * 2018-05-16 2018-10-12 电子科技大学 A kind of infrared imaging detection method of high-altitude cirrus

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102831607B (en) * 2012-08-08 2015-04-22 深圳市迈科龙生物技术有限公司 Method for segmenting cervix uteri liquid base cell image
CN103886614B (en) * 2014-04-14 2017-05-03 重庆威堪科技有限公司 Image edge detection method based on network node fractal dimensions
CN108021890B (en) * 2017-12-05 2020-03-10 武汉大学 High-resolution remote sensing image port detection method based on PLSA and BOW

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7747625B2 (en) * 2003-07-31 2010-06-29 Hewlett-Packard Development Company, L.P. Organizing a collection of objects
US8885925B2 (en) * 2013-03-12 2014-11-11 Harris Corporation Method for 3D object identification and pose detection using phase congruency and fractal analysis
CN105631903A (en) * 2015-12-24 2016-06-01 河海大学 Remote sensing image water extraction method and device based on RGBW characteristic space diagram cutting algorithm
CN106228553A (en) * 2016-07-20 2016-12-14 湖南大学 High-resolution remote sensing image shadow Detection apparatus and method
CN108647658A (en) * 2018-05-16 2018-10-12 电子科技大学 A kind of infrared imaging detection method of high-altitude cirrus

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Automatic target segmentation by improved Grabcut based on fractal;CHEN Jun 等;《Computer Engineering and Applications》;20150902;第53卷(第1期);163-167 *

Also Published As

Publication number Publication date
CN109886193A (en) 2019-06-14

Similar Documents

Publication Publication Date Title
CN112560876B (en) Single-stage small sample target detection method for decoupling measurement
EP3101594A1 (en) Saliency information acquisition device and saliency information acquisition method
Tong et al. Saliency detection with multi-scale superpixels
CN105184763B (en) Image processing method and device
US20130223740A1 (en) Salient Object Segmentation
CN110781756A (en) Urban road extraction method and device based on remote sensing image
CN111523414A (en) Face recognition method and device, computer equipment and storage medium
CN106157330B (en) Visual tracking method based on target joint appearance model
US20170178341A1 (en) Single Parameter Segmentation of Images
CN112036455B (en) Image identification method, intelligent terminal and storage medium
CN109886193B (en) Volume cloud detection method and device and computer readable storage medium
CN108629783A (en) Image partition method, system and medium based on the search of characteristics of image density peaks
WO2014004271A2 (en) Method and system for use of intrinsic images in an automotive driver-vehicle-assistance device
CN106407978B (en) Method for detecting salient object in unconstrained video by combining similarity degree
EP3039645A1 (en) A semi automatic target initialization method based on visual saliency
CN112991238A (en) Texture and color mixing type food image segmentation method, system, medium and terminal
CN114037640A (en) Image generation method and device
CN111898659A (en) Target detection method and system
Zhang et al. Salient region detection for complex background images using integrated features
CN112560845A (en) Character recognition method and device, intelligent meal taking cabinet, electronic equipment and storage medium
Song et al. Depth-aware saliency detection using discriminative saliency fusion
CN111079807B (en) Ground object classification method and device
CN112785595B (en) Target attribute detection, neural network training and intelligent driving method and device
Dornaika et al. A comparative study of image segmentation algorithms and descriptors for building detection
CN111199228A (en) License plate positioning method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant