CN111985492A - Cloud identification method - Google Patents

Cloud identification method Download PDF

Info

Publication number
CN111985492A
CN111985492A CN201910439308.9A CN201910439308A CN111985492A CN 111985492 A CN111985492 A CN 111985492A CN 201910439308 A CN201910439308 A CN 201910439308A CN 111985492 A CN111985492 A CN 111985492A
Authority
CN
China
Prior art keywords
image
value
cloud
recognized
determined
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910439308.9A
Other languages
Chinese (zh)
Other versions
CN111985492B (en
Inventor
姚志豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Nengmei New Energy Technology Co ltd
Original Assignee
Zhejiang Nengmei New Energy Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Nengmei New Energy Technology Co ltd filed Critical Zhejiang Nengmei New Energy Technology Co ltd
Priority to CN201910439308.9A priority Critical patent/CN111985492B/en
Publication of CN111985492A publication Critical patent/CN111985492A/en
Application granted granted Critical
Publication of CN111985492B publication Critical patent/CN111985492B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a cloud identification method, which comprises the following steps: converting an image to be identified into an HSV space and performing a light removing operation; performing edge detection on the image to be recognized, and searching for a 'determined cloud' in the image to be recognized; performing sky detection on the image to be identified, and searching for a 'determined day' in the image to be identified; classifying the images to be recognized according to the search results of the determined cloud and the determined day in the images to be recognized, and then further processing the images to be recognized according to different types to find out the cloud and the day in the images to be recognized.

Description

Cloud identification method
Technical Field
The invention relates to the field of image recognition, in particular to a cloud recognition method
Background
The automatic identification of clouds in images by using machines is difficult, but the cloud image identification method has wide application scenarios in the field of weather, such as finding out the position, shape, thickness and other information of a cloud layer from a cloud image in weather forecast.
With the development of technology, some new industries also need to predict weather conditions in a short time. For example, in the field of solar power generation (including photovoltaic power generation and photo-thermal power generation), since solar energy resources are important factors influencing solar energy utilization efficiency, and cloud layer shading has a very serious influence on the solar energy resources, the cloud shading condition needs to be pre-judged, and machine identification of clouds in real-time weather images is the basis of cloud shading pre-judgment. Some application scenarios in these industries do not require long-term prediction of weather conditions, and are also limited by cost, and traditional weather forecasting devices are expensive and not well suited for these application scenarios.
The applicant proposed a chinese patent application having an application number of 201710820547.X entitled "a method for identifying a cloud contained in an image" in 2017. In the application, whether a pixel point in an image belongs to a cloud is determined based on the RGB value of the pixel point. The method has the advantages of simple implementation, less resource consumption and low implementation cost. In practice, however, the applicant has found that: the method has low recognition accuracy, and particularly has great problems in recognition of cloud without clear boundary and fog cloud. The applicant does not find that the method capable of well identifying irregular clouds and foggy clouds exists in the prior art.
Disclosure of Invention
The invention aims to overcome the defect that the existing cloud identification method cannot identify the cloud or the fog-like cloud without clear boundary, thereby providing a method capable of comprehensively and accurately identifying the cloud.
In order to achieve the above object, the present invention provides a cloud identification method, including:
step 1), converting an image to be identified into an HSV space;
step 2), performing a light removing operation on the image to be identified after the image to be identified is converted into the HSV space;
step 3), performing edge detection on the image to be recognized, and searching for a 'determined cloud' in the image to be recognized; performing sky detection on the image to be identified, and searching for a 'determined day' in the image to be identified;
Step 4), classifying the images to be recognized according to the search results of the determined cloud and the determined day in the images to be recognized, and then further processing the images to be recognized according to different types: if the determined cloud is not found in the image to be recognized and the determined day is not found, executing the next step; if only 'determined cloud' is found in the image to be recognized and 'determined day' is not found, executing step 7); if only the determined day is found in the image to be recognized, and the determined cloud is not found, executing the step 9); if the image to be recognized has the determined cloud and the determined day at the same time, executing the step 12);
step 5), calculating a saturation mean value of the image for the image to be recognized, comparing the saturation mean value with a first threshold value, if the saturation mean value is smaller than the first threshold value, completely covering thick clouds in the sky in the image to be recognized, outputting a recognition result, and then ending the image recognition process, otherwise, executing the next step;
step 6), identifying the current image processing mode, if the current image processing mode is a fog mode, determining that the image to be identified is cloud-free, and outputting an identification result and then finishing the image identification process; otherwise, switching the current image processing mode to a 'fog' mode, and then starting to execute from the step 3) again;
Step 7), identifying the current image processing mode, if the current image processing mode is a fog mode, determining that the image to be identified is currently 'paved with clouds', and then ending the image identification process; otherwise, judging whether the current image processing mode is a 'cloudy' mode, if so, judging that the image to be recognized is 'full of clouds', then ending the image recognition process, and if not, executing the next step;
step 8), switching the current image processing mode to a 'cloudy' mode, and then starting to execute from the step 3) again;
step 9), if the current image processing mode is the fog mode, executing the next step, and if not, executing step 11);
step 10), taking pixel points with saturation values equal to the minimum value S _ min of the image saturation in the image to be recognized as 'determined cloud', and then executing step 12);
step 11), continuously judging whether the current image processing mode is a 'cloud-less' mode, if so, executing the step 10), otherwise, switching the current image processing mode to the 'cloud-less' mode, and then starting to execute from the step 3);
step 12), fuzzy recognition is carried out on the rest parts except the recognized light, the determined day and the determined cloud in the image to be recognized, and whether the image to be recognized is the cloud or the sky is further distinguished.
In the above technical solution, before the step 1), the method further includes: and performing filtering operation on the image to be identified.
In the above technical solution, the step 2) further includes:
step 2-1), taking the position of the sun in the image as a center, and segmenting the image at certain angles along the circumferential direction, so as to segment the image into a plurality of channels;
step 2-2), respectively identifying pixel points in the channels in each channel, and finding out the pixel points which are considered as light; wherein the content of the first and second substances,
if the position of the sun in the image to be identified is shielded by the shadow, setting a near sun region in the image to be identified; setting the distance range from the region to the solar boundary as Lightcircle; if the brightness of the pixel points in the Lightcircle is more than 252, the pixel points are identified as light; if the gradient of the brightness variance of a certain pixel and two adjacent pixels on a channel is not more than 6 and the brightness of the pixel and the previous pixel in the channel is not less than 252, the pixel is identified as light;
if the position of the sun in the image to be identified is not shielded by the shadow, directly obtaining the brightness value of the pixel of the central point of the sun in the image to be identified, traversing each pixel point along the channel according to the direction from the center to the edge of the image, and if the brightness variance gradient of a certain pixel point and two adjacent pixel points on the channel is not more than 6 and the brightness of the pixel point and the brightness of the previous pixel point in the channel is not less than 252, identifying the pixel point as light;
Step 2-3), removing pixel points which are considered as light from the image to be identified.
In the above technical solution, in the step 3), performing edge detection on the image to be recognized, and searching for a "determined cloud" in the image to be recognized includes:
step 3-1-1), calculating gradients for pixel points remained in the image to be identified;
the gradient is calculated by the following formula:
Figure BDA0002071547220000031
wherein the content of the first and second substances,
dx=CD(fi+1,j+1,fi+1,j-1)+2CD(fi,j+1,fi,j-1)+CD(fi-1,j+1,fi-1,j-1);
dy=CD(fi+1,j+1,fi-1,j+1)+2CD(fi+1,j,fi-1,j)+CD(fi+1,j-1,fi-1,j-1);
wherein, CDS represents the gradient, CD (A, B) represents the color difference between the pixel A and the pixel B, and the calculation formula is as follows:
Figure BDA0002071547220000032
wherein HARepresenting the hue value, S, of pixel AARepresenting the saturation value, V, of pixel AAExpressing the brightness value of the pixel point A; hBRepresenting the hue value, S, of pixel BBRepresenting the saturation value, V, of pixel BBExpressing the brightness value of the pixel point B; weighting _ H represents an H channel weighting coefficient in a Sobel operator; weighting _ S represents an S channel weighting coefficient in a Sobel operator; weighting _ V represents a V channel weighting coefficient in a Sobel operator; in the formula fi-1,j-1、fi-1,j、fi-1,j+1、fi,j-1、fi,j、fi,j+1、fi+1,j-1、fi+1,j、fi+1,j+1Used for representing a pixel point and its surrounding adjacent pixel points;
step 3-1-2), comparing the gradient of the pixel point with the boundary identification sensitivity value, if the gradient value of one pixel point is greater than the boundary identification sensitivity value, the pixel point is a boundary point, otherwise, the pixel point is not the boundary point;
And 3-1-3) integrating the boundary points found in the image to be recognized to obtain the determined cloud in the image to be recognized.
In the above technical solution, in the step 3), performing sky detection on the image to be recognized, and searching for a "determined day" in the image to be recognized includes:
step 3-2-1), solving a difference value between the saturation maximum value S _ max and the saturation minimum value S _ min of each pixel point in the image to be identified, comparing the difference value with a parameter difference _ S, and executing the next step if the difference value is greater than the parameter difference _ S; otherwise, reducing the value of the parameter differential _ S, and then re-executing the step; wherein the content of the first and second substances,
the parameter differential _ S is a parameter used for reflecting the saturation difference of each pixel point in the picture;
step 3-2-2), traversing each pixel point in the image to be identified, determining whether the S value of the current pixel point is in the range of (S _ max-buffer _ S, S _ max), if so, regarding the pixel point as belonging to the sky;
and 3-2-3) integrating all pixel points which are considered to belong to the sky in the image to be identified to obtain the determined day in the image to be identified.
In the above technical solution, in the step 3-2-1), the value of the parameter differential _ S is reduced, and then the iterative process of the step 3-2-1) is executed again for 1-2 times.
In the above technical solution, the image processing modes are divided into four types, which are respectively: a "sunny" mode, a "fog" mode, a "cloudy" mode, and a "cloudy" mode; wherein the content of the first and second substances,
the "clear" mode represents that the image to be recognized is in a clear state, and the parameter differential _ S used for reflecting the saturation difference of each pixel point in the image has a higher value in the "clear" mode; the parameter CDS _ Value used to represent the boundary recognition sensitivity Value has a higher Value;
the fog mode represents that the value of the parameter difference _ S is lower than that in the clear mode under the fog mode when the image to be recognized is in the fog state; the Value of the parameter CDS _ Value is lower than that in the clear mode;
the 'cloudy' mode represents that the image to be recognized is in a cloudy state, and in the 'cloudy' mode, the value of the parameter differential _ S is lower than that in the 'sunny' mode; the Value of the parameter CDS _ Value is unchanged compared with the Value in the clear mode;
the 'cloud lacking' mode represents that the value of the parameter differential _ S is unchanged in comparison with the 'fine' mode when the image to be recognized is in a cloud lacking state; the Value of the parameter CDS _ Value is lower than that in the "clear" mode.
In the above technical solution, the step 12) includes:
Step 12-1), calculating the hue mean value, the saturation mean value and the brightness mean value of the known 'determination day' in the image to be identified; the corresponding calculation formula is as follows:
the average chroma value of the ' determination day ' is equal to the sum of hue values of all pixel points in the ' determination day '/' the number of pixel points in the ' determination day ';
the average value of the saturation of the ' determination day ' is equal to the sum of the saturation values of all pixel points in the ' determination day '/' the number of the pixel points in the ' determination day ';
the average brightness value of the ' determination day ' is equal to the sum of the brightness values of all pixel points in the ' determination day '/' the number of pixel points in the ' determination day ';
step 12-2), calculating a hue mean value, a saturation mean value and a brightness mean value of a known 'determined cloud' in the image to be identified; the corresponding calculation formula is as follows:
the average chroma value of the determined cloud is the sum of hue values of all pixel points in the determined cloud/the number of pixel points in the determined cloud;
the average value of the saturation of the ' determined cloud ' is equal to the sum of the saturation values of all pixel points in the ' determined cloud '/' the number of the pixel points in the ' determined cloud ';
the average brightness value of the ' determined cloud ' is equal to the sum of the brightness values of all pixel points in the ' determined cloud '/' the number of the pixel points in the ' determined cloud ';
Step 12-3), calculating the difference between any pixel point in the unknown region in the image to be recognized and the determined day and the determined cloud respectively;
the corresponding calculation formula is as follows:
Figure BDA0002071547220000051
Figure BDA0002071547220000052
wherein, a represents the difference between a certain pixel point in an unknown region in the image to be recognized and a 'determination cloud', and b represents the difference between a certain pixel point in an unknown region in the image to be recognized and a 'determination day'; H. s, V respectively representing the chroma value, saturation value and brightness value of a certain pixel point in an unknown region in the image to be identified;
Figure BDA0002071547220000053
a chroma mean value representing "determine cloud";
Figure BDA0002071547220000054
mean saturation representing "determine cloud";
Figure BDA0002071547220000055
mean luminance values representing "determine cloud";
Figure BDA0002071547220000056
a color mean value representing "determined day";
Figure BDA0002071547220000057
mean saturation representing "determined day";
Figure BDA0002071547220000058
mean value of brightness indicating "determined day"; weighting _ H _2, weighting _ S _2 and weighting _ V _2 are respectively an H channel weighting coefficient, an S channel weighting coefficient and a V channel weighting coefficient used in fuzzy recognition;
step 12-4), judging whether the b value of the pixel point in the unknown region is greater than (a + acceptance) according to the calculation result of the step 12-3), if so, considering the pixel point as 'cloud', otherwise, considering the pixel point as 'sky'; the parameter acceptance is used for defining the acceptance degree of the fuzzy state, and the smaller the acceptance value is, the more the fuzzy part is identified as cloud;
Step 12-5), executing step 12-3) and step 12-4) on all pixel points in the unknown region in the image to be recognized, and thus recognizing the fog cloud.
The invention has the advantages that:
compared with the existing cloud identification method, the cloud identification method can realize the identification of cloud and fog cloud without clear boundary, greatly improves the accuracy of cloud identification, and enables the cloud identification result to be more comprehensive and accurate.
Drawings
FIG. 1 is a flow chart of a cloud identification method of the present invention;
fig. 2 is a schematic view of a "near-sun region" set when a light removal operation is performed on an image.
Detailed Description
The cloud identification method is realized based on HSV space, and before the specific implementation details of the method are explained in detail, HSV is described first.
HSV (Hue, Saturation) is a color space created by a.r. smith in 1978, also known as the hexagonal cone Model (Hexcone Model), based on the intuitive nature of color. The parameters of the colors in this model are: hue (H), saturation (S), brightness (V).
The hue (H) is measured by angle, and ranges from 0 ° to 360 °, starting from red and counting in the counterclockwise direction, with red being 0 °, green being 120 °, and blue being 240 °. Their complementary colors are: yellow is 60 °, cyan is 180 °, and magenta is 300 °.
The value range of the saturation (S) is 0.0-1.0, and the larger the value is, the more saturated the color is.
The luminance (V) ranges from 0 (black) to 255 (white).
The invention will now be further described with reference to the accompanying drawings.
Referring to fig. 1, the method of the present invention comprises the steps of:
step 1), collecting an image to be identified.
In this step, the acquired images to be identified are generally images taken from the ground towards the sky, and in solar energy utilization scenes, the cloud is mainly interested in the condition that the sun is shaded by the cloud, so the images are generally images of the day, and the sun is generally carried in the images.
As a preferred implementation, after the image is acquired, a filtering operation needs to be performed on the image. The filtering operation can remove noise in the image, and objects such as birds, airplanes and the like which are common in the sky image can be removed through the filtering operation, so that the cloud identification is facilitated.
And 2) converting the image acquired in the step 1) into an HSV space.
The initially acquired images are typically in RGB space, where they are converted to HSV space in this step. How to convert an image from an RGB space to an HSV space is the prior art, and the implementation process thereof is not described in detail in this application.
And 3) performing a light removing operation on the image converted into the HSV space obtained in the step 2).
When cloud recognition is performed on an image, light (mainly sunlight) is a strong interference factor, so that a light removing operation needs to be performed on the image.
The method further comprises the following steps:
step 3-1), taking the position of the sun in the image as a center, and segmenting the image at certain angles along the circumferential direction, so as to segment the image into a plurality of channels;
and 3-2) respectively identifying pixel points in the channels in each channel, and finding out the pixel points which are considered as light.
When a pixel point that is considered to be light is found, there are different operations according to different characteristics of the image.
Generally, when an image including the sun is collected, in order to avoid damage to the camera by the sunlight intensity, a lens of the camera is subjected to a patch processing, so that the position of the sun in the collected image is replaced by a shadow. As shown in fig. 2, for such an image, since the sun shadow may affect the collection of the light pixel point, a "near-sun region" needs to be set. Setting the distance range from the region to the solar boundary as a Lightcircle, and identifying the pixel points in the Lightcircle as light when the brightness is more than 252; at a pixel outside the LightCircle, if the gradient of the brightness variance between a certain pixel and two adjacent pixels on the channel is not more than 6, and the brightness between the pixel and the previous pixel in the channel (the pixel close to the center of the image in the channel is the previous pixel) is not less than 252, the pixel is identified as light. The size of the LightCircle can be set according to actual conditions, and in one embodiment, the size of the LightCircle is set to be 50 pixels.
In some cases, the lens of the camera is not subjected to the patch processing, and the sun in the acquired image is not replaced by the shadow. For such an image, the brightness value of a pixel at the sun center (generally, the image center) is directly obtained, then each pixel point is traversed along a channel according to the direction from the center to the edge of the image, and if the brightness variance gradient between a certain pixel point and two adjacent pixel points on the channel is not more than 6, and the brightness between the certain pixel point and the previous pixel point in the channel is not less than 252, the pixel point is identified as light.
Step 3-3), removing pixel points which are regarded as light from the image to be identified.
And 4) carrying out edge detection on the image to be recognized obtained in the previous step, and searching for a 'determined cloud' in the image to be recognized.
The "cloud determination" in this step refers to a cloud with clear and definite boundaries. Such clouds can be identified by means of edge detection. The step may further comprise:
step 4-1), calculating gradients for pixel points remained in the image to be identified;
in calculating the gradient, a Sobel operator may be employed for calculation. Compared with the traditional Sobel operator, the method is improved in the application, namely: weighting coefficients weighting _ H, weighting _ S and weighting _ V are adopted to weight the three channels HSV, and thus the gradient CDS is calculated. Wherein weighting _ H represents an H channel weighting coefficient in a Sobel operator; weighting _ S represents an S channel weighting coefficient in a Sobel operator; weighting _ V represents a V channel weighting coefficient in a Sobel operator; in one embodiment, weighting _ H has a value of 2, weighting _ S has a value of 3, and weighting _ V has a value of 1.
The specific calculation procedure for gradient calculation is as follows:
Figure BDA0002071547220000081
wherein the content of the first and second substances,
dx=CD(fi+1,j+1,fi+1,j-1)+2CD(fi,j+1,fi,j-1)+CD(fi-1,j+1,fi-1,j-1);
dy=CD(fi+1,j+1,fi-1,j+1)+2CD(fi+1,j,fi-1,j)+CD(fi+1,j-1,fi-1,j-1)。
wherein CD (a, B) represents the color difference between the pixel point a and the pixel point B, and the calculation formula is:
Figure BDA0002071547220000082
wherein HAThe hue value, S, of the pixel A is representedARepresents the saturation value, V, of the pixel point AAThe luminance value of pixel a is represented. HBThe hue value, S, of pixel B is representedBRepresents the saturation value, V, of the pixel point BBThe brightness value of the pixel point B is represented
In the above formula fi,jThe expressions are used to represent a pixel and its surrounding neighboring pixels, and the positional relationship between these pixels is shown in the following table 1:
fi-1,j-1 fi-1,j fi-1,j+1
fi,j-1 fi,j fi,j+1
fi+1,j-1 fi+1,j fi+1,j+1
TABLE 1
And 4-2) comparing the gradient of the pixel points with the boundary identification sensitivity value, and determining whether the corresponding pixel points are the boundary points of the cloud or not according to the comparison result.
In this application, CDS _ Value is used to represent a boundary recognition sensitivity Value, and the size of the Value is used to reflect the sensitivity level, and the smaller CDS _ Value, the higher the sensitivity.
The specific Value of CDS _ Value can be adjusted according to the sensitivity requirement, but generally should be not less than
Figure BDA0002071547220000083
If the CDS Value of a pixel is larger than CDS _ Value, the pixel is a boundary point, otherwise, the pixel is not the boundary point.
And 4-3) integrating the boundary points searched in the image to be recognized to obtain the determined cloud in the image to be recognized.
And 5) performing sky detection on the image to be recognized obtained in the previous step to obtain a 'determined day' in the image to be recognized.
The "determined day" in this step refers to a portion of the image to be recognized, which has a clear background and can be determined as the sky. The step may further comprise:
step 5-1), calculating a difference value between the saturation maximum value S _ max and the saturation minimum value S _ min of each pixel point in the image to be identified, comparing the difference value with a parameter difference _ S, and if the difference value is greater than the parameter difference _ S, executing the step 5-2); otherwise, reducing the value of the parameter differential _ S, and then re-executing the step.
The parameter differential _ S is a parameter used for reflecting the saturation difference of each pixel point in the picture. Those skilled in the art will appreciate that there is a large difference in saturation between blue and white pixels. Therefore, the parameter differential _ S is used to measure the blue-white clarity of the sky picture. If the difference value between the saturation maximum value S _ max and the saturation minimum value S _ min of each pixel point in the image to be recognized is larger than the parameter differential _ S, the blue sky and the white cloud of the image to be recognized are proved to be clear, and a foundation is laid for searching for the determined sky. Otherwise, it proves that the blue sky and the white cloud in the image to be recognized are distinguished and not clear (such as cloudy day).
The value of the parameter differential _ S reflects the identification degree of the saturation difference of the pixel points in the image to be identified, the value of the differential _ S can be set to be a higher value initially, if the difference value between the maximum saturation value S _ max and the minimum saturation value S _ min calculated in the step 5-1) is smaller than the parameter differential _ S, the value of the parameter differential _ S can be properly reduced, and then the step 5-1) is executed again. The above iterative process is typically performed 1-2 times, otherwise it will lose meaning.
In one embodiment, parameter differential _ S is set to 30.
And 5-2) traversing each pixel point in the image to be identified, determining whether the S value of the current pixel point is in the range of (S _ max-differential _ S, S _ max), and if so, considering the pixel point as belonging to the sky.
And 5-3) synthesizing all pixel points which are considered to belong to the sky in the image to be identified to obtain the determined day in the image to be identified.
It should be noted that there is no strict logic sequence between the foregoing step 4) and step 5), and the two steps are numbered for convenience of description. In actual execution, step 4) may be executed before step 5), step 5) may be executed before step 4), and step 4) may also be executed simultaneously with step 5).
Step 6), according to the results obtained in the step 4) and the step 5), further judging the type of the image to be recognized, and if no determined cloud is found in the image to be recognized, and no determined day is found, indicating that the image to be recognized is either "no cloud" or "thick cloud is full" or "thin cloud is full" or "cloud is difficult to distinguish" or "cloudy cloud" exists, executing the step 7); if only a determined cloud is found in the image to be recognized and no determined day is found, indicating that the image to be recognized is in a state of 'thick cloud in the sky and thin cloud fully paved', executing step 9); if only a determined day is found in the image to be recognized and no determined cloud is found, indicating that the image to be recognized is in a state that a thin cloud exists in the sky and the sky is not paved with the thin cloud, executing step 11); and if the image to be recognized has the determined cloud and the determined day at the same time, the step 14) is executed to prove that the image to be recognized has clear cloud and day.
Step 7), calculating a saturation mean value of the image for the image to be recognized which is subjected to the light removal, then comparing the saturation mean value with a first threshold value, if the saturation mean value is smaller than the first threshold value, the sky in the image to be recognized may be completely covered with thick clouds, namely the sky is not exposed, outputting a recognition result, and then finishing the image recognition process, otherwise, executing the next step;
The first threshold is determined by a person skilled in the art, in one embodiment, the size of the first threshold is 200, and in other embodiments, the value of the first threshold may also be adjusted according to actual situations.
Step 8), identifying the current image processing mode if the sky in the image to be identified is 'thin cloud full' or 'cloud sky difficult to distinguish', and if the current image processing mode is 'fog' mode, then the image to be identified is 'no cloud' currently, and then ending the image identification process; otherwise, the current image processing mode is switched to the "fog" mode, and then the execution is started from step 4) again.
The image processing mode referred to in this step reflects the processing strategy of the image to be recognized, and in this application, the image processing mode can be divided into four types, which are: a "sunny" mode, a "fog" mode, a "cloudy" mode, and a "cloudy" mode. Wherein the content of the first and second substances,
the "clear" mode represents that the image to be recognized is in a clear state, and in the "clear" mode, the parameter differential _ S discussed earlier for reflecting the saturation difference of each pixel point in the image has a higher value, and in one embodiment, the value of the parameter differential _ S is 30; the parameter CDS Value, which is used to represent the boundary recognition sensitivity Value, has a high Value, and in one embodiment, has a Value of 42. The specific values of the parameter differential _ S and the parameter CDS _ Value may vary according to actual needs.
The fog mode represents that the image to be recognized is in a fog state, and the value of the parameter difference _ S is lower than that in a clear mode in the fog mode, so that the characteristic of fuzzy boundary in the cloud sky can be reflected; the Value of the parameter CDS _ Value is lower than that in a 'clear' mode, and the reduction of the Value of the parameter CDS _ Value is also beneficial to improving the edge identification sensitivity.
The 'cloudy' mode represents that the image to be recognized is in a cloudy state, and in the 'cloudy' mode, the value of the parameter differential _ S is lower than that in the 'sunny' mode, so that the characteristic of fuzzy boundary of the cloud sky can be reflected; the Value of the parameter CDS _ Value is unchanged compared with the Value in the "clear" mode.
The 'cloud lacking' mode represents that the value of the parameter differential _ S is unchanged in comparison with the 'fine' mode when the image to be recognized is in a cloud lacking state; the Value of the parameter CDS _ Value is lower than that in a 'clear' mode, and the reduction of the Value of the parameter CDS _ Value is beneficial to improving the edge identification sensitivity.
As can be seen from the above description, in step 8), if the current image processing mode is not the "fog" mode, the current image processing mode needs to be switched to the "fog" mode, and during the switching process, based on the difference of the original image processing mode, or the Value of the parameter differential _ S is reduced, or the Value of the parameter CDS _ Value is reduced, or the values of the parameter differential _ S and the parameter CDS _ Value are reduced at the same time.
In this step, the iterative process of "switching the current image processing mode to the" fog "mode and then starting execution from step 4 again" may be performed 1-2 times.
Step 9), identifying the current image processing mode, if the current image processing mode is the fog mode, determining that the image to be identified is the current image which is paved with clouds, and then ending the image identification process; otherwise, judging whether the current image processing mode is a 'cloudy' mode, if so, judging that the image to be recognized is 'full of clouds' at present, then ending the image recognition process, and if not, executing the next step;
step 10), switching the current image processing mode to a 'cloudy' mode, and then starting from step 4) again.
The iterative process of "re-executing from step 4)" in this step may be executed 1-2 times.
Step 11), identifying the current image processing mode, if the current image processing mode is the fog mode, executing step 12), and if not, executing step 13);
step 12), taking pixel points with saturation values equal to the minimum value S _ min of the image saturation in the image to be recognized as 'determined cloud', and then executing step 14);
Step 13), continuously judging whether the current image processing mode is the 'cloud-less' mode, if so, executing the step 12), otherwise, switching the current image processing mode to the 'cloud-less' mode, and then, starting execution from the step 4) again.
The iterative process of "re-executing from step 4)" in this step may be executed 1-2 times.
Step 14), fuzzy recognition is carried out on the rest parts except the recognized light, the determined day and the determined cloud in the image to be recognized, and whether the image to be recognized is the cloud or the sky is further distinguished.
The boundary of the sky and the cloud of the rest part in the image to be recognized is fuzzy, or the sky is close to the thin cloud; also commonly referred to as a cloudy cloud. The fuzzy recognition process of the cloud needs to adopt a parameter acceptance, and the parameter is used for defining the acceptance degree of the fuzzy state. The smaller the acceptance value (which may be a negative number), the more the blurred portion is recognized as a cloud, i.e., the more the influence of the cloudy cloud is considered to be. This value needs to be determined based on field test DNI.
The steps further include:
step 14-1), calculating the hue mean value, the saturation mean value and the brightness mean value of the known 'determination day' in the image to be identified; the corresponding calculation formula is as follows:
The average chroma value of the ' determination day ' is equal to the sum of hue values of all pixel points in the ' determination day '/' the number of pixel points in the ' determination day ';
the average value of the saturation of the ' determination day ' is equal to the sum of the saturation values of all pixel points in the ' determination day '/' the number of the pixel points in the ' determination day ';
the average brightness value of the "determination day" is equal to the sum of the brightness values of all the pixel points in the "determination day" and the number of the pixel points in the "determination day".
Step 14-2), calculating a hue mean value, a saturation mean value and a brightness mean value of a known 'determined cloud' in the image to be identified; the corresponding calculation formula is as follows:
the average chroma value of the determined cloud is the sum of hue values of all pixel points in the determined cloud/the number of pixel points in the determined cloud;
the average value of the saturation of the ' determined cloud ' is equal to the sum of the saturation values of all pixel points in the ' determined cloud '/' the number of the pixel points in the ' determined cloud ';
the average brightness value of the ' determination cloud ' is equal to the sum of the brightness values of all pixel points in the ' determination cloud '/' the number of the pixel points in the ' determination cloud '.
Step 14-3), calculating the difference between any pixel point in the unknown region in the image to be recognized and the determined day and the determined cloud respectively.
The corresponding calculation formula is as follows:
Figure BDA0002071547220000121
Figure BDA0002071547220000122
wherein a represents the difference between a certain pixel point in an unknown region in the image to be recognized and the' determined cloudB represents the difference between a certain pixel point in an unknown area in the image to be recognized and a 'determination day'; H. s, V respectively representing the chroma value, saturation value and brightness value of a certain pixel point in an unknown region in the image to be identified;
Figure BDA0002071547220000123
a chroma mean value representing "determine cloud";
Figure BDA0002071547220000124
mean saturation representing "determine cloud";
Figure BDA0002071547220000125
mean luminance value representing "determine cloud".
Figure BDA0002071547220000126
A color mean value representing "determined day";
Figure BDA0002071547220000127
mean saturation representing "determined day";
Figure BDA0002071547220000128
mean value of brightness for "determined day". weighting _ H _2, weighting _ S _2 and weighting _ V _2 are respectively an H-channel weighting coefficient, an S-channel weighting coefficient and a V-channel weighting coefficient used in fuzzy recognition, and the values of these weighting coefficients can be adjusted according to the field situation, in one embodiment, weighting _ H _2 has a value of 2, weighting _ S _2 has a value of 3, and weighting _ V _2 has a value of 1.
Step 14-4), judging whether the b value of the pixel point in the unknown region is greater than (a + acceptance) according to the calculation result of the step 14-3), if so, considering the pixel point as 'cloud', otherwise, considering the pixel point as 'sky'.
Step 14-5), executing step 14-3) and step 14-4) on all pixel points in the unknown region in the image to be recognized, and thus recognizing the fog cloud.
The above is a detailed description of the implementation process of the cloud identification method of the present invention. As can be seen from the above description, the cloud identification method of the present invention can identify not only clear clouds but also undefined clouds, such as foggy clouds, in the sky. Compared with the traditional method, the method has the advantages that the comprehensiveness and accuracy of the identification are improved to a great extent, and the actual requirements of cloud identification are met.
Finally, it should be noted that the above embodiments are only used for illustrating the technical solutions of the present invention and are not limited. Although the present invention has been described in detail with reference to the embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (8)

1. A cloud identification method, comprising:
step 1), converting an image to be identified into an HSV space;
step 2), performing a light removing operation on the image to be identified after the image to be identified is converted into the HSV space;
Step 3), performing edge detection on the image to be recognized, and searching for a 'determined cloud' in the image to be recognized; performing sky detection on the image to be identified, and searching for a 'determined day' in the image to be identified;
step 4), classifying the images to be recognized according to the search results of the determined cloud and the determined day in the images to be recognized, and then further processing the images to be recognized according to different types: if the determined cloud is not found in the image to be recognized and the determined day is not found, executing the next step; if only 'determined cloud' is found in the image to be recognized and 'determined day' is not found, executing step 7); if only the determined day is found in the image to be recognized, and the determined cloud is not found, executing the step 9); if the image to be recognized has the determined cloud and the determined day at the same time, executing the step 12);
step 5), calculating a saturation mean value of the image for the image to be recognized, comparing the saturation mean value with a first threshold value, if the saturation mean value is smaller than the first threshold value, completely covering thick clouds in the sky in the image to be recognized, outputting a recognition result, and then ending the image recognition process, otherwise, executing the next step;
step 6), identifying the current image processing mode, if the current image processing mode is a fog mode, determining that the image to be identified is cloud-free, and outputting an identification result and then finishing the image identification process; otherwise, switching the current image processing mode to a 'fog' mode, and then starting to execute from the step 3) again;
Step 7), identifying the current image processing mode, if the current image processing mode is a fog mode, determining that the image to be identified is currently 'paved with clouds', and then ending the image identification process; otherwise, judging whether the current image processing mode is a 'cloudy' mode, if so, judging that the image to be recognized is 'full of clouds', then ending the image recognition process, and if not, executing the next step;
step 8), switching the current image processing mode to a 'cloudy' mode, and then starting to execute from the step 3) again;
step 9), if the current image processing mode is the fog mode, executing the next step, and if not, executing step 11);
step 10), taking pixel points with saturation values equal to the minimum value S _ min of the image saturation in the image to be recognized as 'determined cloud', and then executing step 12);
step 11), continuously judging whether the current image processing mode is a 'cloud-less' mode, if so, executing the step 10), otherwise, switching the current image processing mode to the 'cloud-less' mode, and then starting to execute from the step 3);
step 12), fuzzy recognition is carried out on the rest parts except the recognized light, the determined day and the determined cloud in the image to be recognized, and whether the image to be recognized is the cloud or the sky is further distinguished.
2. The cloud identification method according to claim 1, wherein before the step 1), further comprising: and performing filtering operation on the image to be identified.
3. The cloud identification method according to claim 1 or 2, wherein the step 2) further comprises:
step 2-1), taking the position of the sun in the image as a center, and segmenting the image at certain angles along the circumferential direction, so as to segment the image into a plurality of channels;
step 2-2), respectively identifying pixel points in the channels in each channel, and finding out the pixel points which are considered as light; wherein the content of the first and second substances,
if the position of the sun in the image to be identified is shielded by the shadow, setting a near sun region in the image to be identified; setting the distance range from the region to the solar boundary as Lightcircle; if the brightness of the pixel points in the Lightcircle is more than 252, the pixel points are identified as light; if the gradient of the brightness variance of a certain pixel and two adjacent pixels on a channel is not more than 6 and the brightness of the pixel and the previous pixel in the channel is not less than 252, the pixel is identified as light;
if the position of the sun in the image to be identified is not shielded by the shadow, directly obtaining the brightness value of the pixel of the central point of the sun in the image to be identified, traversing each pixel point along the channel according to the direction from the center to the edge of the image, and if the brightness variance gradient of a certain pixel point and two adjacent pixel points on the channel is not more than 6 and the brightness of the pixel point and the brightness of the previous pixel point in the channel is not less than 252, identifying the pixel point as light;
Step 2-3), removing pixel points which are considered as light from the image to be identified.
4. The cloud identification method according to claim 1 or 2, wherein in the step 3), performing edge detection on the image to be identified, and searching for the "determined cloud" in the image to be identified comprises:
step 3-1-1), calculating gradients for pixel points remained in the image to be identified;
the gradient is calculated by the following formula:
Figure FDA0002071547210000021
wherein the content of the first and second substances,
dx=CD(fi+1,j+1,fi+1,j-1)+2CD(fi,j+1,fi,j-1)+CD(fi-1,j+1,fi-1,j-1);
dy=CD(fi+1,j+1,fi-1,j+1)+2CD(fi+1,j,fi-1,j)+CD(fi+1,j-1,fi-1,j-1);
wherein, CDS represents the gradient, CD (A, B) represents the color difference between the pixel A and the pixel B, and the calculation formula is as follows:
Figure FDA0002071547210000022
wherein HARepresenting the hue value, S, of pixel AARepresenting the saturation value, V, of pixel AAExpressing the brightness value of the pixel point A; hBRepresenting the hue value, S, of pixel BBRepresenting the saturation value, V, of pixel BBExpressing the brightness value of the pixel point B; weighting _ H represents an H channel weighting coefficient in a Sobel operator; weighting _ S represents an S channel weighting coefficient in a Sobel operator; weighting _ V represents a V channel weighting coefficient in a Sobel operator; in the formula fi-1,j-1、fi-1,j、fi-1,j+1、fi,j-1、fi,j、fi,j+1、fi+1,j-1、fi+1,j、fi+1,j+1Used for representing a pixel point and its surrounding adjacent pixel points;
step 3-1-2), comparing the gradient of the pixel point with the boundary identification sensitivity value, if the gradient value of one pixel point is greater than the boundary identification sensitivity value, the pixel point is a boundary point, otherwise, the pixel point is not the boundary point;
And 3-1-3) integrating the boundary points found in the image to be recognized to obtain the determined cloud in the image to be recognized.
5. The cloud identification method according to claim 1 or 2, wherein in the step 3), sky detection is performed on the image to be identified, and searching for "determined day" in the image to be identified comprises:
step 3-2-1), solving a difference value between the saturation maximum value S _ max and the saturation minimum value S _ min of each pixel point in the image to be identified, comparing the difference value with a parameter difference _ S, and executing the next step if the difference value is greater than the parameter difference _ S; otherwise, reducing the value of the parameter differential _ S, and then re-executing the step; wherein the content of the first and second substances,
the parameter differential _ S is a parameter used for reflecting the saturation difference of each pixel point in the picture;
step 3-2-2), traversing each pixel point in the image to be identified, determining whether the S value of the current pixel point is in the range of (S _ max-buffer _ S, S _ max), if so, regarding the pixel point as belonging to the sky;
and 3-2-3) integrating all pixel points which are considered to belong to the sky in the image to be identified to obtain the determined day in the image to be identified.
6. The cloud identification method according to claim 5, wherein in step 3-2-1), the value of the parameter differential _ S is reduced, and then the iterative process of step 3-2-1) is executed again for 1 to 2 times.
7. The cloud recognition method according to claim 1 or 2, wherein the image processing modes are divided into four types, which are: a "sunny" mode, a "fog" mode, a "cloudy" mode, and a "cloudy" mode; wherein the content of the first and second substances,
the "clear" mode represents that the image to be recognized is in a clear state, and the parameter differential _ S used for reflecting the saturation difference of each pixel point in the image has a higher value in the "clear" mode; the parameter CDS _ Value used to represent the boundary recognition sensitivity Value has a higher Value;
the fog mode represents that the value of the parameter difference _ S is lower than that in the clear mode under the fog mode when the image to be recognized is in the fog state; the Value of the parameter CDS _ Value is lower than that in the clear mode;
the 'cloudy' mode represents that the image to be recognized is in a cloudy state, and in the 'cloudy' mode, the value of the parameter differential _ S is lower than that in the 'sunny' mode; the Value of the parameter CDS _ Value is unchanged compared with the Value in the clear mode;
the 'cloud lacking' mode represents that the value of the parameter differential _ S is unchanged in comparison with the 'fine' mode when the image to be recognized is in a cloud lacking state; the Value of the parameter CDS _ Value is lower than that in the "clear" mode.
8. The cloud identification method according to claim 1 or 2, wherein the step 12) comprises:
step 12-1), calculating the hue mean value, the saturation mean value and the brightness mean value of the known 'determination day' in the image to be identified; the corresponding calculation formula is as follows:
the average chroma value of the ' determination day ' is equal to the sum of hue values of all pixel points in the ' determination day '/' the number of pixel points in the ' determination day ';
the average value of the saturation of the ' determination day ' is equal to the sum of the saturation values of all pixel points in the ' determination day '/' the number of the pixel points in the ' determination day ';
the average brightness value of the ' determination day ' is equal to the sum of the brightness values of all pixel points in the ' determination day '/' the number of pixel points in the ' determination day ';
step 12-2), calculating a hue mean value, a saturation mean value and a brightness mean value of a known 'determined cloud' in the image to be identified; the corresponding calculation formula is as follows:
the average chroma value of the determined cloud is the sum of hue values of all pixel points in the determined cloud/the number of pixel points in the determined cloud;
the average value of the saturation of the ' determined cloud ' is equal to the sum of the saturation values of all pixel points in the ' determined cloud '/' the number of the pixel points in the ' determined cloud ';
the average brightness value of the ' determined cloud ' is equal to the sum of the brightness values of all pixel points in the ' determined cloud '/' the number of the pixel points in the ' determined cloud ';
Step 12-3), calculating the difference between any pixel point in the unknown region in the image to be recognized and the determined day and the determined cloud respectively;
the corresponding calculation formula is as follows:
Figure FDA0002071547210000041
Figure FDA0002071547210000042
wherein, a represents the difference between a certain pixel point in an unknown region in the image to be recognized and a 'determination cloud', and b represents the difference between a certain pixel point in an unknown region in the image to be recognized and a 'determination day'; H. s, V respectively representing the chroma value, saturation value and brightness value of a certain pixel point in an unknown region in the image to be identified;
Figure FDA0002071547210000043
a chroma mean value representing "determine cloud";
Figure FDA0002071547210000044
mean saturation representing "determine cloud";
Figure FDA0002071547210000045
mean luminance values representing "determine cloud";
Figure FDA0002071547210000046
a color mean value representing "determined day";
Figure FDA0002071547210000047
mean saturation representing "determined day";
Figure FDA0002071547210000048
mean value of brightness indicating "determined day"; weighting _ H _2, weighting _ S _2 and weighting _ V _2 are fuzzy identifications respectivelyH channel weighting coefficient, S channel weighting coefficient and V channel weighting coefficient which are used at other times;
step 12-4), judging whether the b value of the pixel point in the unknown region is greater than (a + acceptance) according to the calculation result of the step 12-3), if so, considering the pixel point as 'cloud', otherwise, considering the pixel point as 'sky'; the parameter acceptance is used for defining the acceptance degree of the fuzzy state, and the smaller the acceptance value is, the more the fuzzy part is identified as cloud;
Step 12-5), executing step 12-3) and step 12-4) on all pixel points in the unknown region in the image to be recognized, and thus recognizing the fog cloud.
CN201910439308.9A 2019-05-24 2019-05-24 Cloud identification method Active CN111985492B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910439308.9A CN111985492B (en) 2019-05-24 2019-05-24 Cloud identification method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910439308.9A CN111985492B (en) 2019-05-24 2019-05-24 Cloud identification method

Publications (2)

Publication Number Publication Date
CN111985492A true CN111985492A (en) 2020-11-24
CN111985492B CN111985492B (en) 2024-03-26

Family

ID=73436922

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910439308.9A Active CN111985492B (en) 2019-05-24 2019-05-24 Cloud identification method

Country Status (1)

Country Link
CN (1) CN111985492B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113449668A (en) * 2021-07-08 2021-09-28 杭州迅蚁网络科技有限公司 Target angle identification method and device of flight device

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005094452A (en) * 2003-09-18 2005-04-07 Nec Corp Method, system, and program for processing image
CN101286233A (en) * 2008-05-19 2008-10-15 重庆邮电大学 Fuzzy edge detection method based on object cloud
CN102750701A (en) * 2012-06-15 2012-10-24 西安电子科技大学 Method for detecting spissatus and spissatus shadow based on Landsat thematic mapper (TM) images and Landsat enhanced thematic mapper (ETM) images
CN104463196A (en) * 2014-11-11 2015-03-25 中国人民解放军理工大学 Video-based weather phenomenon recognition method
US20150301226A1 (en) * 2014-04-17 2015-10-22 Siemens Aktiengesellschaft Short term cloud coverage prediction using ground-based all sky imaging
US20160283774A1 (en) * 2012-11-12 2016-09-29 Bae Systems Plc Cloud feature detection
CN107437241A (en) * 2017-08-09 2017-12-05 哈尔滨工业大学 A kind of dark channel image defogging method of jointing edge detection
CN107563340A (en) * 2017-09-13 2018-01-09 首航节能光热技术股份有限公司 The machine identification method of contained cloud in a kind of image
CN108596849A (en) * 2018-04-23 2018-09-28 南京邮电大学 A kind of single image to the fog method based on sky areas segmentation
CN109191432A (en) * 2018-07-27 2019-01-11 西安电子科技大学 The remote sensing images cloud detection method of optic of filtering multi-resolution decomposition is converted based on domain

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005094452A (en) * 2003-09-18 2005-04-07 Nec Corp Method, system, and program for processing image
CN101286233A (en) * 2008-05-19 2008-10-15 重庆邮电大学 Fuzzy edge detection method based on object cloud
CN102750701A (en) * 2012-06-15 2012-10-24 西安电子科技大学 Method for detecting spissatus and spissatus shadow based on Landsat thematic mapper (TM) images and Landsat enhanced thematic mapper (ETM) images
US20160283774A1 (en) * 2012-11-12 2016-09-29 Bae Systems Plc Cloud feature detection
US20150301226A1 (en) * 2014-04-17 2015-10-22 Siemens Aktiengesellschaft Short term cloud coverage prediction using ground-based all sky imaging
CN104463196A (en) * 2014-11-11 2015-03-25 中国人民解放军理工大学 Video-based weather phenomenon recognition method
CN107437241A (en) * 2017-08-09 2017-12-05 哈尔滨工业大学 A kind of dark channel image defogging method of jointing edge detection
CN107563340A (en) * 2017-09-13 2018-01-09 首航节能光热技术股份有限公司 The machine identification method of contained cloud in a kind of image
CN108596849A (en) * 2018-04-23 2018-09-28 南京邮电大学 A kind of single image to the fog method based on sky areas segmentation
CN109191432A (en) * 2018-07-27 2019-01-11 西安电子科技大学 The remote sensing images cloud detection method of optic of filtering multi-resolution decomposition is converted based on domain

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
RADOVAN, A等: "Prediction of HSV color model parameter values of cloud movement picture based on artificial neural networks", 《41ST INTERNATIONAL CONVENTION ON INFORMATION AND COMMUNICATION TECHNOLOGY, ELECTRONICS AND MICROELECTRONICS》, no. 41, pages 1110 - 1114, XP033368237, DOI: 10.23919/MIPRO.2018.8400202 *
朱桂海;黄青伦;蒙印;: "多源高分辨率遥感影像自动云检测方法研究", 江西测绘, no. 02, pages 28 - 30 *
李微;李德仁;: "基于HSV色彩空间的MODIS云检测算法研究", 中国图象图形学报, no. 09, pages 142 - 147 *
李骞;范茵;张;李宝强;: "基于室外图像的天气现象识别方法", 计算机应用, no. 06, pages 178 - 181 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113449668A (en) * 2021-07-08 2021-09-28 杭州迅蚁网络科技有限公司 Target angle identification method and device of flight device

Also Published As

Publication number Publication date
CN111985492B (en) 2024-03-26

Similar Documents

Publication Publication Date Title
CN107895376A (en) Based on the solar panel recognition methods for improving Canny operators and contour area threshold value
CN105956557B (en) A kind of sequential remote sensing image cloud covered areas domain automatic testing method of object-oriented
CN102385753B (en) Illumination-classification-based adaptive image segmentation method
CN109389163B (en) Unmanned aerial vehicle image classification system and method based on topographic map
CN106934418B (en) Insulator infrared diagnosis method based on convolution recursive network
CN107992856B (en) High-resolution remote sensing building shadow detection method under urban scene
CN113537211B (en) Asymmetric IOU-based deep learning license plate frame positioning method
CN102637301B (en) Method for automatically evaluating color quality of image during aerial photography in real time
CN106157323A (en) The insulator division and extracting method that a kind of dynamic division threshold value and block search combine
CN102855627B (en) City remote sensing image shadow detection method based on spectral characteristic and topological relation
CN111339948A (en) Automatic identification method for newly-added buildings of high-resolution remote sensing images
CN109754440A (en) A kind of shadow region detection method based on full convolutional network and average drifting
CN104966291A (en) Cloud cluster automatic detection method based on foundation cloud atlas
CN108711160B (en) Target segmentation method based on HSI (high speed input/output) enhanced model
CN110852207A (en) Blue roof building extraction method based on object-oriented image classification technology
CN110674884A (en) Image identification method based on feature fusion
CN106815602B (en) runway FOD image detection method and device based on multi-level feature description
CN109741337B (en) Region merging watershed color remote sensing image segmentation method based on Lab color space
CN111985492B (en) Cloud identification method
CN110223253B (en) Defogging method based on image enhancement
CN108898080B (en) Ridge line neighborhood evaluation model-based crack connection method
CN116189136A (en) Deep learning-based traffic signal lamp detection method in rainy and snowy weather
CN102982512B (en) Image shadow detection method for Baidu satellite map
CN116109662B (en) Super-pixel segmentation method of infrared image
Du et al. Shadow detection in high-resolution remote sensing image based on improved K-means

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant