CN101876993B - Method for extracting and retrieving textural features from ground digital nephograms - Google Patents

Method for extracting and retrieving textural features from ground digital nephograms Download PDF

Info

Publication number
CN101876993B
CN101876993B CN2009102385224A CN200910238522A CN101876993B CN 101876993 B CN101876993 B CN 101876993B CN 2009102385224 A CN2009102385224 A CN 2009102385224A CN 200910238522 A CN200910238522 A CN 200910238522A CN 101876993 B CN101876993 B CN 101876993B
Authority
CN
China
Prior art keywords
digitized
cloud
ground
pixel
cloud map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2009102385224A
Other languages
Chinese (zh)
Other versions
CN101876993A (en
Inventor
吕伟涛
李清勇
杨俊�
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jiaotong University
Chinese Academy of Meteorological Sciences CAMS
Original Assignee
Beijing Jiaotong University
Chinese Academy of Meteorological Sciences CAMS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jiaotong University, Chinese Academy of Meteorological Sciences CAMS filed Critical Beijing Jiaotong University
Priority to CN2009102385224A priority Critical patent/CN101876993B/en
Publication of CN101876993A publication Critical patent/CN101876993A/en
Application granted granted Critical
Publication of CN101876993B publication Critical patent/CN101876993B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a method for extracting and retrieving textural features from ground digital nephograms, which comprises an extracting method and a retrieving method. The extracting method comprises the following steps: converting a RGB three-channel ground digital nephogram into single-channel pixel class diagrams; analyzing the pixel class diagrams and establishing co-occurrence matrixes to obtain histogram vectors of the co-occurrence matrixes; merging the histogram vectors of the pixel class co-occurrence matrixes to establish textural feature vectors of the ground digital nephogram; and storing the textural feature vectors into the nephogram database. The retrieving method comprises the following steps: extracting the textural features of the sample nephogram according to the feature extracting method; orderly calculating the similarity between the textural features of the sample nephogram and the textural features of every nephogram in the nephogram database; and displaying the most similar nephograms as retrieval results. The invention can automatically analyze and extract the effective textural features of the ground digital nephogram, and automatically retrieve the result nephograms similar to the sample nephogram from the nephogram database.

Description

A kind of textural characteristics of ground digitized cloud map extracts and search method
Technical field
The present invention relates to aerological sounding technology, Digital Image Processing, image retrieval and area of pattern recognition, relate in particular to a kind of textural characteristics analysis and search method of ground digitized cloud map.
Background technology
In the energy budget of the atmosphere balance, the regulating action highly significant of cloud is a material impact factor of climate change.On the other hand, the generation of cloud, development and differentiation have not only reflected the degree of stability of atmosphere, motion and water-air regime at that time, and are the important coindications of Changes in weather of presaging for the future.Therefore, the observation of cloud has important role.At present, the observation of cloud mainly comprises space-based observation (being satellite remote sensing) and ground based observa tion.The signature analysis of satellite cloud picture and robotization are handled and have been obtained bigger progress, and still, the ground based observa tion of cloud and nephanalysis then rely on the visual judgement of meteorologic observer for a long time, and this becomes a bottleneck of weather service automation development process.
At present, some ground all-sky cloud scopes have been developed both at home and abroad, all-sky imager TSI (Total Sky Imager) as U.S. Yankee Environmental System Inc. development, the WSI (Whole Sky Imager) of U.S. California university development, the ground all-sky cloud recording geometry TCI (Ground-based Total-sky Cloud Imager) of the ASI (All Sky Imager) of Ins Atmospheric Physics,Academia Sinica's development and China Meterological Science Research Institute's development etc.These equipment can both automatically be taken the image of all-sky, and generate color digital image radix word cloud atlas one by one.Though the ground digitized cloud map can obtain automatically, cloud amount also can be realized automatic calculating substantially,, the cloud form analysis of ground digitized cloud map also mainly depends on experienced observation person's manual analysis at present.Obviously, manual analysis has a lot of defectives: at first, observation person must be very familiar to complicated observation standard and can skillfully use; Secondly, observed result can be subjected to the influence of people's aspects such as physiology, psychology and sense of responsibility, and even same observation person analyzes in different for same width of cloth ground cloud atlas and may produce different feature descriptions; In addition, the continuity that flows and observe of observing and predicting personnel is not enough, also can influence the accuracy of observation, and same width of cloth ground cloud atlas is analyzed by different observation persons and also tended to produce different feature descriptions.Obviously, the validity feature of automatic objectively analysis and extraction ground digitized cloud map has very important significance for the robotization and the intellectuality of the observation of ground cloud.Textural characteristics is a key character of ground digitized cloud map, and it can describe some characteristics of cloud and sky background thereof objectively.Applied Digital Flame Image Process, artificial intelligence technology, the ground digitized cloud map can be represented by enough digitized, effective texture feature vectors.
In addition, along with the development and the application in the observation of ground cloud thereof of digital camera technology, the ground digitized cloud map that we get access to will get more and more, and accumulate over a long period, and will form the image library of a scale very huge (width of cloth often up to ten thousand).In actual applications, meteorologic observer or meteorological research person often need retrieve specific cloud atlas in ground cloud atlas database.In traditional method, retrieval person has dual mode to obtain cloud atlas: the most original method is to browse the cloud atlas database by hand, obtains specific cloud atlas; Another method is according to the key search cloud atlas, in the method, the cloud atlas data base administrator must be described with literal (being referred to as key word) each width of cloth cloud atlas in the cloud atlas database by hand in advance, and key word and cloud atlas are associated, and is kept in the database; In the time of retrieval, retrieval person imports key word, and the cloud atlas database retrieval system is retrieved by match keywords.Obviously, when the scale of cloud atlas database reach certain level other the time, above-mentioned two kinds of methods all are difficult to be competent at.Browse extensive cloud atlas database power, an inefficiency of consuming again consuming time by hand; Though it is higher based on the key search retrieval efficiency ratio, but what can not be ignored is, the prerequisite of this method is that each width of cloth cloud atlas in the cloud atlas database all has correct key word associated, and with regard to present technology, machine can not add correct key word to cloud atlas automatically, but need be manually added the key word description of cloud atlas by the skilled observation person that professional knowledge is arranged.Same, manually add the cloud atlas key word and take time and effort, and the key word subjectivity of interpolation being strong, the situation of inconsequent is quite a few sees.So digitized, effective textural characteristics extracts, and the problem that can solve existence in manual analysis of cloud atlas and the retrieval based on the cloud atlas search method of this textural characteristics to a certain extent.
Summary of the invention
(1) goal of the invention
The textural characteristics that the purpose of this invention is to provide a kind of ground digitized cloud map extracts and search method, analyzes and extract effective textural characteristics of ground digitized cloud map objectively automatically, solve above-mentioned cloud atlas manual analyze and retrieval in the problem that exists.
(2) summary of the invention
A kind of textural characteristics abstracting method of ground digitized cloud map may further comprise the steps:
S101: adopt following formula that the three-channel ground digitized cloud map of colored RGB is converted to single pass pixel class figure:
C ( x , y ) = 0 , if I B ( x , y ) / I R ( x , y ) &GreaterEqual; &alpha; 1 1 , if &alpha; 1 > I B ( x , y ) / I R ( x , y ) &GreaterEqual; &alpha; 2 2 , if &alpha; 2 > I B ( x , y ) / I R ( x , y ) and I V ( x , y ) < &beta; 3 , if &alpha; 2 > I B ( x , y ) / I R ( x , y ) and I V ( x , y ) &GreaterEqual; &beta; ,
Wherein, I B(x, y), I R(x y) represents that respectively (x y) locates blueness (Blue) component value and redness (Red) component value of pixel to coordinate in the input color ground digitized cloud map, and its value directly reads I by colored ground digitized cloud map image file V(x, y) (x y) locates the brightness value of pixel, and (x, y) (x y) locates the category label of pixel, α to C for coordinate in the colored ground digitized cloud map of expression for coordinate in the expression input color ground digitized cloud map 1, α 2Be the threshold parameter of the red wave band ratio of indigo plant, β is the luminance threshold parameter.
S102:, obtain the histogram vectors of co-occurrence matrix by analyzing pixel class figure and setting up co-occurrence matrix;
S103: merge the histogram vectors of a plurality of pixel class co-occurrence matrixes, make up the texture feature vector of ground digitized cloud map;
S104: the texture feature vector that S103 is made up is saved in the cloud atlas database.
Wherein, the I among the described step S101 V(x, y) value by coordinate in the input color ground digitized cloud map (x, y) blueness (Blue) component value redness (Red) component value and green (Green) component value of locating pixel calculates, its computing formula is: I V(x, y)=100*Max (I R(x, y), I G(x, y), I B(x, y))/255, wherein I G(x is that (x y) locates green (Green) component value of pixel to coordinate y).
Wherein, described α 1Value is 1.5, α 2Value is 1.3, and the β value is 80.
Wherein, described step S102 comprises step:
S1021: analyze the symbiosis between any two pixel class among the pixel class figure, make up co-occurrence matrix CCM:
CCM ( i , j ) = &Sigma; x = 1 w &Sigma; y = 1 h 1 , ifC ( x , y ) = iandC ( x + &Delta;x , y + &Delta;y ) = j 0 , otherwise ,
I wherein, j is the remarked pixel classification respectively, and value is { 0,1,2,3}, (Δ x, Δ y) represents side-play amount, w, h represents the length and the width of ground digitized cloud map respectively, CCM (i, j) (x among the remarked pixel classification figure, y) locating pixel class is i, and (x+ Δ x, y+ Δ y) locates the frequency of position to occurring that pixel class is j simultaneously, (i j), obtains one 4 * 4 co-occurrence matrix to calculate each CCM respectively;
S1022: (i j), obtains normalized co-occurrence matrix CCM by the described co-occurrence matrix CCM of following normalization formula normalization N
CCM N(i,j)=CCM(i,j)/(wh).
S1023: splice normalization co-occurrence matrix CCM by row by following formula NEach element, obtain one 16 the dimension histogram vectors F S
F S=(CCM N(0,0),CCM N(0,1),...,CCM N(3,3)).。
Wherein, described step S103 comprises step:
S1031: the position offset of two pixel class symbiosis has L, i.e. (Δ x 1, Δ y 1), (Δ x 2, Δ y 2) ..., (Δ x L-1Δ y L-1), (Δ x L, Δ y L), S102 obtains L different co-occurrence matrix and L corresponding histogram vectors set by step
Figure GSB00000583671500042
S1032: handle L histogram vectors by following formula linear superposition and equalization
Figure GSB00000583671500043
F = &Sigma; k = 1 L F S k / L .
Obtain the texture feature vector of the ground digitized cloud map of 16 dimensions.
A kind of textural characteristics search method of ground digitized cloud map may further comprise the steps:
S201: the texture feature vector that calculates the sample digitized cloud map by the step of step S101, the S102 of claim 1 and S103;
S202: be calculated as follows the similarity between the texture feature vector of cloud atlas in the texture feature vector of described sample digitized cloud map and the cloud atlas database,
D ( F 1 , F 2 ) = &Sigma; k = 1 16 Min ( F k 1 , F k 2 ) F k 1 ,
Wherein, F 1, F 2The proper vector of two ground digitized cloud maps of expression, the value of D is big more, and these two ground cloud atlas are similar more, and on the contrary, the value of D is more little, and these two ground cloud atlas difference are big more;
S203: press the D value and sort from big to small, the preceding M width of cloth of selecting the similarity maximum is as result for retrieval, and wherein, M is greater than 0 arbitrary integer less than the total cloud atlas number of cloud atlas database;
S204: show the result for retrieval cloud atlas.
Wherein, comprise the step of selecting a width of cloth sample digitized cloud map before the described step S201.
(3) beneficial effect
The textural characteristics of the ground digitized cloud map that the present invention proposes extracts with search method has following beneficial effect:
(1) in the inventive method, the feature of ground digitized cloud map is with objective 16 dimension value vector descriptions, rather than subjective text description, and the extraction process of proper vector can automatically be carried out by computing machine, can greatly improve ground nephanalysis and efficiency of managing.This texture feature vector has implied color, quality, the architectural feature of cloud atlas, for the tasks such as automated analysis, identification and retrieval of cloud atlas provide several reason foundations.
(2) in the inventive method, the backstage cloud atlas library management process full-automation of cloud atlas retrieval can realize 0 manual amount.That is to say that the cloud atlas library manager only need specify the cloud atlas set or the cloud atlas place catalogue of warehouse-in, the present invention analyzes the textural characteristics of cloud atlas automatically, and is saved in the database, for the foreground search call.In addition, the present invention also provides the novel retrieval mode based on the retrieval of sample cloud atlas.The user only need specify duplicate sample example cloud atlas, and the present invention can retrieve the as a result cloud atlas similar to the sample cloud atlas automatically from the cloud atlas storehouse.At last, based on textural characteristics search method excellent performance of the present invention, the accuracy rate of retrieval is higher.
(3) the pixel class figure of the inventive method classifies to pixel according to red wave band ratio of indigo plant and pixel intensity, and sorting technique is simply efficient, and the classification number is simplified.
(4) co-occurrence matrix in the inventive method is analyzed based on pixel class figure, with traditional comparing based on the gray-scale map method, the symbiosis intension that this method pixel is right is deep, implied meaning is blunt, it has portrayed cloud pixel and cloud pixel, sky pixel and sky pixel, pixels such as cloud pixel and sky pixel between symbiosis under diverse location relation.And co-occurrence matrix dimension of the present invention very low (4 * 4) has significantly reduced storage overhead.
(5) the co-occurrence matrix feature in the inventive method adopts histogrammic expression, avoided asking in the classic method process of high-order statistic, this method conversion is simple, and, the implication of each element is clear and definite in the histogram vectors, its represents specific pixel class to the frequency of occurrences in the ground cloud atlas, and this frequecy characteristic has discrimination preferably when image retrieval.
Adopt the equalization consolidation strategy when (6) merging a plurality of histogram vectors in the inventive method, this strategy execution efficient height has been avoided losing of any single property of the histogram, and, kept the dimension of final texture feature vector to simplify.
Description of drawings
Fig. 1 is that the textural characteristics of ground digitized cloud map extracts and the search method FB(flow block);
Fig. 2 is converted to pixel class figure synoptic diagram for colored ground digitized cloud map;
Fig. 3 is two secondary ground digitized cloud maps and corresponding texture feature vector synoptic diagram thereof;
Fig. 4 is the cloud atlas retrieval demo system interface based on textural characteristics.
Embodiment
The textural characteristics of the ground digitized cloud map that the present invention proposes extracts and search method, is described as follows in conjunction with the accompanying drawings and embodiments.
As shown in Figure 1, comprise warehouse-in process (being the textural characteristics extraction process) and retrieving.
The step that texture extracts is as follows:
At first obtain a width of cloth digitized cloud map, generally catch the state of sky by the all-sky imager automatically, and generate digitized cloud map, digitized cloud map also can be by electronic equipment collections such as digital camera/Digital Video.
The digitized cloud map that gather this moment is a RGB triple channel digitized cloud map, one width of cloth digitized cloud map contains cloud, sky two parts usually, and under different weather conditions, the brightness of cloud also has bigger difference, the task of step S101 is exactly that the pixel in the digitized cloud map is classified, and replace the rgb value of pixel in the original image with the category label value, and finally obtain pixel class figure, be about to colored three-channel ground digitized cloud map and be converted to single pass pixel class figure.Pixel class figure abbreviaties, and has abandoned some the less important information in the former RGB triple channel digitized cloud map, has kept its essential content.
(x, the pixel of y) locating at first read colored ground digitized cloud map image file, obtain I for position among the former cloud atlas I B(x, y), I R(x, y), ratio, i.e. I between calculating pixel blueness (B) channel value and redness (R) channel value B(x, y)/I R(x, y), if ratio value greater than 1.5, then this pixel is the sky pixel, category label value C (x, y)=0; If ratio value is between 1.3 and 1.5, then this pixel is the transition pixel of cloud and sky, and category label value C (x, y)=1; If ratio value is less than 1.3, then this pixel is the cloud pixel, and the monochrome information according to former pixel is divided into dark cloud and bright cloud to the cloud pixel again, promptly obtains brightness value I V(x, y), (x y) locates blueness (Blue) component value, redness (Red) component value and green (Green) the component value I of pixel to its value by coordinate in the input color ground digitized cloud map G(x y) calculates, and its computing formula is: I V(x, y)=100*Max (I R(x, y), I G(x, y), I B(x, y))/255, I V(x y)<80, then is a dark cloud, category label value C (x y)=2, otherwise is bright cloud, and category label value C (x, y)=3.Be α 1=1.5, α 2=1.3, β=80, the acquisition parameters of choosing when obtaining with colored ground digitized cloud map of these three threshold values is provided with relevant, in the exposure appropriateness, the cloud atlas that color rendition is correct, can adopt above-mentioned threshold value to select, and for time shutter parameter length/weak point (brightness of image bright partially/dark) partially, the abnormal cloud atlas of color, at first need cloud atlas is carried out pre-service, its brightness and color are revised, could adopt above-mentioned threshold value, if do not carry out image correction, these three threshold values should be done corresponding adjusting and could guarantee the accurate of characteristics of image when carrying out this step process, reasonably extract.Concrete computing formula is as follows:
C ( x , y ) = 0 , if I B ( x , y ) / I R ( x , y ) &GreaterEqual; 1.5 1 , if 1.5 > I B ( x , y ) / I R ( x , y ) &GreaterEqual; 1.3 2 , if 1.3 > I B ( x , y ) / I R ( x , y ) and I V ( x , y ) < 80 3 , if 1.3 > I B ( x , y ) / I R ( x , y ) and I V ( x , y ) &GreaterEqual; 80 ,
According to above-mentioned formula, from left to right, from top to bottom, the classification of each pixel among the cycle calculations cloud atlas I, and the category label value is saved among the pixel class figure C.Shown an original ground digitized cloud map as Fig. 2 (a), the pixel class figure that Fig. 2 (b) expression generates by above-mentioned steps.For the purpose of the convenience that shows, represent classification 0 in Fig. 2 (b) Smalt, red expression classification 1, white expression classification 2, grey is represented classification 3.
Step S102 is according to analyzing pixel class figure and set up co-occurrence matrix, obtaining the histogram vectors of co-occurrence matrix.Obviously, different classes of pixel will obtain different cloud atlas according to different locus arrangements.In abutting connection with arrangement, then can constitute a sky dummy section such as a large amount of sky pixels; A large amount of cloud pixels then may constitute a slice cloud in abutting connection with arrangement.Class co-occurrence matrixes is that two pixels that satisfy particular spatial location relation (be referred to as position to) are belonged to the result that the situation of specific pixel classification is added up respectively.The class co-occurrence matrixes analysis comprises three sub-steps, i.e. class co-occurrence matrixes generation, normalization co-occurrence matrix and class co-occurrence matrixes character representation.
Analyze the symbiosis between any two pixel class among the pixel class figure, make up co-occurrence matrix CCM:
CCM ( i , j ) = &Sigma; x = 1 w &Sigma; y = 1 h 1 , ifC ( x , y ) = iandC ( x + &Delta;x , y + &Delta;y ) = j 0 , otherwise ,
I wherein, j is the remarked pixel classification respectively, and value is { 0,1,2,3}, (Δ x, Δ y) represents side-play amount, w, h represents the length and the width of ground digitized cloud map respectively, CCM (i, j) (x among the remarked pixel classification figure, y) locating pixel class is i, and (x+ Δ x, y+ Δ y) locates the frequency of position to occurring that pixel class is j simultaneously, (i j), obtains one 4 * 4 co-occurrence matrix to calculate each CCM respectively.
(i j), obtains normalized co-occurrence matrix CCM by the described co-occurrence matrix CCM of following normalization formula normalization N
CCM N(i,j)=CCM(i,j)/(wh).
Splice normalization co-occurrence matrix CCM by following formula by row NEach element, obtain one 16 the dimension histogram vectors F S
F S=(CCM N(0,0),CCM N(0,1),...,CCM N(3,3)).
Histogram vectors F SThen as the class co-occurrence matrixes character representation that satisfies position relation (Δ x, Δ y).
Step S103 is the histogram vectors that merges a plurality of pixel class co-occurrence matrixes, makes up the texture feature vector of ground digitized cloud map.A position relation often is not enough to describe the textural characteristics of ground digitized cloud map, therefore, constructs a plurality of positions relation, and correspondingly generates a plurality of class co-occurrence matrixes and their character representation thereof.Comprise four position relations herein, they are { (1,0), (0,1), (1,0), (0 ,-1) }, and for each position relation, circular flow step S102 can obtain 4 different co-occurrence matrixs, and 4 corresponding histogram vectors Then to these four histogram vectors
Figure GSB00000583671500092
Carry out linear superposition, and average, finally be merged into the vector of one 16 dimension, it is as follows to merge formula:
F = &Sigma; k = 1 4 F S k / 4 .
After the merging 16 dimension histogram vectors F has then constituted the texture feature vector of ground digitized cloud map.As Fig. 3 (a) expression one width of cloth fracto-cumulus digitized cloud map, Fig. 3 (b) presentation graphs pairing proper vectors of 3 (a) (representing) with histogram; Fig. 3 (c) represents a width of cloth cumulus humilis digitized cloud map, Fig. 3 (d) presentation graphs pairing proper vectors of 3 (c) (representing with histogram).
Step S104 is saved in the cloud atlas database with the texture feature vector that step S103 makes up, and uses for retrieval.Can be by the ADO interface of SQL Server 2000, coding writes proper vector in the database, adopts the database management language (such as SQL Server, Oracle) and the programming language (C++, JAVA) of main flow to realize.
If a plurality of ground digitized cloud map warehouse-ins are arranged, the process of circulation step S101-S104 all disposes up to all digitized cloud maps.
As follows from the step of cloud atlas database retrieval cloud atlas:
At first select sample digitized cloud map to be retrieved, calculate the texture feature vector of this sample behind the selected sample digitized cloud map to be retrieved, i.e. step S201, this step can be undertaken by step S101, the S102 and the S103 that extract in the textural characteristics method.
Step S202 is the similarity measurement of cloud atlas in sample digitized cloud map to be retrieved and the cloud atlas database, promptly calculates the similarity between the texture feature vector of cloud atlas in the texture feature vector of sample digitized cloud map and the cloud atlas database.Calculate the proper vector F of sample cloud atlas successively eProper vector F with cloud atlas in the cloud atlas database i, i=1 ..., the similarity distance between the N, wherein N represents the number of cloud atlas in the cloud atlas database.Similarity distance between the proper vector represents that with cross distance the computing formula of cross distance is as follows:
D ( F e , F i ) = &Sigma; k = 1 16 Min ( F k e , F k i ) F k e .
Step S203 is according to similarity distance D (F e, F i) from big to small order, the cloud atlas in the cloud atlas database is sorted.Then, the front M width of cloth image of selecting the similarity maximum is as result for retrieval, and wherein, M can be specified greater than 0, less than any positive integer of cloud atlas database total number N arbitrarily by the user.
The result that step S204 will retrieve presents to the user to M width of cloth result for retrieval cloud atlas by specific graphic user interface, and Fig. 4 has shown the display interface example of a cloud atlas searching system.
The application of the present invention in unit cloud atlas management and retrieval:
The present invention is particularly suitable for the cloud atlas image of managing large scale, and the Retrieval Interface that facilitates.Accumulated up to ten thousand ground digitized cloud maps as the weather scientist Professor Zhang in long-term meteorological watch and research, he leaves them in the some of PC hard disk or certain the several catalogue in.When he obtained new cloud atlas, he need find out similar cloud atlas from the typical cloud atlas set of collecting, compare research, but in the face of up to ten thousand digital cloud atlas of computer, he can't fast and effeciently find out similar cloud atlas.Use the present invention, the Professor Zhang can extract textural characteristics to up to ten thousand digital cloud atlas apace, and imports in the database, and this importing process is " once importing permanent the use "; When he obtained new cloud atlas, he can retrieve the similar cloud atlas of content in the cloud atlas database by cloud atlas search method of the present invention within several seconds.
The application of the present invention in the cloud retrieval:
Meteorological administrative authority (as weather bureau) or research department (as meteorological research institute) have made up a large-scale ground cloud atlas database, and use textural characteristics that the present invention extracts cloud atlas, import in the database, cloud retrieval service based on Internet also is provided simultaneously, and has announced the network address of visit cloud atlas retrieval service.When some observation persons have taken a width of cloth cloud atlas, he wants to retrieve from the cloud atlas database and the similar cloud atlas of this cloud atlas compares research.So he can land the website, upload the cloud atlas of oneself the taking line retrieval of going forward side by side, system returns in the cloud atlas database cloud atlas similarly within several seconds.
The application of the present invention in automatic weather station ground digitized cloud map total management system:
The all-sky cloud imaging of ground automatic weather station collects the state of sky, and generates digitized cloud map.Then, by transmission channel digitized cloud map is sent to distance host.Main frame moves textural characteristics extraction algorithm of the present invention, extracts the textural characteristics of this cloud atlas, and cloud atlas and proper vector are imported in the database.Main frame can also be issued the service interface of cloud atlas retrieval, and the user can retrieve cloud atlas at any time from this weather station.Can be implemented in like this under the unmanned intervention situation, automatically realize collection, analysis, the management and retrieval function of ground digitized cloud map, form the ground digitized cloud map total management system of intelligentized a, robotization.
Above embodiment only is used to illustrate the present invention; and be not limitation of the present invention; the those of ordinary skill in relevant technologies field; under the situation that does not break away from the spirit and scope of the present invention; can also make various variations and modification; therefore all technical schemes that are equal to also belong to category of the present invention, and scope of patent protection of the present invention should be defined by the claims.

Claims (7)

1. the textural characteristics abstracting method of a ground digitized cloud map is characterized in that, may further comprise the steps:
S101: adopt following formula that the three-channel ground digitized cloud map of colored RGB is converted to single pass pixel class figure:
C ( x , y ) = 0 , if I B ( x , y ) / I R ( x , y ) &GreaterEqual; &alpha; 1 1 , if &alpha; 1 > I B ( x , y ) / I R ( x , y ) &GreaterEqual; &alpha; 2 2 , if &alpha; 2 > I B ( x , y ) / I R ( x , y ) and I V ( x , y ) < &beta; 3 , if &alpha; 2 > I B ( x , y ) / I R ( x , y ) and I V ( x , y ) &GreaterEqual; &beta; ,
Wherein, I B(x, y), I R(x y) represents that respectively (x y) locates the blue component value and the red color component value of pixel to coordinate in the input color ground digitized cloud map, and its value directly reads I by colored ground digitized cloud map image file V(x, y) (x y) locates the brightness value of pixel, and (x, y) (x y) locates the category label of pixel, α to C for coordinate in the colored ground digitized cloud map of expression for coordinate in the expression input color ground digitized cloud map 1, α 2Be the threshold parameter of the red wave band ratio of indigo plant, β is the luminance threshold parameter;
S102:, obtain the histogram vectors of co-occurrence matrix by analyzing pixel class figure and setting up co-occurrence matrix;
S103: merge the histogram vectors of a plurality of pixel class co-occurrence matrixes, make up the texture feature vector of ground digitized cloud map;
S104: the texture feature vector that S103 is made up is saved in the cloud atlas database.
2. the textural characteristics abstracting method of ground digitized cloud map as claimed in claim 1 is characterized in that, the I among the described step S101 V(x, y) value by coordinate in the input color ground digitized cloud map (x, y) blue component value, red color component value and the green component values of locating pixel calculates, its computing formula is: I V(x, y)=100*Max (I R(x, y), I G(x, y), I B(x, y))/255, wherein I G(x is that (x y) locates the green component values of pixel to coordinate y).
3. the textural characteristics abstracting method of ground digitized cloud map as claimed in claim 1 or 2 is characterized in that, described α 1Value is 1.5, α 2Value is 1.3, and the β value is 80.
4. the textural characteristics abstracting method of ground digitized cloud map as claimed in claim 1 is characterized in that, described step S102 comprises step:
S1021: analyze the symbiosis between any two pixel class among the pixel class figure, make up co-occurrence matrix CCM:
CCM ( i , j ) = &Sigma; x = 1 w &Sigma; y = 1 h 1 , ifC ( x , y ) = iandC ( x + &Delta;x , y + &Delta;y ) = j 0 , otherwise ,
I wherein, j is the remarked pixel classification respectively, and value is { 0,1,2,3}, (Δ x, Δ y) represents side-play amount, w, h represents the length and the width of ground digitized cloud map respectively, CCM (i, j) (x among the remarked pixel classification figure, y) locating pixel class is i, and (x+ Δ x, y+ Δ y) locates the frequency of position to occurring that pixel class is j simultaneously, (i j), obtains one 4 * 4 co-occurrence matrix to calculate each CCM respectively;
S1022: (i j), obtains normalized co-occurrence matrix CCM by the described co-occurrence matrix CCM of following normalization formula normalization N
CCM N(i,j)=CCM(i,j)/(wh).
S1023: splice normalization co-occurrence matrix CCM by row by following formula NEach element, obtain one 16 the dimension histogram vectors F S
F S=(CCM N(0,0),CCM N(0,1),...,CCM N(3,3)).。
5. the textural characteristics abstracting method of ground digitized cloud map as claimed in claim 1 is characterized in that, described step S103 comprises step:
S1031: the position offset of two pixel class symbiosis has L, i.e. (Δ x 1, Δ y 1), (Δ x 2, Δ y 2) ..., (Δ x L-1, Δ y L-1), (Δ x L, Δ y L), S102 obtains L different co-occurrence matrix and L corresponding histogram vectors set by step
Figure FSB00000583671400022
S1032: handle L histogram vectors by following formula linear superposition and equalization
Figure FSB00000583671400023
F = &Sigma; k = 1 L F S k / L .
Obtain the texture feature vector of the ground digitized cloud map of 16 dimensions.
6. the textural characteristics search method of a ground digitized cloud map is characterized in that, may further comprise the steps:
S201: the texture feature vector that calculates the sample digitized cloud map by the step of step S101, the S102 of claim 1 and S103;
S202: be calculated as follows the similarity between the texture feature vector of cloud atlas in the texture feature vector of described sample digitized cloud map and the cloud atlas database,
D ( F 1 , F 2 ) = &Sigma; k = 1 16 Min ( F k 1 , F k 2 ) F k 1 ,
Wherein, F 1, F 2The proper vector of two ground digitized cloud maps of expression, the value of D is big more, and these two ground cloud atlas are similar more, and on the contrary, the value of D is more little, and these two ground cloud atlas difference are big more;
S203: press the D value and sort from big to small, the preceding M width of cloth of selecting the similarity maximum is as result for retrieval, and wherein, M is greater than 0 arbitrary integer less than the total cloud atlas number of cloud atlas database;
S204: show the result for retrieval cloud atlas.
7. the textural characteristics search method of ground digitized cloud map as claimed in claim 6 is characterized in that, comprises the step of selecting a width of cloth sample digitized cloud map before the described step S201.
CN2009102385224A 2009-11-26 2009-11-26 Method for extracting and retrieving textural features from ground digital nephograms Expired - Fee Related CN101876993B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2009102385224A CN101876993B (en) 2009-11-26 2009-11-26 Method for extracting and retrieving textural features from ground digital nephograms

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2009102385224A CN101876993B (en) 2009-11-26 2009-11-26 Method for extracting and retrieving textural features from ground digital nephograms

Publications (2)

Publication Number Publication Date
CN101876993A CN101876993A (en) 2010-11-03
CN101876993B true CN101876993B (en) 2011-12-14

Family

ID=43019551

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2009102385224A Expired - Fee Related CN101876993B (en) 2009-11-26 2009-11-26 Method for extracting and retrieving textural features from ground digital nephograms

Country Status (1)

Country Link
CN (1) CN101876993B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102339388B (en) * 2011-06-27 2012-12-19 华中科技大学 Method for identifying classification of image-based ground state
CN103413148B (en) * 2013-08-30 2017-05-24 中国科学院自动化研究所 Ground-based cloud image classifying method based on random self-adaptive symbol sparse codes
CN105783861B (en) * 2014-12-22 2018-08-28 国家电网公司 Cloud cluster height measurement method based on double ground cloud atlas
CN110806582A (en) * 2019-11-06 2020-02-18 上海眼控科技股份有限公司 Method, device and equipment for evaluating accuracy of cloud image prediction and storage medium
CN111046911A (en) * 2019-11-13 2020-04-21 泰康保险集团股份有限公司 Image processing method and device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4897881A (en) * 1988-03-23 1990-01-30 Centre De Recherche Industrielle Du Quebec Optimum fast textural feature extractor
CN1945353A (en) * 2006-10-26 2007-04-11 国家卫星气象中心 Method for processing meteorological satellite remote sensing cloud chart

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4897881A (en) * 1988-03-23 1990-01-30 Centre De Recherche Industrielle Du Quebec Optimum fast textural feature extractor
CN1945353A (en) * 2006-10-26 2007-04-11 国家卫星气象中心 Method for processing meteorological satellite remote sensing cloud chart

Also Published As

Publication number Publication date
CN101876993A (en) 2010-11-03

Similar Documents

Publication Publication Date Title
Xu et al. Wheat ear counting using K-means clustering segmentation and convolutional neural network
CN109063754B (en) Remote sensing image multi-feature joint classification method based on OpenStreetMap
Zhang et al. Object-oriented method for urban vegetation mapping using IKONOS imagery
CN104331698B (en) Remote sensing type urban image extracting method
CN101876993B (en) Method for extracting and retrieving textural features from ground digital nephograms
CN111937016B (en) City internal poverty-poor space measuring method and system based on street view picture and machine learning
CN103309982B (en) A kind of Remote Sensing Image Retrieval method of view-based access control model significant point feature
Congalton Remote sensing: an overview
CN115497006B (en) Urban remote sensing image change depth monitoring method and system based on dynamic mixing strategy
CN110675421A (en) Depth image collaborative segmentation method based on few labeling frames
CN103888731A (en) Structured description device and system for mixed video monitoring by means of gun-type camera and dome camera
Thoonen et al. Classification of heathland vegetation in a hierarchical contextual framework
CN113066070A (en) Multi-source data fusion interaction method in three-dimensional scene
CN116543316A (en) Method for identifying turf in paddy field by utilizing multi-time-phase high-resolution satellite image
CN114119630B (en) Coastline deep learning remote sensing extraction method based on coupling map features
CN117789037B (en) Crop growth period prediction method and device
CN113033386B (en) High-resolution remote sensing image-based transmission line channel hidden danger identification method and system
Wang et al. [Retracted] Processing Methods for Digital Image Data Based on the Geographic Information System
CN111931738B (en) Neural network model pre-training method and device for remote sensing image
CN108986103A (en) A kind of image partition method merged based on super-pixel and more hypergraphs
CN117611988A (en) Automatic identification and monitoring method and system for newly-increased farmland management and protection attribute
CN112651312A (en) Forest area mikania micrantha automatic identification method combining laser LiDAR data and aerial image data
CN116452872A (en) Forest scene tree classification method based on improved deep pavv3+
CN114882139B (en) End-to-end intelligent generation method and system for multi-level map
Yu et al. An image-based automatic recognition method for the flowering stage of maize

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20111214

Termination date: 20151126