CN102902956A - Ground-based visible cloud image recognition processing method - Google Patents

Ground-based visible cloud image recognition processing method Download PDF

Info

Publication number
CN102902956A
CN102902956A CN2012103332461A CN201210333246A CN102902956A CN 102902956 A CN102902956 A CN 102902956A CN 2012103332461 A CN2012103332461 A CN 2012103332461A CN 201210333246 A CN201210333246 A CN 201210333246A CN 102902956 A CN102902956 A CN 102902956A
Authority
CN
China
Prior art keywords
image
cloud
varieties
gray
processing method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012103332461A
Other languages
Chinese (zh)
Other versions
CN102902956B (en
Inventor
王敏
周树道
陈晓颖
黄峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
METEOROLOGICAL COLLEGE UNIV OF TECHNOLOGY PLA
Original Assignee
METEOROLOGICAL COLLEGE UNIV OF TECHNOLOGY PLA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by METEOROLOGICAL COLLEGE UNIV OF TECHNOLOGY PLA filed Critical METEOROLOGICAL COLLEGE UNIV OF TECHNOLOGY PLA
Priority to CN201210333246.1A priority Critical patent/CN102902956B/en
Publication of CN102902956A publication Critical patent/CN102902956A/en
Application granted granted Critical
Publication of CN102902956B publication Critical patent/CN102902956B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses a ground-based visible cloud image recognition processing method which comprises the following steps of: acquiring a cloud image, performing preprocessing such as denoising, enhancing and the like on an input color image, and separating out a sky background and a cloud foreground by using the transparency based on a perceptual color space; then, extracting a transparency value of the cloud foreground image, and meanwhile, additionally taking textural features as distinguishing features among different cloud types; and finally, classifying the cloud type by using a neural network in conjunction with a cloud type feature database, and storing and displaying the corresponding feature and result. The method is suitable for ground-based all-sky visible cloud recognition processing work, can overcome the limitations in artificial visual judgment, and has a certain automatic discrimination function. Besides, the method is easy to implement, simple in structure and low in cost, and has a better identification and classification effect on common clouds; and especially under the condition that the requirement for the purity of the background sky is not high, the method disclosed by the invention has a better cloud-sky separation effect in comparison with the common threshold discrimination method.

Description

A kind of ground visible cloud image identifying processing method
Technical field
The present invention relates to the digital image processing techniques field, particularly a kind of ground visible cloud image identifying processing method that is applicable to the weather station meteorologic analysis.
Background technology
Cloud is an important step of hydrologic cycle on the earth, and it and terrestrial radiation interaction joint effect local and energy equilibrium Global Scale.The radiation characteristic of dissimilar clouds with and distribution situation, significant to accuracy, Global climate change and the flight support etc. of weather forecast.Because cloud changes constantly, generally relies on people's eyesight to carry out other observation of the ground varieties of clouds both at home and abroad at present, automatic Observation is still in the exploratory development stage.Wherein belong to identifying processing as main to carry out the varieties of clouds based on digital image processing techniques, the blue red gray scale that is based on of extensively using has different textures as cutting apart and classification foundation than (or radiance ratio), different cloud, but final differentiation result is not ideal enough, and main cause is as follows:
The first, under the low visibility, because aerocolloidal increase causes the blue composition of sky colourity to weaken, there is aerocolloidal sky will become greyish white, as utilizes blue red gray scale than carrying out separating of cloud and sky, the gasoloid atmosphere can be mistaken for cloud;
The second, the varieties of clouds are not various, and form is different also close, constantly change complexity, and the image appearance form is single, only is to utilize simple a kind of feature to differentiate, and discrimination must be not high.
Summary of the invention
For the lower problem of prior art medium cloud discrimination, it is effective to the invention provides a kind of identifying processing, and is easy to the ground visible cloud image identifying processing method that realizes and manage.Because considering cloud is translucent object, and different classes of cloud has different transparence values, the present invention is combined traditional moire Li Tezheng with transparency, more fully showing the feature of different clouds, thereby significantly improves discrimination.
For achieving the above object, the technical scheme that the present invention takes is: a kind of ground visible cloud image identifying processing method may further comprise the steps:
1) gathers the original image that comprises the ground visible cloud image;
2). the original image of the colour that will collect transforms to gray space by rgb space, obtains the gray level image corresponding with original image; Then gray level image is carried out filtering and process and to obtain the denoising image, again the denoising image is carried out the nonlinear gray conversion with the image that is enhanced, and will strengthen image and transform to rgb space by gray space, to obtain colored enhancing image;
3). utilize the natural image matting method based on perceptual color space, strengthen the image from colour sky background and cloud prospect are separated;
4). from isolated cloud prospect, extract feature set;
5). according to the data in the feature set that extracts, the neural network classifier that utilization trains carries out the discriminant classification of varieties of clouds type, obtains the differentiation result of varieties of clouds type.
For the ease of know recognition result and historical check that the present invention also comprises step 6 to identifying in the future): show and differentiation result and the cloud feature set of storage varieties of clouds type.
Further, utilizing the adaptive wiener filter method that gray level image is carried out filtering and noise reduction in the step 2 of the present invention processes, the recycling immune genetic algorithm determines that image non-linear greyscale transformation strengthens parameter, then image is carried out the nonlinear gray conversion, and image is enhanced.
Preferably, in the step 4 of the present invention, described feature set comprises the three-channel transparency average of RGB, maximal value and minimum value in the cloud prospect color cloud picture, and 0 °, 45 °, 90 °, 135 ° corresponding second moment, contrast, correlativity and entropys of four direction of cloud prospect gray scale cloud atlas, mean value separately, these characteristics namely can be used as the element characteristic of varieties of clouds type identification, with comprehensive multifactor identification varieties of clouds type, improve the accuracy rate of identification.
Further, the present invention also comprises the different varieties of clouds type characteristics of correspondence of collection as varieties of clouds type property data base, and utilizes varieties of clouds type property data base to carry out the training of neural network classifier.Eigenwert kind in the feature set of extracting in eigenwert kind in the varieties of clouds type property data base and the step 4 is corresponding, based on the neural network classifier of such varieties of clouds type property data base training, at the recognition result that carries out accurately also to obtain rapidly when the varieties of clouds are not identified varieties of clouds type.
Beneficial effect
The present invention utilizes the transparency matching algorithm that cloud atlas is separated, and the eigenwert that extracts cloud atlas on the basis of separating is used for the identification of varieties of clouds type, and the accuracy of identification improves greatly; And the present invention is simple in structure, only needs existing image acquisition device and computing machine to realize, for the Accurate Prediction hazard weather of weather station provides technical guarantee; Storage Presentation Function of the present invention is convenient to the management of weather information simultaneously.
Description of drawings
Figure 1 shows that the method flow diagram of specific embodiments of the invention;
Figure 2 shows that the process flow diagram of nonlinear gray changing image Enhancement Method among the present invention;
Figure 3 shows that immune genetic algorithm process flow diagram in Fig. 2 image enchancing method;
Figure 4 shows that based on the transparency statistical picture of the stingy drawing method of perceptual color space and cut apart synoptic diagram;
Figure 5 shows that the structural model synoptic diagram of varieties of clouds type identification BP neural network.
Embodiment
For content of the present invention is become apparent more, be described further below in conjunction with the drawings and specific embodiments.
As shown in Figure 1, the specific embodiments of the invention method comprises step:
1) gathers image, and original image is transformed to gray space by rgb space, obtain gray level image corresponding to original image; Image can utilize existing techniques in realizing in the conversion of rgb space and gray space.
2). the gray level image that step 1 is obtained carries out adaptive wiener filter obtaining the denoising image, and the denoising image is carried out the nonlinear gray conversion with the image that is enhanced.Wherein adaptive wiener filter is prior art, and its difference according to topography is adjusted parameter, and little smooth operation is carried out in the place that local difference is large, and large smooth operation is carried out in the place that local difference is little.
As shown in Figure 2, the idiographic flow of the figure image intensifying process in the above-mentioned steps is:
2.1) employing formula (1) carries out normalized to denoising image f (i, j).
g ( i , j ) = f ( i , j ) - min ( f ( i , j ) ) max ( f ( i , j ) ) - min ( f ( i , j ) ) - - - ( 1 )
In the formula (1), the capable and row of coordinate at i and j presentation video pixel place, f (i, j) denotation coordination is the original image gray-scale value of (i, j), g (i, j) is its gray-scale value after processing.
2.2) to the image after the normalized, adopt existing basic immune genetic algorithm as shown in Figure 3 to seek optimized transformation parameters α and β (0<α, β<10).For different images, the value of α and β can utilize the immune genetic algorithm automatic seeking to get optimized parameter, and image is different, and optimal result is also different.Wherein the fitness function of immune genetic algorithm relates to the variance F of image Ac, information entropy E, pixel difference F BrAnd noise change amount I NcEtc. four performance evaluatings that picture quality is closely related, these four evaluation indexes are to promote the power that immune genetic algorithm is sought optimal parameter, the i.e. constituent element of fitness function in the immune genetic algorithm.Fitness value is larger, and picture quality is better.The fitness function expression formula is:
Fitness=E·Inc·F ac+F br (2)
Wherein, the variance of image
Figure BDA00002118090100041
Wherein, i XyBe the pixel value of pixel (x, y), (n=M * N) is the sum of image pixel to n.F AcBe worth greatlyr, show that the global contrast of image is better.
Information entropy
Figure BDA00002118090100042
P wherein iBe the probability that i level gray scale occurs, work as p i=0 o'clock, definition p iLog 2p i=0.
The pixel difference Pixel difference F BrBe in the whole image each pixel and with its summation at a distance of the difference of the respective pixel of two pixel distances, the image of a better contrast has a larger F BrValue:
Noise change amount
Figure BDA00002118090100044
After noise changed the image intensifying of scale diagram, gray level was that the number of pixels of h is greater than the quantity of given threshold value t.
2.3) utilize the optimized transformation parameters α and the β that obtain in the step 2.2, obtain non-linear transform function F (u), 0≤u≤1,
F ( u ) = B - 1 ( &alpha; , &beta; ) &times; &Integral; 0 u t &alpha; - 1 ( 1 - t ) &beta; - 1 dt , 0 < &alpha; , &beta; < 10 - - - ( 3 )
Wherein, B (α, β) is the Beta function, is expressed as:
B ( &alpha; , &beta; ) &Integral; 0 1 t &alpha; - 1 ( 1 - t ) &beta; - 1 dt
The image that then image is carried out after the incomplete beta function greyscale transformation is:
g'(x,y)=F(g(x,y)),0≤g′(x,y)≤1(4)
2.4) image after the conversion is carried out the renormalization processing, obtain the image of gray scale airspace enhancement.
f'(x,y)=(Lmax-Lmin)g'(x,y)+Lmin (5)
Wherein, f'(i, j) be gradation of image value after the enhancing of (i, j) for denotation coordination, Lmax, Lmin are changing image g'(i, the j that obtains in 2.3 steps) the minimum and maximum value of gray scale.
3). because cloud is translucent object, before the pixel of target internal has comprised, the color of target context, the color that the transparency matching algorithm can effectively comprise single pixel in the cloud carries out separating of prospect and background, the present invention adopts based on the naturally stingy drawing method of the transparency of perceptual color space single sky background and cloud prospect is separated, as shown in Figure 4; The algorithm concrete steps are:
3.1) supposition carries out separating of prospect and background to a pixel on the image, this point is for the P point, at first finds the some f on the nearest prospect profile line of distance P point on the image when then separating 1, represent bee-line with l.Then with f 1Point is for the center of circle, with γ 1l F1Be radius coefficient, 1<γ 1<10) be round F for radius, the foreground F that P is ordered can calculate with the weighted mean of all foreground point color values in known region in the circle F.The usage space Gaussian distribution Strengthen the impact of neighbor point as flexible strategy.For interior 1 i of circle,
&omega; F i = 1 2 &pi; e &xi; F i 2 2 l F 2 - - - ( 6 )
Wherein
Figure BDA00002118090100053
Foreground point i and some f in the expression circle F 1Two-dimensional distance on image space, i.e. foreground:
F &ap; 1 &omega; &Sigma; i &Element; N &omega; F i F i - - - ( 7 )
Wherein
Figure BDA00002118090100055
F iBe the color value of foreground point i in the circle F, N is the number of foreground pixel point in the circle F;
Making uses the same method can calculate the background colour B that P is ordered.
3.2) then, calculate respectively first the colourity α that P is ordered CHWith brightness α IN, calculate again the transparency α value of this point:
&alpha; CH = ( C &prime; - B &prime; ) &CenterDot; ( F &prime; - B &prime; ) | | F &prime; - B &prime; | | 2 - - - ( 8 )
&alpha; IN = L C - L B L F - L B - - - ( 9 )
Wherein, C ', F', B' are C, F, B in the RGB color space by (1,0,0), the projection on the plane that (0,1,0), (0,0,1) are determined, L C, L B, L FBe respectively the brightness of P point combined colors, background color and foreground color.Definition:
ρ=min(L F,L B)/max(L F,L B)(10)
d=|F′B′|(11)
D is the distance between F' and the B' on the unit plane,
Figure BDA00002118090100058
α then CHAnd α INWeight be respectively
W CH=sd 3+tρ 3
W IN = u d 3 + v &rho; 3 - - - ( 12 )
Wherein, u, v, s, t are constants, in the color-aware space, can rule of thumb get u=1/8000, v=t=3.0, s=8000.Then P point transparency α value is calculated with following formula
&alpha; = W CH &alpha; CH + W IN &alpha; IN W CH + W IN - - - ( 13 )
Carry out cloud atlas according to the transparency α value and scratch figure, thereby realize separating of cloud prospect and sky background.
3). obtain cloud prospect characteristic of correspondence collection, at first calculate its corresponding transparency mean value, maximum and minimum value according to all R, G, the B transparency α value of cloud prospect formula (13) gained;
&alpha; 1 = &Sigma; i = 1 n &alpha; i n - - - ( 14 )
α 2=max(α i)(15)
α 3=min(α i)(16)
Then cloud prospect gray processing is obtained 0 ° of its gray scale cloud atlas, 45 °, 90 °, 135 ° four features mean values separately that the four direction gray level co-occurrence matrixes is derived:
Gray level co-occurrence matrixes: from gray scale be x pixel (its position is (i, j)), and statistics is on the θ direction, and distance is the probability that the pixel (i+dcos θ, j+dsin θ) of y occurs simultaneously for another gray scale of d with it:
P &(i,j)=P(f(i,j)=x,f(i+dcosθ,j+dsinθ)=y)(17)
Second moment: the inhomogeneity tolerance of image distribution:
Figure BDA00002118090100064
Contrast: the sharpness of image:
Correlativity: the similarity degree of the direction that the element of measurement gray level co-occurrence matrixes is expert at and the direction of row:
Figure BDA00002118090100072
In the formula (20),
Figure BDA00002118090100074
Figure BDA00002118090100075
Figure BDA00002118090100076
Entropy: image has the tolerance of quantity of information:
Figure BDA00002118090100077
5). the BP neural network classifier that the characteristic utilization of newly obtaining is trained carries out discriminant classification:
Neural network model adopts three layers of BP neural network of standard of elite, and its models coupling is shown in Figure 4.The input layer number of neural network is that sample is selected intrinsic dimensionality; Be output as the identification and classification layer.Implicit nodal point number is elected the geometric mean that input nodal point number M and output node are counted N as.The weights that network training adopts the method for error back propagation to adjust between node connect.
The present invention also comprises the different varieties of clouds type characteristics of correspondence of collection as varieties of clouds type property data base, and utilizes varieties of clouds type property data base to carry out the training of neural network classifier.Eigenwert kind in the feature set of extracting in eigenwert kind in the varieties of clouds type property data base and the step 4 is corresponding.
In conjunction with shown in Figure 5, when the BP network is trained, if the proper vector of input looks like to extract from m width of cloth cloud atlas, then the desired output of the output layer of corresponding BP network is m and neuronicly is output as 1, and other neuronic outputs all are 0, so the output layer vector can be expressed as:
out = [ 0,0 , . . . , 1 ( m ) , 0 , . . . , 0 ] T (m element of vector is 1)
After obtaining recognition result, the present invention can utilize the Presentation Function of computing machine that recognition result is presented on the visual interface, is convenient to the staff and in time knows, and analyze meteorology according to recognition result, with Accurate Prediction weather.Simultaneously, the not corresponding characteristic parameter of recognition result and the varieties of clouds is stored, so that later reference.
Above embodiment is only in order to illustrating technical scheme of the present invention, but not limits it; Although with reference to previous embodiment the present invention is had been described in detail, for the person of ordinary skill of the art, still can make amendment to the technical scheme that previous embodiment is put down in writing, perhaps part technical characterictic wherein is equal to replacement; And these modifications or replacement do not make the essence of appropriate technical solution break away from the spirit and scope of the present invention's technical scheme required for protection.

Claims (5)

1. a ground visible cloud image identifying processing method is characterized in that, may further comprise the steps:
1). gather the original image that comprises the ground visible cloud image;
2). the original image of the colour that will collect to gray space, obtains the gray level image corresponding with original image by the RGB spatial alternation; Then gray level image is carried out filtering and process and to obtain the denoising image, again the denoising image is carried out the nonlinear gray conversion with the image that is enhanced, and will strengthen image and transform to rgb space by gray space, to obtain colored enhancing image;
3). utilize the natural image matting method based on perceptual color space, strengthen the image from colour sky background and cloud prospect are separated;
4). from isolated cloud prospect, extract feature set;
5). according to the data in the feature set that extracts, the neural network classifier that utilization trains carries out the discriminant classification of varieties of clouds type, obtains the differentiation result of varieties of clouds type.
2. ground visible cloud image identifying processing method according to claim 1 is characterized in that, also comprises step:
6). show and store differentiation result and the cloud feature set of varieties of clouds type.
3. ground visible cloud image identifying processing method according to claim 1, it is characterized in that, utilizing the adaptive wiener filter method that gray level image is carried out filtering and noise reduction in the step 2 processes, the recycling immune genetic algorithm determines that image non-linear greyscale transformation strengthens parameter, then image is carried out the nonlinear gray conversion, image is enhanced.
4. ground visible cloud image identifying processing method according to claim 1, it is characterized in that, in the step 4, described feature set comprises the three-channel transparency average of RGB, maximal value and minimum value in the cloud prospect color cloud picture, and 0 °, 45 °, 90 °, 135 ° corresponding second moment, contrast, correlativity and entropys of four direction of cloud prospect gray scale cloud atlas, mean value separately.
5. ground visible cloud image identifying processing method according to claim 1 is characterized in that, also comprises collecting different varieties of clouds type characteristics of correspondence as varieties of clouds type property data base, and utilizes varieties of clouds type property data base to carry out the training of neural network classifier.
CN201210333246.1A 2012-09-10 2012-09-10 A kind of ground visible cloud image identifying processing method Active CN102902956B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210333246.1A CN102902956B (en) 2012-09-10 2012-09-10 A kind of ground visible cloud image identifying processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210333246.1A CN102902956B (en) 2012-09-10 2012-09-10 A kind of ground visible cloud image identifying processing method

Publications (2)

Publication Number Publication Date
CN102902956A true CN102902956A (en) 2013-01-30
CN102902956B CN102902956B (en) 2016-04-13

Family

ID=47575178

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210333246.1A Active CN102902956B (en) 2012-09-10 2012-09-10 A kind of ground visible cloud image identifying processing method

Country Status (1)

Country Link
CN (1) CN102902956B (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103390274A (en) * 2013-07-19 2013-11-13 电子科技大学 Image segmentation quality evaluation method based on region-related information entropies
CN103699902A (en) * 2013-12-24 2014-04-02 南京信息工程大学 Sorting method of ground-based visible light cloud picture
CN104008402A (en) * 2014-05-12 2014-08-27 南京信息工程大学 Foundation cloud picture recognition method based on improved SOM algorithm
CN104182977A (en) * 2014-08-13 2014-12-03 中国人民解放军理工大学 Wave cloud arranging information extraction method based on cloud block main body framework analysis
CN105405120A (en) * 2015-10-22 2016-03-16 华北电力大学(保定) Method extracting cloud graph from sky image
WO2016062259A1 (en) * 2014-10-22 2016-04-28 华为技术有限公司 Transparency-based matting method and device
CN106127725A (en) * 2016-05-16 2016-11-16 北京工业大学 A kind of millimetre-wave radar cloud atlas dividing method based on multiresolution CNN
CN106228522A (en) * 2016-07-27 2016-12-14 合肥高晶光电科技有限公司 A kind of color concentration treatment of CCD color selector
CN106934426A (en) * 2015-12-29 2017-07-07 三星电子株式会社 The method and apparatus of the neutral net based on picture signal treatment
CN108280810A (en) * 2018-01-09 2018-07-13 北方工业大学 Automatic processing method for repairing cloud coverage area of single-time phase optical remote sensing image
CN108369651A (en) * 2015-12-01 2018-08-03 天青公司 Information extraction is carried out using image data
CN110111342A (en) * 2019-04-30 2019-08-09 贵州民族大学 A kind of optimum option method and device of stingy nomography
CN110363171A (en) * 2019-07-22 2019-10-22 北京百度网讯科技有限公司 The method of the training method and identification sky areas of sky areas prediction model
CN112801857A (en) * 2020-11-30 2021-05-14 泰康保险集团股份有限公司 Image data processing method and device
CN117314741A (en) * 2023-12-01 2023-12-29 成都华栖云科技有限公司 Green screen background matting method, device and equipment and readable storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101727657A (en) * 2008-10-31 2010-06-09 李德毅 Image segmentation method based on cloud model

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101727657A (en) * 2008-10-31 2010-06-09 李德毅 Image segmentation method based on cloud model

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
刘瑜: "强对流天气下卫星云图特征体系构建方法的研究", 《中国优秀硕士学位论文全文数据库信息科技辑》, 15 October 2007 (2007-10-15), pages 1 - 49 *
林生佑等: "基于感知颜色空间的自然图像抠图", 《计算机辅助设计与图形学学报》, vol. 17, no. 5, 31 May 2005 (2005-05-31), pages 915 - 919 *

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103390274A (en) * 2013-07-19 2013-11-13 电子科技大学 Image segmentation quality evaluation method based on region-related information entropies
CN103699902A (en) * 2013-12-24 2014-04-02 南京信息工程大学 Sorting method of ground-based visible light cloud picture
CN104008402A (en) * 2014-05-12 2014-08-27 南京信息工程大学 Foundation cloud picture recognition method based on improved SOM algorithm
CN104182977A (en) * 2014-08-13 2014-12-03 中国人民解放军理工大学 Wave cloud arranging information extraction method based on cloud block main body framework analysis
CN104182977B (en) * 2014-08-13 2017-02-15 中国人民解放军理工大学 Wave cloud arranging information extraction method based on cloud block main body framework analysis
WO2016062259A1 (en) * 2014-10-22 2016-04-28 华为技术有限公司 Transparency-based matting method and device
CN105590307A (en) * 2014-10-22 2016-05-18 华为技术有限公司 Transparency-based matting method and apparatus
CN105405120A (en) * 2015-10-22 2016-03-16 华北电力大学(保定) Method extracting cloud graph from sky image
CN108369651B (en) * 2015-12-01 2022-08-09 天津瞰天科技有限责任公司 Method, system and non-transitory computer-readable storage medium for extracting sky area
CN108369651A (en) * 2015-12-01 2018-08-03 天青公司 Information extraction is carried out using image data
CN106934426A (en) * 2015-12-29 2017-07-07 三星电子株式会社 The method and apparatus of the neutral net based on picture signal treatment
CN106127725B (en) * 2016-05-16 2019-01-22 北京工业大学 A kind of millimetre-wave radar cloud atlas dividing method based on multiresolution CNN
CN106127725A (en) * 2016-05-16 2016-11-16 北京工业大学 A kind of millimetre-wave radar cloud atlas dividing method based on multiresolution CNN
CN106228522A (en) * 2016-07-27 2016-12-14 合肥高晶光电科技有限公司 A kind of color concentration treatment of CCD color selector
CN108280810A (en) * 2018-01-09 2018-07-13 北方工业大学 Automatic processing method for repairing cloud coverage area of single-time phase optical remote sensing image
CN108280810B (en) * 2018-01-09 2020-08-14 北方工业大学 Automatic processing method for repairing cloud coverage area of single-time phase optical remote sensing image
CN110111342A (en) * 2019-04-30 2019-08-09 贵州民族大学 A kind of optimum option method and device of stingy nomography
CN110363171A (en) * 2019-07-22 2019-10-22 北京百度网讯科技有限公司 The method of the training method and identification sky areas of sky areas prediction model
CN112801857A (en) * 2020-11-30 2021-05-14 泰康保险集团股份有限公司 Image data processing method and device
CN117314741A (en) * 2023-12-01 2023-12-29 成都华栖云科技有限公司 Green screen background matting method, device and equipment and readable storage medium
CN117314741B (en) * 2023-12-01 2024-03-26 成都华栖云科技有限公司 Green screen background matting method, device and equipment and readable storage medium

Also Published As

Publication number Publication date
CN102902956B (en) 2016-04-13

Similar Documents

Publication Publication Date Title
CN102902956B (en) A kind of ground visible cloud image identifying processing method
CN108573276B (en) Change detection method based on high-resolution remote sensing image
CN103218831B (en) A kind of video frequency motion target classifying identification method based on profile constraint
CN103049763B (en) Context-constraint-based target identification method
CN106651886A (en) Cloud image segmentation method based on superpixel clustering optimization CNN
CN108647602B (en) A kind of aerial remote sensing images scene classification method determined based on image complexity
CN102646200A (en) Image classifying method and system for self-adaption weight fusion of multiple classifiers
CN107392130A (en) Classification of Multispectral Images method based on threshold adaptive and convolutional neural networks
CN110264484A (en) A kind of improvement island water front segmenting system and dividing method towards remotely-sensed data
CN106296695A (en) Adaptive threshold natural target image based on significance segmentation extraction algorithm
CN109558806A (en) The detection method and system of high score Remote Sensing Imagery Change
CN103246894B (en) A kind of ground cloud atlas recognition methods solving illumination-insensitive problem
CN103984953A (en) Cityscape image semantic segmentation method based on multi-feature fusion and Boosting decision forest
CN110120041A (en) Pavement crack image detecting method
CN102830404B (en) Method for identifying laser imaging radar ground target based on range profile
CN113160062B (en) Infrared image target detection method, device, equipment and storage medium
CN105427309A (en) Multiscale hierarchical processing method for extracting object-oriented high-spatial resolution remote sensing information
CN110309781A (en) Damage remote sensing recognition method in house based on the fusion of multi-scale spectrum texture self-adaption
CN109684922A (en) A kind of recognition methods based on the multi-model of convolutional neural networks to finished product dish
CN110390255A (en) High-speed rail environmental change monitoring method based on various dimensions feature extraction
CN102254174A (en) Method for automatically extracting information of bare area in slumped mass
CN111914611A (en) Urban green space high-resolution remote sensing monitoring method and system
CN106529484A (en) Combined spectrum and laser radar data classification method based on class-fixed multinucleated learning
CN109034184A (en) A kind of grading ring detection recognition method based on deep learning
CN112950780B (en) Intelligent network map generation method and system based on remote sensing image

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant