CN101587189A - Texture elementary feature extraction method for synthetizing aperture radar images - Google Patents

Texture elementary feature extraction method for synthetizing aperture radar images Download PDF

Info

Publication number
CN101587189A
CN101587189A CNA2009100626546A CN200910062654A CN101587189A CN 101587189 A CN101587189 A CN 101587189A CN A2009100626546 A CNA2009100626546 A CN A2009100626546A CN 200910062654 A CN200910062654 A CN 200910062654A CN 101587189 A CN101587189 A CN 101587189A
Authority
CN
China
Prior art keywords
image
pixel
texture
estimate vector
texture primitive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA2009100626546A
Other languages
Chinese (zh)
Other versions
CN101587189B (en
Inventor
何楚
魏喜燕
陶珮
孙洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU filed Critical Wuhan University WHU
Priority to CN2009100626546A priority Critical patent/CN101587189B/en
Publication of CN101587189A publication Critical patent/CN101587189A/en
Application granted granted Critical
Publication of CN101587189B publication Critical patent/CN101587189B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention relates to a texture elementary feature extraction method for synthetizing aperture radar images, comprising the followings: the estimated vectors corresponding to each pixel point of each image of image in a training set are obtained, and the estimated vectors corresponding to all pixel points of all images in the training set are combined; the estimated vectors are clustered to obtain texture elementary code that is texture elementary element; the estimated vectors corresponding to each pixel point in image are obtained for the feature image to be extracted, the estimated vectors corresponding to all pixel points in image are obtained after combination, and the estimated vectors are marked by using the texture elementary code; and the statistics of image marks is carried out to obtain mark histogram that is the statistics texture elementary feature of image. The invention is combined with distributional characteristics, space characteristics and dimension characteristics of image to improve the classification correctness of synthetizing aperture radar images, thereby improving the accuracy of image processing application such as classification and division based on texture elementary feature.

Description

The texture primitive feature extracting method that is used for diameter radar image
Technical field
The invention belongs to technical field of image processing, particularly a kind of texture primitive feature extracting method that is used for diameter radar image.
Background technology
In computer vision and Flame Image Process, the variation of brightness of image forms the pattern of certain specific repetition, is referred to as texture, and texture is a kind of ubiquitous visual phenomenon, and people can remove to experience texture, but is difficult to the explication of texture is formed unified understanding.Still none widely accepted texture definition at present.It is generally acknowledged that texture is gradation of image or color variation or repetition spatially.Intensity profile generally has certain regularity in the texture image, and for random grain, it also has the feature on some statistical significances.People have following common recognition to texture at present:
A) texture shows as the repetition constantly in the bigger zone of this sequence of certain local sequentiality;
B) texture exists the basic comprising unit that causes visually-perceptible, i.e. texture primitive;
C) texture can not be processed into a point process, shows as region characteristic more;
D) the texture region various piece roughly is uniform entity, and each several part has roughly the same size;
E) texture has features such as intensity, density, direction and degree of roughness.
Based on above common recognition, think that texture has two key elements: (1) texture has the elementary cell that causes visually-perceptible, i.e. texture primitive.The form of texture primitive is various, shows as some image color or grayscale mode.(2) texture primitive has certain queueing discipline, and these rules may show as certain regularity, also may show as randomness [Liu Xiaoming, texture Review Study. computer utility research, 2008, Vol.25, No.8].
Texture analysis is one of main contents of texture research, it also is important field of research in the machine vision, boundless application background is arranged, and its application comprises that remote Sensing Image Analysis (Remotely-sensedImage Analysis), medical image analysis (Medical Image Analysis), industrial surface detect fields such as (Industrial Surface Inspection), document process (Document Processing) and image retrieval.
Remote Sensing Image Analysis is a very ripe application of texture analysis, and the area of different terrain landforms can produce different textures in remote sensing images, and river, lake, field, urban area can make a distinction them by texture.People's usage space gray scales such as Haralick rely on the matrix textural characteristics and analyze remote sensing images, Rignot and Kwok analyze the SAR image with textural characteristics, Du uses the wave filter texture of deriving out from Gabor to cut apart texture, successfully is water, fresh ice, ice and ice for many years for a long time with the SAR image segmentation.2003, people such as Christodoulou developed a texture analysis system based on many features, and this system can analyze satellite cloud picture automatically, dissimilar clouds were classified, for synoptic analysis provides a useful analysis means.
A key problem of texture analysis is texture description (Texture Description), is texture feature extraction (Texture Feature Extraction) at area of pattern recognition.Many texture characteristic extracting methods have been arranged at present.Tuceryan and Jain roughly are classified as four big classes with these methods: structure analysis method, statistical analysis technique, modelling analytical approach and signal processing method.Wherein statistical analysis technique and signal processing method are being served as very important role in texture analysis.
Texture analysis research at home mainly is the concrete application of a certain method.For the method for statistics, the co-occurrence matrix method is more commonly used.In the method based on model, fractal method is used many, adopts Fractal Brown function mostly, and there also have pair fractal method to carry out to be improved; The application of Markov random field (MRF) also has, and main difficulty is determining of parameter.In the method for mathematic(al) manipulation, commonly based on method of wavelet.External mainly is that textural characteristics that several texture analysis methods are extracted is in conjunction with the general classification method, to the comparison of classifying of different images.
Early stage texture analysis uses the method for statistics or structure to extract feature.Nearest major progress is to use the textural characteristics of multiresolution (for example Gabor conversion) and hyperchannel (several wave band binding analysis textural characteristics) to describe.Past lacks texture analysis effectively to be analyzed the texture of different scale, and the different scale here is meant same width of cloth image, carries out texture analysis on different scale.Can obtain the different texture feature of the same area like this, increase quantity of information, finally can improve the precision of classification.
Texture description based on filtering can be divided into two kinds of methods: horizontally-spliced one-tenth histogram after the filtering, as GIST method [Aude Oliva, " Gist of the Scene ", Neurobiology of Attention 2005]; Vertically conspire to create a vector after the filtering, add up again, be called texture primitive method [Manik Varma, Andrew Zisserman, " A Statistical Approach to Texture Classificationfrom Single Images ", Kluwer Academic Publishers.Printed in theNetherlands, 2004].Texture primitive is meant micromechanism basic in the natural image, and it is the fundamental element of visually-perceptible starting stage (noting the stage in advance).The research of texture primitive is to utilize the thought of sparse coding (sparse coding) to attempt the super complete image-based shading reason of study primitive s from natural image, expresses texture image with texture primitive again.
Diameter radar image (SAR image) also is a kind of texture image, and its disposal route generally is based on statistical estimate, and traditional texture analysis method texture primitive is based on filtering, therefore combines statistical estimate and texture primitive necessary.
Summary of the invention
The object of the invention is to solve the prior art deficiency, has proposed to be used for the texture primitive feature extracting method of diameter radar image, can extract the feature of the texture image with distribution character efficiently.
Technical scheme of the present invention may further comprise the steps:
Step 1 is asked for each pixel P in the image to every width of cloth image in the training set X, yCorresponding estimate vector V X, y, combination obtains the estimate vector { V of all pixel correspondences of all images in the training set X, y} All,
Step 2 is with the estimate vector { V of all pixel correspondences of all images in the training set X, y} AllCarry out cluster, cluster gained texture primitive code book C is texture primitive;
Step 3 is asked for each pixel P in the image to characteristic image to be extracted X, yCorresponding estimate vector V X, y, combination obtains the estimate vector { V of all pixel correspondences in the image X, y} One, and to this estimate vector { V X, y} OneCarry out label with the texture primitive code book C that generates in the step 2, be designated as { M X, y} One
Step 4, statistical picture label { M X, y} OneObtain label histogram, be designated as H Mark, label histogram H MarkBe the statistics texture primitive feature of this image;
In step 1 and step 3, described interior each the pixel P of image that asks for X, yCorresponding estimate vector V X, y, concrete mode is as follows,
At first, to gained gray level image T process Gabor bank of filters { F behind the quantized image 1, F 2..., F n, obtain filtered image sets { I 1, I 2..., I n, be designated as I;
Then, to the image I among the image sets I p, 1≤p≤n is respectively with each pixel P X, ySelect m different window size for the center and obtain pane group { G P, 1, G P, 2..., G P, m, be designated as G pWith each pane G P, i, that 1≤i≤m divides is upper left, lower-left, upper right, four region R in bottom right P, i, 1, R P, i, 2, R P, i, 3, R P, i, 4, and these zones are combined obtain zone group { R P, i, j, 1≤i≤m, 1≤j≤4 are designated as R pIf each regional view data all satisfies with a kind of statistical distribution E, this distribution comprises q parameter, to R pIn each region R P, i, jCarry out parameter estimation, obtain estimated parameter E P, i, j={ E P, i, j, 1, E P, i, j, 2..., E P, i, j, q, they are combined obtains estimated parameter group { E P, i, j, k, 1≤i≤m, 1≤j≤4,1≤k≤q is designated as E p
At last, with pixel P X, yCorresponding E P, i, j, k, 1≤p≤n, 1≤i≤m, 1≤j≤4,1≤k≤q conspire to create an estimate vector, are each pixel P in the image X, yCorresponding estimate vector V X, y
And, with pixel P X, yCorresponding E P, i, j, kWhen conspiring to create an estimate vector, successively according to k, j, the order that i, p increase progressively is connected in series.
And, described to R pIn each region R P, i, jCarry out parameter estimation, adopt the maximal possibility estimation mode to realize.
The present invention replaces the pixel point value by introducing regional estimated value, has described the distribution character of pixel and surrounding pixel point thereof, has described the feature of pixel effectively; By adding the pixel distribution parameter in different azimuth zone on every side, make feature comprise the spatial relation of image, the situation of change of textural characteristics on the different directions has been described; By introducing the distribution parameter of different scale, increased description to the different scale textural characteristics.A kind of texture primitive feature extracting method that is used for diameter radar image that the present invention proposes, the distribution character of fused images, spatial character and dimensional properties, improve the classification accuracy rate of SAR image, thereby improve classification, the accuracy rate that Flame Image Process is used such as cut apart based on the texture primitive feature.
Description of drawings
Fig. 1 process flow diagram of the present invention;
The embodiment of Fig. 2 spatial characteristics of the present invention;
The embodiment of the multiple dimensioned distribution character of Fig. 3 the present invention;
The generative process of Fig. 4 texture primitive code book;
Fig. 5 utilizes the statistics texture primitive feature of texture primitive code book computed image;
Fig. 6 embodiment of the invention utilizes the texture primitive statistical nature that image is classified;
Embodiment
Referring to Fig. 1, technical solution of the present invention may further comprise the steps:
Step 1 is asked for each pixel P in the image to every width of cloth image in the training set X, yCorresponding estimate vector V X, y, combination obtains the estimate vector { V of all pixel correspondences of all images in the training set X, y} All,
Step 2 is with the estimate vector { V of all pixel correspondences of all images in the training set X, y} AllCarry out cluster, cluster gained texture primitive code book C is texture primitive;
Step 3 is asked for each pixel P in the image to characteristic image to be extracted X, yCorresponding estimate vector V X, y, combination obtains the estimate vector { V of all pixel correspondences in the image X, y} One, and to this estimate vector { V X, y} OneCarry out label with the texture primitive code book C that generates in the step 2, be designated as { M X, y} One
Step 4, statistical picture label { M X, y} OneObtain label histogram, be designated as H Mark, label histogram H MarkBe the statistics texture primitive feature of this image;
In step 1 and step 3, described interior each the pixel P of image that asks for X, yCorresponding estimate vector V X, y, concrete mode is as follows,
At first, to gained gray level image T process Gabor bank of filters { F behind the quantized image 1, F 2..., F n, obtain filtered image sets { I 1, I 2..., I n, be designated as I;
Then, to the image I among the image sets I p, 1≤p≤n is respectively with each pixel P X, ySelect m different window size for the center and obtain pane group { G P, 1, G P, 2..., G P, m, be designated as G pWith each pane G P, i, that 1≤i≤m divides is upper left, lower-left, upper right, four region R in bottom right P, i, 1, R P, i, 2, R P, i, 3, R P, i, 4, and these zones are combined obtain zone group { R P, i, j, 1≤i≤m, 1≤j≤4 are designated as R pIf each regional view data all satisfies with a kind of statistical distribution E, this distribution comprises q parameter, to R pIn each region R P, i, jCarry out parameter estimation, obtain estimated parameter E P, i, j={ E P, i, j, 1, E P, i, j, 2..., E P, i, j, q, they are combined obtains estimated parameter group { E P, i, j, k, 1≤i≤m, 1≤j≤4,1≤k≤q is designated as E p
At last, with pixel P X, yCorresponding E P, i, j, k, 1≤p≤n, 1≤i≤m, 1≤j≤4,1≤k≤q conspire to create an estimate vector, are each pixel P in the image X, yCorresponding estimate vector V X, y
During concrete enforcement, characteristic image to be extracted is decided according to the image applications needs, can be the image in test pattern or the training set, or the part of image to be split.When for example cutting apart, be characteristic image to be extracted, carry out step 3, step 4 with image in zone to be split and the training set; Dividing time-like, be characteristic image to be extracted with the image in test pattern or the training set, carries out step 3, step 4.When characteristic image to be extracted belongs in the training set image, because step 1 is asked for each pixel P in the image to every width of cloth image in the training set X, yCorresponding estimate vector V X, y, in the step 3 characteristic image to be extracted is asked for each pixel P in the image X, yCorresponding estimate vector V X, yCan economize the result that asks for who slightly directly quotes step 1.Subscript x, y are respectively horizontal stroke, the ordinate of pixel, on each image all with the respective pixel point P of same coordinate X, yFor object is estimated.
To R pIn each region R P, i, jBefore carrying out parameter estimation, can be earlier to R pIn each regional view data analyze, obtain these data and satisfy with a kind of statistical distribution E, for example Gamma distributes.Parameter estimation then can be selected existing techniques in realizing such as maximal possibility estimation mode for use, and the present invention will not give unnecessary details.Obviously, with each pixel P X, yCorresponding E P, i, j, kWhen conspiring to create an estimate vector, serial connection sequence should be consistent.The present invention advises successively according to k, j, and the order that i, p increase progressively is connected in series.
The present invention be directed to diameter radar image and extract the texture primitive feature, therefore the image in above-mentioned test pattern or the training set, the part of image to be split obviously all are diameter radar images.The gray level image T that obtains after the diameter radar image quantification is sent into Gabor bank of filters { F 1, F 2..., F nGet final product.
Principle to technical solution of the present invention is illustrated below:
(1) bank of filters is selected
Textural characteristics can be made of the texture primitive of different texture primitives or various combination mode, therefore considers to assist the generation texture primitive by one group of wave filter.The standard of selective filter is in order to extract texture information better, and this texture information that requires to extract not only has a plurality of yardsticks and a plurality of directions are arranged.Wave filter commonly used has a lot, as the Gabor wave filter [Xue Yuli. based on the feature extraction and the application [D] thereof of Gabor conversion. Shandong University's master thesis, in April, 2007] etc.In the signal Processing field, the Gabor wave filter is one of best method of image identification, Gabor can be regarded as the edge that direction and yardstick all can change and the detecting device of straight line (striped), and, statistics for these microscopic features in the given area, can be used for representing basic texture information, so the present invention selects the Gabor wave filter.The two-dimensional Gabor function can be expressed as:
g ( x , y ) = [ 1 2 π σ x σ y ] exp [ - 1 2 ( x 2 σ x 2 + y 2 σ y 2 ) + 2 πjWx ]
Wherein, W is a Gaussian function multiple modulation frequency, and x, y are respectively the coordinate figure of two dimensions, σ x, σ yBandwidth, the j that is Gaussian function in spatial domain is imaginary symbols, exp exponential transform symbol.
During concrete enforcement, with n F 1, F 2..., F nConstitute Gabor bank of filters { F 1, F 2..., F n, 3 yardsticks are chosen in suggestion, and each yardstick has the Gabor wave filter of 8 directions, and then the suggestion value of n is 24.
(2) spatial characteristics of pixel
Do not extract the spatial characteristics of pixel when having the texture primitive feature extraction now, the present invention has considered the spatial characteristics of pixel when extracting the texture primitive feature.The texture primitive that specifically refers to pixel is not only calculated by this neighborhood of pixels, and be to calculate by the neighborhood on 4 different directions, the texture primitive of the pixel that obtains has so just comprised distribution character on the different spatial, can the distribution character of pixel and peripheral region thereof be retrained.Therefore with each pane G P, i, that 1≤p≤n, 1≤i≤m divide is upper left, lower-left, upper right, four zones, bottom right, and these zones are combined, and just can obtain the center pixel spatial characteristics.M suggestion span is 1 ~ 3, G P, iBut pane size general recommendations value be 9,11,13 odd numbers such as grade.
(3) multiple dimensioned distribution character
Referring to accompanying drawing 2, wherein a pixel with certain gray level image I is an example, to the image I that obtains after the gray level image I filtering 1, I 2..., I n, selecting pane to count m is 1, G P, 1Pane size value be 9, show that to be that certain pane at center is divided upper left, lower-left, upper right, four zones, bottom right with this pixel, therefore can obtain behavior n * 4, classify 1 matrix as, be expressed as [] (n*4,1)The multiple dimensioned distribution character of pixel
Texture exists on certain zone, the region area that some texture cell covers is big, the region area of some covering is little, original texture primitive extracting method obtains the texture primitive of different scale by selecting different bank of filters for use, and the present invention directly obtains multiple dimensioned characteristic from the neighborhood of the different sizes of pixel.Region area is too small can't to constitute texture in order to avoid, and selects different area size to estimate, to obtain multiple dimensioned distribution character.With each pixel P X, ySelect different window sizes for the center and obtain pane G P, i, 1≤i≤m chooses the different sizes of estimating exactly, can obtain the multiple dimensioned distribution character of center pixel.
Referring to accompanying drawing 3, wherein a pixel with certain gray level image I is an example, and showing with this pixel is that 3 different size panes are got at the center, i.e. m=3, and get G P, 1Pane be of a size of 5, G P, 2Pane be of a size of 9, G P, 3Pane be of a size of 13 and can obtain behavior n * 3 * 4, classify 1 matrix as, be expressed as [] (n*3*4,1)
(4) the texture primitive code book chooses
Use the method for cluster that texture primitives all in the training set is carried out cluster, obtain cluster centre, as the typical texture primitive.The vector that is about to all images in the training set carries out cluster, obtains the representative of cluster centre as texture primitive.So greatly reduce the quantity of texture primitive, thereby reduced calculated amount.
Referring to accompanying drawing 4,, the generative process of texture primitive code book is described to comprise in the training set that N width of cloth image is an example: to gained gray level image T1 behind the x width of cloth image quantization, T2 ..., TN sends into Gabor bank of filters { F respectively 0, F 1..., F n, filtering obtains image sets { I1 respectively 1, I1 2..., I1 n, { I2 1, I2 2..., I2 n... { IN 1, IN 2..., IN n.To the gained image sets, estimate to obtain estimated parameter group { E1 respectively then 1, E1 2..., E1 n, { E2 1, E2 2..., E2 n... { EN 1, EN 2..., EN n, wherein comprising m pane, the sub-piece of 4 directions and q estimated parameter are example to scheme IN, with pixel P X, yCorresponding EN P, i, j, k, 1≤p≤n, 1≤i≤m, 1≤j≤4,1≤k≤q be successively according to p, i, and the order that j, k increase progressively conspires to create estimate vector VN X, y, promptly be followed successively by EN 1,1,1,1, EN 1,1,1,2... EN 1,1,1, q, EN 1,1,2,1... EN 1, m, 4, q, EN 2,1,1,1... EN N, m, 4, q.
Each image I 1 1, I1 2..., I1 n..., I2 1, I2 2..., I2 n..., IN 1, IN 2..., IN nAll pixel P in the corresponding image X, yEstimate vector set on image can be designated as V={V X, y| P X, y∈ (I1, I2 ..., IN) }.
At last the estimate vector among the V is carried out cluster, gather into the k class, generate the texture primitive code book.
(5) label histogram of image
At first with the texture primitive number consecutively in the texture primitive code book, texture primitive with all pixels of piece image carries out label according to the numbering of texture primitive code book then, again label is added up by the numerical values recited of label, obtained label histogram, as the feature of this image.
Referring to accompanying drawing 5, for the corresponding grey scale image I of certain characteristic image to be extracted, through Gabor bank of filters { F 1, F 2..., F n, obtain filtered image sets I 1, I 2..., I n, estimate to obtain E 1, E 2..., E n, be concatenated into pixel P X, yCorresponding estimate vector V X, yTo each estimate vector V X, yC carries out label by the texture primitive code book, adds up obtaining label histogram again according to the label size.
The invention provides following examples, realize the SAR image classification method based on the statistics texture primitive.
The synthetic aperture radar (SAR) system has the round-the-clock ability of earth surface being carried out high-resolution imaging of round-the-clock, be similar to other coherent measurement systems, the imaging process of synthetic-aperture radar can be subjected to the influence of the coherent spot (Speckle) of similar noise, coherent spot is an inherent characteristic of radar image, simultaneously also make the SAR image table reveal tangible textural characteristics, the employing textural characteristics is classified to the SAR image and is cut apart a focus that also becomes present research.Step based on the SAR image classification method of adding up texture primitive is as follows:
At first N width of cloth training sample image in the training set is handled, obtained the vector set V={V of training set correspondence X, y| P X, y∈ (I1, I2 ..., IN) };
According to the method that generates the statistics texture primitive among the present invention, select a kind of clustering method that V is carried out cluster, obtain one group of typical texture primitive as the texture primitive code book, the embodiment of the invention selects random forest as clustering method, no matter select which kind of clustering method all in protection domain of the present invention when being noted that concrete enforcement, this process as shown in Figure 4;
Then with the V of all images in training set and the test set X, yMake the label statistics by the texture primitive code book, obtain the corresponding label histogram of each image, as the statistics texture primitive feature of training set and test set image, as shown in Figure 5;
Utilize the KNN method to classify at last, the KNN method is the K nearest neighbor method, propose in nineteen sixty-eight at first by Cover and Hart, and be a mature methods in theory.The thinking of this method is simple, intuitive very: if the great majority in the sample of the Y of sample in feature space (being the most contiguous in the feature space) the most similar belong to some classifications, then this sample also belongs to this classification.This method is only deciding the classification for the treatment of under the branch sample according to the classification of one or several the most contiguous samples in the class decision-making.Setting the Y value among the embodiment is 1, and the KNN method deteriorates to nearest neighbor method, and promptly which kind of sample to be tested image the most similar sample on feature space belongs in the test set, just judges which kind of sample to be tested is.As Fig. 6, image in certain width of cloth test set among the embodiment, extract its label histogram after because be in the K class, just this image is assigned to the K class with the statistics texture primitive feature samples of its training set image that closes on most.

Claims (3)

1. a texture primitive feature extracting method that is used for diameter radar image is characterized in that comprising following
Step:
Step 1 is asked for each pixel P in the image to every width of cloth image in the training set X, yCorresponding estimate vector V X, y, combination obtains the estimate vector { { V of all pixel correspondences of all images in the training set X, y} All,
Step 2 is with the estimate vector { V of all pixel correspondences of all images in the training set X, y} AllCarry out cluster, cluster gained texture primitive code book C is texture primitive;
Step 3 is asked for each pixel P in the image to characteristic image to be extracted X, yCorresponding estimate vector V X, y, combination obtains the estimate vector { V of all pixel correspondences in the image X, y} One, and to this estimate vector { V X, y} OneCarry out label with the texture primitive code book C that generates in the step 2, be designated as { M X, y} One
Step 4, statistical picture label { M X, y} OneObtain label histogram, be designated as H Mark, label histogram H MarkBe the statistics texture primitive feature of this image;
In step 1 and step 3, described interior each the pixel P of image that asks for X, yCorresponding estimate vector V X, y, concrete mode is as follows,
At first, to gained gray level image T process Gabor bank of filters { F behind the quantized image 1, F 2..., F n, obtain filtered image sets { I 1, I 2..., I n, be designated as I;
Then, to the image I among the image sets I p, 1≤p≤n is respectively with each pixel P X, ySelect m different window size for the center and obtain pane group { G P, 1, G P, 2..., G P, m, be designated as G pWith each pane G P, 1, that 1≤i≤m divides is upper left, lower-left, upper right, four region R in bottom right P, i, 1, R P, i, 2, R P, i, 3, R P, i, 4, and these zones are combined obtain zone group { R P, i, 1, 1≤i≤m, 1≤j≤4 are designated as R pIf each regional view data all satisfies with a kind of statistical distribution E, this distribution comprises q parameter, to R pIn each region R P, i, jCarry out parameter estimation, obtain estimated parameter E P, i, j={ E P, i, j, 1, E P, i, j, 2..., E P, i, j, q, they are combined obtains estimated parameter group { E P, i, j, k, 1≤i≤m, 1≤j≤4,1≤k≤q is designated as E p
At last, with pixel P X, yCorresponding E P, i, j, k, 1≤p≤n, 1≤i≤m, 1≤j≤4,1≤k≤q conspire to create an estimate vector, are each pixel P in the image X, yCorresponding estimate vector V X, y
2. texture primitive feature extracting method as claimed in claim 1 is characterized in that: with pixel P X, yCorresponding E P, i, j, kWhen conspiring to create an estimate vector, successively according to k, j, the order that i, p increase progressively is connected in series.
3. texture primitive feature extracting method as claimed in claim 1 or 2 is characterized in that: described to R pIn each region R P, i, jCarry out parameter estimation, adopt the maximal possibility estimation mode to realize.
CN2009100626546A 2009-06-10 2009-06-10 Texture elementary feature extraction method for synthetizing aperture radar images Expired - Fee Related CN101587189B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2009100626546A CN101587189B (en) 2009-06-10 2009-06-10 Texture elementary feature extraction method for synthetizing aperture radar images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2009100626546A CN101587189B (en) 2009-06-10 2009-06-10 Texture elementary feature extraction method for synthetizing aperture radar images

Publications (2)

Publication Number Publication Date
CN101587189A true CN101587189A (en) 2009-11-25
CN101587189B CN101587189B (en) 2011-09-14

Family

ID=41371515

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2009100626546A Expired - Fee Related CN101587189B (en) 2009-06-10 2009-06-10 Texture elementary feature extraction method for synthetizing aperture radar images

Country Status (1)

Country Link
CN (1) CN101587189B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101894261A (en) * 2010-06-29 2010-11-24 武汉大学 Extraction method of histogram texture descriptor in muti-contrast mode
CN101853386B (en) * 2010-05-14 2012-06-13 武汉大学 Topological tree based extraction method of image texture element features of local shape mode
CN102129064B (en) * 2010-01-15 2013-01-02 中国科学院电子学研究所 Universal satellite borne synthetic aperture radar (SAR) original data vector quantized codebook design algorithm
CN103426188A (en) * 2013-08-08 2013-12-04 华南理工大学 Texture description method
CN103679195A (en) * 2013-12-02 2014-03-26 北京工商大学 Method and system for classifying texture images on basis of local edge pattern
CN106408502A (en) * 2010-07-07 2017-02-15 快图有限公司 Real-time video frame pre-processing hardware
CN108805186A (en) * 2018-05-29 2018-11-13 北京师范大学 A kind of SAR image circle oil house detection method based on multidimensional notable feature cluster
CN109765554A (en) * 2018-11-14 2019-05-17 北京遥感设备研究所 A kind of radar foresight imaging system and method
CN114972348A (en) * 2022-08-01 2022-08-30 山东尚雅建材有限公司 Seam beautifying effect detection method based on image processing

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102129064B (en) * 2010-01-15 2013-01-02 中国科学院电子学研究所 Universal satellite borne synthetic aperture radar (SAR) original data vector quantized codebook design algorithm
CN101853386B (en) * 2010-05-14 2012-06-13 武汉大学 Topological tree based extraction method of image texture element features of local shape mode
CN101894261A (en) * 2010-06-29 2010-11-24 武汉大学 Extraction method of histogram texture descriptor in muti-contrast mode
CN106408502B (en) * 2010-07-07 2019-12-31 快图有限公司 Real-time video frame preprocessing hardware
CN106408502A (en) * 2010-07-07 2017-02-15 快图有限公司 Real-time video frame pre-processing hardware
CN103426188A (en) * 2013-08-08 2013-12-04 华南理工大学 Texture description method
CN103679195A (en) * 2013-12-02 2014-03-26 北京工商大学 Method and system for classifying texture images on basis of local edge pattern
CN103679195B (en) * 2013-12-02 2016-08-17 北京工商大学 Texture image classification method based on local edge pattern and system
CN108805186A (en) * 2018-05-29 2018-11-13 北京师范大学 A kind of SAR image circle oil house detection method based on multidimensional notable feature cluster
CN108805186B (en) * 2018-05-29 2020-11-17 北京师范大学 SAR image circular oil depot detection method based on multi-dimensional significant feature clustering
CN109765554A (en) * 2018-11-14 2019-05-17 北京遥感设备研究所 A kind of radar foresight imaging system and method
CN114972348A (en) * 2022-08-01 2022-08-30 山东尚雅建材有限公司 Seam beautifying effect detection method based on image processing
CN114972348B (en) * 2022-08-01 2022-09-30 山东尚雅建材有限公司 Seam beautifying effect detection method based on image processing

Also Published As

Publication number Publication date
CN101587189B (en) 2011-09-14

Similar Documents

Publication Publication Date Title
CN101587189B (en) Texture elementary feature extraction method for synthetizing aperture radar images
Dong et al. High quality multi-spectral and panchromatic image fusion technologies based on curvelet transform
CN107239751B (en) High-resolution SAR image classification method based on non-subsampled contourlet full convolution network
CN104881677B (en) Method is determined for the optimum segmentation yardstick of remote sensing image ground mulching classification
CN101901343B (en) Remote sensing image road extracting method based on stereo constraint
CN101510309B (en) Segmentation method for improving water parting SAR image based on compound wavelet veins region merge
CN107067405B (en) Remote sensing image segmentation method based on scale optimization
CN101833753B (en) SAR image de-speckling method based on improved Bayes non-local mean filter
CN103955913B (en) It is a kind of based on line segment co-occurrence matrix feature and the SAR image segmentation method of administrative division map
CN110570440A (en) Image automatic segmentation method and device based on deep learning edge detection
CN104299232B (en) SAR image segmentation method based on self-adaptive window directionlet domain and improved FCM
CN109635733B (en) Parking lot and vehicle target detection method based on visual saliency and queue correction
CN109977968B (en) SAR change detection method based on deep learning classification comparison
CN108829711B (en) Image retrieval method based on multi-feature fusion
Hou et al. SAR image ship detection based on visual attention model
Chen et al. A new process for the segmentation of high resolution remote sensing imagery
CN110070545B (en) Method for automatically extracting urban built-up area by urban texture feature density
CN103903275A (en) Method for improving image segmentation effects by using wavelet fusion algorithm
CN106295498A (en) Remote sensing image target area detection apparatus and method
Hwang et al. A practical algorithm for the retrieval of floe size distribution of Arctic sea ice from high-resolution satellite Synthetic Aperture Radar imagery
CN101853386A (en) Topological tree based extraction method of image texture element features of local shape mode
CN102073867A (en) Sorting method and device for remote sensing images
CN103152569A (en) Video ROI (region of interest) compression method based on depth information
CN107665347A (en) Vision significance object detection method based on filtering optimization
CN104637060A (en) Image partition method based on neighbor-hood PCA (Principal Component Analysis)-Laplace

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20110914

Termination date: 20120610