CN104732238A - Gray level image textural feature extracting method based on orientation selectivity - Google Patents

Gray level image textural feature extracting method based on orientation selectivity Download PDF

Info

Publication number
CN104732238A
CN104732238A CN201510155433.9A CN201510155433A CN104732238A CN 104732238 A CN104732238 A CN 104732238A CN 201510155433 A CN201510155433 A CN 201510155433A CN 104732238 A CN104732238 A CN 104732238A
Authority
CN
China
Prior art keywords
pixel
space structure
texture
structure distribution
represent
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510155433.9A
Other languages
Chinese (zh)
Other versions
CN104732238B (en
Inventor
吴金建
万文菲
张亚中
石光明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201510155433.9A priority Critical patent/CN104732238B/en
Publication of CN104732238A publication Critical patent/CN104732238A/en
Application granted granted Critical
Publication of CN104732238B publication Critical patent/CN104732238B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention discloses a textural feature extracting method based on orientation selectivity. The problem that according to the prior art, when an LBP carries out textural classifying on images with noise, effect is bad is mainly solved. The method comprises the steps that 1. according to the optic nerve orientation selecting principle, the space structure distribution of image pixel points is simulated; 2. by comparing of an azimuth angle difference value between the pixel points and a set threshold value, the space structure distribution of pixel points is determined; 3. the space structure distribution of all the pixel points is reduced to several modes based on direction selectivity; 4. the gray level changing value of each pixel point is computed; and 5. the space structure distribution number of a certain mode in images is subjected to statistics, a textural column diagram is drawn, combining with the gray level changing values is carried out, and weighting textural column diagram is drawn. By simulating the selecting sensitivity on the orientation from the human optic nerve, interference on image texture classifying from noise is lowered, and the method can be used for image processing and computer vision related to image classifying, image understanding and the like.

Description

Based on the method for the gray level image texture feature extraction of orientation selectivity
Technical field
The invention belongs to technical field of image processing, particularly a kind of texture characteristic extracting method of gray level image can be used for Images Classification.
Technical background
Along with the development that network technology and multimedia technology are day by day at full speed, view data dissimilar is in a large number emerged in large numbers on the internet.And view data has the feature not available for some routine datas, as: form disunity, the rich and varied property of the information content, and time and space characteristic etc.Therefore how can complete the classification of image well, become when previous popular research topic.
Textural characteristics is the key character of image, because be widespread in nature texture, it is a kind of intrinsic characteristic that all objects surface has.Do not rely on the texture of brightness or color change, be a kind of visual signature, can show the homogeneity phenomenon in image, the regularity of distribution that image pixel field gray space possesses also can be portrayed out, and therefore texture has very important Theory and applications value.
Finland University of Oulu, Timo Ojala teaches, after the texture operator such as texture operator, gray matrix, Central Symmetry covariance based on Laws texture energy, gray level co-occurrence matrixes, gray level of having analysed and compared, proposed local binary patterns LBP feature operator in 1996 and be used for Description Image texture.And because LBP operator possesses the plurality of advantages such as computation process is simple, texture distinguishing ability is outstanding, just extensively studied by people from appearance always and use.Particularly the article " Multiresolution Gray-Scale andRotation Invariant Texture Classification with Local Binary Patterns " delivered on IEEE TPAMI of Timo Ojala in 2002 afterwards this operator more known to people.After LBP operator proposes, Timo professor Ojala is with regard to the continuous various shortcomings for original operator, propose various difference to improve and optimization method, thus obtain the interior LBP operator, LBP invariable rotary pattern, LBP equivalent formulations etc. containing P sampled point of border circular areas that such as radius is R.But LBP has its application of a larger drawbacks limit, that be exactly it to noise-sensitive, texture features effect when classifying to the image of Noise of extraction is poor, is not suitable for the classification problem of process Noise image.
Summary of the invention
The object of the invention is to the defect for above-mentioned prior art, propose a kind of texture blending method based on set direction, to reduce the interference of noise on image classification, improve the classifying quality of Noise image.
Technical scheme of the present invention is achieved in that
Thinking of the present invention is: have significant set direction feature according to the primary visual cortex of the mankind for the extraction of visual information, when carrying out Texture classification to image, consider spatial coherence and these two factors of grey scale change situation of regional area in image, its implementation is as follows:
Technical scheme one:
Based on a texture characteristic extracting method for set direction, comprise the steps:
(1) the pending image that size is N × N is inputted orientation choosing principles according to optic nerve simulates arbitrary pixel x, space structure distribution character: wherein the set of n the pixel chosen border circular areas around this pixel x, x irepresent i-th pixel. represent a kind of arranged mode of response in bracket, represent the set of this pixel x and its peripheral region pixel between interaction type;
(2) the space structure distribution of pixel x is judged
(2a) position angle of pixel x is calculated:
θ ( x ) = arctan G v ( x ) G h ( x )
Wherein G v(x), G h(x) represent respectively through Prewitt operator edge detection image vertically with the gradient magnitude of horizontal direction, wherein,
f v = 1 3 1 1 1 0 0 0 - 1 - 1 - 1 The Prewitt operator of vertical direction, f h = 1 3 1 0 - 1 1 0 - 1 1 0 - 1 Be the Prewitt operator of horizontal direction, ' * ' represents convolution algorithm;
(2b) position angle discrimination threshold is set: by the azimuth angle theta (x) of pixel x and its peripheral region pixel set in each pixel x iazimuth angle theta (x i) the absolute value of difference | θ (x)-θ (x i) | with value compares:
If the absolute value of the difference at the position angle of two pixels is less than threshold value time, then determine that the interaction relationship between both is excitation types, represent with ' 1 '; Otherwise, determine that the interaction relationship between both is suppression type, represent with ' 0 ';
By the type of action of a distribution situation determination pixel x and surrounding n pixel of n ' 1 ', ' 0 ' value obtain the space structure distribution of pixel x
(3) space structure of pixel x is distributed be summarized as n kind based on set direction sexual norm
(3a) according to peripheral region pixel set the number n of middle pixel, angularly large young pathbreaker 360 degree of circular regional areas are divided into n class, and angle corresponding to every class is j=0,1 ..., n-1, this n class is defined as n kind based on set direction sexual norm
(3b) all regions surrounded by excitation ' 1 ' are got, therefrom choose maximum region, this maximum region will meet outside region can not comprise excitation ' 1 ', n kind in angle corresponding for maximum region and (3a) is classified based on set direction sexual norm angle mate, what the match is successful be included into corresponding pattern
(4) consider the grey scale change situation of image slices vegetarian refreshments, calculate the grey scale change value of pixel x.
According to the horizontal direction gradient magnitude G calculated in step (2a) h(x) and vertical gradient amplitude G vx (), calculates the grey scale change value of pixel x:
(5) spatial structure model of direct statistical pixel point x quantity, draw Texture similarity:
(5a) according to the result that step (3b) is sorted out, direct statistical picture in all meet n kind classification in kth kind based on set direction sexual norm space structure distribution quantity H (k):
Wherein n represents the size of input picture, k ∈ (1 ~ n);
(5b) the number percent MATLAB instrument of the structure distribution total quantity that taken up space by H (k) is depicted as Texture similarity, and this Texture similarity is the result that image carries out texture feature extraction.
Technical scheme two:
Based on a texture characteristic extracting method for set direction, comprise the steps:
1) the pending image that size is N × N is inputted orientation choosing principles according to optic nerve simulates arbitrary pixel x, space structure distribution character: wherein the set of n the pixel chosen border circular areas around this pixel x, x irepresent i-th pixel. represent a kind of arranged mode of response in bracket, represent the set of this pixel x and its peripheral region pixel between interaction type;
2) the space structure distribution of pixel x is judged
2a) calculate the position angle of pixel x:
θ ( x ) = arctan G v ( x ) G h ( x )
Wherein G v(x), G h(x) represent respectively through Prewitt operator edge detection image vertically with the gradient magnitude of horizontal direction, wherein,
f v = 1 3 1 1 1 0 0 0 - 1 - 1 - 1 The Prewitt operator of vertical direction, f h = 1 3 1 0 - 1 1 0 - 1 1 0 - 1 Be the Prewitt operator of horizontal direction, ' * ' represents convolution algorithm;
2b) set position angle discrimination threshold: by the azimuth angle theta (x) of pixel x and its peripheral region pixel set in each pixel x iazimuth angle theta (x i) the absolute value of difference | θ (x)-θ (x i) | with value compares:
If the absolute value of the difference at the position angle of two pixels is less than threshold value time, then determine that the interaction relationship between both is excitation types, represent with ' 1 '; Otherwise, determine that the interaction relationship between both is suppression type, represent with ' 0 ';
By the type of action of a distribution situation determination pixel x and surrounding n pixel of n ' 1 ', ' 0 ' value obtain the space structure distribution of pixel x
3) space structure of pixel x is distributed be summarized as n kind based on set direction sexual norm
3a) according to peripheral region pixel set the number n of middle pixel, angularly large young pathbreaker 360 degree of circular regional areas are divided into n class, and angle corresponding to every class is j=0,1 ..., n-1, this n class is defined as n kind based on set direction sexual norm
3b) get that all therefrom choose maximum region, this maximum region will meet outside region can not comprise excitation ' 1 ' by the region that surrounds of excitation ' 1 ', angle corresponding for maximum region and (3a) middle n kind are classified based on set direction sexual norm angle mate, what the match is successful be included into corresponding pattern
4) according to the gradient magnitude G calculated in step (2a) h(x) and G vx (), calculates the grey scale change value of pixel x:
5) by pixel x spatial structure model with grey scale change value in conjunction with, be depicted as Texture similarity:
First, grey scale change value is added statistical picture in all meet n kind classification in kth kind based on set direction sexual norm space structure distribution quantity H w(k):
Wherein w (x) is the weighted value set according to the grey scale change value of pixel x, directly gets for simplifying to calculate n represents the size of input picture, k ∈ (1 ~ n);
Then, by H wk () take up space number percent MATLAB instrument of structure distribution total quantity is depicted as the Texture similarity of weighting.The Texture similarity finally obtained is the result that image carries out texture feature extraction.
Compared with prior art, tool has the following advantages in the present invention:
1. the present invention is due to the derivation mechanism of simulating human visual system, consider the susceptibility that optic element is selected orientation, Binding experiment result adopts a kind of new pattern to carry out the space structure distribution of Description Image, thus avoid the noise of other modes when utilizing grey scale pixel value size Modling model, simultaneously owing to this pattern selected based on orientation being combined with grey scale change, the classifying quality of Noise image significantly can be improved.
2. the present invention due to the space structure distribution differentiation process of pixel simple, the classification that summary and induction goes out is fewer, therefore there is the simple advantage of the calculating identical with local binary patterns LBP, and experiment proves that the present invention also has the advantage of rotational invariance, but when carrying out Texture classification to the image of Noise, effect is but obviously better than local binary patterns LBP, namely has robustness to noise.
Accompanying drawing explanation
Fig. 1 of the present inventionly realizes general flow chart;
Fig. 2 distributes based on the pixel space structure of orientation selectivity in the present invention quantity statistics sub-process figure;
Fig. 3 is that the interaction type of single pixel and peripheral region in the present invention judges schematic diagram;
Fig. 4 is that the n kind that adopts in the present invention is based on set direction sexual norm schematic diagram;
Fig. 5 is the variation diagram that the Texture classification effect of emulation experiment 1 in the present invention strengthens with noise amplitude;
Fig. 6 is the variation diagram that the Texture classification effect of emulation experiment 2 in the present invention strengthens with noise amplitude.
Embodiment
With reference to Fig. 1, the present invention provides the following two kinds embodiment.
Embodiment 1: based on the texture feature extraction of the weighting of set direction
The performing step of this example is as follows:
Step 1, according to the space structure distribution character of the arbitrary pixel x of orientation choosing principles analog image of optic nerve, and show that space structure distributes computing formula.
(1a) the pending image that size is N × N is inputted therefrom get arbitrary pixel x, according to the orientation choosing principles photofit picture vegetarian refreshments of optic nerve space structure distribution character
Wherein represent the space structure distribution character of pixel x, represent a kind of arranged mode of response in bracket, represent pixel x and surrounding border circular areas between interaction relationship. the set of n the pixel chosen border circular areas around pixel x, x irepresent i-th pixel;
(1b) according to pixel not there is the characteristic of independence, calculating pixel x and its peripheral region for simplifying mutual relationship, adopt the theory that proposes in neuron feedback model of Hubel and Wiesel here, only consider the interaction between two elements, that is:
Wherein represent two pixel x and x ibetween interaction relationship.
Step 2, according to orientation selectivity principle, determines the space structure distribution of pixel x
With reference to Fig. 2, the realization of this step is as follows:
(2a) difference computed image vertical gradient amplitude G vwith horizontal direction gradient magnitude G h:
Wherein f v = 1 3 1 1 1 0 0 0 - 1 - 1 - 1 The Prewitt operator of vertical direction, f h = 1 3 1 0 - 1 1 0 - 1 1 0 - 1 Be the Prewitt operator of horizontal direction, ' * ' represents convolution algorithm;
(2b) azimuth angle theta (x) of pixel x is calculated according to the gradient magnitude of (2a):
θ ( x ) = arctan G v ( x ) G h ( x )
(2c) ability of Orientation differences is differentiated according to human eye, setting position angle discrimination threshold: then by the azimuth angle theta (x) of pixel x and its peripheral region pixel set in i-th pixel x iazimuth angle theta (x i) the absolute value of difference | θ (x)-θ (x i) | with value compares:
If the absolute value of the difference at the position angle of two pixels is less than threshold value time, then determine that the interaction relationship between both is excitation types, represent with ' 1 '; Otherwise, determine that the interaction relationship between both is suppression type, represent with ' 0 '.
(2d) pixel x and peripheral region is judged according to the method for (2c) the interaction type of a middle n pixel, by arranged distribution determination pixel x and the peripheral region of this n ' 1 ', ' 0 ' value the type of action of a middle n pixel obtain the space structure distribution of pixel x as shown in Figure 3.
Arrow in the middle of Fig. 3 represents the direction of pixel x, and around 8 arrows represent peripheral region respectively in the direction of 8 pixels, by the position angle of the intermediate pixel direction of arrow and azimuthal difference of surrounding 8 pixel directions of arrow and threshold value relatively, obviously wherein have 3 differences to be less than 6 °, so represents with ' 1 ', all the other then represent with ' 0 ', and final generation is as 8 ' 1 ' shown on the right of Fig. 3, the arranged distribution of ' 0 ' value, and the space structure that this arranged distribution is pixel x distributes
Step 3, distributes the space structure of pixel x be summarized as n kind based on set direction sexual norm
(3a) according to peripheral region pixel set the number n of middle pixel, angularly large young pathbreaker 360 degree of circular regional areas are divided into n class, and angle corresponding to every class is j=0,1 ..., n-1, this n class is defined as n kind based on set direction sexual norm
(3b) the space structure distribution of capture vegetarian refreshments x in allly therefrom choose maximum region by the region that surrounds of excitation ' 1 ', by angle corresponding for this region with n kind in (3a) based on set direction sexual norm angle mate, what the match is successful to belonging in corresponding pattern
As shown in Figure 4, get n=8,360 degree of border circular areas are divided into 8 classes, angle corresponding to this eight class is respectively: 0 °, 45 °, 90 °, 135 °, 180 °, 225 °, 270 °, 315 °, correspondingly produces 8 kinds based on set direction sexual norm then by encouraging angle corresponding to ' 1 ' maximum region of surrounding to be distributed by different space structures be included into respective classes, as corresponding in 180 ° of angles in figure orientation selectivity pattern obviously allly to come within the category all meeting by the angle of maximum region in ' 1 ' region surrounded is 180 °.
Step 4, considers the grey scale change situation of image slices vegetarian refreshments, calculates the grey scale change value of pixel x.
According to the horizontal direction gradient magnitude G calculated in step (2a) h(x) and vertical gradient amplitude G vx (), calculates the grey scale change value of pixel x:
Step 5, by pixel x spatial structure model with grey scale change value in conjunction with, draw weighting Texture similarity.
(5a) grey scale change value is added then statistical picture in all meet n kind classification in kth kind based on set direction sexual norm space structure distribution quantity H w(k):
Wherein w (x) is the weighted value set according to the grey scale change value of pixel x, directly gets for simplifying to calculate n represents the size of input picture, k ∈ (1 ~ n);
(5b) by H wk () take up space number percent MATLAB instrument of structure distribution total quantity is depicted as the Texture similarity of weighting, the weighting Texture similarity finally obtained is the result that image carries out texture feature extraction.
Embodiment 2: based on set direction texture feature extraction
The performing step of this example is as follows:
Step one is identical with the step 1 of embodiment 1
Step 2 is identical with the step 2 of embodiment 1
Step 3 is identical with the step 3 of embodiment 1
Step 4 is identical with the step 4 of embodiment 1
Step 5, the spatial structure model of direct statistical pixel point x quantity, draw Texture similarity.
(5.1) direct statistical picture in all meet n kind classification in kth kind based on set direction sexual norm space structure distribution quantity H (k):
Wherein n represents the size of input picture, k ∈ (1 ~ n);
(5.2) the number percent MATLAB instrument of the structure distribution total quantity that taken up space by H (k) is depicted as Texture similarity, and this Texture similarity is the result that image carries out texture feature extraction.
Effect of the present invention can be further illustrated by following emulation experiment:
1. emulation experiment principle
Extract Texture similarity from the texture maps of experiment, then add up two kinds of Texture similarity H with card side's distance function 1with H 2difference D (H 1, H 2), utilize D (H 1, H 2) compare Texture similarity H 1with H 2classifying quality, distance function weighted L2 norm in the side's of card represents here:
D ( H 1 , H 2 ) = Σ k = 1 n ( H 1 ( k ) - H 2 ( k ) ) 2 H 1 ( k ) + H 2 ( k ) .
2. emulation experiment database
This experiment be the published texture database Outex being specifically designed to Texture classification, Outex texture database comprises 24 kinds of textural characteristics, and every class textural characteristics has again three kinds of explanations: ' horizon ', ' inca ', ' t184 ', and nine angles: 0 °, 5 °, 10 °, 15 °, 30 °, 45 °, 60 °, 75 °, 90 °.The Outex texture database that this experiment employing two is serial: Outex_TC10 and Outex_TC12, the data type specifically for training and inspection is selected as follows:
Outex_TC10 series: choose textural characteristics in database and illustrate that to be ' inca ' and angle be the 480 width texture maps of 0 ° are Texture classification training sample, remaining 480 × 8 width figure checks to scheme;
Outex_TC12 series: the sample for training is identical with the mode chosen in Outex_TC10 series, and be 480 × 2 × 9 width texture maps that in database, textural characteristics is illustrated as ' horizon ' and ' t184 ' and all nine angles for what check.
3. emulation experiment content
Emulation 1, the training sample of database Outex_TC10 is extracted respectively to the Texture similarity of training sample by the LBPV technology of existing LBP technology and improvement and two kinds of implementation methods of the present invention, then add the white Gaussian noise that to-noise ratio PSNR is 30db, 27db, 23db to training sample respectively.The Texture similarity of the training sample by noise pollution is extracted again respectively by above-mentioned four kinds of methods, to the difference of both Texture similarity card side distance function statistics that front and back are extracted, again the Strength Changes of the result of emulation foundation institute plus noise is depicted as Texture classification effect variation diagram, as shown in Figure 5.The horizontal ordinate of Fig. 5 represents the signal to noise ratio (S/N ratio) PSNR of the texture maps adding noise, the intensity of reaction institute plus noise, and ordinate is classification effectiveness, the classifying quality of reaction four kinds of methods.
As seen from Figure 5, for the training sample of database TC10, along with the increase of added noise intensity, the classifying quality fall of LBP and LBPV is very large, and of the present invention based on set direction texture characteristic extracting method OST and the classifying quality fall based on the weighting texture characteristic extracting method WOST of set direction smaller.When therefore carrying out textural characteristics classification to the Noise image of database TC10, successful of the present invention is better than existing LBP technology.
Emulation 2, the Texture classification effect variation diagram that the simulation result adopting above-mentioned identical step can obtain these four kinds of methods to the training sample of database Outex_TC12 changes according to added noise intensity, as shown in Figure 6.The horizontal ordinate of Fig. 6 represents the signal to noise ratio (S/N ratio) PSNR of the texture maps adding noise, the intensity of reaction institute plus noise, and ordinate is classification effectiveness, the classifying quality of reaction four kinds of methods.
As seen from Figure 6, for the training sample of database TC12, along with the increase of added noise intensity, the classifying quality fall of LBP and LBPV is also very large, and of the present invention based on set direction texture characteristic extracting method OST and the classifying quality fall based on the weighting texture characteristic extracting method WOST of set direction also smaller.When therefore carrying out textural characteristics classification to the Noise image of database TC12, effect of the present invention also obviously will be better than existing LBP technology.
4. the simulation experiment result analysis
The result of above-mentioned emulation 1 and emulation 2 is carried out Texture classification statistics, as table 1.
Table 1: adopt four kinds of methods to the classifying quality comparison sheet of plus noise texture image
As seen from Table 1, the classifying quality of the existing LBP method of the enhancing with interference noise declines by a big margin, and same modified LBPV also declines obviously.And though classifying quality of the present invention also can decline, but amplitude is less, can see that on the classifying quality based on the TC10 database of set direction texture characteristic extracting method OST after adding noise, comparatively LBP improves 40% simultaneously, on the classifying quality of the TC12 database after adding noise, comparatively LBP improves 25%, and on the classifying quality of database TC10 and TC12 after simultaneously adding noise based on the weighting texture characteristic extracting method WOST of set direction at two kinds, comparatively LBPV improves 15%.Therefore the present invention is remarkably productive in the Texture classification of Noise image, compared with prior art has obvious advantage.

Claims (3)

1., based on a texture characteristic extracting method for set direction, comprise the steps:
(1) the pending image that size is N × N is inputted orientation choosing principles according to optic nerve simulates arbitrary pixel x, space structure distribution character: wherein the set of n the pixel chosen border circular areas around this pixel x, x irepresent i-th pixel. a kind of arranged mode of response in table bracket, represent the set of this pixel x and its peripheral region pixel between interaction type;
(2) the space structure distribution of pixel x is judged
(2a) position angle of pixel x is calculated:
θ ( x ) = arctan G v ( x ) G h ( x )
Wherein G v(x), G h(x) represent respectively through Prewitt operator edge detection image vertically with the gradient magnitude of horizontal direction, wherein,
f v = 1 3 1 1 1 0 0 0 - 1 - 1 - 1 The Prewitt operator of vertical direction, f h = 1 3 1 0 - 1 1 0 - 1 1 0 - 1 Be the Prewitt operator of horizontal direction, ' * ' represents convolution algorithm;
(2b) position angle discrimination threshold is set: by the azimuth angle theta (x) of pixel x and its peripheral region pixel set in each pixel x iazimuth angle theta (x i) the absolute value of difference | θ (x)-θ (x i) | with value compares:
If the absolute value of the difference at the position angle of two pixels is less than threshold value time, then determine that the interaction relationship between both is excitation types, represent with ' 1 '; Otherwise, determine that the interaction relationship between both is suppression type, represent with ' 0 ';
By the type of action of a distribution situation determination pixel x and surrounding n pixel of n ' 1 ', ' 0 ' value obtain the space structure distribution of pixel x
(3) space structure of pixel x is distributed be summarized as n kind based on set direction sexual norm
(3a) according to peripheral region pixel set the number n of middle pixel, angularly large young pathbreaker 360 degree of circular regional areas are divided into n class, and angle corresponding to every class is j=0,1 ..., n-1, this n class is defined as n kind based on set direction sexual norm
(3b) all regions surrounded by excitation ' 1 ' are got, therefrom choose maximum region, this maximum region will meet outside region can not comprise excitation ' 1 ', n kind in angle corresponding for maximum region and (3a) is classified based on set direction sexual norm angle mate, what the match is successful be included into corresponding pattern
(4) according to the gradient magnitude G calculated in step (2a) h(x) and G vx (), calculates the grey scale change value of pixel x:
(5) spatial structure model of direct statistical pixel point x quantity, draw Texture similarity:
(5a) according to the result that step (3b) is sorted out, direct statistical picture in all meet n kind classification in kth kind based on set direction sexual norm space structure distribution quantity H (k):
Wherein n represents the size of input picture, k ∈ (1 ~ n);
(5b) the number percent MATLAB instrument of the structure distribution total quantity that taken up space by H (k) is depicted as Texture similarity, and this Texture similarity is the result that image carries out texture feature extraction.
2. method according to claim 1, wherein described in step (3b) n kind in angle corresponding for maximum region and (3a) is classified based on set direction sexual norm angle mate, be first by all with ' 1 ' be head and the tail surround region list, therefrom find out the region that area is maximum; Again by angle corresponding for this region and classified n kind based on set direction sexual norm angle contrast one by one, the identical coupling of angle together, i.e. corresponding space structure distribution belong to the set direction sexual norm of coupling
3., based on a texture characteristic extracting method for set direction, comprise the steps:
1) the pending image that size is N × N is inputted orientation choosing principles according to optic nerve simulates arbitrary pixel x, space structure distribution character: wherein the set of n the pixel chosen border circular areas around this pixel x, x irepresent i-th pixel. represent a kind of arranged mode of response in bracket, represent the set of this pixel x and its peripheral region pixel between interaction type;
2) the space structure distribution of pixel x is judged
2a) calculate the position angle of pixel x:
θ ( x ) = arctan G v ( x ) G h ( x )
Wherein G v(x), G h(x) represent respectively through Prewitt operator edge detection image vertically with the gradient magnitude of horizontal direction, wherein,
f v = 1 3 1 1 1 0 0 0 - 1 - 1 - 1 The Prewitt operator of vertical direction, f h = 1 3 1 0 - 1 1 0 - 1 1 0 - 1 Be the Prewitt operator of horizontal direction, ' * ' represents convolution algorithm;
2b) set position angle discrimination threshold: by the azimuth angle theta (x) of pixel x and its peripheral region pixel set in each pixel x iazimuth angle theta (x i) the absolute value of difference | θ (x)-θ (x i) | with value compares:
If the absolute value of the difference at the position angle of two pixels is less than threshold value time, then determine that the interaction relationship between both is excitation types, represent with ' 1 '; Otherwise, determine that the interaction relationship between both is suppression type, represent with ' 0 ';
By the type of action of a distribution situation determination pixel x and surrounding n pixel of n ' 1 ', ' 0 ' value obtain the space structure distribution of pixel x
3) space structure of pixel x is distributed be summarized as n kind based on set direction sexual norm
3a) according to peripheral region pixel set the number n of middle pixel, angularly large young pathbreaker 360 degree of circular regional areas are divided into n class, and angle corresponding to every class is j=0,1 ..., n-1, this n class is defined as n kind based on set direction sexual norm
3b) get that all therefrom choose maximum region, this maximum region will meet outside region can not comprise excitation ' 1 ' by the region that surrounds of excitation ' 1 ', angle corresponding for maximum region and (3a) middle n kind are classified based on set direction sexual norm angle mate, what the match is successful be included into corresponding pattern
4) according to the gradient magnitude G calculated in step (2a) h(x) and G vx (), calculates the grey scale change value of pixel x:
5) by pixel x spatial structure model with grey scale change value in conjunction with, be depicted as Texture similarity:
First, grey scale change value is added statistical picture in all meet n kind classification in kth kind based on set direction sexual norm space structure distribution quantity H w(k):
Wherein w (x) is the weighted value set according to the grey scale change value of pixel x, directly gets for simplifying to calculate n represents the size of input picture, k ∈ (1 ~ n);
Then, by H wk () take up space number percent MATLAB instrument of structure distribution total quantity is depicted as the Texture similarity of weighting.The weighting Texture similarity finally obtained is the result that image carries out texture feature extraction.
CN201510155433.9A 2015-04-02 2015-04-02 The method of gray level image texture feature extraction based on orientation selectivity Active CN104732238B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510155433.9A CN104732238B (en) 2015-04-02 2015-04-02 The method of gray level image texture feature extraction based on orientation selectivity

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510155433.9A CN104732238B (en) 2015-04-02 2015-04-02 The method of gray level image texture feature extraction based on orientation selectivity

Publications (2)

Publication Number Publication Date
CN104732238A true CN104732238A (en) 2015-06-24
CN104732238B CN104732238B (en) 2018-03-06

Family

ID=53456112

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510155433.9A Active CN104732238B (en) 2015-04-02 2015-04-02 The method of gray level image texture feature extraction based on orientation selectivity

Country Status (1)

Country Link
CN (1) CN104732238B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107862709A (en) * 2017-09-28 2018-03-30 北京华航无线电测量研究所 A kind of method for describing texture of image of multi-direction pattern concatenate rule
CN109272017A (en) * 2018-08-08 2019-01-25 太原理工大学 The vibration signal mode identification method and system of distributed fiberoptic sensor
CN115330774A (en) * 2022-10-12 2022-11-11 南通林多光学仪器有限公司 Welding image molten pool edge detection method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100329517A1 (en) * 2009-06-26 2010-12-30 Microsoft Corporation Boosted face verification
CN102831427A (en) * 2012-09-06 2012-12-19 湖南致尚科技有限公司 Texture feature extraction method fused with visual significance and gray level co-occurrence matrix (GLCM)

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100329517A1 (en) * 2009-06-26 2010-12-30 Microsoft Corporation Boosted face verification
CN102831427A (en) * 2012-09-06 2012-12-19 湖南致尚科技有限公司 Texture feature extraction method fused with visual significance and gray level co-occurrence matrix (GLCM)

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
PRIEBE N J等: "Inhibition, spike threshold, and stimulus selectivity in primary visual cortex", 《SCIENCEDIRECT》 *
毕于慧: "彩色航空图像森林纹理特征提取方法的研究", 《中国博士学位论文全文数据库 农业科技辑》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107862709A (en) * 2017-09-28 2018-03-30 北京华航无线电测量研究所 A kind of method for describing texture of image of multi-direction pattern concatenate rule
CN107862709B (en) * 2017-09-28 2020-03-27 北京华航无线电测量研究所 Image texture description method of multidirectional mode connection rule
CN109272017A (en) * 2018-08-08 2019-01-25 太原理工大学 The vibration signal mode identification method and system of distributed fiberoptic sensor
CN109272017B (en) * 2018-08-08 2022-07-12 太原理工大学 Vibration signal mode identification method and system of distributed optical fiber sensor
CN115330774A (en) * 2022-10-12 2022-11-11 南通林多光学仪器有限公司 Welding image molten pool edge detection method
CN115330774B (en) * 2022-10-12 2023-05-19 南通林多光学仪器有限公司 Welding image molten pool edge detection method

Also Published As

Publication number Publication date
CN104732238B (en) 2018-03-06

Similar Documents

Publication Publication Date Title
CN104537393B (en) A kind of traffic sign recognition method based on multiresolution convolutional neural networks
CN104915676B (en) SAR image sorting technique based on further feature study and watershed
CN103208001B (en) In conjunction with shape-adaptive neighborhood and the remote sensing image processing method of texture feature extraction
CN101673345B (en) Method for extracting target closed contour based on shape prior
CN106650786A (en) Image recognition method based on multi-column convolutional neural network fuzzy evaluation
CN107945153A (en) A kind of road surface crack detection method based on deep learning
CN106127749A (en) The target part recognition methods of view-based access control model attention mechanism
CN102831427B (en) Texture feature extraction method fused with visual significance and gray level co-occurrence matrix (GLCM)
CN106096602A (en) A kind of Chinese licence plate recognition method based on convolutional neural networks
CN103824088B (en) SAR target variant recognition method based on multi-information joint dynamic sparse representation
CN103247059A (en) Remote sensing image region of interest detection method based on integer wavelets and visual features
CN104182985B (en) Remote sensing image change detection method
Park et al. Biologically inspired saliency map model for bottom-up visual attention
CN106778687A (en) Method for viewing points detecting based on local evaluation and global optimization
CN105809173B (en) A kind of image RSTN invariable attribute feature extraction and recognition methods based on bionical object visual transform
CN103020614B (en) Based on the human motion identification method that space-time interest points detects
CN103020649A (en) Forest type identification method based on texture information
CN105893971A (en) Traffic signal lamp recognition method based on Gabor and sparse representation
Vu et al. Improving texture categorization with biologically-inspired filtering
CN103714340B (en) Self-adaptation feature extracting method based on image partitioning
CN108596195A (en) A kind of scene recognition method based on sparse coding feature extraction
CN105913463A (en) Position prior principle-based texture-color characteristic overall saliency detection method
CN104966075A (en) Face recognition method and system based on two-dimensional discriminant features
CN107194314A (en) The fuzzy 2DPCA and fuzzy 2DLDA of fusion face identification method
CN104732238A (en) Gray level image textural feature extracting method based on orientation selectivity

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant