CN104732238B - The method of gray level image texture feature extraction based on orientation selectivity - Google Patents

The method of gray level image texture feature extraction based on orientation selectivity Download PDF

Info

Publication number
CN104732238B
CN104732238B CN201510155433.9A CN201510155433A CN104732238B CN 104732238 B CN104732238 B CN 104732238B CN 201510155433 A CN201510155433 A CN 201510155433A CN 104732238 B CN104732238 B CN 104732238B
Authority
CN
China
Prior art keywords
pixel
mrow
texture
space structure
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510155433.9A
Other languages
Chinese (zh)
Other versions
CN104732238A (en
Inventor
吴金建
万文菲
张亚中
石光明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201510155433.9A priority Critical patent/CN104732238B/en
Publication of CN104732238A publication Critical patent/CN104732238A/en
Application granted granted Critical
Publication of CN104732238B publication Critical patent/CN104732238B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a kind of texture characteristic extracting method based on set direction, mainly solves the problems, such as existing method LBP effect differences when carrying out Texture classification to noisy acoustic image.Implementation step is:1. the space structure distribution of the orientation choosing principles analog image pixel according to optic nerve;2. determine that the space structure of pixel is distributed by the azimuth difference between pixel and the comparison of given threshold;3. the space structure of all pixels point is distributed, it is summarized as several being based on set direction sexual norm;4. calculate the grey scale change value of each pixel;5. belonging to the space structure distributed quantity of certain pattern in statistical picture, Texture similarity is drawn, and is combined with grey scale change value, draws weighting Texture similarity.The present invention reduces the interference of noise on image Texture classification, the image procossing related available for image classification, image understanding etc. and computer vision by simulating selection sensitiveness of mankind's optic nerve to orientation.

Description

The method of gray level image texture feature extraction based on orientation selectivity
Technical field
The invention belongs to technical field of image processing, more particularly to a kind of texture characteristic extracting method of gray level image can use In image classification.
Technical background
With the development that network technology and multimedia technology are increasingly at full speed, a large amount of different types of view data are emerged in large numbers mutual In networking.And view data has the characteristics of not available for some routine datas, such as:Form disunity, the information content are enriched Diversity, and time and space characteristic etc..Therefore the classification of image how is completed well, is had become when previous hot topic Research topic.
Textural characteristics are the key character of image, because being widespread in nature texture, it is property body surface A kind of intrinsic characteristic possessed by face.Independent of brightness or the texture of color change, it is a kind of visual signature, can shows Homogeneity phenomenon in image, the regularity of distribution that image pixel field gray space possesses also can be characterized out, therefore texture has There is highly important theoretical and application value.
Finland University of Oulu, Timo Ojala is taught, and is being analysed and compared based on Laws texture energies, ash Spend after the texture operators such as co-occurrence matrix, the texture operator of gray level, gray matrix, Central Symmetry covariance, carried in 1996 Go out local binary patterns LBP feature operators to be used for describing image texture.And due to LBP operators possess calculating process it is simple, Many advantages, such as texture distinguishing ability protrudes, it is just widely studied and use by people always since occurring.Particularly 2002 Article " the Multiresolution Gray-Scale and Rotation that Timo Ojala are delivered on IEEE TPAMI After Invariant Texture Classification with Local Binary Patterns " this operator more by Known to people.After the proposition of LBP operators, Timo professors Ojala are just constantly directed to the various shortcomings of original operator, and proposition is various not It is the LBP operators containing P sampled point in R border circular areas, LBP rotations so as to obtain such as radius with improvement and optimization method Turn constant pattern, LBP equivalent formulations etc..But LBP has a larger defect to limit its application, that is, it is to noise Sensitivity, the texture features of the extraction effect when the image to Noise is classified is poor, is not suitable for handling noisy acoustic image Classification problem.
The content of the invention
The defects of it is an object of the invention to for above-mentioned prior art, propose that a kind of texture based on set direction carries Method is taken, to reduce the interference of noise on image classification, improves the classifying quality of noisy acoustic image.
The technical proposal of the invention is realized in this way:
The present invention thinking be:According to extraction of the primary visual cortex of the mankind for visual information there is significant direction to select Selecting property feature, when carrying out Texture classification to image, consider the spatial coherence and grey scale change of regional area in image The two factors of situation, its implementation are as follows:
Technical scheme one:
A kind of texture characteristic extracting method based on set direction, comprises the following steps:
(1) the pending image that size is N × N is inputtedOrientation choosing principles according to optic nerve simulate any pixel Point x,Space structure distribution character:Wherein It is the set for the n pixel chosen from border circular areas around pixel x, xiRepresent ith pixel point.Representative includes A kind of arranged mode responded in number,Represent pixel x and its peripheral region pixel setBetween it is mutual Type of action;
(2) pixel x space structure distribution is judged
(2a) calculates pixel x azimuth:
Wherein Gv(x),Gh(x) represent respectively image Jing Guo Prewitt operator edge detections vertically with level side To gradient magnitude,Wherein,
The Prewitt operators being vertically oriented,It is horizontally oriented Prewitt operators, ' * ' represents convolution algorithm;
(2b) sets azimuth discrimination threshold:By pixel x azimuth angle theta (x) and its peripheral region pixel Point setIn each pixel xiAzimuth angle theta (xi) difference absolute value | θ (x)-θ (xi) | withValue is compared:
If the absolute value of the difference at the azimuth of two pixels is less than threshold valueWhen, it is determined that the phase interaction between both It is excitation types with relation, is represented with ' 1 ';Otherwise, it determines the interaction relationship between both is suppression type, with ' 0 ' Represent;
Pixel x and n pixel of surrounding type of action are determined by the distribution situation of n ' 1 ', ' 0 ' values Space structure to pixel x is distributed
(3) pixel x space structure is distributedIt is summarized as n kinds and is based on set direction sexual norm
(3a) is according to peripheral region pixel point setThe number n of middle pixel, angularly size is circular local by 360 degree Region division is n classes, and angle corresponding to per class isJ=0,1 ..., n-1, this n class be defined as n kinds based on direction select Select sexual norm
(3b) takes all regions surrounded by excitation ' 1 ', therefrom chooses the region of maximum, and the maximum region will meet region Excitation ' 1 ' can not be included outside, by n kinds in angle corresponding to maximum region and (3a) classify based on set direction sexual norm Angle matched, what the match is successfulIt is included into corresponding pattern
(4) consider the grey scale change situation of image slices vegetarian refreshments, calculate pixel x grey scale change value.
According to the horizontal direction gradient magnitude G calculated in step (2a)hAnd vertical gradient amplitude G (x)v(x), Calculate pixel x grey scale change value:
(5) direct statistical pixel point x spatial structure modelQuantity, draw Texture similarity:
The result that (5a) sorts out according to step (3b), direct statistical pictureIn it is all meet n kinds classification in kth kind be based on Set direction sexual normSpace structure distribution quantity H (k):
WhereinN represents the size of input picture, k ∈ (1~n);
The take up space percentages of structure distribution total quantity of H (k) are depicted as Texture similarity by (5b) with MATLAB instruments, should Texture similarity is the result that image carries out texture feature extraction.
Technical scheme two:
A kind of texture characteristic extracting method based on set direction, comprises the following steps:
1) the pending image that size is N × N is inputtedOrientation choosing principles simulation according to optic nerve is taken the post as One pixel x,Space structure distribution character:Wherein It is the set for the n pixel chosen from border circular areas around pixel x, xiRepresent i-th Pixel.A kind of arranged mode of response in bracket is represented,Represent pixel x and its peripheral region pixel The set of pointBetween interaction type;
2) pixel x space structure distribution is judged
2a) calculate pixel x azimuth:
Wherein Gv(x),Gh(x) represent respectively image Jing Guo Prewitt operator edge detections vertically with level side To gradient magnitude,Wherein,
The Prewitt operators being vertically oriented,It is horizontally oriented Prewitt operators, ' * ' represents convolution algorithm;
2b) set azimuth discrimination threshold:By pixel x azimuth angle theta (x) and its peripheral region pixel SetIn each pixel xiAzimuth angle theta (xi) difference absolute value | θ (x)-θ (xi) | withValue is compared:
If the absolute value of the difference at the azimuth of two pixels is less than threshold valueWhen, it is determined that the phase interaction between both It is excitation types with relation, is represented with ' 1 ';Otherwise, it determines the interaction relationship between both is suppression type, with ' 0 ' Represent;
Pixel x and n pixel of surrounding type of action are determined by the distribution situation of n ' 1 ', ' 0 ' values Space structure to pixel x is distributed
3) pixel x space structure is distributedIt is summarized as n kinds and is based on set direction sexual norm
3a) according to peripheral region pixel point setThe number n of middle pixel, angularly size is circular local by 360 degree Region division is n classes, and angle corresponding to per class isJ=0,1 ..., n-1, this n class be defined as n kinds based on direction select Select sexual norm
All regions surrounded by excitation ' 1 ' 3b) are taken, therefrom choose the region of maximum, the maximum region will meet region Excitation ' 1 ' can not be included outside, by n kinds in angle corresponding to maximum region and (3a) classify based on set direction sexual norm Angle matched, what the match is successfulIt is included into corresponding pattern
4) according to the gradient magnitude G calculated in step (2a)hAnd G (x)v(x) pixel x grey scale change, is calculated Value:
5) by pixel x spatial structure modelsWith grey scale change valueWith reference to being depicted as Texture similarity:
First, grey scale change value is addedStatistical pictureIn it is all meet n kinds classification in kth kind be based on direction choosing Select sexual normSpace structure distribution quantity Hw(k):
Wherein w (x) is the weighted value set according to pixel x grey scale change value, is directly taken to simplify calculatingN represents the size of input picture, k ∈ (1~n);
Then, by Hw(k) percentage of the structure distribution that takes up space total quantity is depicted as the texture of weighting with MATLAB instruments Histogram.The Texture similarity finally obtained is the result that image carries out texture feature extraction.
The present invention compared with prior art, has the following advantages that:
1. derivation mechanism of the present invention due to simulating human vision nervous system, it is quick to consider that optic element is selected orientation Perception, Binding experiment result is distributed using a kind of new pattern to describe the space structure of image, so as to avoid other modes Noise jamming when establishing model using grey scale pixel value size, simultaneously because by this based on the pattern that orientation selects and ash Degree change combines, and the classifying quality of noisy acoustic image can be greatly improved.
2. for the present invention because the space structure distribution differentiation process of pixel is simple, the classification that summary and induction goes out is fewer, Therefore with calculating simple advantage with local binary patterns LBP identicals, and test and prove that the present invention also has rotation not The advantage of denaturation, but when the image to Noise carries out Texture classification, effect is but substantially better than local binary patterns LBP, There is robustness to noise.
Brief description of the drawings
Fig. 1 be the present invention realize general flow chart;
Fig. 2 is the pixel space structure distribution based on orientation selectivity in the present inventionQuantity statistics sub-process figure;
Fig. 3 is that the interaction type of single pixel point and peripheral region judges schematic diagram in the present invention;
Fig. 4 is that the n kinds used in the present invention are based on set direction sexual normSchematic diagram;
Fig. 5 is the variation diagram that the Texture classification effect of emulation experiment 1 in the present invention strengthens with noise amplitude;
Fig. 6 is the variation diagram that the Texture classification effect of emulation experiment 2 in the present invention strengthens with noise amplitude.
Embodiment
Reference picture 1, the present invention provide the following two kinds embodiment.
Embodiment 1:The texture feature extraction of weighting based on set direction
This example realizes that step is as follows:
Step 1, the orientation choosing principles analog image any pixel point x of foundation optic nerve space structure distribution character, And show that space structure is distributedCalculation formula.
The pending image that (1a) input size is N × NAny pixel point x is therefrom taken, the orientation according to optic nerve is selected Select principle simulation pixelSpace structure distribution character
WhereinPixel x space structure distribution character is represented,Represent a kind of layout of response in bracket Mode,Represent pixel x and surrounding border circular areasBetween interaction relationship. It is the set for the n pixel chosen from border circular areas around pixel x, xiRepresent ith pixel point;
(1b) is according to pixelCharacteristic without independence, pixel x and its peripheral region are calculated to simplify Correlation, the theory proposed here using Hubel and Wiesel in neuron feedback model, only consider two elements between Interaction, i.e.,:
WhereinRepresent two pixels x and xiBetween interaction relationship.
Step 2, according to orientation selectivity principle, pixel x space structure distribution is determined
Reference picture 2, the realization of this step are as follows:
(2a) calculates image respectivelyVertical gradient amplitude GvWith horizontal direction gradient magnitude Gh
WhereinThe Prewitt operators being vertically oriented,It is horizontal The Prewitt operators in direction, ' * ' represent convolution algorithm;
(2b) calculates pixel x azimuth angle theta (x) according to the gradient magnitude of (2a):
(2c) differentiates the ability of Orientation differences according to human eye, sets azimuth discrimination threshold:Then by pixel Point x azimuth angle theta (x) and its peripheral region pixel point setMiddle ith pixel point xiAzimuth angle theta (xi) difference it is absolute Value | θ (x)-θ (xi) | withValue is compared:
If the absolute value of the difference at the azimuth of two pixels is less than threshold valueWhen, it is determined that the phase interaction between both It is excitation types with relation, is represented with ' 1 ';Otherwise, it determines the interaction relationship between both is suppression type, with ' 0 ' Represent.
(2d) judges pixel x and peripheral region according to the method for (2c)The interaction type of middle n pixel, by The arranged distribution of this n ' 1 ', ' 0 ' values determines pixel x and peripheral regionThe type of action of middle n pixel Space structure to pixel x is distributedAs shown in Figure 3.
Arrow among Fig. 3 represents pixel x direction, and 8 arrows of surrounding represent peripheral region respectivelyIn 8 pictures The direction of vegetarian refreshments, azimuth and azimuthal difference and threshold of 8 pixel directions of arrow of surrounding by the intermediate pixel direction of arrow ValueCompare, hence it is evident that wherein there are 3 differences to be less than 6 °, so being represented with ' 1 ', remaining is then represented with ' 0 ', final to produce The arranged distribution of 8 ' 1 ', ' 0 ' values shown on the right of Fig. 3, the arranged distribution are pixel x space structure distribution
Step 3, pixel x space structure is distributedIt is summarized as n kinds and is based on set direction sexual norm
(3a) is according to peripheral region pixel point setThe number n of middle pixel, angularly size is circular local by 360 degree Region division is n classes, and angle corresponding to per class isJ=0,1 ..., n-1, this n class be defined as n kinds based on direction select Select sexual norm
(3b) capture vegetarian refreshments x space structure distributionIn it is all by excitation ' 1 ' surround regions, therefrom choose most Big region, n kinds in angle corresponding to the region and (3a) are based on set direction sexual normAngle matched, With successfulTo belonging in corresponding pattern
As shown in figure 4, taking n=8,360 degree of border circular areas are divided into 8 classes, angle corresponding to this eight class is respectively:0°、 45 °, 90 °, 135 °, 180 °, 225 °, 270 °, 315 °, corresponding 8 kinds of generation is based on set direction sexual normThen by excitation Different space structures is distributed by angle corresponding to ' 1 ' maximum region surroundedRespective classes are included into, such as 180 ° of angles in figure Orientation selectivity pattern is based on corresponding to degreeIt is obvious all to come within the categoryAll meet by ' 1 ' region surrounded The angle of maximum region is 180 °.
Step 4, consider the grey scale change situation of image slices vegetarian refreshments, calculate pixel x grey scale change value.
According to the horizontal direction gradient magnitude G calculated in step (2a)hAnd vertical gradient amplitude G (x)v(x), Calculate pixel x grey scale change value:
Step 5, by pixel x spatial structure modelsWith grey scale change valueWith reference to drafting weighting texture Nogata Figure.
(5a) adds grey scale change valueThen statistical pictureIn it is all meet n kinds classification in kth kind be based on side To selective modeSpace structure distribution quantity Hw(k):
Wherein w (x) is the weighted value set according to pixel x grey scale change value, is directly taken to simplify calculatingN represents the size of input picture, k ∈ (1~n);
(5b) is by Hw(k) texture that the percentage of the structure distribution that takes up space total quantity is depicted as weighting with MATLAB instruments is straight Fang Tu, the weighting Texture similarity finally obtained are the result that image carries out texture feature extraction.
Embodiment 2:Based on set direction texture feature extraction
This example realizes that step is as follows:
Step 1 is identical with the step 1 of embodiment 1
Step 2 is identical with the step 2 of embodiment 1
Step 3 is identical with the step 3 of embodiment 1
Step 4 is identical with the step 4 of embodiment 1
Step 5, direct statistical pixel point x spatial structure modelQuantity, draw Texture similarity.
(5.1) direct statistical pictureIn it is all meet n kinds classification in kth kind be based on set direction sexual normSpace The quantity H (k) of structure distribution:
WhereinN represents the size of input picture, k ∈ (1~n);
(5.2) the take up space percentages of structure distribution total quantity of H (k) are depicted as Texture similarity with MATLAB instruments, The Texture similarity is the result that image carries out texture feature extraction.
The effect of the present invention can be further illustrated by following emulation experiment:
1. emulation experiment principle
Texture similarity is extracted from the texture maps of experiment, then counts two kinds of Texture similarity H with card side's distance function1 With H2Difference D (H1, H2), utilize D (H1,H2) compare Texture similarity H1With H2Classifying quality, the side's of card distance function here Represented with weighted L2 norm:
2. emulation experiment database
This experiment is published texture database Outex, the Outex texture database dedicated for Texture classification Comprising 24 kinds of textural characteristics, there are three kinds of explanations again per class textural characteristics:' horizon ', ' inca ', ' t184 ', and nine angles: 0 °, 5 °, 10 °, 15 °, 30 °, 45 °, 60 °, 75 °, 90 °.This experiment uses two serial Outex texture databases:Outex_ TC10 and Outex_TC12, selected specifically for the data type trained and examined as follows:
Outex_TC10 series:The 480 width textures that it is ' inca ' to choose textural characteristics in database to illustrate and angle is 0 ° Figure is Texture classification training sample, and remaining 480 × 8 width figure is to examine figure;
Outex_TC12 series:Sample for training is identical with the mode chosen in Outex_TC10 series, and is used for What is examined is that textural characteristics illustrate for ' horizon ' and ' t184 ' and 480 × 2 × 9 width lines of all nine angles in database Reason figure.
3. emulation experiment content
Emulation 1, to database Outex_TC10 training sample respectively with existing LBP technologies and improved LBPV technologies And the Texture similarity of two kinds of implementations extraction training sample of the present invention, then add to-noise ratio to training sample respectively PSNR is 30db, 27db, 23db white Gaussian noise.Extract the training sample by noise pollution with above-mentioned four kinds of methods respectively again Texture similarity, to both difference of the Texture similarity card side distance function statistics of front and rear extraction, then the knot by emulation Fruit is depicted as Texture classification effect variation diagram according to the Strength Changes of institute's plus noise, as shown in Figure 5.Fig. 5 abscissa represents to add Enter the signal to noise ratio PSNR of the texture maps of noise, the intensity of reaction institute plus noise, ordinate is classification effectiveness, four kinds of methods of reaction Classifying quality.
From figure 5 it can be seen that for database TC10 training sample, with the increase of added noise intensity, LBP and LBPV's Classifying quality fall is very big, and the present invention based on set direction texture characteristic extracting method OST and based on direction The weighting texture characteristic extracting method WOST of selectivity classifying quality fall is smaller.Therefore database TC10 is contained When noise image carries out textural characteristics classification, positive effect of the invention is better than existing LBP technologies.
Emulation 2, these four methods are can obtain to database Outex_TC12 training sample using above-mentioned identical step Simulation result according to added by noise intensity change Texture classification effect variation diagram, as shown in Figure 6.Fig. 6 abscissa represents The signal to noise ratio PSNR of the texture maps of noise is added, the intensity of reaction institute plus noise, ordinate is classification effectiveness, reacts four kinds of methods Classifying quality.
As seen from Figure 6, for database TC12 training sample, with the increase of added noise intensity, LBP and LBPV's Classifying quality fall is also very big, and the present invention based on set direction texture characteristic extracting method OST and based on side It is also smaller to the weighting texture characteristic extracting method WOST of selectivity classifying quality fall.Therefore to database TC12 Noisy acoustic image carry out textural characteristics classification when, effect of the invention will also be substantially better than existing LBP technologies.
4. the simulation experiment result is analyzed
The result of above-mentioned emulation 1 and emulation 2 is subjected to Texture classification statistics, such as table 1.
Table 1:Classifying quality comparison sheet using four kinds of methods to plus noise texture image
As seen from Table 1, declined by a big margin with the classifying quality of the existing LBP methods of enhancing of interference noise, it is same to improve Type LBPV also declines substantially.And though the classifying quality of the present invention can also decline, amplitude is smaller, while can be seen and be based on direction Selective texture characteristic extracting method OST improves 40% on the classifying quality of the TC10 databases after adding noise compared with LBP, Add on the classifying quality of the TC12 databases after noise and improve 25%, while the weighting texture based on set direction compared with LBP On database TC10 and TC12 of the feature extracting method WOST after two kinds of addition noises classifying quality 15% is improved compared with LBPV. Therefore present invention effect in terms of the Texture classification of noisy acoustic image protrudes, compared with prior art with obvious advantage.

Claims (3)

1. a kind of texture characteristic extracting method based on set direction, comprises the following steps:
(1) the pending image that size is N × N is inputtedOrientation choosing principles according to optic nerve simulate any pixel point x,Space structure distribution character:Wherein It is the set for the n pixel chosen from border circular areas around pixel x, xiIth pixel point is represented,Representative includes A kind of arranged mode responded in number,Represent pixel x and its peripheral region pixel setBetween it is mutual Type of action;
(2) pixel x space structure distribution is judged
(2a) calculates pixel x azimuth:
<mrow> <mi>&amp;theta;</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>=</mo> <mi>a</mi> <mi>r</mi> <mi>c</mi> <mi>t</mi> <mi>a</mi> <mi>n</mi> <mfrac> <mrow> <msub> <mi>G</mi> <mi>v</mi> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> </mrow> <mrow> <msub> <mi>G</mi> <mi>h</mi> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> </mrow> </mfrac> </mrow>
Wherein Gv(x),Gh(x) represent respectively image Jing Guo Prewitt operator edge detections vertically with horizontal direction Gradient magnitude,Wherein,The Prewitt being vertically oriented is calculated Son,The Prewitt operators being horizontally oriented, ' * ' represent convolution algorithm;
(2b) sets azimuth discrimination threshold:By pixel x azimuth angle theta (x) and its peripheral region pixel point setIn each pixel xiAzimuth angle theta (xi) difference absolute value | θ (x)-θ (xi) | withValue is compared:
If the absolute value of the difference at the azimuth of two pixels is less than threshold valueWhen, it is determined that the interaction between both is closed It is for excitation types, is represented with ' 1 ';Otherwise, it determines the interaction relationship between both is represented to suppress type with ' 0 ';
Pixel x and n pixel of surrounding type of action are determined by the distribution situation of n ' 1 ', ' 0 ' valuesObtain picture Vegetarian refreshments x space structure distribution
(3) pixel x space structure is distributedIt is summarized as n kinds and is based on set direction sexual norm
(3a) is according to peripheral region pixel point setThe number n of middle pixel, angularly size is by 360 degree of circular regional areas N classes are divided into, angle corresponding to per class isJ=0,1 ..., n-1, this n class is defined as n kinds and is based on set direction Pattern
(3b) takes all regions surrounded by excitation ' 1 ', therefrom chooses the region of maximum, and the maximum region will meet outside region not Can comprising excitation ' 1 ', by n kinds in angle corresponding to maximum region and (3a) classify based on set direction sexual normAngle Degree is matched, what the match is successfulIt is included into corresponding pattern
(4) according to the gradient magnitude G calculated in step (2a)hAnd G (x)v(x) pixel x grey scale change value, is calculated:
(5) direct statistical pixel point x spatial structure modelQuantity, draw Texture similarity:
The result that (5a) sorts out according to step (3b), direct statistical pictureIn it is all meet n kinds classification in kth kind be based on direction Selective modeSpace structure distribution quantity H (k):
WhereinN represents the size of input picture, k ∈ [1, n];
The take up space percentages of structure distribution total quantity of H (k) are depicted as Texture similarity by (5b) with MATLAB instruments, the texture Histogram is the result that image carries out texture feature extraction.
2. according to the method for claim 1, wherein described in step (3b) by n in angle corresponding to maximum region and (3a) Kind classification based on set direction sexual normAngle matched, be first by it is all with ' 1 ' for head and the tail surround region arrange Go out, therefrom find out the maximum region of area;Angle corresponding to the region and classified n kinds are based on set direction sexual norm againAngle contrast one by one, angle identical matches together, i.e. corresponding space structure distributionBelong to the direction of matching Selective mode
3. a kind of texture characteristic extracting method based on set direction, comprises the following steps:
1) the pending image that size is N × N is inputtedOrientation choosing principles according to optic nerve simulate any pixel point x,Space structure distribution character:Wherein It is the set for the n pixel chosen from border circular areas around pixel x, xiIth pixel point is represented,Representative includes A kind of arranged mode responded in number,Represent pixel x and its peripheral region pixel setBetween it is mutual Type of action;
2) pixel x space structure distribution is judged
2a) calculate pixel x azimuth:
<mrow> <mi>&amp;theta;</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>=</mo> <mi>a</mi> <mi>r</mi> <mi>c</mi> <mi>t</mi> <mi>a</mi> <mi>n</mi> <mfrac> <mrow> <msub> <mi>G</mi> <mi>v</mi> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> </mrow> <mrow> <msub> <mi>G</mi> <mi>h</mi> </msub> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> </mrow> </mfrac> </mrow>
Wherein Gv(x),Gh(x) represent respectively image Jing Guo Prewitt operator edge detections vertically with horizontal direction Gradient magnitude,Wherein,The Prewitt being vertically oriented is calculated Son,The Prewitt operators being horizontally oriented, ' * ' represent convolution algorithm;
2b) set azimuth discrimination threshold:By pixel x azimuth angle theta (x) and its peripheral region pixel point setIn each pixel xiAzimuth angle theta (xi) difference absolute value | θ (x)-θ (xi) | withValue is compared:
If the absolute value of the difference at the azimuth of two pixels is less than threshold valueWhen, it is determined that the interaction between both is closed It is for excitation types, is represented with ' 1 ';Otherwise, it determines the interaction relationship between both is represented to suppress type with ' 0 ';
Pixel x and n pixel of surrounding type of action are determined by the distribution situation of n ' 1 ', ' 0 ' valuesObtain picture Vegetarian refreshments x space structure distribution
3) pixel x space structure is distributedIt is summarized as n kinds and is based on set direction sexual norm
3a) according to peripheral region pixel point setThe number n of middle pixel, angularly size by 360 degree of circular regional areas draw It is divided into n classes, angle corresponding to per class isJ=0,1 ..., n-1, this n class is defined as n kinds and is based on set direction mould Formula
All regions surrounded by excitation ' 1 ' 3b) are taken, therefrom choose the region of maximum, the maximum region will meet outside region not Can comprising excitation ' 1 ', by angle corresponding to maximum region and 3a) in n kinds classify based on set direction sexual normAngle Matched, what the match is successfulIt is included into corresponding pattern
4) according to step 2a) in the gradient magnitude G that has calculatedhAnd G (x)v(x) pixel x grey scale change value, is calculated:
5) by pixel x spatial structure modelsWith grey scale change valueWith reference to being depicted as Texture similarity:
First, grey scale change value is addedStatistical pictureIn it is all meet n kinds classification in kth kind be based on set direction PatternSpace structure distribution quantity Hw(k):
Wherein w (x) is the weighted value set according to pixel x grey scale change value, is directly taken to simplify calculatingN represents the size of input picture, k ∈ [1, n];
Then, by Hw(k) percentage of the structure distribution that takes up space total quantity is depicted as the texture Nogata of weighting with MATLAB instruments Figure, the weighting Texture similarity finally obtained are the result that image carries out texture feature extraction.
CN201510155433.9A 2015-04-02 2015-04-02 The method of gray level image texture feature extraction based on orientation selectivity Active CN104732238B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510155433.9A CN104732238B (en) 2015-04-02 2015-04-02 The method of gray level image texture feature extraction based on orientation selectivity

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510155433.9A CN104732238B (en) 2015-04-02 2015-04-02 The method of gray level image texture feature extraction based on orientation selectivity

Publications (2)

Publication Number Publication Date
CN104732238A CN104732238A (en) 2015-06-24
CN104732238B true CN104732238B (en) 2018-03-06

Family

ID=53456112

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510155433.9A Active CN104732238B (en) 2015-04-02 2015-04-02 The method of gray level image texture feature extraction based on orientation selectivity

Country Status (1)

Country Link
CN (1) CN104732238B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107862709B (en) * 2017-09-28 2020-03-27 北京华航无线电测量研究所 Image texture description method of multidirectional mode connection rule
CN109272017B (en) * 2018-08-08 2022-07-12 太原理工大学 Vibration signal mode identification method and system of distributed optical fiber sensor
CN115330774B (en) * 2022-10-12 2023-05-19 南通林多光学仪器有限公司 Welding image molten pool edge detection method

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102831427A (en) * 2012-09-06 2012-12-19 湖南致尚科技有限公司 Texture feature extraction method fused with visual significance and gray level co-occurrence matrix (GLCM)

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8406483B2 (en) * 2009-06-26 2013-03-26 Microsoft Corporation Boosted face verification

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102831427A (en) * 2012-09-06 2012-12-19 湖南致尚科技有限公司 Texture feature extraction method fused with visual significance and gray level co-occurrence matrix (GLCM)

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Inhibition, spike threshold, and stimulus selectivity in primary visual cortex;Priebe N J等;《scienceDirect》;20080228;第57卷(第4期);全文 *
彩色航空图像森林纹理特征提取方法的研究;毕于慧;《中国博士学位论文全文数据库 农业科技辑》;20070815(第2期);全文 *

Also Published As

Publication number Publication date
CN104732238A (en) 2015-06-24

Similar Documents

Publication Publication Date Title
CN110097543B (en) Hot-rolled strip steel surface defect detection method based on generation type countermeasure network
CN104537393B (en) A kind of traffic sign recognition method based on multiresolution convolutional neural networks
CN106650786A (en) Image recognition method based on multi-column convolutional neural network fuzzy evaluation
CN109919177B (en) Feature selection method based on hierarchical deep network
CN103208001B (en) In conjunction with shape-adaptive neighborhood and the remote sensing image processing method of texture feature extraction
CN111046964B (en) Convolutional neural network-based human and vehicle infrared thermal image identification method
CN106407917A (en) Dynamic scale distribution-based retinal vessel extraction method and system
CN107153816A (en) A kind of data enhancement methods recognized for robust human face
CN109858439A (en) A kind of biopsy method and device based on face
CN101615245A (en) Expression recognition method based on AVR and enhancing LBP
CN107844795A (en) Convolutional neural networks feature extracting method based on principal component analysis
CN106127196A (en) The classification of human face expression based on dynamic texture feature and recognition methods
CN105005765A (en) Facial expression identification method based on Gabor wavelet and gray-level co-occurrence matrix
CN101383008A (en) Image classification method based on visual attention model
CN110163286A (en) Hybrid pooling-based domain adaptive image classification method
CN107305691A (en) Foreground segmentation method and device based on images match
CN108764298A (en) Electric power image-context based on single classifier influences recognition methods
CN104732238B (en) The method of gray level image texture feature extraction based on orientation selectivity
CN110232390B (en) Method for extracting image features under changed illumination
CN106874825A (en) The training method of Face datection, detection method and device
CN106874929A (en) A kind of pearl sorting technique based on deep learning
CN105956570A (en) Lip characteristic and deep learning based smiling face recognition method
CN109377487A (en) A kind of fruit surface defect detection method based on deep learning segmentation
CN105913463A (en) Position prior principle-based texture-color characteristic overall saliency detection method
CN111738178A (en) Wearing mask facial expression recognition method based on deep learning

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant