CN103700119A - Local texture description method based on local grouping comparison mode column diagram - Google Patents

Local texture description method based on local grouping comparison mode column diagram Download PDF

Info

Publication number
CN103700119A
CN103700119A CN201310612650.7A CN201310612650A CN103700119A CN 103700119 A CN103700119 A CN 103700119A CN 201310612650 A CN201310612650 A CN 201310612650A CN 103700119 A CN103700119 A CN 103700119A
Authority
CN
China
Prior art keywords
local
pixel
grouping comparison
comparison pattern
circle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201310612650.7A
Other languages
Chinese (zh)
Inventor
董效杰
杨杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiaotong University
Original Assignee
Shanghai Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiaotong University filed Critical Shanghai Jiaotong University
Priority to CN201310612650.7A priority Critical patent/CN103700119A/en
Publication of CN103700119A publication Critical patent/CN103700119A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention provides a local texture description method based on a local grouping comparison mode column diagram. The method specifically comprises the following steps: step 1, selecting q supporting regions on the basis of an interest region; step 2, normalizing the supporting regions into circular regions; step 3, dividing circular images into P parts through convergence strategies based on mean value ordering; step 4, calculating a local grouping comparison mode of pixel coordinate points inside the circular images; step 5, counting a local grouping comparison mode column diagram in a local characteristic region according to the suffix of the local grouping comparison mode, so as to form local texture description of a single supporting region; step 6, cascading the local texture descriptions of the supporting regions to obtain a local grouping comparison mode column diagram for characteristic regions, and forming a local texture descriptor. According to the method, actually, a gray change column diagram for pixel point neighbourhood pixels with consistent gray values is calculated, and the descriptor formed by the method is strong in discrimination performance and has excellent robustness in illumination transformation and geometric transformation.

Description

Local grain describing method based on local grouping comparison pattern histogram
Technical field
The present invention relates to a kind of describing method of partial image texture characteristic of computer vision field, specifically, what relate to is a kind of local grain describing method based on local grouping comparison pattern histogram.
Background technology
Because image local feature has that the property distinguished is good, reappearance is high, strong robustness, Geometrical change and illumination variation are kept to the feature such as sane, they are widely used in fields such as image and video frequency searching, image registration, target following, target identification, target classification, Texture classification, robot location, wide baseline couplings.
The research of image local feature comprises three aspects: content: feature extraction, feature are described and characteristic matching.The research that image local feature extracts is comparative maturity.What now focus was maximum is that image local feature is described, annual at the top-level meeting ICCV(IEEE in vision field International Conference on Computer Vision), CVPR(IEEE Conference on Computer Vision and Pattern Recognition), ECCV(European Conference on Computer Vision) on have the high-quality paper publishing of being correlated with.Image local texture description has been subject to researchist and has more and more paid close attention to, and is successfully applied in fields such as computer vision and pattern-recognitions.
The core of topography's feature descriptor is the unchangeability of research descriptor and the property distinguished, yet the property the distinguished power of feature descriptor is often contradiction with its unchangeability.But an outstanding feature descriptor should have the very strong unchangeability property distinguished simultaneously.
In many topography's feature descriptors, SIFT(Scale Invariant Feature Transform) be wherein most widely used a kind of topography feature descriptor.Also can say that SIFT is the landmark work in one, topography's feature descriptive study field.Because SIFT has good unchangeability and the stronger property distinguished to image conversions such as dimensional variation, rotation variation, visual angle change and illumination variation, since it is suggested, in the fields such as object identification, wide baseline image coupling, three-dimensional reconstruction, image retrieval, be widely used soon.Topography's feature descriptor has also been caused more widely in computer vision field to be paid close attention to.
Topography's feature descriptor is probably divided into following a few class:
1. based on the representative topography's feature descriptor of wave filter technology, have: Steerable filters, Gabor filters, and complex filters etc.;
2. the representative topography's feature descriptor based on distributing has: SIFT, SURF, the shape context, DAISY, PCA-SIFT, spin image, RIFT and GLOH etc.;
3. based on the representative topography's feature descriptor of gradient, have: local gray value invariants etc.;
Other: Moment-based descriptors, Phase-based local features, Color-based descriptors.
Through retrieval, (patent publication No.: CN103295014A) this patented technology is recent disclosed a kind of topography describing method that Chinese invention patent application " is arranged histogrammic image local feature describing method based on location of pixels ", the method is by a kind of topography describing method of grey scale change weighted statistical pixel adjoint point positional alignment, and the positional alignment of pixel adjoint point is by selecting fixedly adjoint point coordinate and gray scale sequence to obtain.Selecting fixing adjoint point coordinate is in order to guarantee to add up the consistance of positional alignment, then by its gray-scale intensity sequence is calculated to its positional alignment.Although the method has obtained good differentiation and robustness, also there are the following problems: the full arrangement of pixel adjoint point position has much relations with the adjoint point number of choosing, and is the factorial of adjoint point number.If the adjoint point quantity of choosing is many, the full number of the arrangement of position can be very large.Full arrangement as 5 is 120, and 6 full arrangement is exactly 720.Although throughput is dissolved the large histogram Sparse Problems that may bring of the full number of the arrangement of having determined, calculated amount is bigger than normal greatly, quantizes further to have increased again calculated amount.Descriptor proposed by the invention only needs the magnitude relationship of the gray scale of contrast grouping interior element, has omitted the ordered steps that calculating location is arranged, and does not also need further quantification simultaneously.
Summary of the invention
For defect of the prior art, the object of this invention is to provide a kind of local grain describing method based on local grouping comparison pattern histogram, the topography descriptor constructed in conjunction with this textural characteristics has stronger differentiation and robustness, greatly reduced calculated amount.
Deficiency for the existing descriptor property distinguished, the present invention proposes a kind of image local texture description method based on local grouping comparison pattern.The local grain descriptor obtaining by the method has that the property distinguished is good, the characteristic of performance robust, its discrimination in normal applied environment higher than classical SIFT descriptor, the MROGH descriptor and the PPH descriptor that propose in the recent period.
The local feature of supposing image detects son extraction, i.e. region-of-interest by Hessian-affine.The present invention is achieved through the following technical solutions:
Step 1, selects q supporting zone based on region-of-interest;
The different values of q can have a certain impact to result, and value is larger, and the differentiation of topography's feature descriptor is just stronger, but the also corresponding increase of the dimension of topography's feature descriptor;
Step 2, normalization supporting zone is border circular areas;
Step 3, the strategy that converges based on average sequence is divided into p part by circular image;
The different values of p can have a certain impact to result, value is larger, and the differentiation of topography's feature descriptor is just stronger, but the also corresponding increase of the dimension of topography's feature descriptor, but too large words can cause again histogram sparse, reduce on the contrary the robustness of topography's feature descriptor;
Step 4, the local grouping comparison pattern of pixel coordinate point in calculating circular image;
Step 5, adds up the local grouping comparison pattern histogram of local characteristic area according to the subscript of local grouping comparison pattern, the local grain that forms single supporting zone is described;
Step 6, it is the local grouping comparison pattern histograms of many characteristic areas that the local grain of a plurality of supporting zones is described to cascade, forms described local grain descriptor.
Preferably, in described step 1, region-of-interest is expanded in proportion, the major axis equivalent of topography is increased, specific as follows:
The parameter matrix of supposing region-of-interest is A ∈ R 2 * 2, by the parameter matrix A of following formula definition ielliptic parameter matrix as many supporting zones of selecting:
A i = 1 r i 2 A , i = 1,2 , · · · , q
r i=1+0.5×(i-1)
A is the elliptic parameter matrix of region-of-interest, A ibe the elliptic parameter matrix of i supporting zone of selection, by above formula, known A 1be original region-of-interest, q is many supporting zones number of choosing, and q value is 4.
Preferably, in described step 2, the border circular areas that is radius r by normalization supporting zone, specific as follows:
X = 1 r A i - 1 2 X ′ = T - 1 X ′
Wherein X' is the pixel coordinate in circular image, and r represents that the radius of circular image and value are that 20.5, X is pixel coordinate in oval region-of-interest.Because the pixel coordinate in circular image is mapped to coordinate corresponding to elliptical region, it not generally integer, not on pixel, so need to adopt interpolation technique to calculate the gray-scale intensity of pixel coordinate in circular image, the present invention preferably adopts bilinear interpolation technology.
Preferably, described step 3, specific implementation comprises the steps:
3.1, adopt mean filter to carry out smoothly circular image;
3.2, pixels all in circular image is regarded as to a set, the gray-scale intensity of pixel in set is subtracted to mode is ascending to sort according to non-;
3.3, finally the gray scale sequence having sorted is divided into p equal portions, by its corresponding pixel coordinate, form a corresponding p subset, p value is 6.
Preferably, described step 4, specific implementation comprises the steps:
4.1, in circular image centered by pixel coordinate point, on the fixing circle of radius size, uniformly-spaced choose the individual pixel of 3d (d >=2);
4.2 will choose 3d pixel is divided into d group, and every group of pixel comprising in three pixels and group also must meet on circle and be spacedly distributed;
4.3, contrast the magnitude relationship of every group of interior pixel gray-scale intensity and comparing result of each group of combination is encoded as local grouping comparison;
4.4, the gray standard deviation of 3d the pixel that calculating is chosen, both are in conjunction with forming the local grouping comparison pattern of describing this pixel.
Choosing of the individual pixel of described 3d (d >=2), position and the gray-scale value of the selected pixel coordinate point of different starting points also have certain difference.To be the axial intersection point of x uniformly-spaced choose 3d pixel as starting point on circle in circle and the center of circle local Cartesian coordinates that are coordinate origin of can take; Also the round local Cartesian coordinates that are coordinate origin with the center of circle of can usining are that the axial intersection point of y is uniformly-spaced chosen 3d pixel on circle as starting point; The crossing point of positive dirction and the circle of pixel coordinate point line in the center of circle of circular image and circular image of can certainly take is uniformly-spaced chosen 3d pixel as starting point on circle.Can find out, concrete choosing method is a lot.In the present invention, be to take the crossing point of positive dirction and the circle of pixel coordinate point line in the center of circle of circular image and circular image on circle, uniformly-spaced to choose 3d pixel as starting point.
Preferably, described step 5, specific implementation comprises the steps:
5.1, add up respectively the local grouping comparison pattern in each subset with identical local grouping comparison coding;
5.2, in order to eliminate the impact of linear light photograph, the local grouping comparison pattern histogram of each subset generating is normalized respectively;
5.3, then the local grouping comparison pattern histogram of p subset is cascaded up and just formed the local grouping comparison pattern histogram of circular image, the local grain that forms single supporting zone is described.
Preferably, described step 6, specific implementation comprises the following steps:
Step 61, calculates the proper vector descriptor of all supporting zones to step 5 according to step 2 in claim 1;
Step 62, cascades up a plurality of topographies proper vector descriptor calculating just to form the feature descriptor HLGCP of topography proposed by the invention:
D=(D(S 1),D(S 2),…,D(S q));
D(S i), i=1,2 ..., q represents the proper vector descriptor of i supporting zone, q value is 4.
Compared with prior art, the present invention has following beneficial effect:
Descriptor proposed by the invention only needs the magnitude relationship of the gray scale of contrast grouping interior element, has omitted the ordered steps that calculating location is arranged, and does not also need further quantification simultaneously.So just reduce the complexity of calculating descriptor, saved the processing time.The present invention adds up in fact is that grey scale change that the gray-scale value magnitude relationship of neighborhood of pixel points pixel is consistent is as the Expressive Features of pixel, the descriptor of this kind of method formation is distinguished property by force and illumination conversion and geometric transformation is had to good robustness, with existing descriptor contrast, best performance.With respect to other descriptor, under the same conditions, the descriptor that the present invention proposes just can obtain more unique point of the same name, thereby has improved the parameter index based on unique point application of the same name, as the matching precision of images match.
Accompanying drawing explanation
By reading the detailed description of non-limiting example being done with reference to the following drawings, it is more obvious that other features, objects and advantages of the present invention will become:
Fig. 1 is image local feature describing method basic flow sheet;
Fig. 2 is 3 * 3 mean filter mask artworks;
Fig. 3 is pixel coordinate point X iadjoint point is selected schematic diagram;
Fig. 4 is the Performance Evaluation curve of descriptor under geometric transformation;
Fig. 5 is the Performance Evaluation curve of descriptor under illumination conversion.
Embodiment
Below in conjunction with specific embodiment, the present invention is described in detail.Following examples will contribute to those skilled in the art further to understand the present invention, but not limit in any form the present invention.It should be pointed out that to those skilled in the art, without departing from the inventive concept of the premise, can also make some distortion and improvement.These all belong to protection scope of the present invention.
The detection in image local feature region and description are separate two parts, can select different local features to detect the performance that the sub feature detecting is assessed carried descriptor.But no matter adopt which kind of local feature to detect son, the assessment result of final calculated descriptor performance is all consistent, i.e. the ranking results of descriptor performance test curve does not change with the local feature detection method adopting.The present invention has adopted Hessian-affine to detect son, and it is also when assessment descriptor, to select one of maximum detection.Hessian-affine detects the spot structure of son in can detected image, and these structures seldom appear at the position that the degree of depth changes, and therefore, Hessian-affine detects the hypothesis that region that son detects can meet part plan and local smoothing method well.Although Hessian-affine detects son and show extraordinary robustness under illumination conversion and geometric transformation, these unique points may appear at the discontinuous place of the degree of depth.
Fig. 1 has provided the basic flow sheet of the embodiment of the present invention, and step is:
1, region-of-interest is expanded in proportion, the major axis equivalent of topography is increased, specifically describe as follows:
The parameter matrix of supposing region-of-interest is A ∈ R 2 * 2, by the parameter matrix A of following formula definition ielliptic parameter matrix as many supporting zones of selecting:
A i = 1 r i 2 A , i = 1,2 , · · · , q
r i=1+0.5×(i-1)
A is the elliptic parameter matrix of region-of-interest, A iit is the elliptic parameter matrix of i supporting zone of selection.By above formula, known A 1be original region-of-interest, q is many supporting zones number of choosing, and q value is 4.
2, the border circular areas that is radius r by normalization supporting zone, specifically describes as follows:
X = 1 r A i - 1 2 X ′ = T - 1 X ′
Wherein X' is the pixel coordinate in circular image, and r represents that the radius of circular image and value are that 20.5, X is pixel coordinate in oval region-of-interest.Because the pixel coordinate in topography is mapped to coordinate corresponding to elliptical region, it not generally integer, not on pixel, so need to adopt interpolation technique to calculate the gray-scale intensity of pixel coordinate in circular image, the present embodiment has adopted bilinear interpolation technology.
3, by mean filter, circular image is carried out to filtering.The present embodiment adopts 3 * 3 mean filter masks as shown in Figure 2.
4, all pixels in circular image after filtering are regarded as to a set, gray-scale intensity corresponding to set interior element subtracted to mode is ascending to sort according to non-, and the gray scale sequence of sequence is divided into p part, by its corresponding pixel coordinate, form a corresponding p subset.Specifically describe as follows:
Use R={X 1, X 2..., X nrepresent the set of pixel coordinate that circular image comprises, I (X i) represent pixel X i, i=1,2 ..., N gray-scale intensity, subtracts according to gray-scale intensity is non-that mode is ascending to sort, and supposes that the result of sequence is as follows:
{X f(1),X f(2),…,X f(N):I(X f(1))≤I(X f(2))≤…≤I(X f(N))}
Subscript f (1) in above formula, f (2) ..., f (N) is pixel subscript 1,2 in pixel coordinate set ..., N is according to the non-rank results subtracting of gray-scale value.Then the pixel coordinate after sequence is divided into p subset, p=6 in the present embodiment.I subset R wherein imathematic(al) representation as follows:
R i={X j∈R:t i-1≤I(X j)≤t i},i=1,2,…,p
Wherein
t i = I ( X f ( s i ) ) : t 0 ≤ t 1 ≤ · · · ≤ t p
S ifor sequence 1,2 ..., the sequential value that when N} is divided into p equal portions, i is ordered.
The pixel coordinate point of 5, take in circular image is drawn the circle that radius is r, r=3 in the present embodiment as the center of circle.The point that the positive dirction of circular image centre coordinate and pixel coordinate point line and circle are crossing is uniformly-spaced chosen to 3d pixel as starting point, be expressed as
Figure BDA0000422988420000081
D=2 in the present embodiment, i.e. pixel coordinate point X i6 adjoint point systems of selection as shown in Figure 3.Wherein O is the centre coordinate of circular image, X iit is arbitrary pixel in circular image.Connect
Figure BDA0000422988420000082
with with circle intersection point be that starting point is uniformly-spaced chosen 6 pixels as X on circle iadjoint point, be expressed as
Figure BDA0000422988420000084
6, pixel coordinate point X iadjoint point coordinate set be
Figure BDA0000422988420000085
its corresponding gray-scale intensity set is
Figure BDA0000422988420000086
coordinate set is pressed to following formula grouping:
G g = { X i g + 1 , X i g + 1 + d , X i g + 1 + 2 d } , g = 0,1 , · · · , d - 1
7, calculating pixel point X ilocal grouping comparison coding.
LGC ( X i ) = Σ g = 0 d - 1 3 ( g - 0 ) × f ( I ( X i g + 1 ) , I ( X i g + 1 + d ) , I ( X i g + 1 + 2 d ) )
Wherein
f ( x , y , z ) = 0 x ≥ y , x ≥ z 1 y ≥ x , y ≥ z 2 z ≥ x , z ≥ y
represent corresponding pixel points
Figure BDA00004229884200000811
gray-scale value, d is all adjoint point coordinate sets institute grouping number, d value is 2.
8, the standard deviation of calculating all Neighbor Points gray-scale values is:
σ = 1 3 d - 1 Σ j = 1 3 d ( I ( X i j ) - E ( X i ) ) 2
Wherein E ( X i ) = 1 3 d Σ j = 1 3 d I ( X i j ) ;
9, pixel X ilocal feature---local grouping comparison pattern
LGCP LGC ( X i ) = σ
σ is standard deviation, the variation of tolerance gray scale, LGC (X i) be local grouping comparison coding, the magnitude relationship of adjoint point coordinate respective pixel gray-scale value has been described.Pixel coordinate point X in the present embodiment ithe local grouping comparison mode computation of 6 adjoint points examples as shown in Figure 3.
10,, according to the local grouping comparison pattern of all elements in the subset of subscript Statistics Implementation mode the 4th step division of local grouping comparison pattern, just form 3 of corresponding subset ddimensional feature vector.Subset R iproper vector be:
H(R i)=(H(0),H(1),…,H(3 d-1))
Wherein H ( LGC ( X i ) ) = Σ X i ∈ R i LGCP LGC ( X i ) , LGC(X i)=0,1,…,3 d-1
Above-mentioned LGC (X i) be an X ilocal grouping comparison coding,
Figure BDA0000422988420000092
an X ilocal grouping comparison pattern.
Due to d=2 in the present embodiment, so the proper vector dimension of each subset is 9 dimensions.
11, the proper vector normalized to each subset.
The proper vector of 12, p sub-set pair being answered cascades up, and has just formed 3 dthe proper vector descriptor D (S) of * p dimension.
D(S)=(H(R 1),H(R 2),…,H(R p))
Due to d=2 in the present embodiment, p=6, so the proper vector dimension of border circular areas is 54 dimensions.
13,, according to the description vectors of all supporting zones of 2~12 step double counting, finally the description vectors of each supporting zone is cascaded up and has just formed the final descriptor HLGCP(Histogram of the Local Group Comparison Pattern of topography):
D=(D(S 1),D(S 2),…,D(S q))。
D(S i), i=1,2 ..., q represents the proper vector descriptor of i supporting zone, q value is 4.
According to various values in above-mentioned the present embodiment, the dimension of HLGCP descriptor is 3 d* p * q=9 * 6 * 4=216 dimension.
14, based on PR curve, HLGCP and PPH, MROGH, GLOH, SIFT, PCA-SIFT and spin image(are represented with SPIN in test curve here) compare to verify the performance of the descriptor HLGCP that the inventive method forms.Fig. 4 has provided two groups of test pictures and corresponding test findings.One group, left side is the image pair that has view transformation, and that group test pattern of right side is to there being obvious rotational transform, Fig. 5 provided one group of trial image that has an illumination variation to corresponding test findings.Test findings has provided the performance comparison curve of descriptor from the relation between degree of accuracy and recall rate.Degree of accuracy and the recall rate of coupling represent with precision and recall respectively, and calculate by following formula in test result:
Recall = # correct matches # correspondences
1 - precision = # false matches # correct matches + # false matches
As can be seen from the test results, the degree of accuracy of the descriptor HLGCP that the inventive method builds and recall rate are higher than other descriptor that participates in contrast, thereby HLGCP has better distinguish and robustness.
Above specific embodiments of the invention are described.It will be appreciated that, the present invention is not limited to above-mentioned specific implementations, and those skilled in the art can make various distortion or modification within the scope of the claims, and this does not affect flesh and blood of the present invention.

Claims (10)

1. the local grain describing method based on local grouping comparison pattern histogram, is characterized in that, described method is described topography by the local grouping comparison pattern histogram that calculates all pixels of normalization topography, specifically comprises the steps:
Step 1, selects q supporting zone based on region-of-interest;
Step 2, normalization supporting zone is border circular areas;
Step 3, the strategy that converges based on average sequence is divided into p part by circular image;
Step 4, the local grouping comparison pattern of pixel coordinate point in calculating circular image;
Step 5, adds up the local grouping comparison pattern histogram of local characteristic area according to the subscript of local grouping comparison pattern, the local grain that forms single supporting zone is described;
Step 6, it is the local grouping comparison pattern histograms of many characteristic areas that the local grain of a plurality of supporting zones is described to cascade, forms described local grain descriptor.
2. the local grain describing method based on local grouping comparison pattern histogram according to claim 1, is characterized in that, in described step 1, region-of-interest is expanded in proportion, and the major axis equivalent of topography is increased, and is specially:
The parameter matrix of supposing region-of-interest is A ∈ R 2 * 2, by the parameter matrix A of following formula definition ielliptic parameter matrix as many supporting zones of selecting:
A i = 1 r i 2 A , i = 1,2 , · · · , q
r i=1+0.5×(i-1)
A is the elliptic parameter matrix of region-of-interest, A ibe the elliptic parameter matrix of i supporting zone of selection, by above formula, known A 1be original region-of-interest, q is many supporting zones number of choosing, and q value is 4.
3. the local grain describing method based on local grouping comparison pattern histogram according to claim 1, is characterized in that, in described step 2, and the border circular areas that is radius r by normalization supporting zone, specific as follows:
X = 1 r A i - 1 2 X ′ = T - 1 X ′
Wherein X' is the pixel coordinate in circular image, and r represents that the radius of circular image and value are that 20.5, X is that in oval region-of-interest, pixel coordinate adopts interpolation technique to calculate the gray-scale intensity of X coordinate points.
4. the local grain describing method based on local grouping comparison pattern histogram according to claim 1, is characterized in that, described step 3, and specific implementation comprises the steps:
3.1, adopt mean filter to carry out smoothly circular image;
3.2, pixels all in circular image is regarded as to a set, the gray-scale intensity of pixel in set is subtracted to mode is ascending to sort according to non-;
3.3, finally the gray scale sequence having sorted is divided into p equal portions, by its corresponding pixel coordinate, form a corresponding p subset, p value is 6.
5. the local grain describing method based on local grouping comparison pattern histogram according to claim 1, is characterized in that, described step 4, and specific implementation comprises the steps:
4.1, in circular image centered by pixel coordinate point, on the fixing circle of radius size, uniformly-spaced choose the individual pixel of 3d (d >=2);
4.2 are divided into d group by the 3d a choosing pixel, and every group of pixel comprising in three pixels and group also must meet on circle and be spacedly distributed;
4.3, contrast the magnitude relationship of every group of interior pixel gray-scale intensity and comparing result of each group of combination is encoded as local grouping comparison;
4.4, the gray standard deviation of 3d the pixel that calculating is chosen, forms in conjunction with local grouping comparison coding the local grouping comparison pattern of describing this pixel.
6. the local grain describing method based on local grouping comparison pattern histogram according to claim 5, is characterized in that, the choosing of the individual pixel of described 3d (d >=2), and it is one of following that its method adopts:
To be the axial intersection point of x uniformly-spaced choose 3d pixel as starting point on circle in circle and the center of circle local Cartesian coordinates that are coordinate origin of take;
The round local Cartesian coordinates that are coordinate origin with the center of circle of usining are that the axial intersection point of y is uniformly-spaced chosen 3d pixel on circle as starting point;
The crossing point of positive dirction and the circle of pixel coordinate point line in the center of circle of circular image and circular image of take is uniformly-spaced chosen 3d pixel as starting point on circle.
7. the local grain describing method based on local grouping comparison pattern histogram according to claim 6, it is characterized in that, in the described center of circle of take circular image and circular image, the crossing point of the positive dirction of pixel coordinate point line and circle is uniformly-spaced chosen 3d pixel as starting point on circle, is implemented as follows:
Step 4.11, with the pixel coordinate point X in topography ifor the center of circle, drawing radius is the circle of 3 pixels, and the crossing point of positive dirction and the circle of topography center and coordinate points line of take is uniformly-spaced chosen 3d pixel formation pixel coordinate point X as starting point on circle iadjoint point coordinate set
Figure FDA0000422988410000021
and coordinate set is divided into d group, and every group comprises three elements, and group result is:
G g = { X i g + 1 , X i g + 1 + d , X i g + 1 + 2 d } , g = 0,1 , · · · , d - 1
The gray-scale value set that adjoint point coordinate set element is corresponding is
Figure FDA0000422988410000031
Step 4.12, contrasts the size of each group element gray-scale value, calculates local grouping comparison coding;
LGC ( X i ) = Σ g = 0 d - 1 3 ( g - 0 ) × f ( I ( X i g + 1 ) , I ( X i g + 1 + d ) , I ( X i g + 1 + 2 d ) )
Wherein
f ( x , y , z ) = 0 x ≥ y , x ≥ z 1 y ≥ x , y ≥ z 2 z ≥ x , z ≥ y
Figure FDA0000422988410000034
represent corresponding pixel points gray-scale value, d is all adjoint point coordinate sets institute grouping number, d value is 2;
Step 4.13, the variation of calculating adjoint point coordinate element corresponding grey scale value, i.e. standard deviation;
σ = 1 3 d - 1 Σ j = 1 3 d ( I ( X i j ) - E ( X i ) ) 2
Wherein E ( X i ) = 1 3 d Σ j = 1 3 d I ( X i j )
Wherein d is all adjoint point coordinate sets institute grouping number, and d value is 2,
Figure FDA0000422988410000038
represent corresponding pixel points gray-scale value;
Step 4.14, the poor drawn game of combined standard portion grouping comparison coding calculates local grouping comparison pattern;
LGCP LGC ( X i ) = σ
σ is standard deviation, the variation of tolerance gray scale, LGC (X i) be local grouping comparison coding, the magnitude relationship of adjoint point coordinate respective pixel gray-scale value has been described.
8. according to the local grain describing method based on local grouping comparison pattern histogram described in claim 1-7 any one, it is characterized in that, described step 5, specific implementation comprises the steps:
5.1, add up respectively the local grouping comparison pattern in each subset with same packets contrast coding;
5.2, the local grouping comparison pattern histogram of each subset generating is normalized respectively;
5.3, then the local grouping comparison pattern histogram of p subset is cascaded up and just formed the local grouping comparison pattern histogram of circular image, p value is 6, the local grain that forms single supporting zone is described.
9. the local grain describing method based on local grouping comparison pattern histogram according to claim 8, it is characterized in that, in described step 5: calculate the local grouping comparison pattern histogram of subregion according to the subscript of local grouping comparison pattern, i.e. subregion feature description vectors;
H(R i)=(H(0),H(1),…,H(3 d-1))
Wherein: H ( LGC ( X i ) ) = Σ X i ∈ R i LGCP LGC ( X i ) , LGC(X i)=0,1,…,3 d-1
Above-mentioned LGC (X i) be an X ilocal grouping comparison coding,
Figure FDA0000422988410000042
an X ilocal grouping comparison pattern.
10. according to the local grain describing method based on local grouping comparison pattern histogram described in claim 1-7 any one, it is characterized in that, described step 6, specific implementation comprises the following steps:
Step 61, calculates the proper vector descriptor of all supporting zones to step 5 according to step 2 in claim 1;
Step 62, cascades up a plurality of topographies proper vector descriptor calculating just to form the feature descriptor HLGCP of topography proposed by the invention:
D=(D(S 1),D(S 2),…,D(S q));
D(S i), i=1,2 ..., q represents the proper vector descriptor of i supporting zone, q value is 4.
CN201310612650.7A 2013-11-26 2013-11-26 Local texture description method based on local grouping comparison mode column diagram Pending CN103700119A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310612650.7A CN103700119A (en) 2013-11-26 2013-11-26 Local texture description method based on local grouping comparison mode column diagram

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310612650.7A CN103700119A (en) 2013-11-26 2013-11-26 Local texture description method based on local grouping comparison mode column diagram

Publications (1)

Publication Number Publication Date
CN103700119A true CN103700119A (en) 2014-04-02

Family

ID=50361637

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310612650.7A Pending CN103700119A (en) 2013-11-26 2013-11-26 Local texture description method based on local grouping comparison mode column diagram

Country Status (1)

Country Link
CN (1) CN103700119A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107862709A (en) * 2017-09-28 2018-03-30 北京华航无线电测量研究所 A kind of method for describing texture of image of multi-direction pattern concatenate rule
CN108780509A (en) * 2016-03-15 2018-11-09 视觉科技(以色列)有限公司 Image comparison system and method
CN108876832A (en) * 2018-05-30 2018-11-23 重庆邮电大学 Based on grouping-order modes robust texture features extracting method
CN109410258A (en) * 2018-09-26 2019-03-01 重庆邮电大学 Texture image feature extracting method based on non local binary pattern

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102314691A (en) * 2011-06-30 2012-01-11 北京平安视讯科技有限公司 Background model based on multiple information integration
US20120070041A1 (en) * 2010-09-16 2012-03-22 Jie Wang System And Method For Face Verification Using Video Sequence
CN102779273A (en) * 2012-06-29 2012-11-14 重庆邮电大学 Human-face identification method based on local contrast pattern
CN103295014A (en) * 2013-05-21 2013-09-11 上海交通大学 Image local feature description method based on pixel location arrangement column diagrams

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120070041A1 (en) * 2010-09-16 2012-03-22 Jie Wang System And Method For Face Verification Using Video Sequence
CN102314691A (en) * 2011-06-30 2012-01-11 北京平安视讯科技有限公司 Background model based on multiple information integration
CN102779273A (en) * 2012-06-29 2012-11-14 重庆邮电大学 Human-face identification method based on local contrast pattern
CN103295014A (en) * 2013-05-21 2013-09-11 上海交通大学 Image local feature description method based on pixel location arrangement column diagrams

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MARKO HEIKKILA 等: "Description of interest regions with local binary patterns", 《PATTERN RECOGNITION》, vol. 42, no. 3, 31 March 2009 (2009-03-31), pages 425 - 436 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108780509A (en) * 2016-03-15 2018-11-09 视觉科技(以色列)有限公司 Image comparison system and method
CN107862709A (en) * 2017-09-28 2018-03-30 北京华航无线电测量研究所 A kind of method for describing texture of image of multi-direction pattern concatenate rule
CN107862709B (en) * 2017-09-28 2020-03-27 北京华航无线电测量研究所 Image texture description method of multidirectional mode connection rule
CN108876832A (en) * 2018-05-30 2018-11-23 重庆邮电大学 Based on grouping-order modes robust texture features extracting method
CN108876832B (en) * 2018-05-30 2022-04-26 重庆邮电大学 Robust texture feature extraction method based on grouping-order mode
CN109410258A (en) * 2018-09-26 2019-03-01 重庆邮电大学 Texture image feature extracting method based on non local binary pattern
CN109410258B (en) * 2018-09-26 2021-12-10 重庆邮电大学 Texture image feature extraction method based on non-local binary pattern

Similar Documents

Publication Publication Date Title
David et al. Object recognition in high clutter images using line features
CN106960451B (en) Method for increasing number of feature points of image weak texture area
Yao et al. A new pedestrian detection method based on combined HOG and LSS features
US9619733B2 (en) Method for generating a hierarchical structured pattern based descriptor and method and device for recognizing object using the same
CN104809731B (en) A kind of rotation Scale invariant scene matching method based on gradient binaryzation
JP5703312B2 (en) Efficient scale space extraction and description of feature points
CN101833765B (en) Characteristic matching method based on bilateral matching and trilateral restraining
CN102592281B (en) Image matching method
CN103679702A (en) Matching method based on image edge vectors
CN103426186A (en) Improved SURF fast matching method
CN101650784B (en) Method for matching images by utilizing structural context characteristics
CN106355577A (en) Method and system for quickly matching images on basis of feature states and global consistency
JP6435048B2 (en) Image collation apparatus, image collation method, and program
Zhang et al. Line matching using appearance similarities and geometric constraints
CN103295014B (en) Image local feature description method based on pixel location arrangement column diagrams
CN101493891A (en) Characteristic extracting and describing method with mirror plate overturning invariability based on SIFT
CN103400384A (en) Large viewing angle image matching method capable of combining region matching and point matching
CN105335952B (en) Matching power flow computational methods and device and parallax value calculating method and equipment
CN102446356A (en) Parallel and adaptive matching method for acquiring remote sensing images with homogeneously-distributed matched points
CN103700119A (en) Local texture description method based on local grouping comparison mode column diagram
CN113962967B (en) Object shot image ellipse detection algorithm based on Markuling theorem constraint
CN111709426A (en) Diatom identification method based on contour and texture
CN103336964A (en) SIFT image matching method based on module value difference mirror image invariant property
CN105631860A (en) Local sorted orientation histogram descriptor-based image correspondence point extraction method
Carneiro et al. Pruning local feature correspondences using shape context

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20140402

RJ01 Rejection of invention patent application after publication