CN103258202B - A kind of texture characteristic extracting method of robust - Google Patents

A kind of texture characteristic extracting method of robust Download PDF

Info

Publication number
CN103258202B
CN103258202B CN201310158760.0A CN201310158760A CN103258202B CN 103258202 B CN103258202 B CN 103258202B CN 201310158760 A CN201310158760 A CN 201310158760A CN 103258202 B CN103258202 B CN 103258202B
Authority
CN
China
Prior art keywords
pixel
lbp
input image
label
feature set
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310158760.0A
Other languages
Chinese (zh)
Other versions
CN103258202A (en
Inventor
李宏亮
宋铁成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN201310158760.0A priority Critical patent/CN103258202B/en
Publication of CN103258202A publication Critical patent/CN103258202A/en
Application granted granted Critical
Publication of CN103258202B publication Critical patent/CN103258202B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses the texture characteristic extracting method of a kind of robust, belong to technical field of image processing.The step that realizes of the present invention is: input picture carries out pretreatment, generates feature set F, is then based on the threshold value of each feature by feature set F binaryzation, then carries out binary-coding generation specific pixel label;Input picture is rotated constant uniform LBP coding simultaneously, generates the LBP label of each pixel;Coexisted rectangular histogram by the specific pixel label of each pixel and LBP label configurations 2-D, be used for texture after the histogram vectors that will coexist and express.The application of the present invention, can reduce two-value in existing LBP mode and quantify loss, maintain simultaneously and extract the feature robustness to illumination, rotation, yardstick and visual angle change.

Description

Robust texture feature extraction method
Technical Field
The invention belongs to the technical field of image processing, and particularly relates to a robust texture feature extraction method.
Background
Texture features play an important role in visual recognition, and have been widely researched and applied in the fields of texture classification, retrieval, synthesis, segmentation, and the like. In general, texture images not only exhibit a wide variety of geometric and lighting changes, but are often accompanied by drastic intra-class and inter-class variations. Texture classification is a difficult task when a priori knowledge is not available. Therefore, extracting robust texture features is a core problem to solve these tasks.
Over the past several decades, many methods have been proposed to extract textural features. Early research focused on different statistical-based methods, model-based and signal-processing features such as co-occurrence matrices, markov random fields and filter-based methods, etc. Later, methods based on primitives (textons) and Local Binary Patterns (LBP) were proposed. The former requires a learning process to build a primitive dictionary by clustering local features of the training images, and then to build a histogram expression by counting the primitive frequencies for a given texture. The local gray difference binary quantization coding is directly expressed by local gray difference binary quantization coding without a training process, so that local microstructure information of the texture is extracted. Examples of primitive-based methods are found in M.VarmaandA.Zisserman, "Astatistica procedure of atomic material Classification-locationing image Patchexemplars," IEEETrans.PatternAnnal.Mach.Intell., vol.31, No.11, pp.2032-2047, Nov.2009; LBP is specifically referenced: t.ojala, m.pietikaine, andt.maenpaa, "multiresolution-scaleandroationvariationtexturceinstrassication with localbinaryattirans," ieee trans.pattern. mach.intell, vol.24, No.7, pp.971-987, jul.2002.
The LBP uses a binary sequence to describe the characteristics of local texture, for each pixel point of the image, the pixel value of the pixel point is subtracted from the pixel value of the neighboring sampling pixel point around the pixel point to obtain a corresponding sequence, the symbol value of the corresponding sequence is taken to obtain a binary sequence coded as 0/1, and then the binary sequence is converted into decimal numbers to be used as the texture identification of the pixel point, namely the LBP code of the pixel point.
LBP is well known for its simplicity and efficiency, and has been widely used in the fields of texture classification, face recognition, and object detection. In recent years, a number of improved algorithms have been proposed based on LBP, most of which are attributable to the following: selection of local patterns (e.g., ring or disk geometry), sampling features (e.g., high order differential features, Garbor features, differential amplitude, block-based gray-scale mean), quantization approaches (e.g., three-level quantization and adaptive quantization), coding rules (e.g., split ternary coding, statistical sign number), and extensions to high dimensions (e.g., stereo LBP, uniform spherical region description, Garbor stereo LBP, and color LBP).
LBP trades binary quantization for high efficiency of extracting local structure information and poor robustness to noise, and in order to effectively reduce quantization loss and maintain robustness of extracted features, it is necessary to improve the current LBP method.
Disclosure of Invention
The invention aims to: a robust texture feature extraction method is provided to reduce binary quantization loss in the existing LBP mode, and meanwhile, robustness of extracted features to illumination, rotation, scale and view angle changes is maintained.
The invention relates to a robust texture feature extraction method, which comprises the following steps:
step 1: generating a specific pixel label L (x) of the input image I, wherein x represents a pixel point of the input image I;
101: generating an n-dimensional feature set F ═ F for an input image Ii(x)|i=1,2,...,n};
102: and carrying out binarization processing on the feature set F to obtain a binary feature set B ═ Bi(x)|i=1,2,...,n;x∈I}:
If fi(x) Greater than or equal to the threshold value thr of the featureiThen b isi(x) The value is 1; otherwise is0;
103: encoding the binary feature set B to generate a specific pixel label L (x) of each pixel:
L ( x ) = Σ i = 1 n b i ( x ) 2 i - 1 ;
step 2, carrying out rotation invariant uniform LBP coding on the input image I to generate an LBP label Z (x) of each pixel point;
and 3, constructing a 2-dimensional symbiotic histogram based on the specific pixel label L (x) and the LBP label Z (x) of each pixel point, and vectorizing and outputting the 2-dimensional symbiotic histogram.
The original LBP method extracts texture features from only the difference information of a small-range neighborhood of the original image, and is sensitive to noise. The invention provides a robust texture extraction method, which realizes extraction of texture features based on high-order gradient domain information on a larger support region, and the extracted texture features are more robust and have expression force and discrimination force compared with the conventional LBP (local binary pattern) mode. Compared with the texture feature extraction method based on the primitive, the method omits the training and clustering processes which occupy a large amount of system resources, and directly performs binary quantization and coding on the feature set of the generated input image, so that the method is simpler, more convenient and more efficient to realize compared with the method based on the primitive.
In order to obtain a relatively stable quantization threshold, and to maintain the efficiency of the present invention in extracting texture features, in step 102,threshold value thriComprises the following steps: characteristic f corresponding to each pixel point on input image IiMean over the whole image I.
In conclusion, the method has the advantages of being simple and convenient to implement, reducing binary quantization loss in the conventional LBP mode, maintaining robustness of extracted features on illumination, rotation, scale and visual angle change and enabling the extracted texture features to have high discriminability.
Drawings
The invention will now be described, by way of example, with reference to the accompanying drawings, in which:
FIG. 1 is a flow chart of an embodiment of the present invention.
Detailed Description
All of the features disclosed in this specification, or all of the steps in any method or process so disclosed, may be combined in any combination, except combinations of features and/or steps that are mutually exclusive.
Any feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving equivalent or similar purposes, unless expressly stated otherwise. That is, unless expressly stated otherwise, each feature is only an example of a generic series of equivalent or similar features.
In the texture feature extraction process of the present invention, in order to reduce the binary quantization loss and obtain the feature with more judgment power, the present invention extracts useful information from a larger local neighbor support region, and simultaneously maintains the robustness of the extracted feature to illumination, rotation, scale and view angle changes, see fig. 1, and the specific implementation steps thereof are:
step S100: the feature set F of the input image I is generated, which is a preprocessing step of the present invention, and the present invention can be implemented in any existing mature implementation manner, for example, in the present embodiment, the following steps (1) to (4) are adopted to generate the feature set F of the present invention:
(1): normalizing the input image I to remove the illumination influence; the normalization process can be any existing mature mode, such as histogram equalization and the like, and is preferably performed by using the mean value and standard deviation of the image I;
(2): respectively obtaining rotation invariant filter responses of all scales based on multi-scale and multi-direction edge filtering (first order Gaussian partial derivative) and strip filtering (second order Gaussian partial derivative), namely, convolutively normalizing the image I by using edge filtering and strip filtering in m directions on each scale, and recording the direction filter response with the maximum amplitude response on the scale, so that each pixel point x of the image I obtains stable n-1 (n represents the dimension of a feature set F) filter responses;
the number of different scales can be set according to actual conditions, preferably, 3 scales are considered, each scale adopts 8-direction filtering, and the sizes of the 3 scales in the horizontal and vertical directions of the image I can be sequentially selected as follows: (1,3) (2,6), (3,12), and other values can be taken according to actual requirements and applications.
When based on 3 scales, each pixel point x will obtain 6 stable filter responses.
(3) The normalization processing is carried out on the obtained n-1 filter responses of each pixel point, which can be any one of the existing normalization modes, and the invention preferably adopts Web 'slaw (Weber's law) normalization, namely
R(x)←R(x)[log(1+M(x)/0.03)]/M(x)
Wherein r (x) represents a filter response, m (x) | | | r (x) | luminance2Representing the magnitude of the filter response, the symbol | · | | non-woven calculation2Representing a 2-norm.
(4): an n-dimensional (n-D) feature set F is constructed from the normalized image I, n-1 filter responses, and is expressed as:
F={fi(x)|i=1,2,...,n;x∈I}
step S200: carrying out binarization operation on the feature set F to obtain a binary feature set B ═ Bi(x)|i=1,2,...,n;x∈I}:
Wherein, thriRepresenting a feature fiThreshold value of, any feature fi(x) Threshold value thr ofiThe threshold value thr may be preset, preferably, based on an empirical valueiComprises the following steps: characteristic f corresponding to each pixel point on input image IiMean over the entire image I, i.e.Wherein X represents the number of pixel points X of the input image I.
Step S300: for the binary feature set B ═ Bi(x) I1, 2, n, x ∈ I, to generate a specific pixel label l (x), i.e. I
L ( x ) = Σ i = 1 n b i ( x ) 2 i - 1
Step S400: carrying out rotation invariant uniform LBP coding on an input image I to generate a neighbor information coding label Z (x) of each pixel, namely
Wherein,indicating a rotationally invariant uniform pattern; central pixel value gcNamely the pixel value of the current pixel point x; gpThe pixel value of the p-th sampling pixel point with the sampling radius of R around the central pixel is obtained; u (LBP)P,R) Representing the number of 0-1 transitions of a uniform measure, i.e. a circumferential bit string consisting of the sign of the difference between the pixel values of the sampling neighbours and the central pixel; s (t) is a sign function, i.e., t is a negative number, and the function value is 0, otherwise it is 1.
Step S500: histogram expression. Constructing a 2-D symbiotic histogram based on specific pixel labels L (x) and LBP labels Z (x) of all pixel points, and then vectorizing the symbiotic histogram to obtain a 2-D symbiotic histogram with dimension of 2n× (P +2), which is the final texture expression, the 2-D symbiotic histogram is calculated as follows:
H ( l , p ) = Σ x δ ( ( L ( x ) , Z ( x ) ) = = ( l , p ) )
where H (l, p) represents a 2-D histogram with index (l, p), l ∈ [0,2 ]n-1],p∈[0,P+1],And (y) is used for accumulating the times of pixel coding as (l, p) in the image. For example, for pixel x0If L (x)0)=1,Z(x0) When the index (l, p) in H (l, p) is equal to (1,2), 1 is added.
In the present invention, the processing for generating the specific pixel label l (x) and the LBP label z (x) may be executed in parallel or in series, and is selected according to the actual application requirements.
The texture feature extraction method is used for classifying three texture databases of Outex, CURET and UIUC with illumination, rotation, visual angle and scale change according to the texture feature extraction method of the invention by adopting edge filtering and strip filtering in 8 directions on 3 different scales (1,3), (2,6) and (3,12), and compared with classification algorithms based on LBP and learning (such as primitives), the classification performance is obviously improved; and the resulting feature representation has a dimension of 1280, which has a lower feature dimension compared to primitive-based methods (typical dimension is 2440).
The invention is not limited to the foregoing embodiments. The invention extends to any novel feature or any novel combination of features disclosed in this specification and any novel method or process steps or any novel combination of features disclosed.

Claims (8)

1. A robust texture feature extraction method is characterized by comprising the following steps:
step 1: generating a specific pixel label L (x) of the input image I, wherein x represents a pixel point of the input image I;
101: generating an n-dimensional feature set F ═ F for an input image Ii(x)|i=1,2,...,n}:
Carrying out normalization processing on the input image I;
performing multi-scale multi-directional filtering on the normalized image I: for each scale, m directional filtering is adopted, the directional filtering with the maximum amplitude response on each scale is respectively obtained based on edge filtering and strip filtering to serve as the rotation invariant filtering response on the scale, and normalization processing is carried out on the filtering response;
an n-dimensional feature set F ═ F is formed by the normalized image I and the filter responsei(x)|i=1,2,...,n;x∈I};
102: and carrying out binarization processing on the feature set F to obtain a binary feature set B ═ Bi(x)|i=1,2,...,n;x∈I}:
If fi(x) Greater than or equal to the threshold value thr of the featureiThen b isi(x) The value is 1; otherwise, the value is 0;
103: encoding the binary feature set B to generate a specific pixel label L (x) of each pixel: L ( x ) = Σ i = 1 n b i ( x ) 2 i - 1 ;
step 2, carrying out rotation invariant uniform LBP coding on the input image I to generate an LBP label Z (x) of each pixel point;
and 3, constructing a 2-dimensional symbiotic histogram based on the specific pixel label L (x) and the LBP label Z (x) of each pixel point, and vectorizing and outputting the 2-dimensional symbiotic histogram.
2. The method of claim 1, wherein the threshold value thr in step 102iComprises the following steps: characteristic f corresponding to each pixel point on input image IiMean over the whole image I.
3. The method according to claim 1 or 2, wherein the normalization processing on the input image I is a normalization processing that removes an illumination transform.
4. The method of claim 3, wherein the normalization process to remove the illumination transform is: and carrying out normalization processing by using the pixel value mean value and the standard deviation of each pixel point of the input image I.
5. A method according to claim 1 or 2, wherein the filter response is normalized by: r (x) ← R (x) [ log (1+ M (x)/0.03)](x), where r (x) represents a filter response, m (x) r (x) y2Representing the magnitude of the filter response.
6. The method of claim 1 or 2, wherein m of the m directional filters is 8.
7. The method according to claim 1 or 2, wherein the multiscale is in particular 3 scales.
8. The method of claim 7, wherein the 3 dimensions have magnitudes in the horizontal and vertical directions of the image I in the order: (1,3),(2,6),(3,12).
CN201310158760.0A 2013-05-02 2013-05-02 A kind of texture characteristic extracting method of robust Active CN103258202B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310158760.0A CN103258202B (en) 2013-05-02 2013-05-02 A kind of texture characteristic extracting method of robust

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310158760.0A CN103258202B (en) 2013-05-02 2013-05-02 A kind of texture characteristic extracting method of robust

Publications (2)

Publication Number Publication Date
CN103258202A CN103258202A (en) 2013-08-21
CN103258202B true CN103258202B (en) 2016-06-29

Family

ID=48962106

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310158760.0A Active CN103258202B (en) 2013-05-02 2013-05-02 A kind of texture characteristic extracting method of robust

Country Status (1)

Country Link
CN (1) CN103258202B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103761507B (en) * 2014-01-03 2017-02-08 东南大学 Local multi-value pattern face recognition method based on Weber law
CN105046262B (en) * 2015-06-29 2018-08-17 中国人民解放军国防科学技术大学 A kind of robust extension local binary patterns texture characteristic extracting method
CN106788722B (en) * 2016-11-30 2019-10-15 东南大学 A kind of inter-symbol interference cancellation method of pixel modulation visible light communication system
CN108629262B (en) * 2017-03-18 2021-08-20 上海荆虹电子科技有限公司 Iris identification method and corresponding device
CN107403451B (en) * 2017-06-16 2020-11-10 西安电子科技大学 Self-adaptive binary characteristic monocular vision odometer method, computer and robot
CN108876832B (en) * 2018-05-30 2022-04-26 重庆邮电大学 Robust texture feature extraction method based on grouping-order mode
CN109271997B (en) * 2018-08-28 2022-01-28 河南科技大学 Image texture classification method based on skip subdivision local mode
CN109410258B (en) * 2018-09-26 2021-12-10 重庆邮电大学 Texture image feature extraction method based on non-local binary pattern

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6647132B1 (en) * 1999-08-06 2003-11-11 Cognex Technology And Investment Corporation Methods and apparatuses for identifying regions of similar texture in an image
CN102542571A (en) * 2010-12-17 2012-07-04 中国移动通信集团广东有限公司 Moving target detecting method and device
CN102663436A (en) * 2012-05-03 2012-09-12 武汉大学 Self-adapting characteristic extracting method for optical texture images and synthetic aperture radar (SAR) images
CN103035013A (en) * 2013-01-08 2013-04-10 东北师范大学 Accurate moving shadow detection method based on multi-feature fusion

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IT1404810B1 (en) * 2011-01-26 2013-11-29 St Microelectronics Srl RECOGNITION OF TEXTURES IN IMAGE PROCESSING

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6647132B1 (en) * 1999-08-06 2003-11-11 Cognex Technology And Investment Corporation Methods and apparatuses for identifying regions of similar texture in an image
CN102542571A (en) * 2010-12-17 2012-07-04 中国移动通信集团广东有限公司 Moving target detecting method and device
CN102663436A (en) * 2012-05-03 2012-09-12 武汉大学 Self-adapting characteristic extracting method for optical texture images and synthetic aperture radar (SAR) images
CN103035013A (en) * 2013-01-08 2013-04-10 东北师范大学 Accurate moving shadow detection method based on multi-feature fusion

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
A Completed Modeling of Local Binary Pattern Operator for Texture Classification;Zhenhua Guo等;《IEEE Transactions On Image Processing》;20100630;第19卷(第6期);第1657-1663页 *
A statistical approach to texture classification from singal images;Manik Varma等;《Int.J.Comput. Vision》;20050430;第62卷(第1-2期);第61-68页 *

Also Published As

Publication number Publication date
CN103258202A (en) 2013-08-21

Similar Documents

Publication Publication Date Title
CN103258202B (en) A kind of texture characteristic extracting method of robust
Lukic et al. Leaf recognition algorithm using support vector machine with Hu moments and local binary patterns
CN110334762B (en) Feature matching method based on quad tree combined with ORB and SIFT
CN104778457B (en) Video face identification method based on multi-instance learning
CN105321176A (en) Image segmentation method based on hierarchical higher order conditional random field
CN104392463A (en) Image salient region detection method based on joint sparse multi-scale fusion
CN103778434A (en) Face recognition method based on multi-resolution multi-threshold local binary pattern
Shamrat et al. Bangla numerical sign language recognition using convolutional neural networks
Khmag et al. Recognition system for leaf images based on its leaf contour and centroid
Wen et al. Virus image classification using multi-scale completed local binary pattern features extracted from filtered images by multi-scale principal component analysis
Simon et al. Review of texture descriptors for texture classification
CN103390170A (en) Surface feature type texture classification method based on multispectral remote sensing image texture elements
CN104346628A (en) License plate Chinese character recognition method based on multi-scale and multidirectional Gabor characteristic
Chitaliya et al. An efficient method for face feature extraction and recognition based on contourlet transform and principal component analysis using neural network
CN102136074A (en) Man-machine interface (MMI) based wood image texture analyzing and identifying method
CN113744241A (en) Cell image segmentation method based on improved SLIC algorithm
Iamsiri et al. A new shape descriptor and segmentation algorithm for automated classifying of multiple-morphological filamentous algae
CN110490210B (en) Color texture classification method based on t sampling difference between compact channels
CN111401485A (en) Practical texture classification method
CN103902965A (en) Spatial co-occurrence image representing method and application thereof in image classification and recognition
Hammouche et al. A clustering method based on multidimensional texture analysis
Liang et al. Multi-resolution local binary patterns for image classification
Park et al. Image retrieval technique using rearranged freeman chain code
CN110489587B (en) Tire trace image feature extraction method in local gradient direction three-value mode
Nair et al. A survey on feature descriptors for texture image classification

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant