CN113963164A - Texture feature extraction method based on grouping neighborhood intensity difference coding - Google Patents

Texture feature extraction method based on grouping neighborhood intensity difference coding Download PDF

Info

Publication number
CN113963164A
CN113963164A CN202111354116.1A CN202111354116A CN113963164A CN 113963164 A CN113963164 A CN 113963164A CN 202111354116 A CN202111354116 A CN 202111354116A CN 113963164 A CN113963164 A CN 113963164A
Authority
CN
China
Prior art keywords
texture
image
domain
group
texture feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111354116.1A
Other languages
Chinese (zh)
Inventor
林亚平
张怡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan University
Original Assignee
Hunan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan University filed Critical Hunan University
Priority to CN202111354116.1A priority Critical patent/CN113963164A/en
Publication of CN113963164A publication Critical patent/CN113963164A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/211Selection of the most significant subset of features
    • G06F18/2113Selection of the most significant subset of features by ranking or filtering the set of features, e.g. using a measure of variance or of feature cross-correlation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features

Abstract

The invention relates to a texture feature extraction method based on grouping neighborhood intensity difference coding. The invention mainly comprises the following steps: the method comprises the steps of taking the intensity difference between each group of pixels as supplementary information to be coded, and providing a cross-image domain texture extraction method, converting an original image into a gradient domain, extracting texture features on the basis of the gradient domain, and finally fusing texture information of different scales and different image domains and using Top-N to perform dimensionality reduction to obtain final texture feature expression. The texture feature extraction method with high classification performance provided by the invention has better robustness in the environments of Gaussian noise, scale change and illumination change compared with other texture image classification methods, and has practical application value in the fields of biomedical image analysis, satellite remote sensing image application, target identification and face identification.

Description

Texture feature extraction method based on grouping neighborhood intensity difference coding
Technical Field
The invention relates to the field of pattern recognition and image recognition, in particular to a texture feature extraction method based on grouping neighborhood intensity difference coding.
Background
The most common features in an image include the color, shape, and texture of the image. Among these features, texture is a key visual cue for humans to perceive different objects. Texture is a basic primitive of a visual pattern loop, which is a natural attribute of the surface of an object, and texture recognition plays an important role in the field of computer vision. Due to the abundant texture information contained in the images, texture recognition is widely applied in many fields, including biomedical image analysis, satellite remote sensing image application, target recognition, image retrieval, face recognition and the like.
The research of texture recognition mainly focuses on the extraction of texture features. Among the texture extraction methods, Local Binary Pattern (LBP) is one of the most important texture extraction methods, and is continuously paid attention and favored by researchers due to its simplicity and high efficiency. The LBP method is a local characteristic nonparametric operator for describing the image, and has the advantages of simplicity, effectiveness, gray scale invariance and no need of training and learning. The original LBP method has the following limitations: the length of the formed histogram vector is too long; the rotation invariability is not outstanding enough; capturing only very local textures; the single-scale local binary difference has lower classification capability; is very sensitive to noise. Meanwhile, it is difficult to describe the intrinsic mode of complicated texture by using a general model, and different imaging conditions (such as illumination, rotation, scale and deformation) can influence the interpretation capability of the model.
To better utilize the difference information between adjacent sampling points, the document "Pixel to panel sampling structure and local neighbor correlation pattern for texture classification Letters, 2013" designs a Local Neighboring Intensity Relationship Pattern (LNIRP), which describes a micro pattern and a macro pattern. Song et al proposed two complementary Texture descriptors, called Local packet Order Pattern (LGOP) and Non-Local Binary Pattern (NLBP), in "Robust Texture Description Using Local group Order Pattern and Non-Local Binary Pattern. The LGOP descriptor captures the packet order relationship between different neighbors. The design of NLBP is consistent with the LBP coding scheme, i.e. the information of pixel intensity difference between non-local pixels and adjacent sample points is coded.
Although this method has an excellent effect, the following disadvantages remain. First, the method uses only pixel-level information to explore local patterns in the original image domain, ignoring the correlation information between different image domains. Secondly, since the intensity difference between adjacent sampling points of the packet is also a non-negligible distinguishing feature, the characteristic cannot be described when considering the sequential relationship between the sampling points. These two limitations make the generated texture representation less robust and the classification capability in noisy, scale-varying, etc. environments remains to be improved.
Based on the above, the patent proposes a grouping adjacent strength difference coding scheme to provide auxiliary adjacent information, fuses texture information of different scales and different image domains, and uses Top-N to perform dimension reduction to obtain a final texture feature expression, so as to obtain superior classification performance in the environment of gaussian noise, scale variation and illumination variation.
Disclosure of Invention
The invention provides a grouping adjacent strength difference coding scheme and a cross-image domain texture feature extraction method, which have good robustness on Gaussian noise and surrounding environment changes (such as image rotation, illumination and scaling) and can obtain a good texture classification effect. The method mainly comprises the following steps:
(1) a method for coding intra-group neighborhood difference information based on a local grouping sequence mode is provided, and complementary information is captured to further resist the interference of external noise.
(2) The method of cross-image domain is adopted to make the original texture descriptor have stronger robustness to Gaussian noise, scale change and illumination change. And reducing the dimension of the generated feature histogram by adopting Top-N.
The specific contents are as follows:
(1) a Local Grouping Order Pattern (LGOP) -based texture feature extraction method for coding intra-group neighborhood difference information is proposed, and the model structure is shown in fig. 3. Firstly, calculating a leading direction of neighborhood sampling points of each sampling region, rotating sampling points around a local part according to the leading direction, and D is an index of the sampling point with the maximum intensity difference with a central pixel. Next, the rotated sampling sequence is divided into m-P/4 groups and uniformly distributed on the circle. The calculation rule of D is as follows:
Figure BDA0003353615790000021
the resulting vectors of neighborhood samples are represented as follows:
Figure BDA0003353615790000022
where P denotes the number of sampling points,
Figure BDA0003353615790000023
representing the average gray value of the local image block in which the central pixel is located,
Figure BDA0003353615790000024
representing the average gray value of the pixels of the local image block around the p-th sample point,
Figure BDA0003353615790000025
is that
Figure BDA0003353615790000026
A rotated version.
Figure BDA0003353615790000027
A vector is represented whose components correspond to the gray values of the k-th set of sample points. Finally, the gray-order relationship of the local sample points around the center pixel is encoded by the ordering function γ (-) and the mapping function f (-). The mapping function maps the group of input adjacent sample points to an integer value between 0 and 23, referred to as an LGOP code, as shown in fig. 2. The calculation rule of the LGOP code is as follows:
Figure BDA0003353615790000031
the intensity difference between each group of pixels is considered as supplementary information. Based on the local grouping order mode, we propose a first order differential coding scheme based on intra-group pixel-to-symbol differences. The difference in gray scale symbols of the first two adjacent pixels and the last two adjacent pixels in each group is first calculated, and then the first order difference between the two results is calculated. And if the difference result is greater than or equal to 0, the value is 1, otherwise, the value is 0. The sum of the pixel sign differences within a group during one sampling is denoted as S.
Figure BDA0003353615790000032
Where s () is a sign function. The process of calculating the difference between adjacent sample points of a packet is shown in fig. 1, where (P, r) — (12, 3), which is called an extended partial packet pattern (ELGOP).
(2) And a cross-image domain method is adopted to extract richer texture information. For the input texture image, the first derivative of the Gaussian derivative filter is used to obtain the gradient response, and the original image is transformed to the gradient domain. Given an image I, its gradient representation is calculated as follows:
Figure BDA0003353615790000033
where G is a two-dimensional Gaussian function:
Figure BDA0003353615790000034
wherein
Figure BDA0003353615790000035
Representing the convolution operation, σ is a scale parameter determined by experimental conditions.
And extracting ELGOP descriptors and NLBP descriptors in a document 'Robust Texture Description Using Local group Order Pattern and Non-Local Binary Pattern. IEEE TCSVT, 2020' in a gradient domain to finally obtain gradient Texture representation. Since multi-scale applications in the gradient domain are not significant for improving the classification accuracy, computational cost and memory space are increased, where only a single scale is used to extract texture features in the gradient domain. And finally, extracting richer texture information by combining texture features of the original image and texture features of the gradient domain. To reduce the influence of the feature dimension added in the above process, the number of features corresponding to the obtained feature histogram is first sorted in descending order, and the pattern corresponding to the first N features is selected, where N is calculated as follows, and the accuracy of classification approaches saturation when the value of θ is 0.9.
Figure BDA0003353615790000036
Where M is the total number of bins in the histogram.
Compared with the prior art, the technical scheme at least has the following remarkable effects:
1. the invention provides a texture feature extraction method for intra-group neighborhood difference information coding based on a local grouping sequence mode.
2. The invention provides a cross-image-domain texture extraction method, which comprises the steps of converting an original image into a gradient domain, extracting texture features on the basis of the gradient domain, and finally fusing texture information of different scales and different image domains to obtain final texture feature expression. The final classification result has stronger robustness to Gaussian noise, scale change and illumination change.
Drawings
FIG. 1 is a schematic diagram of an implementation flow of extracting multi-scale texture features in an original image domain and extracting single-scale texture features in a gradient domain and connecting the extracted multi-scale texture features and the single-scale texture features into a histogram;
FIG. 2 is a diagram illustrating mapping of an f-function in the prior art;
FIG. 3 is a diagram of an ELGOP texture descriptor model according to the present invention;
Detailed Description
The invention discloses a texture feature extraction method based on grouping neighborhood intensity difference coding. For convenience of illustration, the implementation describes a specific implementation of the present invention by taking an Outex TC10 texture dataset as an example, but those skilled in the art should know that the technical solution of the present application does not limit the kind of texture picture dataset. These embodiments are merely to explain the technical principles of the present invention and are not intended to limit the scope of the present invention.
The method can be implemented according to the following steps, and the specific steps are as follows:
the method comprises the following steps: and (4) a pretreatment step.
The sample with the rotation angle of 0 degree in the TC10 is used as a training set, and the samples with the remaining angles are used as a test set. And carrying out normalization processing by using the mean value and the standard deviation of the image, and multiplying the corresponding pixels in the image by using a Gaussian derivative filter to obtain a gradient image.
Step two: and extracting texture features of the original image.
Taking different sampling radiuses (r is 1, r is 3, r is 5) and the number of neighborhood sampling points (P) as 24 in the original image domain, calculating an expanded local group order mode:
ELGOP-Cr,P,x=LGOP-Cr,P,k+LGOP-Mr,P
wherein is LGOP-Cr,P,kE { 0., 47} is the joint encoding of the center pixel and the LGOP descriptor, which is calculated as:
LGOP-Cr,P,k=LGOPr,P,k+Cr,P×24
Figure BDA0003353615790000041
wherein
Figure BDA0003353615790000051
Average gray value of the local image block where the central pixel x is located, cIIn the presentation image I
Figure BDA0003353615790000052
Is measured.
In the present embodiment, 6 groups of every 4 neighborhood sampling points are used. LGOP-Mr,PE {0, 6} encodes the local pixel difference information between each group, first computing the difference in gray symbols of the first two neighboring pixels and the last two neighboring pixels in each neighborhood group, respectively, and then computing the first order difference between the two results. And if the difference result is greater than or equal to 0, the value is 1, otherwise, the value is 0. The specific flow of the calculation is shown in FIG. 1 (in the figure, P is 12, and the number of groups m is 3).
For different sampling radii in the original image domain, a non-local binary pattern N is calculated:
NLBP-Cr,P,j=NLBPr,P,j+Cr,P×(P+6)
where the computation of NLBP comes from the prior art, which aims to capture the interaction between the neighborhood pixels and the distant pixels of the sample, first it computes several non-local central pixels gAj (J1.., J), which are called anchor points according to global image statistics, J being the number of anchor points. It then encodes the sign difference information between adjacent samples and these anchor points.
And vectorizing the obtained expanded local group sequence mode and the non-local binary mode with different scales to obtain the characteristic expression of the original image domain.
Step three: and extracting texture features of the gradient image.
And (3) calculating single-scale ELGOP-C and NLBP-C for each pixel in the gradient image domain, and fusing the characteristics obtained by the two texture descriptors to obtain the characteristic expression of the gradient image domain.
Step four: cross-image domain fusion and dimension reduction.
And (4) splicing the texture feature histograms of different image domains obtained in the step (3) and the step (4). And performing descending order on the feature numbers corresponding to the obtained feature histograms, taking the value of theta as 0.9, calculating the value of N, and selecting the modes corresponding to the first N features to obtain the final dimension-reduced texture feature histogram.
Step five: and classifying the texture image.
And calculating the chi-square distance between the histogram of the textural features of each test image and all the training images, and classifying by adopting a nearest neighbor classifier to obtain an image label corresponding to the test image. And calculating to obtain the final texture classification accuracy.
In summary, aiming at the problem that the existing texture classification method is not robust enough to noise, scale change and illumination change, the invention provides a texture image classification method based on grouping neighborhood intensity difference coding, and more abundant texture information is extracted by combining texture features of an original image and a gradient image, so that excellent texture classification performance can be obtained, and the texture image classification method has practical application value in the fields of biomedical image analysis, satellite remote sensing image application, target identification and face identification.
The foregoing shows and describes the general principles, essential features, and advantages of the invention. It will be appreciated by persons skilled in the art that the scope of the present invention is not limited to the specific embodiments described. Equivalent changes or substitutions of related technical features can be made by those skilled in the art without departing from the principle of the invention, and it is noted that the technical solutions after the changes or substitutions will fall within the protection scope of the invention.

Claims (2)

1. A method for intra-group neighborhood difference coding based on a local group ordering model, the method comprising:
the intensity difference between the pixels in each neighborhood sampling point group is taken as supplementary information, a first-order differential coding scheme based on the symbol difference of the pixels in the group is provided, and the influence of external noise on the classification accuracy is further resisted;
the method as claimed in claim 1, wherein the method for extracting texture features based on intra-group neighborhood difference information coding of the local group order mode specifically comprises:
the method comprises the steps of completing splicing of texture feature vectors and generation of a histogram through preprocessing of an input texture image and generation of an expanded local group sequence mode and a non-local binary mode; wherein, the texture image preprocessing stage standardizes each image; firstly, calculating a local group order mode, then calculating the gray level symbol difference of the first two adjacent pixels and the last two adjacent pixels in each group of the local group order, generating a first-order difference between two calculated results, then summing the difference information of each group in a sampling process to form an expanded local group order mode, and simultaneously calculating a non-local binary mode; and connecting the texture feature vectors of 3 different sampling radii in the original image domain to obtain a corresponding texture feature histogram.
2. A method for cross-image domain texture feature extraction, the method comprising:
the original image is transformed to the gradient domain using the first derivative of the gaussian derivative filter to obtain the gradient response of the input image. By combining texture features of an original image and texture features of a gradient domain, richer texture information is extracted, so that the extracted texture descriptor has stronger robustness to Gaussian noise, scale change and illumination change;
the cross-image domain texture feature extraction method according to claim 2, wherein the cross-image domain texture feature extraction scheme specifically includes:
calculating a first derivative of a derivative filter for an input texture image to obtain a gradient response of the texture image, extracting an expanded local group sequence mode and a non-local binary mode by using a single scale to obtain texture feature expression of the image in a gradient domain, and connecting the texture feature expression with an original image domain and a texture feature vector obtained in the gradient domain; in order to reduce the calculation overhead brought by the increased feature dimension, the dimension reduction of the generated feature histogram is carried out by adopting Top-N.
CN202111354116.1A 2021-11-15 2021-11-15 Texture feature extraction method based on grouping neighborhood intensity difference coding Pending CN113963164A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111354116.1A CN113963164A (en) 2021-11-15 2021-11-15 Texture feature extraction method based on grouping neighborhood intensity difference coding

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111354116.1A CN113963164A (en) 2021-11-15 2021-11-15 Texture feature extraction method based on grouping neighborhood intensity difference coding

Publications (1)

Publication Number Publication Date
CN113963164A true CN113963164A (en) 2022-01-21

Family

ID=79470700

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111354116.1A Pending CN113963164A (en) 2021-11-15 2021-11-15 Texture feature extraction method based on grouping neighborhood intensity difference coding

Country Status (1)

Country Link
CN (1) CN113963164A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115272873A (en) * 2022-09-27 2022-11-01 山东大学 Hyperspectral image nonlinear feature preprocessing system and method based on gradient domain

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115272873A (en) * 2022-09-27 2022-11-01 山东大学 Hyperspectral image nonlinear feature preprocessing system and method based on gradient domain
CN115272873B (en) * 2022-09-27 2023-02-24 山东大学 Hyperspectral image nonlinear feature preprocessing system and method based on gradient domain

Similar Documents

Publication Publication Date Title
Sirmacek et al. Urban-area and building detection using SIFT keypoints and graph theory
Fu et al. Centralized binary patterns embedded with image euclidean distance for facial expression recognition
CN110334762B (en) Feature matching method based on quad tree combined with ORB and SIFT
Joshi et al. Recent advances in local feature detector and descriptor: a literature survey
Tang et al. Distinctive image features from illumination and scale invariant keypoints
CN111079514A (en) Face recognition method based on CLBP and convolutional neural network
CN109388727A (en) BGP face rapid retrieval method based on clustering
Soltanpour et al. Weighted extreme sparse classifier and local derivative pattern for 3D face recognition
Fang et al. Image classification with an RGB-channel nonsubsampled contourlet transform and a convolutional neural network
Ai et al. Color independent components based SIFT descriptors for object/scene classification
Chakraborty et al. Hand gesture recognition: A comparative study
CN113963164A (en) Texture feature extraction method based on grouping neighborhood intensity difference coding
Bhattacharya et al. A survey of landmark recognition using the bag-of-words framework
Kasthuri et al. Gabor-oriented local order feature-based deep learning for face annotation
Vijayalakshmi K et al. Copy-paste forgery detection using deep learning with error level analysis
Vasudevan et al. Flowchart knowledge extraction on image processing
Sadat et al. Texture classification using multimodal invariant local binary pattern
Essa et al. High order volumetric directional pattern for video-based face recognition
Xu et al. Lace Fabric Image Retrieval Using Siamese Neural Network
Julsing Face recognition with local binary patterns
Yadav et al. A survey: comparative analysis of different variants of local binary pattern
Priyanka et al. Ensemble learning-based deep neural network model for face recognition
Mishra et al. Handwritten digit recognition using combined feature extraction technique and neural network
Kumar et al. Learning Noise-Assisted Robust Image Features for Fine-Grained Image Retrieval.
Murthy et al. A Novel method for efficient text extraction from real time images with diversified background using haar discrete wavelet transform and k-means clustering

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination