CN114037847A - Anti-noise local color texture feature extraction method - Google Patents

Anti-noise local color texture feature extraction method Download PDF

Info

Publication number
CN114037847A
CN114037847A CN202111391326.8A CN202111391326A CN114037847A CN 114037847 A CN114037847 A CN 114037847A CN 202111391326 A CN202111391326 A CN 202111391326A CN 114037847 A CN114037847 A CN 114037847A
Authority
CN
China
Prior art keywords
color
channel
channels
features
sequence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111391326.8A
Other languages
Chinese (zh)
Other versions
CN114037847B (en
Inventor
束鑫
宋志刚
程科
於跃成
严熙
范燕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu University of Science and Technology
Original Assignee
Jiangsu University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu University of Science and Technology filed Critical Jiangsu University of Science and Technology
Priority to CN202111391326.8A priority Critical patent/CN114037847B/en
Publication of CN114037847A publication Critical patent/CN114037847A/en
Application granted granted Critical
Publication of CN114037847B publication Critical patent/CN114037847B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches

Abstract

The invention discloses an anti-noise local color texture feature extraction method, which comprises the following specific steps: firstly, generating a fourth color vector channel C by a color texture image with the size of M multiplied by N according to three color channels of R-G-B according to a specific method, and arranging the fourth color vector channel C in a row to form a cube with the size of M multiplied by N multiplied by 4; then, extracting local grouping sequence mode characteristics on the 4 color channels respectively; secondly, extracting longitudinal difference value binary pattern features on 4 color channels in a channel-crossing mode; then extracting the above features by different scales, normalizing and cascading to construct a joint vector H; and finally, classifying by using the chi-square distance and the nearest neighbor classifier to obtain a classification result. The method is simple in calculation, can extract the color information in the texture image and the correlation information among the color channels, and has robustness to noise.

Description

Anti-noise local color texture feature extraction method
Technical Field
The invention relates to the technical field of image processing and pattern recognition, in particular to an anti-noise local color texture feature extraction method.
Background
Texture is a perceptible basic attribute presented on the surface of an object in nature, and texture classification is a very important research hotspot in the field of computer vision, and plays an important role in all aspects. Therefore, how to effectively acquire the characteristic texture features is the key point for image analysis and understanding.
At present, the gray texture analysis technology is more and more mature, and many gray texture descriptors have been developed and successfully applied to many fields of image classification. However, since only the grayscale image is texture classified, for color images, the color information is discarded, and is an important clue for visual perception. How to fully utilize color information while extracting the texture features of the color image has important research value and significance. Ojala et al in 2002 proposed a Local Binary Pattern (LBP), which is widely used in many application fields such as face recognition, face fraud detection, defect detection, medical image detection and the like due to its characteristics of easy implementation, low computational complexity, strong recognition capability, invariance to monotonic illumination variation and the like. Subsequently, more LBP extension optimization algorithms were proposed, however most algorithms only target gray texture images and cannot effectively process color texture images. Although the feature extraction of color texture images has made great progress, color texture still has many open problems to be solved, especially the correlation problem between different color channels and the sensitivity problem to noise. Therefore, it is important to design a descriptor that effectively utilizes color information in texture images and is robust to noise.
Disclosure of Invention
The invention aims to solve the problem that an anti-noise local color texture feature extraction method is provided, and is used for solving the problems that the traditional Local Binary Pattern (LBP) and an extended optimization algorithm thereof cannot extract color information, cannot effectively utilize correlation information among color channels, has sensitivity to noise and the like.
The technical scheme adopted by the invention for solving the technical problems is as follows:
an anti-noise local color texture feature extraction method comprises the following steps:
the method comprises the following steps: generating a fourth color vector channel C of the color texture image with the size of M multiplied by N according to three color channels of R-G-B according to a specific method, and arranging the fourth color vector channel C in a cube with the size of M multiplied by N multiplied by 4;
step two: respectively extracting local grouping sequence mode characteristics on the 4 color channels;
step three: extracting longitudinal difference value binary pattern features on 4 color channels in a channel-crossing mode;
step four: extracting the features by adopting different scales, normalizing and cascading the features to construct a joint vector H;
step five: and classifying by using the chi-square distance and the nearest neighbor classifier to obtain a classification result.
In the first step of the present invention, a fourth color vector channel C is generated from a color texture image with size M × N according to three color channels R-G-B and placed in a cube with size M × N × 4, that is, a color texture RGB image with size M × N is generated according to a color vector formula. The total number of 4 single-channel images is R channel (red), G channel (green ), B channel (blue), and C channel (Color Vector). Placed in the order of R-G-B-C to give an M X N X4 cube. The color vector channel C is generated as follows:
Figure BDA0003368952660000021
in the second step of the present invention, local grouping sequence pattern features are extracted on 4 color channels, that is, local grouping sequence pattern features are extracted on each channel, and the extraction process is as follows: first, a dominant direction D needs to be designed. The dominant direction is defined as the index of the nearest pixel that differs most from the central pixel value, and is expressed as:
Figure BDA0003368952660000022
the maximum differential response is then used to improve the discrimination of the features and the robustness to noise. After the dominant direction D is obtained, the sequence of neighboring pixel values is circularly rotated until the pixel value with index D is located at the first position in the sequence, denoted as:
(gr,0,gr,1,…,gr,P-1):=(gr,D,…,gr,P-1,gr,0,…gr,D-1)
in the formula, the symbol ": "denotes an element-by-element assignment operation.
Then, the rotated neighbor pixel value sequence is uniformly divided into a plurality of groups, and in order to ensure that the texture features have lower dimensionality, the number of neighbor pixel points in each group after grouping is limited to 4, so that the group can be divided into n-P/4 groups:
Figure BDA0003368952660000025
in the formula (I), the compound is shown in the specification,
Figure BDA0003368952660000026
representing the pixel values of the ith group of neighbor pixels.
Finally, the order relationship between the neighboring pixels in each group is encoded:
LCOPr,P,i=f(γ(g′i))
where γ (·) is a sorting function that sorts the input elements in a non-descending manner and returns their relative positions; f (-) is a mapping function that maps an input sequence to a corresponding code value. As shown in the table, the neighbor pixels in each grouping may be mapped to an integer value (i.e., LCOP code value) in the range of {0,1, …,23} according to the formula.
Figure BDA0003368952660000031
In the third step of the invention, longitudinal difference value binary pattern features are extracted in a channel-crossing mode on 4 color channels, namely, the difference values among the channels are sequentially obtained and binarized in the 4 channels according to the following formula:
V1=VR-G=δ(R-G)
V2=VG-B=δ(G-B)
V3=VB-R=δ(B-R)
V4=VR-C=δ(R-C)
V5=VG-C=δ(G-C)
V6=VB-C=δ(B-C)
Figure BDA0003368952660000032
in the third step of the invention, longitudinal difference value binary pattern features are extracted in a channel-crossing mode on 4 color channels, namely, the difference values among the channels are sequentially obtained in the 4 channels according to the following formula, and are encoded according to a certain sequence after binarization, wherein the formula is as follows:
Figure BDA0003368952660000041
in the fourth step of the invention, the characteristics are extracted by different scales, normalized and cascaded, and a combined vector H is constructed, namely the local continuous sequence mode characteristics extracted in 4 channels and the longitudinal difference value binary mode characteristics are normalized and then cascaded, wherein the normalization process is as follows:
Figure BDA0003368952660000042
Figure BDA0003368952660000043
Figure BDA0003368952660000044
Figure BDA0003368952660000045
Figure BDA0003368952660000046
then, 5 features are cascaded to obtain a final color image feature histogram:
Figure BDA0003368952660000047
the invention has the beneficial effects that:
(1) a new color channel of a color vector is added, so that texture details are enriched on the basis of the original three channels;
(2) not only the color correlation information among different channels is coded, but also the color texture information of each channel is contained;
(3) intra-channel and inter-channel features for each pixel in a color image are jointly encoded. Extracting local characteristics from four channels at a time, wherein the local characteristics comprise correlation information among different channels;
(4) is robust to noise.
Drawings
FIG. 1 is a general flow chart of a method for noise immune local color texture feature extraction of the present invention;
FIG. 2 illustrates a first step of generating a fourth color vector channel C from a color texture image of size M × N according to R-G-B color channels and arranging the fourth color vector channel C in a cube of size M × N × 4;
FIG. 3 is a diagram illustrating a step two of extracting local grouping sequential pattern features on 4 color channels according to the present invention;
FIG. 4 is a diagram illustrating a cross-channel method for extracting longitudinal difference binary pattern features on 4 color channels in step three of the present invention;
FIG. 5 is a block diagram illustrating the construction of a joint vector H in step four according to the present invention.
Detailed Description
In order to make the purpose and technical solution of the present invention clearer, the technical solution of the present invention will be clearly and completely described below with reference to the accompanying drawings of the embodiments of the present invention. It is to be understood that the embodiments described are only a few embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the described embodiments of the invention without any inventive step, are within the scope of protection of the invention.
As shown in FIG. 1, the method for extracting anti-noise local color texture features of the present invention comprises the steps of firstly generating a fourth color vector channel C according to three color channels R-G-B and a specific method for the color texture image with size of M × N, and arranging the fourth color vector channel C in a cube with size of M × N × 4; then, extracting local grouping sequence mode characteristics on the 4 color channels respectively; secondly, extracting longitudinal difference value binary pattern features on 4 color channels in a channel-crossing mode; then extracting the above features by different scales, normalizing and cascading to construct a joint vector H; and finally, classifying by using the chi-square distance and the nearest neighbor classifier to obtain a classification result. The method specifically comprises the following steps:
the method comprises the following steps: generating a fourth color vector channel C of the color texture image with the size of M multiplied by N according to three color channels of R-G-B according to a specific method, and arranging the fourth color vector channel C in a cube with the size of M multiplied by N multiplied by 4;
step two: respectively extracting local grouping sequence mode characteristics on the 4 color channels;
step three: extracting longitudinal difference value binary pattern features on 4 color channels in a channel-crossing mode;
step four: extracting the features by adopting different scales, normalizing and cascading the features to construct a joint vector H;
step five: and classifying by using the chi-square distance and the nearest neighbor classifier to obtain a classification result.
As shown in fig. 2, according to the method for extracting anti-noise local color texture features of the present invention, first, a fourth color vector channel C is generated from a color texture image with a size of M × N according to three color channels R-G-B according to a specific method, and is arranged in a cube with a size of M × N × 4, that is, a color texture RGB image with a size of M × N is generated according to a color vector formula. The total number of 4 single-channel images is R channel (red), G channel (green ), B channel (blue), and C channel (Color Vector). Placed in the order of R-G-B-C to give an M X N X4 cube. The color vector channel C is generated as follows:
Figure BDA0003368952660000051
as shown in fig. 3, in the second step of the present invention, local packet sequence pattern features are extracted on 4 color channels, that is, local packet sequence pattern features are extracted on each channel, and the extraction process is as follows: first, a dominant direction D needs to be designed. The dominant direction is defined as the index of the nearest pixel that differs most from the central pixel value, and is expressed as:
Figure BDA0003368952660000052
the maximum differential response is then used to improve the discrimination of the features and the robustness to noise. After the dominant direction D is obtained, the sequence of neighboring pixel values is circularly rotated until the pixel value with index D is located at the first position in the sequence, denoted as:
(gr,0,gr,1,…,gr,P-1):=(gr,D,…,gr,P-1,gr,0,…gr,D-1)
in the formula, the symbol ": "denotes an element-by-element assignment operation.
Then, the rotated neighbor pixel value sequence is uniformly divided into a plurality of groups, and in order to ensure that the texture features have lower dimensionality, the number of neighbor pixel points in each group after grouping is limited to 4, so that the group can be divided into n-P/4 groups:
Figure BDA0003368952660000063
in the formula (I), the compound is shown in the specification,
Figure BDA0003368952660000064
representing the pixel values of the ith group of neighbor pixels.
Finally, the order relationship between the neighboring pixels in each group is encoded:
LCOPr,P,i=f(γ(g′i))
where γ (·) is a sorting function that sorts the input elements in a non-descending manner and returns their relative positions; f (-) is a mapping function that maps an input sequence to a corresponding code value. As shown in the table, the neighbor pixels in each group may be mapped to an integer value (i.e., LCOP code value) in the range of {0,1, …,23 };
Figure BDA0003368952660000065
as shown in fig. 4, in step three of the present invention, the longitudinal difference value binary pattern feature is extracted in a cross-channel manner on 4 color channels, that is, the difference values between the channels are sequentially obtained and binarized in the 4 channels according to the following formula:
V1=VR-G=δ(R-G)
V2=VG-B=δ(G-B)
V3=VB-R=δ(B-R)
V4=VR-C=δ(R-C)
V5=VG-C=δ(G-C)
V6=VB-C=δ(B-C)
Figure BDA0003368952660000071
as shown in fig. 4, in the third step of the present invention, the longitudinal difference value binary pattern feature is extracted in a channel-crossing manner on 4 color channels, that is, the difference values between the channels are sequentially obtained in the 4 channels according to the following formula, and are encoded according to a certain sequence after binarization, where the formula is as follows:
Figure BDA0003368952660000072
as shown in fig. 5, in the fourth step of the present invention, the above features are extracted by different scales, normalized and concatenated, and a joint vector H is constructed, that is, the local continuous sequence mode features extracted in 4 channels and the longitudinal difference binary mode features are normalized and concatenated, and the normalization process is as follows:
Figure BDA0003368952660000073
Figure BDA0003368952660000074
Figure BDA0003368952660000075
Figure BDA0003368952660000076
Figure BDA0003368952660000077
then, 5 features are cascaded to obtain a final color image feature histogram:
Figure BDA0003368952660000078
in order to verify the effectiveness and stability of the anti-noise local color texture feature extraction method, the specific implementation of the method is described by classifying the texture features extracted finally in fig. 5 on a standard texture library KTH-TIPS:
(1) single-scale and multi-scale performance analysis: the method analyzes the anti-noise performance under single scale and multi-scale in the KTH-TIPS database. The classification results are shown in table 1. As can be seen from table 1, the method proposed herein achieves satisfactory and stable classification accuracy at either single or multiple scales.
Table 1:
Figure BDA0003368952660000081
(2) comparing the method with other 10 texture feature extraction methods, the result is shown in table 2;
by comparison with other methods, the method provided by the invention can be verified to have good advantages over other 10 methods: the classification precision is effectively improved; color correlation information among different channels is effectively utilized, and color texture information of each channel is also included; efficient joint coding of intra-channel and inter-channel features for each pixel in a color image; the anti-noise performance is effectively improved.
Table 2:
Figure BDA0003368952660000082
the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting the same, and although the present invention is described in detail with reference to the above embodiments, those of ordinary skill in the art should understand that: modifications and equivalents may be made to the embodiments of the invention without departing from the spirit and scope of the invention, which is to be covered by the claims.

Claims (6)

1. An anti-noise local color texture feature extraction method is characterized by comprising the following steps: the method comprises the following steps:
the method comprises the following steps: generating a fourth color vector channel C of the color texture image with the size of M multiplied by N according to three color channels of R-G-B according to a specific method, and arranging the fourth color vector channel C in a cube with the size of M multiplied by N multiplied by 4;
step two: respectively extracting local grouping sequence mode characteristics on the 4 color channels;
step three: extracting longitudinal difference value binary pattern features on 4 color channels in a channel-crossing mode;
step four: extracting the features by adopting different scales, normalizing and cascading the features to construct a joint vector H;
step five: and classifying by using the chi-square distance and the nearest neighbor classifier to obtain a classification result.
2. The anti-noise local color texture feature extraction method according to claim 1, characterized in that: in the first step, a fourth color vector channel C is generated on the color texture image with the size of M multiplied by N according to three color channels R-G-B according to a specific method and is arranged in a cube with the size of M multiplied by N4, namely, a color texture RGB image with the size of M multiplied by N is generated into the fourth channel C according to a color vector formula; the total number of the images with the single channel is 4, and the images are respectively an R channel (red), a G channel (green ), a B channel (blue) and a C channel (Color Vector); placing the three-dimensional cubic bodies according to the arrangement sequence of R-G-B-C so as to obtain an MxNx4 cubic body; the color vector channel C is generated as follows:
Figure FDA0003368952650000011
3. the anti-noise local color texture feature extraction method according to claim 1, characterized in that: in the second step, the local grouping sequence mode features are respectively extracted on the 4 color channels, namely the local grouping sequence mode features are extracted on each channel, and the extraction process is as follows: firstly, a dominant direction D needs to be designed; the dominant direction is defined as the index of the nearest pixel that differs most from the central pixel value, and is expressed as:
Figure FDA0003368952650000012
then, the maximum differential response is adopted to improve the discrimination of the characteristics and the robustness of the noise; after the dominant direction D is obtained, the sequence of neighboring pixel values is circularly rotated until the pixel value with index D is located at the first position in the sequence, denoted as:
(gr,0,gr,1,…,gr,P-1):=(gr,D,…,gr,P-1,gr,0,…gr,D-1)
in the formula, the symbol ": "represents an element-by-element assignment operation;
then, the rotated adjacent pixel value sequence is uniformly divided into a plurality of groups, in order to ensure that the texture features have lower dimensionality, the number of adjacent pixel points in each group after grouping is limited to 4, and therefore the adjacent pixel value sequence is divided into n-P/4 groups:
Figure FDA0003368952650000021
in the formula (I), the compound is shown in the specification,
Figure FDA0003368952650000022
pixel values representing the ith set of neighbor pixels;
finally, the order relationship between the neighboring pixels in each group is encoded:
LCOPr,P,i=f(γ(g′i))
where γ (·) is a sorting function that sorts the input elements in a non-descending manner and returns their relative positions; f (-) is a mapping function that maps an input sequence to a corresponding code value; as shown in the table, the neighbor pixels in each group may be mapped to an integer value (i.e., LCOP code value) in the range of {0,1, …,23 };
Figure FDA0003368952650000023
4. the anti-noise local color texture feature extraction method according to claim 1, characterized in that: in the third step, longitudinal difference value binary pattern features are extracted on 4 color channels in a channel-crossing mode, namely, the difference values among the channels are sequentially obtained and binarized in the 4 channels according to the following formula:
V1=VR-G=δ(R-G)
V2=VG-B=δ(G-B)
V3=VB-R=δ(B-R)
V4=VR-C=δ(R-C)
V5=VG-C=δ(G-C)
V6=VB-C=δ(B-C)
Figure FDA0003368952650000031
5. the anti-noise local color texture feature extraction method according to claim 4, wherein: in the third step, longitudinal difference value binary pattern features are extracted on 4 color channels in a channel-crossing mode, namely, the difference values among the channels are sequentially obtained in the 4 channels according to the following formula, and are encoded according to a certain sequence after binarization, wherein the formula is as follows:
Figure FDA0003368952650000032
6. a method for extracting anti-noise local color texture features according to claim 3, 4 or 5, characterized by: in the fourth step, the characteristics are extracted by different scales, normalized and cascaded, and a combined vector H is constructed, namely the normalized and cascaded local continuous sequence mode characteristics extracted in 4 channels and the normalized and cascaded longitudinal difference binary mode characteristics, wherein the normalization process is as follows:
Figure FDA0003368952650000033
Figure FDA0003368952650000034
Figure FDA0003368952650000035
Figure FDA0003368952650000036
Figure FDA0003368952650000037
then, 5 features are cascaded to obtain a final color image feature histogram:
Figure FDA0003368952650000038
CN202111391326.8A 2021-11-23 2021-11-23 Anti-noise local color texture feature extraction method Active CN114037847B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111391326.8A CN114037847B (en) 2021-11-23 2021-11-23 Anti-noise local color texture feature extraction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111391326.8A CN114037847B (en) 2021-11-23 2021-11-23 Anti-noise local color texture feature extraction method

Publications (2)

Publication Number Publication Date
CN114037847A true CN114037847A (en) 2022-02-11
CN114037847B CN114037847B (en) 2023-04-18

Family

ID=80145135

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111391326.8A Active CN114037847B (en) 2021-11-23 2021-11-23 Anti-noise local color texture feature extraction method

Country Status (1)

Country Link
CN (1) CN114037847B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109740572A (en) * 2019-01-23 2019-05-10 浙江理工大学 A kind of human face in-vivo detection method based on partial color textural characteristics
CN111696080A (en) * 2020-05-18 2020-09-22 江苏科技大学 Face fraud detection method, system and storage medium based on static texture
CN112508038A (en) * 2020-12-03 2021-03-16 江苏科技大学 Cross-channel local binary pattern color texture classification method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109740572A (en) * 2019-01-23 2019-05-10 浙江理工大学 A kind of human face in-vivo detection method based on partial color textural characteristics
CN111696080A (en) * 2020-05-18 2020-09-22 江苏科技大学 Face fraud detection method, system and storage medium based on static texture
CN112508038A (en) * 2020-12-03 2021-03-16 江苏科技大学 Cross-channel local binary pattern color texture classification method

Also Published As

Publication number Publication date
CN114037847B (en) 2023-04-18

Similar Documents

Publication Publication Date Title
Kviatkovsky et al. Color invariants for person reidentification
Mäenpää et al. Classification with color and texture: jointly or separately?
Chatzichristofis et al. Fcth: Fuzzy color and texture histogram-a low level feature for accurate image retrieval
Banerji et al. Novel color LBP descriptors for scene and image texture classification
Ramaiah et al. De-duplication of photograph images using histogram refinement
CN112766291B (en) Matching method for specific target object in scene image
CN111126240B (en) Three-channel feature fusion face recognition method
CN102819582A (en) Quick searching method for mass images
Buza et al. Skin detection based on image color segmentation with histogram and k-means clustering
CN109934272B (en) Image matching method based on full convolution network
CN109325507A (en) A kind of image classification algorithms and system of combination super-pixel significant characteristics and HOG feature
CN109388727A (en) BGP face rapid retrieval method based on clustering
Ahmed et al. Deep image sensing and retrieval using suppression, scale spacing and division, interpolation and spatial color coordinates with bag of words for large and complex datasets
Reta et al. Color uniformity descriptor: An efficient contextual color representation for image indexing and retrieval
CN110287847A (en) Vehicle grading search method based on Alexnet-CLbpSurf multiple features fusion
Jayaswal et al. A hybrid approach for image retrieval using visual descriptors
CN113505856A (en) Hyperspectral image unsupervised self-adaptive classification method
CN114037847B (en) Anti-noise local color texture feature extraction method
CN110674334B (en) Near-repetitive image retrieval method based on consistency region deep learning features
CN109753912B (en) Multispectral palm print matching method based on tensor
CN110490210B (en) Color texture classification method based on t sampling difference between compact channels
CN112508038B (en) Cross-channel local binary pattern color texture classification method
Zhu et al. Color orthogonal local binary patterns combination for image region description
Hiwale et al. Quick interactive image search in huge databases using Content-Based image retrieval
Banerji et al. Scene image classification: Some novel descriptors

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant