CN114037847B - Anti-noise local color texture feature extraction method - Google Patents

Anti-noise local color texture feature extraction method Download PDF

Info

Publication number
CN114037847B
CN114037847B CN202111391326.8A CN202111391326A CN114037847B CN 114037847 B CN114037847 B CN 114037847B CN 202111391326 A CN202111391326 A CN 202111391326A CN 114037847 B CN114037847 B CN 114037847B
Authority
CN
China
Prior art keywords
color
channel
channels
features
multiplied
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111391326.8A
Other languages
Chinese (zh)
Other versions
CN114037847A (en
Inventor
束鑫
宋志刚
程科
於跃成
严熙
范燕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu University of Science and Technology
Original Assignee
Jiangsu University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu University of Science and Technology filed Critical Jiangsu University of Science and Technology
Priority to CN202111391326.8A priority Critical patent/CN114037847B/en
Publication of CN114037847A publication Critical patent/CN114037847A/en
Application granted granted Critical
Publication of CN114037847B publication Critical patent/CN114037847B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches

Abstract

The invention discloses an anti-noise local color texture feature extraction method, which comprises the following specific steps: firstly, generating a fourth color vector channel C by a color texture image with the size of M multiplied by N according to three color channels of R-G-B according to a specific method, and arranging the fourth color vector channel C in a row to form a cube with the size of M multiplied by N multiplied by 4; then extracting local grouping sequence mode characteristics on the 4 color channels respectively; secondly, extracting longitudinal difference value binary pattern features on 4 color channels in a channel-crossing mode; then extracting the above features by different scales, normalizing and cascading to construct a joint vector H; and finally, classifying by using the chi-square distance and the nearest neighbor classifier to obtain a classification result. The method is simple in calculation, can extract the color information in the texture image and the correlation information among the color channels, and has robustness to noise.

Description

Anti-noise local color texture feature extraction method
Technical Field
The invention relates to the technical field of image processing and pattern recognition, in particular to an anti-noise local color texture feature extraction method.
Background
Texture is a basic property sensed by the surface of an object in nature, and texture classification is a very important research hotspot in the field of computer vision and plays an important role in all aspects. Therefore, how to effectively acquire the characteristic texture features is the key point for image analysis and understanding.
Gray texture analysis techniques are now becoming more sophisticated and many gray texture descriptors have been developed and successfully applied in many areas of image classification. However, since only grayscale images are texture classified, color information is discarded for color images, and is an important cue for visual perception. How to fully utilize the color information while extracting the texture features of the color image has important research value and significance. In 2002, ojala et al propose a Local Binary Pattern (LBP) which is widely applied to various application fields such as face recognition, face fraud detection, defect detection, medical image detection and the like due to the characteristics of easy realization, low calculation complexity, strong recognition capability, invariance to monotonous illumination change and the like. Subsequently, more LBP extension optimization algorithms were proposed, however most algorithms only target gray texture images and cannot effectively process color texture images. Although the feature extraction of color texture images has made great progress, color texture still has many open problems to be solved, especially the correlation problem between different color channels and the sensitivity problem to noise. Therefore, it would be of great importance to design a descriptor that makes efficient use of color information in texture images while being robust to noise.
Disclosure of Invention
The invention aims to solve the problems that the traditional Local Binary Pattern (LBP) and an extended optimization algorithm thereof cannot extract color information, cannot effectively utilize correlation information among color channels, has sensitivity to noise and the like.
The technical scheme adopted by the invention for solving the technical problems is as follows:
an anti-noise local color texture feature extraction method comprises the following steps:
the method comprises the following steps: generating a fourth color vector channel C of the color texture image with the size of M multiplied by N according to three color channels of R-G-B according to a specific method, and arranging the fourth color vector channel C in a cube with the size of M multiplied by N multiplied by 4;
step two: respectively extracting local grouping sequence mode characteristics on the 4 color channels;
step three: extracting longitudinal difference value binary pattern features on 4 color channels in a channel-crossing mode;
step four: extracting the features by adopting different scales, normalizing and cascading the features to construct a joint vector H;
step five: and classifying by using a chi-square distance and a nearest neighbor classifier to obtain a classification result.
In the first step of the present invention, a fourth color vector channel C is generated from the color texture image with size M × N according to three color channels R-G-B and arranged in a cube with size M × N × 4, that is, a color texture RGB image with size M × N is generated according to a color vector formula. The total number of 4 single-channel images is R channel (red), G channel (green ), B channel (blue), and C channel (Color Vector). Placed in the order of R-G-B-C to give an M X N X4 cube. The color vector channel C is generated as follows:
Figure BDA0003368952660000021
in the second step of the present invention, local grouping sequence pattern features are extracted on 4 color channels, that is, local grouping sequence pattern features are extracted on each channel, and the extraction process is as follows: first a dominant direction D needs to be designed. The dominant direction is defined as the index of the nearest pixel with the greatest difference to the central pixel value, and is expressed as:
Figure BDA0003368952660000022
the maximum differential response is then used to improve the discrimination of the features and the robustness to noise. After the dominant direction D is obtained, the sequence of neighboring pixel values is circularly rotated until the pixel value with index D is located at the first position in the sequence, denoted as:
(g r,0 ,g r,1 ,…,g r,P-1 ):=(g r,D ,…,g r,P-1 ,g r,0 ,…g r,D-1 )
in the formula, the symbol ": = "denotes element-by-element assignment operation.
Then, the rotated sequence of neighboring pixel values is uniformly divided into several groups, and to ensure that the texture features have a lower dimension, the number of neighboring pixel points in each group after grouping is limited to 4, so that the group can be divided into n = P/4 groups:
Figure BDA0003368952660000025
in the formula (I), the compound is shown in the specification,
Figure BDA0003368952660000026
representing the pixel values of the ith group of neighbor pixels.
Finally, coding the order relation between the adjacent pixels in each group:
LCOP r,P,i =f(γ(g′ i ))
where γ (·) is a sorting function that sorts the input elements in non-descending order and returns their relative positions; f (-) is a mapping function that maps an input sequence to a corresponding code value. As shown in the table, the neighbor pixels in each group may be mapped to an integer value (i.e., LCOP code value) in the range of {0,1, …,23} according to the formula.
Figure BDA0003368952660000031
In the third step of the invention, longitudinal difference value binary pattern features are extracted in a channel-crossing mode on 4 color channels, namely, the difference values among the channels are sequentially obtained and binarized in the 4 channels according to the following formula:
V 1 =V R-G =δ(R-G)
V 2 =V G-B =δ(G-B)
V 3 =V B-R =δ(B-R)
V 4 =V R-C =δ(R-C)
V 5 =V G-C =δ(G-C)
V 6 =V B-C =δ(B-C)
Figure BDA0003368952660000032
in the third step of the invention, longitudinal difference value binary pattern features are extracted in a channel-crossing mode on 4 color channels, namely, the difference values among the channels are sequentially obtained in the 4 channels according to the following formula, and are encoded according to a certain sequence after binarization, wherein the formula is as follows:
Figure BDA0003368952660000041
in the fourth step of the invention, different scales are adopted to extract the characteristics, normalize and cascade the characteristics, and a joint vector H is constructed, namely the characteristics of the local continuous sequence mode extracted from 4 channels and the characteristics of the longitudinal difference binary mode are normalized and then cascaded, wherein the normalization process comprises the following steps:
Figure BDA0003368952660000042
/>
Figure BDA0003368952660000043
Figure BDA0003368952660000044
Figure BDA0003368952660000045
Figure BDA0003368952660000046
then, 5 features are cascaded to obtain a final color image feature histogram:
Figure BDA0003368952660000047
the beneficial effects of the invention are:
(1) A new color channel of a color vector is added, so that texture details are enriched on the basis of the original three channels;
(2) Not only the color correlation information among different channels is coded, but also the color texture information of each channel is contained;
(3) The intra-channel and inter-channel features of each pixel in the color image are jointly encoded. Extracting local characteristics from four channels at a time, wherein the local characteristics comprise correlation information among different channels;
(4) Is robust to noise.
Drawings
FIG. 1 is a general flowchart of an anti-noise local color texture feature extraction method of the present invention;
FIG. 2 illustrates a first step of generating a fourth color vector channel C from a color texture image of size M × N according to R-G-B color channels and arranging the fourth color vector channel C in a cube of size M × N × 4;
FIG. 3 is a diagram illustrating a second step of extracting local grouping sequential pattern features from 4 color channels;
FIG. 4 is a diagram illustrating a cross-channel method for extracting longitudinal difference binary pattern features on 4 color channels in the third step of the present invention;
fig. 5 is a diagram of constructing a joint vector H in step four according to the present invention.
Detailed Description
In order to make the purpose and technical solution of the present invention clearer, the technical solution of the present invention will be clearly and completely described below with reference to the accompanying drawings of the embodiments of the present invention. It should be apparent that the described embodiments are only some of the embodiments of the present invention, and not all of them. All other embodiments, which can be derived by a person skilled in the art from the described embodiments of the invention without inventive step, are within the scope of protection of the invention.
As shown in FIG. 1, the method for extracting anti-noise local color texture features of the present invention comprises the steps of firstly generating a fourth color vector channel C according to three color channels R-G-B and a specific method for the color texture image with size of M × N, and arranging the fourth color vector channel C in a cube with size of M × N × 4; then extracting local grouping sequence mode characteristics on the 4 color channels respectively; secondly, extracting longitudinal difference value binary pattern features on 4 color channels in a channel-crossing mode; extracting the features by adopting different scales, normalizing and cascading the features, and constructing a joint vector H; and finally, classifying by using the chi-square distance and the nearest neighbor classifier to obtain a classification result. The method specifically comprises the following steps:
the method comprises the following steps: generating a fourth color vector channel C of the color texture image with the size of M multiplied by N according to three color channels of R-G-B according to a specific method, and arranging the fourth color vector channel C in a cube with the size of M multiplied by N multiplied by 4;
step two: respectively extracting local grouping sequence mode characteristics on the 4 color channels;
step three: extracting longitudinal difference value binary pattern features on 4 color channels in a channel-crossing mode;
step four: extracting the features by adopting different scales, normalizing and cascading the features to construct a joint vector H;
step five: and classifying by using a chi-square distance and a nearest neighbor classifier to obtain a classification result.
As shown in fig. 2, according to the method for extracting anti-noise local color texture features of the present invention, first, a fourth color vector channel C is generated from a color texture image with a size of M × N according to three color channels R-G-B according to a specific method, and is arranged in a cube with a size of M × N × 4, that is, a color texture RGB image with a size of M × N is generated according to a color vector formula. The total number of 4 single-channel images is R channel (red), G channel (green ), B channel (blue ), and C channel (Color Vector). The arrangement of R-G-B-C was repeated to obtain an MxNx4 cube. The color vector channel C is generated as follows:
Figure BDA0003368952660000051
as shown in fig. 3, in the second step of the present invention, local packet sequential pattern features are extracted on 4 color channels, that is, local packet sequential pattern features are extracted on each channel, and the extraction process is as follows: first, a dominant direction D needs to be designed. The dominant direction is defined as the index of the nearest pixel that differs most from the central pixel value, and is expressed as:
Figure BDA0003368952660000052
the maximum differential response is then used to improve the discrimination of the features and the robustness to noise. After the dominant direction D is obtained, the sequence of neighboring pixel values is circularly rotated until the pixel value with index D is located at the first position in the sequence, denoted as:
(g r,0 ,g r,1 ,…,g r,P-1 ):=(g r,D ,…,g r,P-1 ,g r,0 ,…g r,D-1 )
in the formula, the symbol ": = "represents an element-by-element assignment operation.
Then, the rotated neighboring pixel value sequence is uniformly divided into a plurality of groups, and in order to ensure that the texture features have lower dimensionality, the number of neighboring pixel points in each group after grouping is limited to 4, so that the group can be divided into n = P/4 groups:
Figure BDA0003368952660000063
in the formula (I), the compound is shown in the specification,
Figure BDA0003368952660000064
representing the pixel values of the ith group of neighbor pixels.
Finally, the order relationship between the neighboring pixels in each group is encoded:
LCOP r,P,i =f(γ(g′ i ))
where γ (·) is a sorting function that sorts the input elements in non-descending order and returns their relative positions; f (-) is a mapping function that maps an input sequence to a corresponding code value. As shown in the table, the neighbor pixels in each group may be mapped to an integer value (i.e., LCOP code value) according to the formula, which ranges from {0,1, …,23};
Figure BDA0003368952660000065
as shown in fig. 4, in the third step of the present invention, the longitudinal difference value binary pattern feature is extracted in a channel-crossing manner on 4 color channels, that is, the difference values between the channels are sequentially obtained and binarized in the 4 channels according to the following formula:
V 1 =V R-G =δ(R-G)
V 2 =V G-B =δ(G-B)
V 3 =V B-R =δ(B-R)
V 4 =V R-C =δ(R-C)
V 5 =V G-C =δ(G-C)
V 6 =V B-C =δ(B-C)
Figure BDA0003368952660000071
as shown in fig. 4, in the third step of the present invention, the longitudinal difference value binary pattern feature is extracted in a channel-crossing manner on 4 color channels, that is, the difference values between the channels are sequentially obtained in the 4 channels according to the following formula, and are encoded according to a certain sequence after binarization, where the formula is as follows:
Figure BDA0003368952660000072
as shown in fig. 5, in the fourth step of the present invention, the above features are extracted by different scales, normalized and concatenated, and a joint vector H is constructed, that is, the local continuous sequence mode features extracted in 4 channels and the longitudinal difference binary mode features are normalized and concatenated, and the normalization process is as follows:
Figure BDA0003368952660000073
Figure BDA0003368952660000074
Figure BDA0003368952660000075
Figure BDA0003368952660000076
Figure BDA0003368952660000077
then, 5 features are cascaded to obtain a final color image feature histogram:
Figure BDA0003368952660000078
in order to verify the effectiveness and stability of the anti-noise local color texture feature extraction method, the specific implementation of the method is described by classifying the texture features extracted finally in fig. 5 on a standard texture library KTH-TIPS:
(1) Single-scale and multi-scale performance analysis: the method analyzes the anti-noise performance under single scale and multi-scale in the KTH-TIPS database. The classification results are shown in table 1. As can be seen from table 1, the method proposed herein achieves satisfactory and stable classification accuracy at either single or multiple scales.
Table 1:
Figure BDA0003368952660000081
(2) Comparing the method with other 10 texture feature extraction methods, the result is shown in table 2;
by comparison with other methods, the method provided by the invention can be verified to have good advantages over other 10 methods: the classification precision is effectively improved; color correlation information among different channels is effectively utilized, and color texture information of each channel is also included; efficiently jointly encoding intra-channel and inter-channel features for each pixel in a color image; the anti-noise performance is effectively improved.
Table 2:
Figure BDA0003368952660000082
although the present invention has been described in detail with reference to the above embodiments, it should be understood by those skilled in the art that: modifications and equivalents may be made to the embodiments of the invention without departing from the spirit and scope of the invention, which is to be covered by the claims.

Claims (5)

1. An anti-noise local color texture feature extraction method is characterized by comprising the following steps: the method comprises the following steps:
the method comprises the following steps: generating a fourth color vector channel C of the color texture image with the size of M multiplied by N according to three color channels of R-G-B according to a specific method, and arranging the fourth color vector channel C in a cube with the size of M multiplied by N multiplied by 4;
step two: respectively extracting local grouping sequence mode characteristics on the 4 color channels;
step three: extracting longitudinal difference value binary pattern features on 4 color channels in a channel-crossing mode;
step four: extracting the features by different scales, normalizing and cascading the features, and constructing a joint vector H;
step five: classifying by using a chi-square distance and a nearest neighbor classifier to obtain a classification result;
in the second step, the local grouping sequence mode features are respectively extracted on the 4 color channels, that is, the extraction process of extracting the local grouping sequence mode features on each channel is as follows: firstly, designing a dominant direction D; the dominant direction is defined as the index of the nearest pixel with the greatest difference to the central pixel value, and is expressed as:
Figure FDA0003952275080000011
then, the maximum differential response is adopted to improve the discrimination of the characteristics and the robustness of the noise; after the dominant direction D is obtained, the sequence of neighboring pixel values is circularly rotated until the pixel value with index D is located at the first position in the sequence, denoted as:
(g′ r,0 ,g′ r,1 ,…,g′ r,P-1 ):=(g r,D ,…,g r,P-1 ,g r,0 ,…,g r,D-1 )
in the formula, the symbol ": = "represents element-by-element assignment operation;
then, the rotated neighboring pixel value sequence is uniformly divided into a plurality of groups, and the number of neighboring pixel points in each group after grouping is limited to 4, so that the group is divided into n = P/4:
Figure FDA0003952275080000012
in formula (II), g' i Pixel values representing the ith set of neighbor pixels;
finally, the order relationship between the neighboring pixels in each group is encoded:
LCOP r,P,i =f(γ(g′ i ))
where γ (·) is a sorting function that sorts the input elements in a non-descending manner and returns their relative positions; f (-) is a mapping function that maps an input sequence to a corresponding code value; as shown in the table, the neighbor pixels in each group may be mapped to an integer value, i.e., LCOP code value, according to the formula, the range of integer values being {0,1, …,23};
γ(g′ i ) f(γ(g′ i )) γ(g′ i ) f(γ(g′ i )) 1,2,3,4 0 3,1,2,4 12 1,2,4,3 1 3,1,4,2 13 1,3,2,4 2 3,2,1,4 14 1,3,4,2 3 3,2,4,1 15 1,4,2,3 4 3,4,1,2 16 1,4,3,2 5 3,4,2,1 17 2,1,3,4 6 4,1,2,3 18 2,1,4,3 7 4,1,3,2 19 2,3,1,4 8 4,2,1,3 20 2,3,4,1 9 4,2,3,1 21 2,4,1,3 10 4,3,1,2 22 2,4,3,1 11 4,3,2,1 23
2. an anti-noise local color texture feature extraction method as claimed in claim 1, characterized in that: in the first step, a fourth color vector channel C is generated on the color texture image with the size of M multiplied by N according to three color channels R-G-B according to a specific method and is arranged in a cube with the size of M multiplied by N4, namely, a color texture RGB image with the size of M multiplied by N is generated into the fourth channel C according to a color vector formula; 4 images of a single channel are counted, namely an R channel, a G channel, a B channel and a C channel; placing the three-dimensional cubic bodies according to the arrangement sequence of R-G-B-C so as to obtain an MxNx4 cubic body; the color vector channel C is generated as follows:
Figure FDA0003952275080000021
3. an anti-noise local color texture feature extraction method as claimed in claim 1, characterized in that: in the third step, longitudinal difference value binary pattern features are extracted on 4 color channels in a channel-crossing mode, that is, the difference values between the channels are sequentially obtained in the 4 channels according to the following formula and are binarized, wherein the formula is as follows:
V 1 =V R-G =δ(R-G)
V 2 =V G-B =δ(G-B)
V 3 =V B-R =δ(B-R)
V 4 =V R-C =δ(R-C)
V 5 =V G-C =δ(G-C)
V 6 =V B-C =δ(B-C)
Figure FDA0003952275080000031
4. a method for extracting anti-noise local color texture features according to claim 3, characterized in that: in the third step, longitudinal difference value binary pattern features are extracted on 4 color channels in a channel-crossing mode, namely, the difference values among the channels are sequentially obtained in the 4 channels according to the following formula, and are encoded according to a certain sequence after binarization, wherein the formula is as follows:
Figure FDA0003952275080000032
5. an anti-noise local color texture feature extraction method as claimed in claim 3 or 4, characterized in that: in the fourth step, the characteristics are extracted by different scales, normalized and cascaded, and a joint vector H is constructed, namely the local continuous sequence mode characteristics extracted from 4 channels and the longitudinal difference value binary mode characteristics are normalized and then cascaded, wherein the normalization process is as follows:
Figure FDA0003952275080000033
Figure FDA0003952275080000034
Figure FDA0003952275080000035
/>
Figure FDA0003952275080000036
Figure FDA0003952275080000037
then, 5 features are cascaded to obtain a final color image feature histogram:
Figure FDA0003952275080000038
/>
CN202111391326.8A 2021-11-23 2021-11-23 Anti-noise local color texture feature extraction method Active CN114037847B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111391326.8A CN114037847B (en) 2021-11-23 2021-11-23 Anti-noise local color texture feature extraction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111391326.8A CN114037847B (en) 2021-11-23 2021-11-23 Anti-noise local color texture feature extraction method

Publications (2)

Publication Number Publication Date
CN114037847A CN114037847A (en) 2022-02-11
CN114037847B true CN114037847B (en) 2023-04-18

Family

ID=80145135

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111391326.8A Active CN114037847B (en) 2021-11-23 2021-11-23 Anti-noise local color texture feature extraction method

Country Status (1)

Country Link
CN (1) CN114037847B (en)

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109740572B (en) * 2019-01-23 2020-09-29 浙江理工大学 Human face living body detection method based on local color texture features
CN111696080B (en) * 2020-05-18 2022-12-30 江苏科技大学 Face fraud detection method, system and storage medium based on static texture
CN112508038B (en) * 2020-12-03 2022-11-08 江苏科技大学 Cross-channel local binary pattern color texture classification method

Also Published As

Publication number Publication date
CN114037847A (en) 2022-02-11

Similar Documents

Publication Publication Date Title
Kviatkovsky et al. Color invariants for person reidentification
Mäenpää et al. Classification with color and texture: jointly or separately?
Zhu et al. Image region description using orthogonal combination of local binary patterns enhanced with color information
Chatzichristofis et al. Fcth: Fuzzy color and texture histogram-a low level feature for accurate image retrieval
Yi et al. Text extraction from scene images by character appearance and structure modeling
Saha et al. Cbir using perception based texture and colour measures
Buza et al. Skin detection based on image color segmentation with histogram and k-means clustering
Mohamed et al. An improved LBP algorithm for avatar face recognition
Ahmed et al. Deep image sensing and retrieval using suppression, scale spacing and division, interpolation and spatial color coordinates with bag of words for large and complex datasets
CN110287847A (en) Vehicle grading search method based on Alexnet-CLbpSurf multiple features fusion
Reta et al. Color uniformity descriptor: An efficient contextual color representation for image indexing and retrieval
Jayaswal et al. A hybrid approach for image retrieval using visual descriptors
Sadique et al. Content-based image retrieval using color layout descriptor, gray-level co-occurrence matrix and k-nearest neighbors
CN109271997B (en) Image texture classification method based on skip subdivision local mode
CN114037847B (en) Anti-noise local color texture feature extraction method
CN109753912B (en) Multispectral palm print matching method based on tensor
CN109544614B (en) Method for identifying matched image pair based on image low-frequency information similarity
Das et al. Enhancing face matching in a suitable binary environment
CN110674334A (en) Near-repetitive image retrieval method based on consistency region deep learning features
CN112508038B (en) Cross-channel local binary pattern color texture classification method
Zhu et al. Color orthogonal local binary patterns combination for image region description
CN115830637A (en) Method for re-identifying shielded pedestrian based on attitude estimation and background suppression
Banerji et al. Scene image classification: Some novel descriptors
Parnak et al. A Novel Image Splicing Detection Algorithm Based on Generalized and Traditional Benford’s Law
Tang et al. Scene text detection via edge cue and multi-features

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant