CN109657083B - Method and device for establishing textile picture feature library - Google Patents

Method and device for establishing textile picture feature library Download PDF

Info

Publication number
CN109657083B
CN109657083B CN201811610700.7A CN201811610700A CN109657083B CN 109657083 B CN109657083 B CN 109657083B CN 201811610700 A CN201811610700 A CN 201811610700A CN 109657083 B CN109657083 B CN 109657083B
Authority
CN
China
Prior art keywords
value
sample image
color
feature vector
textile
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811610700.7A
Other languages
Chinese (zh)
Other versions
CN109657083A (en
Inventor
吴家鹏
冯立庚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
GUANGZHOU HUAXUN NETWORK TECHNOLOGY CO LTD
Original Assignee
GUANGZHOU HUAXUN NETWORK TECHNOLOGY CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by GUANGZHOU HUAXUN NETWORK TECHNOLOGY CO LTD filed Critical GUANGZHOU HUAXUN NETWORK TECHNOLOGY CO LTD
Priority to CN201811610700.7A priority Critical patent/CN109657083B/en
Publication of CN109657083A publication Critical patent/CN109657083A/en
Application granted granted Critical
Publication of CN109657083B publication Critical patent/CN109657083B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The application relates to a method and a device for establishing a textile picture feature library, computer equipment and a storage medium, wherein the method comprises the following steps: obtaining sample images with preset classification numbers; acquiring a gray value, an HSV color space value and a coordinate value of a pixel in a sample image; calculating a color global feature vector, a color local feature vector and a gray texture local feature vector of the sample image according to the gray value, the HSV color space value and the coordinate value; clustering and dividing the sample images into groups of cluster centroid feature vectors of a first preset number according to the color global feature vector, the color local feature vector and the gray texture local feature vector of each sample image; and generating a textile picture feature library according to the cluster centroid feature vectors and the sample images in the groups of the cluster centroid feature vectors. The textile picture feature library established by the method can ensure matching accuracy and reduce calculation amount when the textile pictures are matched, and matching efficiency is improved.

Description

Method and device for establishing textile picture feature library
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method and an apparatus for establishing a textile image feature library, a computer device, and a storage medium.
Background
The automatic identification technology of the textiles has wide application in the fields of textile purchasing, textile storage and transportation, textile electronic commerce and the like. Automatic identification of textiles belongs to the application of pattern recognition field, and depends on picture processing and analysis. Taking a large-scale textile wholesale market as an example, numerous merchants and various commodities exist in the wholesale market, customers and merchants mainly rely on manual searching, inquiring and identifying during purchasing, the manual searching mode is low in efficiency, and if the automatic searching from a sample to a commodity can be realized by using a picture matching technology, the purchasing efficiency can be greatly improved.
Conventional textile image matching methods include: global color matching, color matching with spatial information, shape feature matching, texture matching and the like, wherein the global color matching adopts a color histogram of the whole picture as a feature for matching, and the detection precision is low. Compared with global color matching, shape feature matching and texture matching with spatial information adopt more detailed image features, and can analyze and match textile images more accurately, but the defects of large operation amount and low matching efficiency exist.
Disclosure of Invention
In view of the above, it is necessary to provide a method, an apparatus, a computer device and a storage medium for establishing a textile picture feature library, which can accurately and efficiently match a textile picture.
A method for establishing a textile picture feature library comprises the following steps:
obtaining sample images with preset classification numbers; acquiring a gray value, an HSV color space value and a coordinate value of a pixel in a sample image; calculating a color global feature vector, a color local feature vector and a gray texture local feature vector of the sample image according to the gray value, the HSV color space value and the coordinate value; clustering and dividing the sample images into groups of cluster centroid feature vectors of a first preset number according to the color global feature vector, the color local feature vector and the gray texture local feature vector of each sample image; and generating a textile picture feature library according to the cluster centroid feature vectors and the sample images in the groups of the cluster centroid feature vectors.
An apparatus for establishing a textile picture feature library, the apparatus comprising:
the sample image acquisition module is used for acquiring sample images with preset classification numbers;
the system comprises a sample pixel value acquisition module, a pixel value acquisition module and a pixel value acquisition module, wherein the sample pixel value acquisition module is used for acquiring a gray value, an HSV color space value and a coordinate value of a pixel in a sample image;
the sample characteristic vector calculation module is used for calculating a color global characteristic vector, a color local characteristic vector and a gray texture local characteristic vector of the sample image according to the gray value, the HSV color space value and the coordinate value;
the sample image clustering and dividing module is used for clustering and dividing the sample images into groups of cluster centroid feature vectors with a first preset number according to the color global feature vectors, the color local feature vectors and the gray texture local feature vectors of all the sample images;
and the textile picture feature library generating module is used for generating a textile picture feature library according to the cluster-like centroid feature vectors and the sample images in the groups of the cluster-like centroid feature vectors.
A computer device comprising a memory and a processor, the memory storing a computer program, the processor implementing the following steps when executing the computer program:
obtaining sample images with preset classification numbers; acquiring a gray value, an HSV color space value and a coordinate value of a pixel in a sample image; calculating a color global feature vector, a color local feature vector and a gray texture local feature vector of the sample image according to the gray value, the HSV color space value and the coordinate value; clustering and dividing the sample images into groups of cluster centroid feature vectors of a first preset number according to the color global feature vector, the color local feature vector and the gray texture local feature vector of each sample image; and generating a textile picture feature library according to the cluster centroid feature vectors and the sample images in the groups of the cluster centroid feature vectors.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
obtaining sample images with preset classification numbers; acquiring a gray value, an HSV color space value and a coordinate value of a pixel in a sample image; calculating a color global feature vector, a color local feature vector and a gray texture local feature vector of the sample image according to the gray value, the HSV color space value and the coordinate value; clustering and dividing the sample images into groups of cluster centroid feature vectors of a first preset number according to the color global feature vector, the color local feature vector and the gray texture local feature vector of each sample image; and generating a textile picture feature library according to the cluster centroid feature vectors and the sample images in the groups of the cluster centroid feature vectors.
According to the building method and device of the textile picture feature library, the computer equipment and the storage medium, the color global feature vector, the color local feature vector and the gray texture local feature vector of the sample image are collected respectively, and the similar textile pictures can be identified to the greatest extent through feature extraction combining the global features and the local features, so that the success rate of textile picture matching is improved. The dimensions of the global features and the local features are reduced, reducing the amount of computation. Similar sample images are clustered and grouped through clustering and grouping operations of the sample images in the textile image feature library, so that the number of samples participating in matching can be reduced when the textile images are matched, the operation amount is reduced on the premise of ensuring the matching accuracy, and the matching efficiency of the textile images is improved.
Drawings
FIG. 1 is a diagram of an application environment of a method for creating a textile picture feature library according to an embodiment;
FIG. 2 is a schematic flow chart illustrating a method for creating a textile picture feature library according to an embodiment;
FIG. 3 is a schematic diagram of an embodiment of an image unit for segmenting a textile sample;
FIG. 4 is a schematic flow chart of the sample image cluster partitioning step in one embodiment;
FIG. 5 is a schematic representation of a library of textile picture features in one embodiment;
FIG. 6 is a flowchart illustrating a method for creating a textile picture feature library according to an embodiment;
FIG. 7 is a flowchart illustrating a method for creating a textile image feature library according to an embodiment;
fig. 8 is a block diagram of a device for creating a textile picture feature library according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The method for establishing the textile picture feature library can be applied to the application environment shown in fig. 1. Wherein the terminal 102 communicates with the server 104 via a network. The server 104 executes the method for establishing the textile picture feature library according to any embodiment of the present application, establishes the textile picture feature library, and stores the textile picture feature library in the server 104. The terminal 102 sends a matching request to the server 104, the matching request carries information of the textile picture to be matched, the server 104 obtains the textile picture, matches the obtained textile picture with a sample image in a textile picture feature library established in the server 104, obtains a matching result of the textile picture, and returns the matching result to the terminal 102. The terminal 102 may be, but not limited to, various personal computers, notebook computers, smart phones, tablet computers, and portable wearable devices, and the server 104 may be implemented by an independent server or a server cluster formed by a plurality of servers.
In one embodiment, as shown in fig. 2, a method for establishing a textile picture feature library is provided, which is described by taking the method as an example for being applied to the terminal in fig. 1, and includes the following steps:
s202, obtaining sample images with preset classification numbers;
wherein the sample image is a textile sample image representing a textile type; in practical application, a sample image to be included in a textile picture feature library may be determined according to different textile models, different merchants, and the like, for example, for a textile of one model of a certain merchant, one or more sample images may be set, an initial textile sample image obtained by sampling the textile of the merchant and the model may be obtained, and then the initial textile sample image is subjected to certain preprocessing to obtain the sample image.
The initial textile sample image can be acquired in a manual shooting mode, and then the acquired initial textile sample image is recorded in a server for recording. In the step, the server preprocesses the collected initial textile sample image to obtain a sample image with a preset classification number.
S204, acquiring a gray value, an HSV color space value and a coordinate value of a pixel in the sample image;
the sample image may be an image with a preset resolution, the image is composed of a pixel matrix with the preset resolution, each pixel in the pixel matrix includes a plurality of channel values of a certain color space, and the color space may be, for example, an RGB color space (including three channel values of red R, green G, and blue B), a YUV color space (including three channel values of luminance Y, chrominance U, and chrominance V, which collectively represent saturation and chrominance), an HSV color space (including three channel values of hue H, saturation S, and luminance V), and the like.
The gray value can be calculated according to the channel value of the color space.
The HSV color space value may be obtained in different manners. If the sample image is represented by an HSV color space, an HSV color space value may be directly read from the sample image, and if the sample image is represented by another color space, an HSV color space value may be obtained through color space conversion.
The coordinate values represent information of relative positions of pixels in the image, and may have different specific expressions, for example, coordinates of a certain pixel point p in the sample image may be recorded as (x, y), where x is a column number and y is a row number.
In this step, the server may process the acquired sample image, and calculate and acquire a gray value, an HSV color space value, and a coordinate value of each pixel in the sample image according to a channel value of a color space of the sample image.
S206, calculating a color global feature vector, a color local feature vector and a gray texture local feature vector of the sample image according to the gray value, the HSV color space value and the coordinate value;
in this step, the color global feature vector of the sample image may be calculated according to the HSV color space value of the pixel in the sample image, the color local feature vector of the sample image may be calculated according to the HSV color space value and the coordinate value of the pixel in the sample image, and the gray texture local feature vector of the sample image may be calculated according to the gray value of the pixel in the sample image.
S208, clustering and dividing the sample images into groups of the first preset number of cluster centroid feature vectors according to the color global feature vectors, the color local feature vectors and the gray texture local feature vectors of the sample images;
the first preset number of the cluster-like centroid feature vectors can be set according to actual conditions, and under the condition that the total number of the sample images is determined, the smaller the first preset number is set, the smaller the number of groups is, the more sample images participating in matching in a single group are in the matching of the textile images, and the later-stage matching computation amount is increased; on the contrary, the more the first preset number sets the more the grouping number is, when the textile pictures are matched, the more operation time is needed to be spent on determining the grouping to which the textile pictures belong, and the less the sample images participating in matching in a single grouping are, the higher the probability of failure in matching is, so that the grouping number of the cluster-like centroid feature vectors needs to be set appropriately.
In this step, the server may determine a first preset number of groups of suitable cluster-like centroid feature vectors according to the total number of sample images. And then clustering and dividing the sample images into groups of the cluster centroid feature vectors of the first preset number according to the Euclidean distance between the color global feature vector, the color local feature vector and the gray texture local feature vector of each sample image and the cluster centroid feature vectors of each cluster.
And S210, generating a textile picture feature library according to the cluster-like centroid feature vectors and the sample images in the groups of the cluster-like centroid feature vectors.
In the case that the sample image cluster has been divided into the first preset number of groups of cluster centroid feature vectors in step S208, in this step, the server may generate a textile picture feature library, and record the cluster centroid feature vectors and the related information of the sample images in the groups of the cluster centroid feature vectors in the generated textile picture feature library.
According to the method for establishing the textile picture feature library, the color global feature vector, the color local feature vector and the gray texture local feature vector of the sample image are respectively collected, and similar textile pictures can be identified to the greatest extent through feature extraction combining the global feature and the local feature, so that the success rate of textile picture matching is improved. The dimensions of the global features and the local features are reduced, reducing the amount of computation. Similar sample images are clustered and grouped through clustering and grouping operations of the sample images in the textile image feature library, so that the number of samples participating in matching can be reduced when the textile images are matched, the operation amount is reduced on the premise of ensuring the matching accuracy, and the matching efficiency of the textile images is improved.
In one embodiment, the step S202 of obtaining sample images with preset classification numbers includes: acquiring an initial textile sample image with a preset classification number; segmenting a textile sample image unit with a preset resolution from each initial textile sample image, and adjusting the contrast of the textile sample image unit to a preset contrast value to obtain an adjusted textile sample image unit; the adjusted textile sample image element is taken as the sample image.
As an example, after receiving the initial textile sample picture sent by the terminal, the server may process the initial textile sample picture into an original textile sample image with a preset resolution according to matching requirements.
For example, as shown in fig. 3, taking the preset resolution of 512 × as an example, the initial textile sample image acquired by the terminal needs to be not less than 512 × pixels, and the server uniformly adjusts the initial textile sample image to 512 × pixels after receiving the initial textile sample image acquired by the terminal.
The server may then adjust the brightness and contrast of the textile sample image elements such that the image parameters of the adjusted textile sample image elements are consistent for use in matching the textile image.
Wherein, contrast transformation is also called contrast enhancement or histogram transformation, or contrast enhancement (stretching), which is an image processing method for improving image quality by changing the contrast of an image element by changing the brightness value of the image element, taking the RGB color space as an example of a textile sample image unit, and the range of the values of R, G, B three channels is [0, 255], then the color value of each pixel of the adjusted textile sample image unit can be calculated by using the following formula:
r’=r+(r-127)×2,g’=g+(g-127)×2,b’=b+(b-127)×2;
wherein r ', g ' and b ' are adjusted r, g and b values. For any one of r ', g' and b ', if the result calculated by the above formula is out of the range of [0, 255], if the result calculated by the r', g 'and b' is larger than 255, 255 is taken; and if the calculation results of r ', g ' and b ' are less than 0, taking 0.
After adjusting the color values of the pixels of the textile sample image unit, the server may take the adjusted textile sample image unit as a sample image.
In one embodiment, calculating the color global feature vector of the sample image in S206 includes: respectively calculating the mean value of the H channel value, the mean value of the S channel value, the mean value of the V channel value, the variance of the H channel value, the variance of the S channel value, the variance of the V channel value, the third moment of the H channel value, the third moment of the S channel value and the third moment of the V channel value in the HSV color space values of the pixels in the sample image; and generating a color global feature vector of the sample image according to the mean value of the H channel value, the mean value of the S channel value, the mean value of the V channel value, the variance of the H channel value, the variance of the S channel value, the variance of the V channel value, the third moment of the H channel value, the third moment of the S channel value and the third moment of the V channel value.
Illustratively, the mean μ of the H-channel valueshMean value of S channel valuessV mean value of channel values muvVariance σ of H channel valueshVariance σ of S channel valuesV variance of channel values σvThird order moment of H channel value tauhThird order moment of S channel value tausAnd three of the V channel valueOrder moment tauvEach can be calculated by the following formula:
Figure BDA0001924669410000081
Figure BDA0001924669410000082
Figure BDA0001924669410000083
wherein h isi、si、viRespectively representing the H channel value, the S channel value and the V channel value of the ith pixel in the sample image, wherein n represents the total number of pixels of the sample image, if the resolution of the sample image is 512 × 512, n is 262144, and after the calculation is finished, 9 global characteristic values mu of the image are obtainedh、μs、μv、σh、σs、σv、τh、τs、τv. Generating a color global feature vector global containing 9 feature values:
global=(μh,μs,μv,σh,σs,σv,τh,τs,τv)。
in this embodiment, the mean, the variance, and the third moment of each channel value of the sample image in the HSV color space value are respectively extracted to generate a color global feature vector, which may reflect the global color feature of the entire sample image.
In one embodiment, calculating the color local feature vector of the sample image in S206 includes: reducing the dimensions of an H channel value, an S channel value and a V channel value in the HSV color space value of each pixel in the sample image to generate a single-dimensional color value; the one-dimensional color values are quantized into color level values of a third preset number of color levels; quantizing the pixel distance value between each pixel and other pixels in the sample image into a pixel distance grade value of a fourth preset number of distance grades according to the coordinate value of each pixel in the sample image; generating a two-dimensional matrix containing elements of which the third preset number is multiplied by the fourth preset number according to the color grade value and the pixel distance grade value of each pixel in the sample image; the value of each element in the two-dimensional matrix represents the number of color grade values and pixel distance grade values under one color grade and distance grade; and expanding the two-dimensional matrix to obtain the color local characteristic vector of the sample image.
As an example, the HSV color space value may be quantized into a plurality of levels first to reduce the size of the feature space, and the number of levels quantized by each channel value may be determined according to actual requirements. Taking the quantization of the H channel value into 8 levels, the S channel value into 3 levels, and the V channel value into 4 levels as an example, the HSV color space value may be quantized as follows:
Figure BDA0001924669410000091
Figure BDA0001924669410000092
Figure BDA0001924669410000093
wherein H represents an H channel value before quantization, S represents an S channel value before quantization, V represents a V channel value before quantization, HnewRepresenting the quantized H-channel value, snewRepresenting the quantized S-channel value, vnewRepresenting the quantized V-channel value.
Dimension reduction and synthesis are carried out on the quantized H, S, V channel values to obtain a single-dimension color value hsv of one dimension, which is shown as the following formula:
hsv=12hnew+4snew+vnew
of these, hsv ∈ [0,1, …,95 ].
The distance between the quantized pixel points is 4 levels [0,1, 2, 3 ]]Generating a feature vector L ocal1,Local1The size is 4 × 96, i.e. contains 384 features, and the calculation method is as follows:
taking the sample image resolution of 512 × 512 as an example, the image pixel coordinates of 512 × 512 pixels can be quantized into 4 levels, which facilitates calculation of distance, for example, the coordinates of a certain pixel point p are (x, y) x, y ∈ [0,511], and then the new coordinates (x ', y') are:
Figure BDA0001924669410000094
after quantizing the image HSV color space to a single channel of 96 levels and the pixel coordinates to 4 levels, the calculation of the color local feature vector local1 can be implemented, for example, by the following pseudo-code:
the size of the two-dimensional array L ocal1 is 4 × 96, and all array elements are initialized to 0;
Figure BDA0001924669410000101
after the calculation, the calculated L ocal1 array is expanded to form a color local feature vector local1 containing 384 feature values, wherein the 384 feature values actually represent the color autocorrelation of the sample image.
In one embodiment, the calculating the gray texture local feature vector of the sample image in S206 includes: acquiring a fifth preset number of different convolution template matrixes; performing convolution calculation on each convolution template matrix and a gray value matrix formed by the gray values of all pixels in the sample image to obtain a convolution value of each pixel; generating a convolution image corresponding to each convolution template matrix according to the convolution value of each pixel in the sample image calculated by each convolution template matrix; respectively calculating the mean value and the variance of the gray values of the pixels in the convolution image aiming at each convolution image; and generating the gray texture local feature vector of the sample image according to the mean value and the variance of the gray values of the convolution images of the fifth preset number.
As an example, the gray texture local feature vector may be obtained by calculating a gray value of the sample image, and specifically includes the following steps:
firstly, performing convolution operation on an image by adopting preset 8 convolution template matrixes, wherein the 8 convolution template matrixes are as follows:
Figure BDA0001924669410000111
Figure BDA0001924669410000112
for each convolution template matrix, performing convolution operation on the gray value of the sample image and the convolution template matrix to generate a convolution image, wherein the convolution operation can be regarded as a weighted summation process, each pixel in the image area of the sample image is multiplied by each element of a convolution kernel (weight matrix), and the sum of all products is used as a new value of the area center pixel.
For each convolution image, use
Figure BDA0001924669410000113
Calculating the mean and variance of the convolution image as a whole, wherein n represents the total number of pixels in the convolution image, grayiRepresenting the gray value of the ith pixel in the convolved image. The final 8 convolution images for each sample image generate 16 feature values μ g1, σ g1, μ g2, σ g2, μ g3, σ g3, μ g4, σ g4, μ g5, σ g5, μ g6, σ g6, μ g7, σ g7, μ g8, σ g8, and generate a gray texture local feature vector local2 for the sample image:
local2=(μg1,σg1,μg2,σg2,μg3,σg3,μg4,σg4,μg5,σg5,μg6,σg6,μg7,σg7,μg8,σg8)。
through the calculation methods of the three embodiments, a sample image feature library L including z sample images can be obtained { L ═1、L2、…、Li、…、LzAny one of the sample images LiWith three feature vectors global, local1 and local2, the sample image LiCan be noted as
Figure BDA0001924669410000114
Wherein the content of the first and second substances,
Figure BDA0001924669410000115
representative sample image LiFor example, global includes 9 eigenvalues, local1 includes 384 eigenvalues, and local2 includes 16 eigenvalues, then m is 409.
In one embodiment, as shown in fig. 4, the step S208 of clustering the sample images into a first predetermined number of clusters of centroid feature vectors according to the color global feature vector, the color local feature vector and the gray texture local feature vector of each sample image includes:
s401, selecting total feature vectors of sample images with a first preset number from sample images with a preset number of classes as initial cluster-like mass center feature vectors; the total feature vector is generated according to the color global feature vector, the color local feature vector and the gray texture local feature vector;
before this step S401, basic parameters of cluster division, such as the first preset number k of cluster-like centroid feature vectors, need to be set first, after step S206 is executed, the sample image feature library L ═ L can be obtained1、L2、…、Li、…、Lz},LiRepresenting the ith sample image in the sample image feature library, wherein the total feature vector of each sample image
Figure BDA0001924669410000121
m is the size of the total feature vector and is 409; the first preset number k of the cluster-like centroid feature vectors can be determined according to the total number of the sample images, and generally, each cluster-like centroid feature vector is divided into 409 because the feature numbers (global, local1, and local2 feature number total) reach 409The number of sample images included in a group is generally not less than 409, and a value of k (sum (L) may be set to k (sum (L i)/1000 as an examplei) Representing the total number of sample images in the sample image feature library).
As an example, in this step, the total feature vector S of the k sample images may be randomly selectedj(j ═ 1, 2, … k) as the initial cluster-like centroid feature vector.
S402, calculating Euclidean distances from the total feature vectors of all sample images in the sample images with preset classification numbers to the centroid feature vectors of the initial cluster;
as an example, in this step, the distances from all sample images to the k cluster-like centroid feature vectors are calculated respectively, and the distances are calculated by using euclidean distances:
Figure BDA0001924669410000122
in the above formula, LiRepresenting the ith sample image, SjRepresenting the feature vector of the centroid of the jth cluster class,
Figure BDA0001924669410000123
the t-th feature value representing the i-th sample image,
Figure BDA0001924669410000124
and the t-th characteristic value of the j-th cluster-like centroid characteristic vector is represented.
S403, dividing each sample image into groups of initial cluster centroid feature vectors with minimum Euclidean distances from the sample image;
illustratively, in this step, if the sample image is off-cluster centroid feature vector SiRecently, then this sample image belongs to SiPoint group, which can be divided into centroid feature vector S of the clusteriIn the group of (1); if the distances to the plurality of cluster-like centroid feature vectors are equal, the cluster-like centroid feature vectors can be divided into groups of any cluster-like centroid feature vectors:
label(Li)=arg min1≤j≤k{dist(Li,Sj)};
s404, calculating the total feature vector mean value of the sample images in the grouping of the centroid feature vectors of each initial cluster;
as an example, in this step, after all the sample images are grouped by distance, the mean of the total feature vectors of the sample images in each group is calculated (a simpler method is to average the feature values of each dimension of the sample images) as a new cluster-like centroid feature vector:
Figure BDA0001924669410000131
in the above formula, the first and second groups of the formula,
Figure BDA0001924669410000132
representing the jth new cluster-like centroid feature vector, NjNumber of sample images, l, representing centroid feature vectors of the jth clusteri∈C(Sj) Representing the sample image belonging to the jth cluster-like centroid feature vector.
S405, judging whether the Euclidean distances between the total feature vector mean value of the sample images in the grouping of all the initial cluster centroid feature vectors and the initial cluster centroid feature vectors are smaller than a distance threshold value;
the distance threshold value can be set according to actual conditions, and the smaller the value, the larger the cluster calculation amount.
If the determination result of S405 is yes, go to step S407;
if the judgment result in the step S405 is negative, executing the step S406, and respectively taking the total feature vector mean values in the grouping of the initial cluster centroid feature vectors as the updated initial cluster centroid feature vectors; after step S406 is executed, the process returns to step S402;
s407, obtaining the current initial cluster centroid feature vector and the sample image in the grouping of the initial cluster centroid feature vector as the cluster centroid feature vector of the cluster division and the sample image in the grouping of the cluster centroid feature vector.
In this embodiment, a first preset number of initial cluster centroid feature vectors are set, and through repeated iterative operations, the total feature vector mean value in each iterative group is respectively used as an updated initial cluster centroid feature vector as an initial cluster centroid feature vector of the next iterative operation until the euclidean distances between the total feature vector mean values of the sample images in all the initial cluster centroid feature vectors and the initial cluster centroid feature vectors are smaller than the distance threshold, so that the sample images can be rapidly and accurately clustered and divided into the first preset number of cluster centroid feature vectors.
As shown in fig. 5, by the technical solutions of the above embodiments, a sample feature library of a textile picture with a total number of sample images of z can be generated, and further divided into k groups according to clustering calculation, where each group has a certain number of sample images below, and a total feature vector of each sample image includes three feature vectors, namely global feature vector, local1, and local2, where global represents a color global feature vector, and includes 9 feature values; local1 represents a color local feature vector, containing 384 feature values; local2 represents local features of gray texture, including 16 feature values; the total feature vector composed of the 409 feature values forms the features of a sample image, and can be used for matching of subsequent textile pictures.
In the foregoing embodiments, the server establishes the textile image feature library, and after the textile image feature library is established in the server, the server may match the textile image obtained according to the matching instruction with the sample image in the established textile image feature library after receiving the matching instruction sent by the terminal. In an embodiment, as shown in fig. 6, the method for establishing a textile picture feature library further includes a step of matching a textile picture, which specifically includes:
s602, obtaining a textile picture to be matched;
in the step, the server acquires a textile picture to be matched according to a matching request sent by the terminal; specifically, the terminal can send an original textile picture to be matched to the server, and the server processes the received original textile picture to obtain a textile picture for matching. The original textile picture can be obtained by shooting through a terminal camera or directly reading from a terminal memory such as an album.
In this step, when the server processes the received original textile picture, the resolution, contrast, and the like of the textile picture to be matched, which is obtained by the processing, need to be consistent with the resolution, contrast, and the like of the sample image in the textile picture feature library.
S604, acquiring a gray value, an HSV color space value and a coordinate value of a pixel in the textile picture;
in this step, the server may process the obtained textile picture by using the same method as in S204, and calculate and obtain the gray value, HSV color space value, and coordinate value of each pixel in the textile picture according to the channel value of the color space of the textile picture.
S606, calculating a color global feature vector, a color local feature vector and a gray texture local feature vector of the textile picture according to the gray value, the HSV color space value and the coordinate value;
in this step, the server may calculate a color global feature vector, a color local feature vector, and a gray texture local feature vector of the textile picture according to the gray value, the HSV color space value, and the coordinate value of the pixel in the textile picture by the same method as in S206.
S608, determining the grouping of the cluster centroid feature vectors to which the textile pictures belong according to the first Euclidean distance between the color global feature vectors of the textile pictures and the color global feature vectors of the cluster centroid feature vectors in the textile picture feature library;
the server is pre-stored with an established textile picture feature library, and the textile picture feature library is pre-stored with a plurality of cluster-like centroid feature vectors and sample images classified into groups of the cluster-like centroid feature vectors. The Euclidean distance between the total feature vector of each sample image under a group and the cluster-like centroid feature vector of the group is smaller than a distance threshold value. Each cluster-like centroid feature vector comprises a group of color global feature vectors, color local feature vectors and gray texture local feature vectors.
In this step, the server may calculate a first euclidean distance between the currently acquired color global feature vector of the textile picture and the color global feature vector of each cluster centroid feature vector in the textile picture feature library, and select a cluster-like centroid feature vector group having the smallest first euclidean distance from the color global feature vector of the textile picture as a cluster-like centroid feature vector group to which the textile picture belongs.
S610, determining sample images matched with the textile images in the grouping of the cluster-like centroid feature vectors according to the color global feature vectors, the color local feature vectors and the gray texture local feature vectors.
Each sample image in the textile picture feature library is associated and recorded with a color global feature vector, a color local feature vector and a gray texture local feature vector of the sample image.
In this step, in the case that the grouping of the cluster-like centroid feature vector to which the textile picture belongs has been determined in step S608, the server may analyze and determine the sample image of the textile picture matching in the grouping of the cluster-like centroid feature vector directly according to the color global feature vector, the color local feature vector, and the gray texture local feature vector of the textile picture, and the color global feature vector, the color local feature vector, and the gray texture local feature vector of each sample image in the grouping.
In the embodiment, when the textile pictures are matched, the color global feature vector, the color local feature vector and the gray texture local feature vector of the textile pictures are respectively calculated according to the gray value, the HSV color space value and the coordinate value of the pixel in the textile pictures to be matched, when the textile pictures are matched, the groups of the textile pictures in the textile picture feature library are determined according to the color global feature vector, and then the textile pictures are matched with the sample images in the groups to obtain the sample images matched with the textile pictures, so that the number of the sample images participating in matching can be effectively reduced, the operation amount is reduced, and the accuracy and the efficiency of textile picture matching are improved.
In practical application, due to the rapid update iteration of merchants and textile goods, the situation that the textile picture feature library cannot effectively identify and match part of textile pictures as time goes on exists, so that the success rate of matching the textile pictures is reduced. In order to improve the recognition success rate of the textile picture feature library, the textile picture feature library can be updated. In an embodiment, as shown in fig. 7, the method for establishing a textile image feature library further includes a step of updating the textile image feature library, which specifically includes:
s702, acquiring a textile picture failed in matching;
in the matching process of the textile pictures, a sample image which cannot be matched with a certain textile picture in the textile picture feature library may occur, and the textile picture is a textile picture which fails to be matched.
In practical application, each time the textile picture received by the server fails to be matched, the textile picture can be included in the textile picture matching failure record table. In this step, the server may read one or more textile pictures that have failed to match from the textile picture matching failure record table stored in the server, and then acquire the number of the textile pictures that have failed to match and are recorded in the textile picture matching failure record table.
S704, if the number of the textile pictures which fail to be matched exceeds the updating number threshold, mixing each textile picture which fails to be matched as a sample image with the sample image in the current textile picture feature library to obtain a new sample image set;
the updating number threshold can be set according to actual needs, and the setting of the appropriate updating number threshold can limit the updating frequency of the textile picture feature library, so that occupation and consumption of server operation resources caused by too frequent updating of the textile picture feature library are avoided.
In this step, if the number of the textile pictures with failed matching read by the server in step S702 exceeds the update number threshold, the server mixes all the textile pictures with failed matching recorded in the current textile picture matching failure record table as sample images to be updated with the existing sample images in the current textile picture feature library to form a new sample image set for subsequent reconstruction to generate a new textile picture feature library, thereby implementing an update operation on the textile picture feature library.
In addition, after each textile picture failed in matching is taken as a sample image in the step and is mixed with the sample image in the current textile picture feature library to obtain a new sample image set, the server can empty the textile picture failed in matching recorded in the textile picture matching record table, or can mark the textile picture failed in matching which is updated, so that resource waste caused by repeated updating is avoided.
S706, clustering and dividing the sample images in the new sample image set into groups of a second preset number of new cluster centroid feature vectors according to the color global feature vectors, the color local feature vectors and the gray texture local feature vectors of all the sample images in the new sample image set;
in the new sample image set, a part of sample images are sample images in a textile picture feature library which is not updated currently, and the total feature vectors of the sample images are already calculated in the process of establishing the textile picture feature library or updating the textile picture feature library last time; and the other part of the sample image is a textile picture with failed matching, and the total feature vector of the part of the sample image is also calculated in the matching process. Therefore, each sample image in the new sample image set has been recorded with a corresponding total feature vector in an associated manner.
In this step, the server may adopt the same method as in step S708 to determine a second preset number of suitable groups of new cluster-like centroid feature vectors according to the total number of sample images in the new sample image set. And then clustering and dividing the sample images into groups of the new cluster-like centroid feature vectors with the second preset number according to the Euclidean distance between the color global feature vector, the color local feature vector and the gray texture local feature vector of each sample image and the cluster-like centroid feature vectors.
And S708, generating an updated textile picture feature library according to each new cluster-like centroid feature vector and the sample images in the group of each new cluster-like centroid feature vector.
In the case that the sample image clusters have been divided into a second preset number of groups of new cluster-like centroid feature vectors in step S706, in this step, the server may generate an updated textile picture feature library, and record each new cluster-like centroid feature vector and the related information of the sample images in each new cluster-like centroid feature vector group in the generated updated textile picture feature library.
In the embodiment, the textile pictures which are failed to be matched in the textile picture matching process are used as sample images to be updated, when the recorded number of the textile pictures which are failed to be matched exceeds the update number threshold value, the textile pictures which are failed to be matched are supplemented into the original textile picture feature library to be mixed, the textile picture feature library is completely reconstructed once, the textile picture feature library is ensured to be matched with the textile pictures successfully, the times of update operation of the textile picture feature library are reduced as much as possible, and the update efficiency is improved.
It should be understood that although the steps in the flowcharts of fig. 2, 5, 6, and 7 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2, 5, 6, and 7 may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least some of the sub-steps or stages of other steps.
In one embodiment, as shown in fig. 8, there is provided an apparatus 800 for creating a textile picture feature library, including: a sample image obtaining module 802, a sample pixel value obtaining module 804, a sample feature vector calculating module 806, a sample image clustering and partitioning module 808, and a textile picture feature library generating module 810, wherein:
a sample image obtaining module 802, configured to obtain sample images with preset classification numbers;
a sample pixel value obtaining module 804, configured to obtain a gray value, an HSV color space value, and a coordinate value of a pixel in a sample image;
a sample feature vector calculation module 806, configured to calculate a color global feature vector, a color local feature vector, and a gray texture local feature vector of the sample image according to the gray value, the HSV color space value, and the coordinate value;
the sample image clustering and dividing module 808 is used for clustering and dividing the sample images into groups of cluster centroid feature vectors of a first preset number according to the color global feature vectors, the color local feature vectors and the gray texture local feature vectors of all the sample images;
and the textile picture feature library generating module 810 is configured to generate a textile picture feature library according to each cluster-like centroid feature vector and the sample images in the group of each cluster-like centroid feature vector.
In one embodiment, the sample image acquiring module 802 is configured to acquire an initial textile sample image with a preset classification number; segmenting a textile sample image unit with a preset resolution from each initial textile sample image, and adjusting the contrast of the textile sample image unit to a preset contrast value to obtain an adjusted textile sample image unit; the adjusted textile sample image element is taken as the sample image.
In one embodiment, the sample eigenvector calculation module 806 is configured to calculate a mean value of H-channel values, a mean value of S-channel values, a mean value of V-channel values, a variance of H-channel values, a variance of S-channel values, a variance of V-channel values, a third moment of H-channel values, a third moment of S-channel values, and a third moment of V-channel values, respectively, in HSV color space values of pixels in the sample image; and generating a color global feature vector of the sample image according to the mean value of the H channel value, the mean value of the S channel value, the mean value of the V channel value, the variance of the H channel value, the variance of the S channel value, the variance of the V channel value, the third moment of the H channel value, the third moment of the S channel value and the third moment of the V channel value.
In one embodiment, the sample feature vector calculation module 806 is configured to reduce the dimensions of the H channel value, the S channel value, and the V channel value in the HSV color space values of the pixels in the sample image to generate a one-dimensional color value; the one-dimensional color values are quantized into color level values of a third preset number of color levels; quantizing the pixel distance value between each pixel and other pixels in the sample image into a pixel distance grade value of a fourth preset number of distance grades according to the coordinate value of each pixel in the sample image; generating a two-dimensional matrix containing elements of which the third preset number is multiplied by the fourth preset number according to the color grade value and the pixel distance grade value of each pixel in the sample image; the value of each element in the two-dimensional matrix represents the number of color grade values and pixel distance grade values under one color grade and distance grade; and expanding the two-dimensional matrix to obtain the color local characteristic vector of the sample image.
In one embodiment, the sample feature vector calculation module 806 is configured to obtain a fifth preset number of different convolution template matrices; performing convolution calculation on each convolution template matrix and a gray value matrix formed by the gray values of all pixels in the sample image to obtain a convolution value of each pixel; generating a convolution image corresponding to each convolution template matrix according to the convolution value of each pixel in the sample image calculated by each convolution template matrix; respectively calculating the mean value and the variance of the gray values of the pixels in the convolution image aiming at each convolution image; and generating the gray texture local feature vector of the sample image according to the mean value and the variance of the gray values of the convolution images of the fifth preset number.
In one embodiment, the sample image cluster partitioning module 808 is configured to:
s1, selecting total feature vectors of sample images of a first preset number from the sample images of the preset number of classes as initial cluster-like centroid feature vectors; the total feature vector is generated according to the color global feature vector, the color local feature vector and the gray texture local feature vector;
s2, calculating Euclidean distances between the total feature vectors of all sample images in the sample images with preset classification numbers and the centroid feature vectors of the initial cluster;
s3, dividing each sample image into groups of initial cluster centroid feature vectors with the minimum Euclidean distance to the sample image;
s4, calculating the average value of the total feature vectors of the sample images in the grouping of the centroid feature vectors of each initial cluster;
s5, if the Euclidean distance between the total feature vector mean value of the sample images in the grouping of all the initial cluster centroid feature vectors and the initial cluster centroid feature vector is smaller than a distance threshold, executing a step S7;
s6, if the Euclidean distance between the mean value of the total feature vectors in the groups of any initial cluster centroid feature vectors and the initial cluster centroid feature vectors is not smaller than a distance threshold, respectively taking the mean value of the total feature vectors in the groups of each initial cluster centroid feature vector as an updated initial cluster centroid feature vector, repeating the steps S2, S3 and S4 until the Euclidean distance between the mean value of the total feature vectors of the sample images in the groups of all initial cluster centroid feature vectors and the initial cluster centroid feature vectors is smaller than the distance threshold, and executing the step S7;
s7, obtaining the current initial cluster centroid feature vector and the sample image in the grouping of the initial cluster centroid feature vector, and using the sample image as the cluster centroid feature vector of the cluster division and the sample image in the grouping of the cluster centroid feature vector.
In one embodiment, the creating apparatus 800 for the textile picture feature library further includes: the textile picture matching module is used for acquiring a textile picture to be matched; acquiring a gray value, an HSV color space value and a coordinate value of a pixel in a textile picture; calculating a color global characteristic vector, a color local characteristic vector and a gray texture local characteristic vector of the textile picture according to the gray value, the HSV color space value and the coordinate value; determining the grouping of the cluster centroid feature vectors to which the textile pictures belong according to the first Euclidean distance between the color global feature vectors of the textile pictures and the color global feature vectors of the cluster centroid feature vectors in the textile picture feature library; and determining a sample image matched with the textile picture in the grouping of the cluster-like centroid feature vectors according to the color global feature vector, the color local feature vector and the gray texture local feature vector.
In one embodiment, the creating apparatus 800 for the textile picture feature library further includes: the updating module of the textile picture feature library is used for acquiring textile pictures which fail to be matched; if the number of the textile pictures failed to be matched exceeds the update number threshold value, taking each textile picture failed to be matched as a sample image to be mixed with the sample images in the current textile picture feature library to obtain a new sample image set; clustering and dividing the sample images in the new sample image set into groups of a second preset number of new cluster centroid feature vectors according to the color global feature vector, the color local feature vector and the gray texture local feature vector of each sample image in the new sample image set; and generating an updated textile picture feature library according to each new cluster-like centroid feature vector and the sample images in the group of each new cluster-like centroid feature vector.
For specific limitations of the building device of the textile image feature library, reference may be made to the above limitations of the building method of the textile image feature library, and details are not described here. All or part of each module in the building device of the textile picture feature library can be realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In an embodiment, there is provided a computer device comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing the steps of the method for establishing a textile picture feature library as any of the above embodiments when executing the computer program.
In an embodiment, a computer-readable storage medium is provided, on which a computer program is stored, which when executed by a processor implements the steps of the method for establishing a textile picture feature library as any of the above embodiments.
It will be understood by those of ordinary skill in the art that all or a portion of the processes of the methods of the embodiments described above may be implemented by a computer program that may be stored on a non-volatile computer-readable storage medium, which when executed, may include the processes of the embodiments of the methods described above, wherein any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A method for establishing a textile picture feature library, the method comprising:
obtaining sample images with preset classification numbers;
acquiring a gray value, an HSV color space value and a coordinate value of a pixel in the sample image;
calculating a color global feature vector, a color local feature vector and a gray texture local feature vector of the sample image according to the gray value, the HSV color space value and the coordinate value;
clustering and dividing the sample images into groups of cluster centroid feature vectors with a first preset number according to the color global feature vectors, the color local feature vectors and the gray texture local feature vectors of all the sample images;
generating a textile picture feature library according to each cluster centroid feature vector and the sample images in the groups of each cluster centroid feature vector;
wherein the calculating the color local feature vector of the sample image comprises:
reducing the dimensions of an H channel value, an S channel value and a V channel value in the HSV color space value of each pixel in the sample image to generate a single-dimensional color value; the one-dimensional color values are quantized to color level values of a third preset number of color levels;
quantizing the pixel distance value between each pixel and other pixels in the sample image into a pixel distance grade value of a fourth preset number of distance grades according to the coordinate value of each pixel in the sample image;
generating a two-dimensional matrix containing elements of which the third preset number is multiplied by the fourth preset number according to the color grade value and the pixel distance grade value of each pixel in the sample image; wherein, the value of each element in the two-dimensional matrix represents the number of color grade values and pixel distance grade values under one color grade and distance grade;
and expanding the two-dimensional matrix to obtain the color local characteristic vector of the sample image.
2. The method of claim 1, wherein obtaining a sample image of a predetermined classification number comprises:
acquiring an initial textile sample image with a preset classification number;
segmenting a textile sample image unit with a preset resolution from each initial textile sample image;
adjusting the contrast of the textile sample image unit to a preset contrast value to obtain an adjusted textile sample image unit;
taking the adjusted textile sample image element as a sample image.
3. The method of claim 1, wherein the computing the color global feature vector for the sample image comprises:
respectively calculating the mean value of the H channel value, the mean value of the S channel value, the mean value of the V channel value, the variance of the H channel value, the variance of the S channel value, the variance of the V channel value, the third moment of the H channel value, the third moment of the S channel value and the third moment of the V channel value in the HSV color space values of the pixels in the sample image;
and generating a color global feature vector of the sample image according to the mean value of the H channel value, the mean value of the S channel value, the mean value of the V channel value, the variance of the H channel value, the variance of the S channel value, the variance of the V channel value, the third moment of the H channel value, the third moment of the S channel value and the third moment of the V channel value.
4. The method of claim 1, wherein computing the gray-scale texture local feature vector for the sample image comprises:
acquiring a fifth preset number of different convolution template matrixes;
performing convolution calculation on the convolution template matrixes and a gray value matrix formed by the gray values of the pixels in the sample image respectively to obtain convolution values of the pixels;
generating a convolution image corresponding to each convolution template matrix according to the convolution value of each pixel in the sample image calculated by each convolution template matrix;
respectively calculating the mean value and the variance of the gray values of the pixels in the convolution image aiming at each convolution image;
and generating a gray texture local feature vector of the sample image according to the mean value and the variance of the gray values of the convolution images of the fifth preset number.
5. The method according to any one of claims 1 to 4, wherein the clustering the sample images into a first preset number of clusters centroid feature vectors of clusters according to the color global feature vector, the color local feature vector and the gray texture local feature vector of each sample image comprises:
s1, selecting total feature vectors of sample images of a first preset number from the sample images of the preset number of classes as initial cluster-like centroid feature vectors; the total feature vector is generated according to the color global feature vector, the color local feature vector and the gray texture local feature vector;
s2, calculating Euclidean distances between the total feature vectors of all sample images in the sample images with preset classification numbers and the centroid feature vectors of the initial cluster;
s3, dividing each sample image into groups of initial cluster centroid feature vectors with the minimum Euclidean distance to the sample image;
s4, calculating the average value of the total feature vectors of the sample images in the grouping of the centroid feature vectors of each initial cluster;
s5, if the Euclidean distance between the total feature vector mean value of the sample images in the grouping of all the initial cluster centroid feature vectors and the initial cluster centroid feature vector is smaller than a distance threshold, executing a step S7;
s6, if the Euclidean distance between the mean value of the total feature vectors in the groups of any initial cluster centroid feature vectors and the initial cluster centroid feature vectors is not smaller than a distance threshold, respectively taking the mean value of the total feature vectors in the groups of each initial cluster centroid feature vector as an updated initial cluster centroid feature vector, repeating the steps S2, S3 and S4 until the Euclidean distance between the mean value of the total feature vectors of the sample images in the groups of all initial cluster centroid feature vectors and the initial cluster centroid feature vectors is smaller than the distance threshold, and executing the step S7;
s7, obtaining the current initial cluster centroid feature vector and the sample image in the grouping of the initial cluster centroid feature vector, and using the sample image as the cluster centroid feature vector of the cluster division and the sample image in the grouping of the cluster centroid feature vector.
6. The method of any one of claims 1 to 4, further comprising:
acquiring a textile picture to be matched;
acquiring a gray value, an HSV color space value and a coordinate value of a pixel in the textile picture;
calculating a color global characteristic vector, a color local characteristic vector and a gray texture local characteristic vector of the textile picture according to the gray value, the HSV color space value and the coordinate value;
determining the grouping of cluster centroid feature vectors to which the textile pictures belong according to a first Euclidean distance between the color global feature vectors of the textile pictures and the color global feature vectors of the cluster centroid feature vectors in a textile picture feature library;
and determining a sample image matched with the textile picture in the grouping of the cluster-like centroid feature vectors according to the color global feature vector, the color local feature vector and the gray texture local feature vector.
7. The method of claim 6, further comprising:
acquiring a textile picture failed in matching;
if the number of the textile pictures failed in matching exceeds the update number threshold, taking each textile picture failed in matching as a sample image to be mixed with the sample images in the current textile picture feature library to obtain a new sample image set;
clustering and dividing the sample images in the new sample image set into groups of a second preset number of new cluster-like centroid feature vectors according to the color global feature vector, the color local feature vector and the gray texture local feature vector of each sample image in the new sample image set;
and generating an updated textile picture feature library according to each new cluster-like centroid feature vector and the sample images in the group of each new cluster-like centroid feature vector.
8. An apparatus for creating a textile picture feature library, the apparatus comprising:
the sample image acquisition module is used for acquiring sample images with preset classification numbers;
the sample pixel value acquisition module is used for acquiring the gray value, HSV color space value and coordinate value of the pixel in the sample image;
the sample characteristic vector calculation module is used for calculating a color global characteristic vector, a color local characteristic vector and a gray texture local characteristic vector of the sample image according to the gray value, the HSV color space value and the coordinate value; the color local feature vector can be obtained according to a color grade value and a pixel distance grade value of a pixel in a sample image, wherein the color grade value is obtained from the HSV color space value, and the pixel distance grade value is obtained from the coordinate value;
the sample image clustering and dividing module is used for clustering and dividing the sample images into groups of cluster centroid feature vectors with a first preset number according to the color global feature vectors, the color local feature vectors and the gray texture local feature vectors of all the sample images;
and the textile picture feature library generating module is used for generating a textile picture feature library according to the cluster-like centroid feature vectors and the sample images in the groups of the cluster-like centroid feature vectors.
9. A computer device comprising a memory and a processor, the memory storing a computer program, wherein the processor implements the steps of the method of any one of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
CN201811610700.7A 2018-12-27 2018-12-27 Method and device for establishing textile picture feature library Active CN109657083B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811610700.7A CN109657083B (en) 2018-12-27 2018-12-27 Method and device for establishing textile picture feature library

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811610700.7A CN109657083B (en) 2018-12-27 2018-12-27 Method and device for establishing textile picture feature library

Publications (2)

Publication Number Publication Date
CN109657083A CN109657083A (en) 2019-04-19
CN109657083B true CN109657083B (en) 2020-07-14

Family

ID=66117615

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811610700.7A Active CN109657083B (en) 2018-12-27 2018-12-27 Method and device for establishing textile picture feature library

Country Status (1)

Country Link
CN (1) CN109657083B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112668597B (en) * 2019-10-15 2023-07-28 杭州海康威视数字技术股份有限公司 Feature comparison method, device and equipment
CN110694942B (en) * 2019-10-21 2021-05-14 武汉纺织大学 Color difference sorting method for solar cells

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101777059A (en) * 2009-12-16 2010-07-14 中国科学院自动化研究所 Method for extracting landmark scene abstract
CN101853304A (en) * 2010-06-08 2010-10-06 河海大学 Remote sensing image retrieval method based on feature selection and semi-supervised learning
EP2443589A1 (en) * 2009-06-16 2012-04-25 Alibaba Group Holding Limited Method and system for near-duplicate image searching
CN102508909A (en) * 2011-11-11 2012-06-20 苏州大学 Image retrieval method based on multiple intelligent algorithms and image fusion technology
CN103577840A (en) * 2013-10-30 2014-02-12 汕头大学 Item identification method
CN104331447A (en) * 2014-10-29 2015-02-04 邱桃荣 Cloth color card image retrieval method

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102012939B (en) * 2010-12-13 2012-11-14 中国人民解放军国防科学技术大学 Method for automatically tagging animation scenes for matching through comprehensively utilizing overall color feature and local invariant features
CN102521838B (en) * 2011-12-19 2013-11-27 国家计算机网络与信息安全管理中心 Image searching/matching method and system for same
CN102629325B (en) * 2012-03-13 2014-06-04 深圳大学 Image characteristic extraction method, device thereof, image copy detection method and system thereof
CN103871044B (en) * 2012-12-14 2018-02-09 阿里巴巴集团控股有限公司 A kind of image signatures generation method and image authentication method and device
CN105608230B (en) * 2016-02-03 2019-05-31 南京云创大数据科技股份有限公司 A kind of Business Information recommender system and method based on image retrieval

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2443589A1 (en) * 2009-06-16 2012-04-25 Alibaba Group Holding Limited Method and system for near-duplicate image searching
CN101777059A (en) * 2009-12-16 2010-07-14 中国科学院自动化研究所 Method for extracting landmark scene abstract
CN101853304A (en) * 2010-06-08 2010-10-06 河海大学 Remote sensing image retrieval method based on feature selection and semi-supervised learning
CN102508909A (en) * 2011-11-11 2012-06-20 苏州大学 Image retrieval method based on multiple intelligent algorithms and image fusion technology
CN103577840A (en) * 2013-10-30 2014-02-12 汕头大学 Item identification method
CN104331447A (en) * 2014-10-29 2015-02-04 邱桃荣 Cloth color card image retrieval method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于整体局部颜色特征和纹理特征的图像检索算法研究;黄艳丽;《中国优秀硕士学位论文全文数据库信息科技辑》;20120915(第9期);第2-4节 *
面向图像检索的海量图像自动聚类方法研究;杨杰;《中国优秀硕士学位论文全文数据库信息科技辑》;20150615(第6期);第2-4节 *

Also Published As

Publication number Publication date
CN109657083A (en) 2019-04-19

Similar Documents

Publication Publication Date Title
CN110210560B (en) Incremental training method, classification method and device, equipment and medium of classification network
CN108090511B (en) Image classification method and device, electronic equipment and readable storage medium
CN112651438A (en) Multi-class image classification method and device, terminal equipment and storage medium
CN108710916B (en) Picture classification method and device
US20120201464A1 (en) Computer readable medium, image processing apparatus, and image processing method
CN109741380B (en) Textile picture fast matching method and device
CN111935479A (en) Target image determination method and device, computer equipment and storage medium
CN109657083B (en) Method and device for establishing textile picture feature library
CN111179270A (en) Image co-segmentation method and device based on attention mechanism
Zhang et al. Dual-channel multi-task CNN for no-reference screen content image quality assessment
CN113297420A (en) Video image processing method and device, storage medium and electronic equipment
CN112561976A (en) Image dominant color feature extraction method, image retrieval method, storage medium and device
CN109784379B (en) Updating method and device of textile picture feature library
CN115578590A (en) Image identification method and device based on convolutional neural network model and terminal equipment
CN108388869B (en) Handwritten data classification method and system based on multiple manifold
CN108052918A (en) A kind of person's handwriting Compare System and method
CN113850748A (en) Point cloud quality evaluation system and method
KR101741761B1 (en) A classification method of feature points required for multi-frame based building recognition
CN114445916A (en) Living body detection method, terminal device and storage medium
Frosio et al. Adaptive segmentation based on a learned quality metric.
CN110647898B (en) Image processing method, image processing device, electronic equipment and computer storage medium
CN113095147A (en) Skin area detection method, system, image processing terminal and storage medium
CN111797922A (en) Text image classification method and device
Mary et al. Content based image retrieval using colour, multi-dimensional texture and edge orientation
CN112883959B (en) Identity card integrity detection method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant