CN109784379B - Updating method and device of textile picture feature library - Google Patents

Updating method and device of textile picture feature library Download PDF

Info

Publication number
CN109784379B
CN109784379B CN201811609396.4A CN201811609396A CN109784379B CN 109784379 B CN109784379 B CN 109784379B CN 201811609396 A CN201811609396 A CN 201811609396A CN 109784379 B CN109784379 B CN 109784379B
Authority
CN
China
Prior art keywords
sample image
value
textile
feature vector
new
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811609396.4A
Other languages
Chinese (zh)
Other versions
CN109784379A (en
Inventor
吴家鹏
冯立庚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
GUANGZHOU HUAXUN NETWORK TECHNOLOGY CO LTD
Original Assignee
GUANGZHOU HUAXUN NETWORK TECHNOLOGY CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by GUANGZHOU HUAXUN NETWORK TECHNOLOGY CO LTD filed Critical GUANGZHOU HUAXUN NETWORK TECHNOLOGY CO LTD
Priority to CN201811609396.4A priority Critical patent/CN109784379B/en
Publication of CN109784379A publication Critical patent/CN109784379A/en
Application granted granted Critical
Publication of CN109784379B publication Critical patent/CN109784379B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The application relates to a method and a device for updating a textile picture feature library, a computer device and a storage medium, wherein the method comprises the following steps: acquiring a textile picture failed in matching; if the number of the textile pictures failed to be matched exceeds the update number threshold value, taking each textile picture failed to be matched as a sample image to be mixed with the sample images in the current textile picture feature library to obtain a new sample image set; acquiring a total feature vector of each sample image in the new sample image set; according to the total feature vector, clustering and dividing the sample images in the new sample image set into groups of a second preset number of new cluster-like centroid feature vectors; and generating an updated textile picture feature library according to each new cluster-like centroid feature vector and the sample images in the group of each new cluster-like centroid feature vector. By adopting the method, the success rate of matching the textile picture characteristic library to the textile picture can be ensured, and the updating efficiency can be improved.

Description

Updating method and device of textile picture feature library
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method and an apparatus for updating a textile image feature library, a computer device, and a storage medium.
Background
The automatic identification technology of the textiles has wide application in the fields of textile purchasing, textile storage and transportation, textile electronic commerce and the like. Automatic identification of textiles belongs to the application of pattern recognition field, and depends on picture processing and analysis. Taking a large-scale textile wholesale market as an example, numerous merchants and various commodities exist in the wholesale market, customers and merchants mainly rely on manual searching, inquiring and identifying during purchasing, the manual searching mode is low in efficiency, and if the automatic searching from a sample to a commodity can be realized by using a picture matching technology, the purchasing efficiency can be greatly improved.
In a traditional textile image matching method, a textile image to be matched is matched with a sample image stored in a textile image feature library through an established textile image feature library to obtain a matching result. However, in practical application, due to the fast update iteration of the merchant and the textile goods, the textile picture feature library cannot effectively identify and match a part of pictures over time, so that the success rate of matching the textile pictures is reduced.
Disclosure of Invention
Therefore, it is necessary to provide a method, an apparatus, a computer device and a storage medium for updating a textile image feature library, which can improve the success rate of matching a textile image.
A method for updating a textile picture feature library comprises the following steps:
acquiring a textile picture failed in matching;
if the number of the textile pictures failed to be matched exceeds the update number threshold value, taking each textile picture failed to be matched as a sample image to be mixed with the sample images in the current textile picture feature library to obtain a new sample image set;
acquiring a total feature vector of each sample image in the new sample image set; the total feature vector is generated according to the color global feature vector, the color local feature vector and the gray texture local feature vector;
according to the total feature vector, clustering and dividing the sample images in the new sample image set into groups of a second preset number of new cluster-like centroid feature vectors;
and generating an updated textile picture feature library according to each new cluster-like centroid feature vector and the sample images in the group of each new cluster-like centroid feature vector.
An apparatus for updating a library of textile picture features, the apparatus comprising:
the to-be-updated picture acquisition module is used for acquiring the textile picture failed in matching;
the new sample image set acquisition module is used for mixing each textile picture which fails to be matched as a sample image with the sample images in the current textile picture feature library to obtain a new sample image set if the number of the textile pictures which fail to be matched exceeds the update number threshold;
the total characteristic vector acquisition module is used for acquiring the total characteristic vector of each sample image in the new sample image set; the total feature vector is generated according to the color global feature vector, the color local feature vector and the gray texture local feature vector;
the new sample image clustering and dividing module is used for clustering and dividing the sample images in the new sample image set into groups of a second preset number of new cluster centroid feature vectors according to the total feature vectors;
and the textile picture feature library updating module is used for generating an updated textile picture feature library according to each new cluster-like centroid feature vector and the sample image in each new cluster-like centroid feature vector group.
A computer device comprising a memory and a processor, the memory storing a computer program, the processor implementing the following steps when executing the computer program:
acquiring a textile picture failed in matching; if the number of the textile pictures failed to be matched exceeds the update number threshold value, taking each textile picture failed to be matched as a sample image to be mixed with the sample images in the current textile picture feature library to obtain a new sample image set; acquiring a total feature vector of each sample image in the new sample image set; the total feature vector is generated according to the color global feature vector, the color local feature vector and the gray texture local feature vector; according to the total feature vector, clustering and dividing the sample images in the new sample image set into groups of a second preset number of new cluster-like centroid feature vectors; and generating an updated textile picture feature library according to each new cluster-like centroid feature vector and the sample images in the group of each new cluster-like centroid feature vector.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
acquiring a textile picture failed in matching; if the number of the textile pictures failed to be matched exceeds the update number threshold value, taking each textile picture failed to be matched as a sample image to be mixed with the sample images in the current textile picture feature library to obtain a new sample image set; acquiring a total feature vector of each sample image in the new sample image set; the total feature vector is generated according to the color global feature vector, the color local feature vector and the gray texture local feature vector; according to the total feature vector, clustering and dividing the sample images in the new sample image set into groups of a second preset number of new cluster-like centroid feature vectors; and generating an updated textile picture feature library according to each new cluster-like centroid feature vector and the sample images in the group of each new cluster-like centroid feature vector.
According to the method and the device for updating the textile picture feature library, the textile pictures which are failed to be matched in the textile picture matching process are used as sample images to be updated, when the recorded number of the textile pictures which are failed to be matched exceeds the update number threshold value, the textile pictures which are failed to be matched are supplemented into the original textile picture feature library to be mixed, the textile picture feature library is completely reconstructed once, the textile picture feature library is ensured to be matched with the textile pictures, the update operation times of the textile picture feature library are reduced as much as possible, and the updating efficiency is improved.
Drawings
FIG. 1 is a diagram of an application environment of a method for updating a feature library of a textile picture according to an embodiment;
FIG. 2 is a flowchart illustrating a method for updating a feature library of a textile picture according to an embodiment;
FIG. 3 is a schematic flow chart of the sample image cluster partitioning step in one embodiment;
FIG. 4 is a schematic flow chart diagram illustrating the steps of creating a library of textile image features in one embodiment;
FIG. 5 is a schematic diagram of an embodiment of an image unit for segmenting a textile sample;
FIG. 6 is a schematic flow chart of the steps of textile image matching in one embodiment;
fig. 7 is a block diagram of an apparatus for updating a textile picture feature library according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The method for updating the textile picture feature library provided by the application can be applied to the application environment shown in fig. 1. Wherein the terminal 102 communicates with the server 104 via a network. The server 104 establishes a textile image feature library and stores the textile image feature library in the server 104, and the server 104 further executes the steps of the updating method of the textile image feature library according to any embodiment of the present application to update the stored textile image feature library. The terminal 102 sends a matching request to the server 104, the matching request carries information of the textile picture to be matched, the server 104 obtains the textile picture to be matched according to the matching request, matches the obtained textile picture with a sample image in a textile picture feature library currently stored in the server 104, obtains a matching result of the textile picture, and returns the matching result to the terminal 102. The terminal 102 may be, but not limited to, various personal computers, notebook computers, smart phones, tablet computers, and portable wearable devices, and the server 104 may be implemented by an independent server or a server cluster formed by a plurality of servers.
In one embodiment, as shown in fig. 2, there is provided a method for updating a textile picture feature library, which is described by taking the method as an example applied to the server in fig. 1, and includes the following steps:
s202, acquiring a textile picture failed in matching;
in the embodiment of the application, when the server receives a matching request from a terminal under the condition that a textile picture feature library is established in the server, a textile picture acquired in the matching request is matched with a sample image in the textile picture feature library. In the matching process of the textile pictures, a sample image which cannot be matched with a certain textile picture in the textile picture feature library may occur, and the textile picture is a textile picture which fails to be matched.
In practical application, each time the textile picture received by the server fails to be matched, the textile picture can be included in the textile picture matching failure record table. In this step, the server may read one or more textile pictures that have failed to match from the textile picture matching failure record table stored in the server, and then acquire the number of the textile pictures that have failed to match and are recorded in the textile picture matching failure record table.
S204, if the number of the textile pictures which fail to be matched exceeds the update number threshold, mixing each textile picture which fails to be matched as a sample image with the sample images in the current textile picture feature library to obtain a new sample image set;
the updating number threshold can be set according to actual needs, and the setting of the appropriate updating number threshold can limit the updating frequency of the textile picture feature library, so that occupation and consumption of server operation resources caused by too frequent updating of the textile picture feature library are avoided.
In this step, if the number of textile pictures with failed matching read by the server in step S202 exceeds the update number threshold, the server mixes all textile pictures with failed matching recorded in the current textile picture matching failure record table as sample images to be updated with existing sample images in the current textile picture feature library to form a new sample image set for subsequent reconstruction to generate a new textile picture feature library, thereby implementing an update operation on the textile picture feature library.
In addition, after each textile picture failed in matching is taken as a sample image in the step and is mixed with the sample image in the current textile picture feature library to obtain a new sample image set, the server can empty the textile picture failed in matching recorded in the textile picture matching record table, or can mark the textile picture failed in matching which is updated, so that resource waste caused by repeated updating is avoided.
S206, acquiring a total feature vector of each sample image in the new sample image set; the total feature vector is generated according to the color global feature vector, the color local feature vector and the gray texture local feature vector;
in the new sample image set, a part of sample images are sample images in a textile picture feature library which is not updated currently, and the total feature vectors of the sample images are already calculated in the process of establishing the textile picture feature library or updating the textile picture feature library last time; and the other part of the sample image is a textile picture with failed matching, and the total feature vector of the part of the sample image is also calculated in the matching process. Therefore, each sample image in the new sample image set has been recorded with a corresponding total feature vector in an associated manner.
In this step, the server may read the total feature vector of each sample image in the new sample image set directly from the server.
S208, clustering and dividing the sample images in the new sample image set into groups of a second preset number of new cluster centroid feature vectors according to the total feature vectors of all the sample images in the new sample image set;
the second preset number of the cluster-like centroid feature vectors can be set according to actual conditions, and under the condition that the total number of the sample images in the new sample image set is determined, the smaller the second preset number is, the smaller the group number is, the more the sample images participating in matching in a single group are during matching of the textile images, so that the later-stage matching computation amount is increased; on the contrary, the more the second preset number sets the grouping number, the more the calculation time is needed to determine the grouping to which the textile pictures belong when the textile pictures are matched, and the less the sample images participating in matching in a single grouping are, the higher the probability of matching failure is, so that the grouping number of the new cluster-like centroid feature vectors needs to be set appropriately.
In this step, the server may determine a second preset number of suitable groups of the new cluster-like centroid feature vectors according to the total number of the sample images in the new sample image set. And then clustering and dividing the sample images into groups of the new cluster-like centroid feature vectors with the second preset number according to the Euclidean distance between the color global feature vector, the color local feature vector and the gray texture local feature vector of each sample image and the cluster-like centroid feature vectors.
And S210, generating an updated textile picture feature library according to each new cluster-like centroid feature vector and the sample images in the group of each new cluster-like centroid feature vector.
In the case that the sample image cluster has been divided into a second preset number of groups of new cluster-like centroid feature vectors in step S208, in this step, the server may generate an updated textile picture feature library, and record in the generated updated textile picture feature library the relevant information of each new cluster-like centroid feature vector and the sample images in each new cluster-like centroid feature vector group.
According to the updating method of the textile picture feature library, the textile pictures which are failed to be matched in the textile picture matching process are used as sample images to be updated, when the number of the recorded textile pictures which are failed to be matched exceeds an updating number threshold value, the textile pictures which are failed to be matched are supplemented into the original textile picture feature library to be mixed, the textile picture feature library is completely reconstructed once, when the textile picture feature library is reconstructed, the color global feature vector, the color local feature vector and the gray texture local feature vector of the sample images are respectively collected, and through feature extraction combining the global feature and the local feature, the similar textile pictures can be identified to the maximum extent, and the success rate of textile picture matching is improved. The success rate of matching the textile picture characteristic library to the textile picture is ensured, the times of updating operation of the textile picture characteristic library are reduced as much as possible, and the updating efficiency is improved.
In one embodiment, as shown in fig. 3, the dividing S208 the sample image clusters in the new sample image set into a second predetermined number of groups of new cluster centroid feature vectors according to the total feature vector of each sample image in the new sample image set includes:
s301, selecting total feature vectors of a second preset number of sample images from the new sample image set as initial cluster-like centroid feature vectors; the total feature vector is generated according to the color global feature vector, the color local feature vector and the gray texture local feature vector;
before this step S301, basic parameters of cluster partitioning need to be set, for exampleSuch as a second predetermined number k' of cluster-like centroid feature vectors. The total feature vector of each sample image in the new sample image set obtained in S206, where the new sample image set may be represented as L' ═ { L ═ L1、L2、…、Li、…、Lz},LiRepresenting the ith sample image in the new sample image set, wherein the total feature vector of each sample image
Figure BDA0001924354940000071
m is the size of the total feature vector and is 409; the second preset number k of cluster centroid feature vectors can be determined according to the total number of sample images, generally, since the number of features (global, local1, and local2 feature sum) has reached 409, the number of sample images included in each cluster centroid feature vector grouping is generally not less than 409, and k' sum (L ═ sum) can be taken as an examplei) Set the k' value (sum (L))i) Representing the total number of sample images in the new sample image set).
As an example, in this step, the total feature vector S of k' sample images may be randomly chosenj(j ═ 1, 2, … k') as the initial cluster-like centroid feature vector.
S302, calculating Euclidean distances from the total characteristic vectors of all sample images in the new sample image set to the centroid characteristic vector of the initial cluster;
illustratively, in this step, the distances from the sample images in the new sample image set to the k' cluster-like centroid feature vectors are respectively calculated, and the distances are calculated by using euclidean distances:
Figure BDA0001924354940000081
in the above formula, LiRepresenting the ith sample image, SjRepresenting the feature vector of the centroid of the jth cluster class,
Figure BDA0001924354940000082
t-th bit representing i-th sample imageThe characteristic value of the light-emitting diode is shown,
Figure BDA0001924354940000083
and the t-th characteristic value of the j-th cluster-like centroid characteristic vector is represented.
S303, dividing each sample image in the new sample image set into a group of initial cluster centroid feature vectors with the minimum Euclidean distance from the sample image;
illustratively, in this step, if the sample image is off-cluster centroid feature vector SiRecently, then this sample image belongs to SiPoint group, which can be divided into centroid feature vector S of the clusteriIn the group of (1); if the distances to the plurality of cluster-like centroid feature vectors are equal, the cluster-like centroid feature vectors can be divided into groups of any cluster-like centroid feature vectors:
label(Li)=arg min1≤j≤k{dist(Li,Sj)};
s304, calculating the average value of the total feature vectors of the sample images in the grouping of the centroid feature vectors of each initial cluster;
as an example, in this step, after all the sample images are grouped by distance, the mean of the total feature vectors of the sample images in each group is calculated (a simpler method is to average the feature values of each dimension of the sample images) as a new cluster-like centroid feature vector:
Figure BDA0001924354940000084
in the above formula, the first and second groups of the formula,
Figure BDA0001924354940000085
representing the jth new cluster-like centroid feature vector, NjNumber of sample images, l, representing centroid feature vectors of the jth clusteri∈C(Sj) Representing the sample image belonging to the jth cluster-like centroid feature vector.
S305, judging whether the Euclidean distances between the total feature vector mean value of the sample images in the grouping of all the initial cluster centroid feature vectors and the initial cluster centroid feature vectors are smaller than a distance threshold value;
the distance threshold δ can be set according to actual conditions, and the smaller the δ value, the larger the cluster calculation amount.
If the determination result in S305 is yes, go to step S307;
if the judgment result in the step S305 is negative, executing a step S306, and taking the total feature vector mean values in the grouping of the initial cluster centroid feature vectors as the updated initial cluster centroid feature vectors respectively; after step S306 is executed, step S302 is returned to;
s307, obtaining the current initial cluster centroid feature vector and the sample image in the grouping of the initial cluster centroid feature vector, and taking the sample image as the new cluster centroid feature vector of the clustering division and the sample image in the grouping of the new cluster centroid feature vector.
The technical scheme of the embodiment is to update the existing textile picture feature library of the server, and in the initial stage, the textile picture feature library needs to be established first. In an embodiment, as shown in fig. 4, the method for updating a textile picture feature library of this embodiment further includes a step of establishing a textile picture feature library, which specifically includes:
s402, obtaining sample images with preset classification numbers;
wherein the sample image is a textile sample image representing a textile type; in practical application, a sample image to be included in a textile picture feature library may be determined according to different textile models, different merchants, and the like, for example, for a textile of one model of a certain merchant, one or more sample images may be set, an initial textile sample image obtained by sampling the textile of the merchant and the model may be obtained, and then the initial textile sample image is subjected to certain preprocessing to obtain the sample image.
The initial textile sample image can be acquired in a manual shooting mode, and then the acquired initial textile sample image is recorded in a server for recording. In the step, the server preprocesses the collected initial textile sample image to obtain a sample image with a preset classification number.
S404, acquiring a gray value, an HSV color space value and a coordinate value of a pixel in a sample image;
the sample image may be an image with a preset resolution, the image is composed of a pixel matrix with the preset resolution, each pixel in the pixel matrix includes a plurality of channel values of a certain color space, and the color space may be, for example, an RGB color space (including three channel values of red R, green G, and blue B), a YUV color space (including three channel values of luminance Y, chrominance U, and chrominance V, which collectively represent saturation and chrominance), an HSV color space (including three channel values of hue H, saturation S, and luminance V), and the like.
The gray value can be calculated according to the channel value of the color space.
The HSV color space value may be obtained in different manners. If the sample image is represented by an HSV color space, an HSV color space value may be directly read from the sample image, and if the sample image is represented by another color space, an HSV color space value may be obtained through color space conversion.
The coordinate values represent information of relative positions of pixels in the image, and may have different specific expressions, for example, coordinates of a certain pixel point p in the sample image may be recorded as (x, y), where x is a column number and y is a row number.
In this step, the server may process the acquired sample image, and calculate and acquire a gray value, an HSV color space value, and a coordinate value of each pixel in the sample image according to a channel value of a color space of the sample image.
S406, calculating a color global feature vector, a color local feature vector and a gray texture local feature vector of the sample image according to the gray value, the HSV color space value and the coordinate value;
in this step, the color global feature vector of the sample image may be calculated according to the HSV color space value of the pixel in the sample image, the color local feature vector of the sample image may be calculated according to the HSV color space value and the coordinate value of the pixel in the sample image, and the gray texture local feature vector of the sample image may be calculated according to the gray value of the pixel in the sample image.
S408, clustering and dividing the sample images into groups of the first preset number of cluster centroid feature vectors according to the color global feature vectors, the color local feature vectors and the gray texture local feature vectors of the sample images;
the first preset number of the cluster-like centroid feature vectors can be set according to actual conditions, and under the condition that the total number of the sample images is determined, the smaller the first preset number is set, the smaller the number of groups is, the more sample images participating in matching in a single group are in the matching of the textile images, and the later-stage matching computation amount is increased; on the contrary, the more the first preset number sets the more the grouping number is, when the textile pictures are matched, the more operation time is needed to be spent on determining the grouping to which the textile pictures belong, and the less the sample images participating in matching in a single grouping are, the higher the probability of failure in matching is, so that the grouping number of the cluster-like centroid feature vectors needs to be set appropriately.
In this step, the server may determine a first preset number of groups of suitable cluster-like centroid feature vectors according to the total number of sample images. And then clustering and dividing the sample images into groups of the cluster centroid feature vectors of the first preset number according to the Euclidean distance between the color global feature vector, the color local feature vector and the gray texture local feature vector of each sample image and the cluster centroid feature vectors of each cluster.
And S410, generating a textile picture feature library according to the cluster-like centroid feature vectors and the sample images in the groups of the cluster-like centroid feature vectors.
In the case that the sample image cluster has been divided into the first preset number of groups of cluster centroid feature vectors in step S208, in this step, the server may generate a textile picture feature library, and record the cluster centroid feature vectors and the related information of the sample images in the groups of the cluster centroid feature vectors in the generated textile picture feature library.
In the embodiment, when the textile picture feature library is established, the color global feature vector, the color local feature vector and the gray texture local feature vector of the sample image are respectively collected, and the similar textile pictures can be identified to the greatest extent by feature extraction combining the global feature and the local feature, so that the success rate of textile picture matching is improved. The dimensions of the global features and the local features are reduced, reducing the amount of computation. Similar sample images are clustered and grouped through clustering and grouping operations of the sample images in the textile image feature library, so that the number of samples participating in matching can be reduced when the textile images are matched, the operation amount is reduced on the premise of ensuring the matching accuracy, and the matching efficiency of the textile images is improved.
In one embodiment, the step S402 of obtaining sample images with preset classification numbers includes: acquiring an initial textile sample image with a preset classification number; segmenting a textile sample image unit with a preset resolution from each initial textile sample image, and adjusting the contrast of the textile sample image unit to a preset contrast value to obtain an adjusted textile sample image unit; the adjusted textile sample image element is taken as the sample image.
As an example, after receiving the initial textile sample picture sent by the terminal, the server may process the initial textile sample picture into an original textile sample image with a preset resolution according to matching requirements.
For example, as shown in fig. 5, taking an example of the preset resolution being 512 × 512, the initial textile sample image acquired by the terminal needs to be no less than 512 × 512 pixels, and after receiving the initial textile sample image acquired by the terminal, the server uniformly adjusts the initial textile sample image to 512 × 512 pixels. In particular, if the resolution of the acquired initial textile sample image is larger than 512 × 512, a sub-image of 512 × 512 pixels may be cropped as a segmented textile sample image unit starting from a certain area of the initial textile sample image, e.g. the upper left corner.
The server may then adjust the brightness and contrast of the textile sample image elements such that the image parameters of the adjusted textile sample image elements are consistent for use in matching the textile image.
Wherein, contrast transformation is also called contrast enhancement or histogram transformation, or contrast enhancement (stretching), which is an image processing method for improving image quality by changing the contrast of an image element by changing the brightness value of the image element, taking the RGB color space as an example of a textile sample image unit, and the range of the values of R, G, B three channels is [0, 255], then the color value of each pixel of the adjusted textile sample image unit can be calculated by using the following formula:
r'=r+(r-127)×2,g′=g+(g-127)×2,b'=b+(b-127)×2;
wherein r ', g ' and b ' are adjusted r, g and b values. For any one of r ', g' and b ', if the result calculated by the above formula is out of the range of [0, 255], if the result calculated by the r', g 'and b' is larger than 255, 255 is taken; and if the calculation results of r ', g ' and b ' are less than 0, taking 0.
After adjusting the color values of the pixels of the textile sample image unit, the server may take the adjusted textile sample image unit as a sample image.
In one embodiment, calculating the color global feature vector of the sample image in S406 includes:
respectively calculating the mean value of the H channel value, the mean value of the S channel value, the mean value of the V channel value, the variance of the H channel value, the variance of the S channel value, the variance of the V channel value, the third moment of the H channel value, the third moment of the S channel value and the third moment of the V channel value in the HSV color space values of the pixels in the sample image; and generating a color global feature vector of the sample image according to the mean value of the H channel value, the mean value of the S channel value, the mean value of the V channel value, the variance of the H channel value, the variance of the S channel value, the variance of the V channel value, the third moment of the H channel value, the third moment of the S channel value and the third moment of the V channel value.
Illustratively, the mean μ of the H-channel valueshMean value of S channel valuess、VMean value of channel values muvVariance σ of H channel valueshVariance σ of S channel valuesV variance of channel values σvThird order moment of H channel value tauhThird order moment of S channel value tausAnd the third moment τ of the V channel valuevEach can be calculated by the following formula:
Figure BDA0001924354940000131
Figure BDA0001924354940000132
Figure BDA0001924354940000133
wherein h isi、si、viRespectively representing the H channel value, S channel value, and V channel value of the ith pixel in the sample image, where n represents the total number of pixels in the sample image, and if the resolution of the sample image is 512 × 512, n is 262144. After the calculation is finished, 9 global characteristic values mu of the image are obtainedh、μs、μv、σh、σs、σv、τh、τs、τv. Generating a color global feature vector global containing 9 feature values:
global=(μh,μs,μv,σh,σs,σv,τh,τs,τv)。
in this embodiment, the mean, the variance, and the third moment of each channel value of the sample image in the HSV color space value are respectively extracted to generate a color global feature vector, which may reflect the global color feature of the entire sample image.
In one embodiment, calculating the color local feature vector of the sample image in S406 includes:
reducing the dimensions of an H channel value, an S channel value and a V channel value in the HSV color space value of each pixel in the sample image to generate a single-dimensional color value; the one-dimensional color values are quantized into color level values of a third preset number of color levels; quantizing the pixel distance value between each pixel and other pixels in the sample image into a pixel distance grade value of a fourth preset number of distance grades according to the coordinate value of each pixel in the sample image; generating a two-dimensional matrix containing elements of which the third preset number is multiplied by the fourth preset number according to the color grade value and the pixel distance grade value of each pixel in the sample image; the value of each element in the two-dimensional matrix represents the number of color grade values and pixel distance grade values under one color grade and distance grade; and expanding the two-dimensional matrix to obtain the color local characteristic vector of the sample image.
As an example, the HSV color space value may be quantized into a plurality of levels first to reduce the size of the feature space, and the number of levels quantized by each channel value may be determined according to actual requirements. Taking the quantization of the H channel value into 8 levels, the S channel value into 3 levels, and the V channel value into 4 levels as an example, the HSV color space value may be quantized as follows:
Figure BDA0001924354940000141
Figure BDA0001924354940000142
Figure BDA0001924354940000143
wherein H represents an H channel value before quantization, S represents an S channel value before quantization, V represents a V channel value before quantization, HnewRepresenting the quantized H-channel value, snewRepresenting the quantized S-channel value, vnewRepresenting the quantized V-channel value.
Dimension reduction and synthesis are carried out on the quantized H, S, V channel values to obtain a single-dimension color value hsv of one dimension, which is shown as the following formula:
hsv=12hnew+4snew+vnew
wherein, the hsv belongs to [0, 1.. multidot., 95 ].
The distance between the quantized pixel points is 4 levels [0, 1, 2, 3 ]]Generating a feature vector Local1,Local1The size is 4 × 96, i.e. contains 384 features, and the calculation method is as follows:
taking the sample image resolution as 512 × 512 as an example, the image pixel coordinates of 512 × 512 pixels can be quantized into 4 levels, which facilitates the calculation of the distance, for example, the coordinates of a certain pixel p are (x, y) x, y ∈ [0, 511], and then the new coordinates (x ', y') are:
Figure BDA0001924354940000144
after quantizing the image HSV color space to a single channel of 96 levels and the pixel coordinates to 4 levels, the calculation of the color local feature vector local1 can be implemented, for example, by the following pseudo-code:
the size of the two-dimensional array Local1 is 4 multiplied by 96, and all array elements are initialized to be 0;
Figure BDA0001924354940000151
after calculation, the calculated Local1 array is expanded to form a color Local feature vector Local1 containing 384 feature values, wherein the 384 feature values actually represent the color autocorrelation of the sample image.
In one embodiment, calculating the gray texture local feature vector of the sample image in S406 includes:
acquiring a fifth preset number of different convolution template matrixes; performing convolution calculation on each convolution template matrix and a gray value matrix formed by the gray values of all pixels in the sample image to obtain a convolution value of each pixel; generating a convolution image corresponding to each convolution template matrix according to the convolution value of each pixel in the sample image calculated by each convolution template matrix; respectively calculating the mean value and the variance of the gray values of the pixels in the convolution image aiming at each convolution image; and generating the gray texture local feature vector of the sample image according to the mean value and the variance of the gray values of the convolution images of the fifth preset number.
As an example, the gray texture local feature vector may be obtained by calculating a gray value of the sample image, and specifically includes the following steps:
firstly, performing convolution operation on an image by adopting preset 8 convolution template matrixes, wherein the 8 convolution template matrixes are as follows:
Figure BDA0001924354940000161
Figure BDA0001924354940000162
and for each convolution template matrix, carrying out convolution operation on the gray value of the sample image and the convolution template matrix to generate a convolution image. The convolution operation can be regarded as a weighted summation process, in which each pixel in the image region of the sample image is correspondingly multiplied by each element of the convolution kernel (weight matrix), and the sum of all products is used as a new value of the region center pixel. The size of the generated convolved image corresponds to the resolution of the sample image, e.g. the sample image resolution is 512 x 512, and the resolution of the generated convolved image is also 512 x 512. Each sample image may generate 8 convolved images AM1, AM2, AM3, AM4, AM5, AM6, AM7, AM 8.
For each convolution image, use
Figure BDA0001924354940000163
Calculating the mean and variance of the convolution image as a whole, wherein n represents the total number of pixels in the convolution image, grayiRepresenting the gray value of the ith pixel in the convolved image. Finally, each sampleThe 8 convolution images of the present image generate 16 feature values μ g1, σ g1, μ g2, σ g2, μ g3, σ g3, μ g4, σ g4, μ g5, σ g5, μ g6, σ g6, μ g7, σ g7, μ g8, and σ g8, and generate a local feature vector local2 of the gray texture of the sample image:
local2=(μg1,σg1,μg2,σg2,μg3,σg3,μg4,σg4,σg5,σg5,μg6,σg6,μg7,σg7,μg8,σg8)。
by the calculation methods of the three embodiments, a sample image feature library L ═ L { L ═ L including z sample images can be obtained1、L2、…、Li、…、LzH, any one of the sample images LiWith three feature vectors global, local1 and local2, the sample image LiCan be noted as
Figure BDA0001924354940000164
Wherein the content of the first and second substances,
Figure BDA0001924354940000165
representative sample image LiFor example, global includes 9 eigenvalues, local1 includes 384 eigenvalues, and local2 includes 16 eigenvalues, then m is 409.
In an embodiment, as shown in fig. 6, the method for updating a textile picture feature library of this embodiment further includes a step of matching a textile picture, which specifically includes:
s602, obtaining a textile picture to be matched;
in the step, the server acquires a textile picture to be matched according to a matching request sent by the terminal;
specifically, the terminal can send an original textile picture to be matched to the server, and the server processes the received original textile picture to obtain a textile picture for matching. The original textile picture can be obtained by shooting through a terminal camera or directly reading from a terminal memory such as an album.
In this step, when the server processes the received original textile picture, the resolution, contrast, and the like of the textile picture to be matched, which is obtained by the processing, need to be consistent with the resolution, contrast, and the like of the sample image in the textile picture feature library.
S604, acquiring a gray value, an HSV color space value and a coordinate value of a pixel in the textile picture;
in this step, the server may process the acquired textile picture by using the same method as the method of acquiring the gray value, the HSV color space value, and the coordinate value of the pixel in the sample image in S404, and calculate and acquire the gray value, the HSV color space value, and the coordinate value of each pixel in the textile picture according to the channel value of the color space of the textile picture.
S606, calculating a color global feature vector, a color local feature vector and a gray texture local feature vector of the textile picture according to the gray value, the HSV color space value and the coordinate value;
in this step, the server may calculate the color global feature vector, the color local feature vector, and the gray texture local feature vector of the textile picture according to the gray value, the HSV color space value, and the coordinate value of the pixel in the textile picture by the same method as that of calculating the color global feature vector, the color local feature vector, and the gray texture local feature vector of the sample picture according to the gray value, the HSV color space value, and the coordinate value of the pixel in the sample picture in S406.
S608, determining the grouping of the cluster centroid feature vectors to which the textile pictures belong according to the first Euclidean distance between the color global feature vectors of the textile pictures and the color global feature vectors of the cluster centroid feature vectors in the textile picture feature library;
the server is pre-stored with an established textile picture feature library, and the textile picture feature library is pre-stored with a plurality of cluster-like centroid feature vectors and sample images classified into groups of the cluster-like centroid feature vectors. The Euclidean distance between the total feature vector of each sample image under a group and the cluster-like centroid feature vector of the group is smaller than a distance threshold value. Each cluster-like centroid feature vector comprises a group of color global feature vectors, color local feature vectors and gray texture local feature vectors.
In this step, the server may calculate a first euclidean distance between the currently acquired color global feature vector of the textile picture and the color global feature vector of each cluster centroid feature vector in the textile picture feature library, and select a cluster-like centroid feature vector group having the smallest first euclidean distance from the color global feature vector of the textile picture as a cluster-like centroid feature vector group to which the textile picture belongs.
S610, determining sample images matched with the textile images in the grouping of the cluster-like centroid feature vectors according to the color global feature vectors, the color local feature vectors and the gray texture local feature vectors.
Each sample image in the textile picture feature library is associated and recorded with a color global feature vector, a color local feature vector and a gray texture local feature vector of the sample image.
In this step, in the case that the grouping of the cluster centroid feature vector to which the textile picture belongs has been determined in step S608, the server may directly perform matching identification on the textile picture and each sample image under the grouping of the cluster centroid feature vector to which the textile picture belongs, and analyze and determine the sample image of the textile picture matching in the grouping of the cluster centroid feature vector according to the color global feature vector, the color local feature vector, and the gray texture local feature vector of the textile picture, and the color global feature vector, the color local feature vector, and the gray texture local feature vector of each sample image in the grouping.
In the embodiment, when the textile pictures are matched, the color global feature vector, the color local feature vector and the gray texture local feature vector of the textile pictures are respectively calculated according to the gray value, the HSV color space value and the coordinate value of the pixel in the textile pictures to be matched, when the textile pictures are matched, the groups of the textile pictures in the textile picture feature library are determined according to the color global feature vector, and then the textile pictures are matched with the sample images in the groups to obtain the sample images matched with the textile pictures, so that the number of the sample images participating in matching can be effectively reduced, the operation amount is reduced, and the accuracy and the efficiency of textile picture matching are improved.
It should be understood that although the steps in the flowcharts of fig. 2, 3, 4, and 6 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2, 3, 4, and 6 may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least some of the sub-steps or stages of other steps.
In one embodiment, as shown in fig. 7, there is provided an updating apparatus 700 for a textile picture feature library, including: a to-be-updated picture obtaining module 702, a new sample image set obtaining module 704, a total feature vector obtaining module 706, a new sample image cluster dividing module 708, and a textile picture feature library updating module 77, wherein:
a to-be-updated picture acquiring module 702, configured to acquire a textile picture that fails to be matched;
a new sample image set obtaining module 704, configured to, if the number of the textile pictures that fail to be matched exceeds the update number threshold, mix each textile picture that fails to be matched as a sample image with a sample image in the current textile picture feature library to obtain a new sample image set;
a total feature vector obtaining module 706, configured to obtain a total feature vector of each sample image in the new sample image set; the total feature vector is generated according to the color global feature vector, the color local feature vector and the gray texture local feature vector;
the new sample image clustering and dividing module 708 is configured to cluster and divide sample images in the new sample image set into groups of a second preset number of new cluster centroid feature vectors according to the total feature vectors;
and the textile picture feature library updating module 710 is configured to generate an updated textile picture feature library according to each new cluster-like centroid feature vector and the sample images in the group of each new cluster-like centroid feature vector.
In one embodiment, the new sample image cluster partitioning module 708 is configured to:
s1, selecting the total feature vectors of a second preset number of sample images from the new sample image set as initial cluster-like centroid feature vectors;
s2, calculating Euclidean distances between the total characteristic vectors of all sample images in the new sample image set and the initial cluster-like centroid characteristic vectors;
s3, dividing each sample image in the new sample image set into a group of initial cluster centroid feature vectors with the minimum Euclidean distance to the sample image;
s4, calculating the average value of the total feature vectors of the sample images in the grouping of the centroid feature vectors of each initial cluster;
s5, if the Euclidean distance between the total feature vector mean value of the sample images in the grouping of all the initial cluster centroid feature vectors and the initial cluster centroid feature vector is smaller than a distance threshold, executing a step S7;
s6, if the Euclidean distance between the mean value of the total feature vectors in the groups of any initial cluster centroid feature vectors and the initial cluster centroid feature vectors is not smaller than a distance threshold, respectively taking the mean value of the total feature vectors in the groups of each initial cluster centroid feature vector as an updated initial cluster centroid feature vector, repeating the steps S2, S3 and S4 until the Euclidean distance between the mean value of the total feature vectors of the sample images in the groups of all initial cluster centroid feature vectors and the initial cluster centroid feature vectors is smaller than the distance threshold, and executing the step S7;
and S7, obtaining the current initial cluster centroid feature vector and the sample image in the grouping of the initial cluster centroid feature vector, and taking the sample image as the new cluster centroid feature vector of the cluster division and the sample image in the grouping of the new cluster centroid feature vector.
In one embodiment, the apparatus 700 for updating a textile picture feature library further includes a building module of the textile picture feature library, where the building module of the textile picture feature library includes:
the sample image acquisition module is used for acquiring sample images with preset classification numbers;
the system comprises a sample pixel value acquisition module, a pixel value acquisition module and a pixel value acquisition module, wherein the sample pixel value acquisition module is used for acquiring a gray value, an HSV color space value and a coordinate value of a pixel in a sample image;
the sample characteristic vector calculation module is used for calculating a color global characteristic vector, a color local characteristic vector and a gray texture local characteristic vector of the sample image according to the gray value, the HSV color space value and the coordinate value;
the sample image clustering and dividing module is used for clustering and dividing the sample images into groups of cluster centroid feature vectors with a first preset number according to the color global feature vectors, the color local feature vectors and the gray texture local feature vectors of all the sample images;
and the textile picture feature library generating module is used for generating a textile picture feature library according to the cluster-like centroid feature vectors and the sample images in the groups of the cluster-like centroid feature vectors.
In one embodiment, the sample feature vector calculation module is to: respectively calculating the mean value of the H channel value, the mean value of the S channel value, the mean value of the V channel value, the variance of the H channel value, the variance of the S channel value, the variance of the V channel value, the third moment of the H channel value, the third moment of the S channel value and the third moment of the V channel value in the HSV color space values of the pixels in the sample image; and generating a color global feature vector of the sample image according to the mean value of the H channel value, the mean value of the S channel value, the mean value of the V channel value, the variance of the H channel value, the variance of the S channel value, the variance of the V channel value, the third moment of the H channel value, the third moment of the S channel value and the third moment of the V channel value.
In one embodiment, the sample feature vector calculation module is to: reducing the dimensions of an H channel value, an S channel value and a V channel value in the HSV color space value of each pixel in the sample image to generate a single-dimensional color value; the one-dimensional color values are quantized into color level values of a third preset number of color levels; quantizing the pixel distance value between each pixel and other pixels in the sample image into a pixel distance grade value of a fourth preset number of distance grades according to the coordinate value of each pixel in the sample image; generating a two-dimensional matrix containing elements of which the third preset number is multiplied by the fourth preset number according to the color grade value and the pixel distance grade value of each pixel in the sample image; the value of each element in the two-dimensional matrix represents the number of color grade values and pixel distance grade values under one color grade and distance grade; and expanding the two-dimensional matrix to obtain the color local characteristic vector of the sample image.
In one embodiment, the sample feature vector calculation module is to: acquiring a fifth preset number of different convolution template matrixes; performing convolution calculation on each convolution template matrix and a gray value matrix formed by the gray values of all pixels in the sample image to obtain a convolution value of each pixel; generating a convolution image corresponding to each convolution template matrix according to the convolution value of each pixel in the sample image calculated by each convolution template matrix; respectively calculating the mean value and the variance of the gray values of the pixels in the convolution image aiming at each convolution image; and generating the gray texture local feature vector of the sample image according to the mean value and the variance of the gray values of the convolution images of the fifth preset number.
In one embodiment, the apparatus 700 for updating a feature library of a textile picture further comprises a textile picture matching module, and the textile picture matching module comprises:
the textile picture acquisition module is used for acquiring a textile picture to be matched;
the image pixel value acquisition module is used for acquiring the gray value, HSV color space value and coordinate value of the pixels in the textile image;
the image characteristic vector calculation module is used for calculating a color global characteristic vector, a color local characteristic vector and a gray texture local characteristic vector of the textile image according to the gray value, the HSV color space value and the coordinate value;
the cluster-like centroid grouping determination module is used for determining the grouping of the cluster-like centroid feature vectors to which the textile pictures belong according to the first Euclidean distance between the color global feature vectors of the textile pictures and the color global feature vectors of the cluster-like centroid feature vectors in the textile picture feature library;
and the image matching module is used for determining sample images matched with the textile images in the grouping of the cluster-like centroid feature vectors according to the color global feature vectors, the color local feature vectors and the gray texture local feature vectors.
For specific limitations of the updating device for the textile image feature library, reference may be made to the above limitations of the updating method for the textile image feature library, and details are not described here. The various modules in the updating means of the above-described textile picture feature library can be implemented in whole or in part by software, hardware and combinations thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In an embodiment, there is provided a computer device comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing the steps of the method for updating a textile picture feature library as any of the above embodiments when executing the computer program.
In an embodiment, a computer readable storage medium is provided, on which a computer program is stored, which when executed by a processor, implements the steps of the method for updating a textile picture feature library as any of the above embodiments.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A method for updating a textile picture feature library, the method comprising:
acquiring the textile pictures failed to be matched and the number of the textile pictures failed to be matched;
if the number of the textile pictures failed in matching exceeds the update number threshold, taking each textile picture failed in matching as a sample image to be mixed with the sample images in the current textile picture feature library to obtain a new sample image set;
determining a second preset number of groups of new cluster-like centroid feature vectors according to the total number of the sample images in the new sample image set, and acquiring the total feature vectors of all the sample images in the new sample image set; the total feature vector is generated according to the color global feature vector, the color local feature vector and the gray texture local feature vector;
clustering and dividing the sample images in the new sample image set into groups of new cluster-like centroid feature vectors of a second preset number according to Euclidean distances between the total feature vectors of all the sample images in the new sample image set and all the new cluster-like centroid feature vectors;
and generating an updated textile picture feature library according to each new cluster-like centroid feature vector and the sample images in the group of each new cluster-like centroid feature vector.
2. The method of claim 1, wherein dividing the sample image clusters in the new sample image set into a second predetermined number of groups of new cluster-like centroid feature vectors according to the euclidean distance between the total feature vector of each sample image in the new sample image set and each of the new cluster-like centroid feature vectors comprises:
s1, selecting the total feature vectors of a second preset number of sample images from the new sample image set as initial cluster-like centroid feature vectors;
s2, calculating Euclidean distances between the total characteristic vectors of all sample images in the new sample image set and the initial cluster-like centroid characteristic vectors;
s3, dividing each sample image in the new sample image set into a group of initial cluster centroid feature vectors with the minimum Euclidean distance to the sample image;
s4, calculating the average value of the total feature vectors of the sample images in the grouping of the centroid feature vectors of each initial cluster;
s5, if the Euclidean distance between the total feature vector mean value of the sample images in the grouping of all the initial cluster centroid feature vectors and the initial cluster centroid feature vector is smaller than a distance threshold, executing a step S7;
s6, if the Euclidean distance between the mean value of the total feature vectors in the groups of any initial cluster centroid feature vectors and the initial cluster centroid feature vectors is not smaller than a distance threshold, respectively taking the mean value of the total feature vectors in the groups of each initial cluster centroid feature vector as an updated initial cluster centroid feature vector, repeating the steps S2, S3 and S4 until the Euclidean distance between the mean value of the total feature vectors of the sample images in the groups of all initial cluster centroid feature vectors and the initial cluster centroid feature vectors is smaller than the distance threshold, and executing the step S7;
and S7, obtaining the current initial cluster centroid feature vector and the sample image in the grouping of the initial cluster centroid feature vector, and taking the sample image as the new cluster centroid feature vector of the cluster division and the sample image in the grouping of the new cluster centroid feature vector.
3. The method of claim 1, further comprising:
obtaining sample images with preset classification numbers;
acquiring a gray value, an HSV color space value and a coordinate value of a pixel in the sample image;
calculating a color global feature vector, a color local feature vector and a gray texture local feature vector of the sample image according to the gray value, the HSV color space value and the coordinate value;
clustering and dividing the sample images into groups of cluster centroid feature vectors with a first preset number according to the color global feature vectors, the color local feature vectors and the gray texture local feature vectors of all the sample images;
and generating a textile picture feature library according to the cluster centroid feature vectors and the sample images in the groups of the cluster centroid feature vectors.
4. The method of claim 3, wherein the computing the color global feature vector for the sample image comprises:
respectively calculating the mean value of the H channel value, the mean value of the S channel value, the mean value of the V channel value, the variance of the H channel value, the variance of the S channel value, the variance of the V channel value, the third moment of the H channel value, the third moment of the S channel value and the third moment of the V channel value in the HSV color space values of the pixels in the sample image;
and generating a color global feature vector of the sample image according to the mean value of the H channel value, the mean value of the S channel value, the mean value of the V channel value, the variance of the H channel value, the variance of the S channel value, the variance of the V channel value, the third moment of the H channel value, the third moment of the S channel value and the third moment of the V channel value.
5. The method of claim 3, wherein the computing the color local feature vector for the sample image comprises:
reducing the dimensions of an H channel value, an S channel value and a V channel value in the HSV color space value of each pixel in the sample image to generate a single-dimensional color value; the one-dimensional color values are quantized to color level values of a third preset number of color levels;
quantizing the pixel distance value between each pixel and other pixels in the sample image into a pixel distance grade value of a fourth preset number of distance grades according to the coordinate value of each pixel in the sample image;
generating a two-dimensional matrix containing elements of which the third preset number is multiplied by the fourth preset number according to the color grade value and the pixel distance grade value of each pixel in the sample image; wherein, the value of each element in the two-dimensional matrix represents the number of color grade values and pixel distance grade values under one color grade and distance grade;
and expanding the two-dimensional matrix to obtain the color local characteristic vector of the sample image.
6. The method of claim 3, wherein computing the gray-scale texture local feature vector for the sample image comprises:
acquiring a fifth preset number of different convolution template matrixes;
performing convolution calculation on the convolution template matrixes and a gray value matrix formed by the gray values of the pixels in the sample image respectively to obtain convolution values of the pixels;
generating a convolution image corresponding to each convolution template matrix according to the convolution value of each pixel in the sample image calculated by each convolution template matrix;
respectively calculating the mean value and the variance of the gray values of the pixels in the convolution image aiming at each convolution image;
and generating a gray texture local feature vector of the sample image according to the mean value and the variance of the gray values of the convolution images of the fifth preset number.
7. The method of any one of claims 1 to 6, further comprising:
acquiring a textile picture to be matched;
acquiring a gray value, an HSV color space value and a coordinate value of a pixel in the textile picture;
calculating a color global characteristic vector, a color local characteristic vector and a gray texture local characteristic vector of the textile picture according to the gray value, the HSV color space value and the coordinate value;
determining the grouping of cluster centroid feature vectors to which the textile pictures belong according to a first Euclidean distance between the color global feature vectors of the textile pictures and the color global feature vectors of the cluster centroid feature vectors in a textile picture feature library;
and determining a sample image matched with the textile picture in the grouping of the cluster-like centroid feature vectors according to the color global feature vector, the color local feature vector and the gray texture local feature vector.
8. An apparatus for updating a library of textile picture features, the apparatus comprising:
the to-be-updated picture acquisition module is used for acquiring the textile pictures failed to be matched and the number of the textile pictures failed to be matched;
a new sample image set obtaining module, configured to, if the number of the textile pictures failed to be matched exceeds the update number threshold, mix each textile picture failed to be matched as a sample image with a sample image in the current textile picture feature library to obtain a new sample image set;
a total feature vector acquisition module, configured to determine a second preset number of groups of new cluster-like centroid feature vectors according to the total number of sample images in the new sample image set, and acquire a total feature vector of each sample image in the new sample image set; the total feature vector is generated according to the color global feature vector, the color local feature vector and the gray texture local feature vector;
the new sample image clustering and dividing module is used for clustering and dividing the sample images in the new sample image set into groups of a second preset number of new cluster centroid feature vectors according to Euclidean distances between the total feature vectors of all the sample images in the new sample image set and all the new cluster centroid feature vectors;
and the textile picture feature library updating module is used for generating an updated textile picture feature library according to each new cluster-like centroid feature vector and the sample image in each new cluster-like centroid feature vector group.
9. A computer device comprising a memory and a processor, the memory storing a computer program, wherein the processor implements the steps of the method of any one of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
CN201811609396.4A 2018-12-27 2018-12-27 Updating method and device of textile picture feature library Active CN109784379B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811609396.4A CN109784379B (en) 2018-12-27 2018-12-27 Updating method and device of textile picture feature library

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811609396.4A CN109784379B (en) 2018-12-27 2018-12-27 Updating method and device of textile picture feature library

Publications (2)

Publication Number Publication Date
CN109784379A CN109784379A (en) 2019-05-21
CN109784379B true CN109784379B (en) 2021-03-30

Family

ID=66497802

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811609396.4A Active CN109784379B (en) 2018-12-27 2018-12-27 Updating method and device of textile picture feature library

Country Status (1)

Country Link
CN (1) CN109784379B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113065447A (en) * 2021-03-29 2021-07-02 南京掌控网络科技有限公司 Method and equipment for automatically identifying commodities in image set
CN116955669B (en) * 2023-09-19 2023-12-22 江苏洁瑞雅纺织品有限公司 Updating system of textile picture feature library

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101551823A (en) * 2009-04-20 2009-10-07 浙江师范大学 Comprehensive multi-feature image retrieval method
CN106022063A (en) * 2016-05-27 2016-10-12 广东欧珀移动通信有限公司 Unlocking method and mobile terminal
CN103430179B (en) * 2011-02-07 2017-04-05 英特尔公司 Add method, system and the computer-readable recording medium of new images and its relevant information in image data base
CN107766563A (en) * 2017-11-07 2018-03-06 广东欧珀移动通信有限公司 Method, apparatus, storage medium and the electronic equipment updated the data

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6138162B2 (en) * 2012-02-06 2017-05-31 インサイテック・リミテッド Reference-based motion tracking during non-invasive therapy
US9720934B1 (en) * 2014-03-13 2017-08-01 A9.Com, Inc. Object recognition of feature-sparse or texture-limited subject matter
CN108124478A (en) * 2017-12-05 2018-06-05 深圳前海达闼云端智能科技有限公司 Picture searching method and apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101551823A (en) * 2009-04-20 2009-10-07 浙江师范大学 Comprehensive multi-feature image retrieval method
CN103430179B (en) * 2011-02-07 2017-04-05 英特尔公司 Add method, system and the computer-readable recording medium of new images and its relevant information in image data base
CN106022063A (en) * 2016-05-27 2016-10-12 广东欧珀移动通信有限公司 Unlocking method and mobile terminal
CN107766563A (en) * 2017-11-07 2018-03-06 广东欧珀移动通信有限公司 Method, apparatus, storage medium and the electronic equipment updated the data

Also Published As

Publication number Publication date
CN109784379A (en) 2019-05-21

Similar Documents

Publication Publication Date Title
WO2020147445A1 (en) Rephotographed image recognition method and apparatus, computer device, and computer-readable storage medium
CN111860147B (en) Pedestrian re-identification model optimization processing method and device and computer equipment
CN108090511B (en) Image classification method and device, electronic equipment and readable storage medium
CN112651438A (en) Multi-class image classification method and device, terminal equipment and storage medium
CN108710916B (en) Picture classification method and device
US20120201464A1 (en) Computer readable medium, image processing apparatus, and image processing method
CN109741380B (en) Textile picture fast matching method and device
CN111275685A (en) Method, device, equipment and medium for identifying copied image of identity document
CN113361495A (en) Face image similarity calculation method, device, equipment and storage medium
CN109784379B (en) Updating method and device of textile picture feature library
CN113344000A (en) Certificate copying and recognizing method and device, computer equipment and storage medium
CN111179270A (en) Image co-segmentation method and device based on attention mechanism
CN113469092A (en) Character recognition model generation method and device, computer equipment and storage medium
CN109657083B (en) Method and device for establishing textile picture feature library
CN115578590A (en) Image identification method and device based on convolutional neural network model and terminal equipment
CN113313092B (en) Handwritten signature recognition method, and claims settlement automation processing method, device and equipment
RU2361273C2 (en) Method and device for identifying object images
CN114444565A (en) Image tampering detection method, terminal device and storage medium
CN111507119A (en) Identification code identification method and device, electronic equipment and computer readable storage medium
CN115115552B (en) Image correction model training method, image correction device and computer equipment
JP6151908B2 (en) Learning device, identification device, and program thereof
CN114445916A (en) Living body detection method, terminal device and storage medium
CN114332915A (en) Human body attribute detection method and device, computer equipment and storage medium
CN110647898B (en) Image processing method, image processing device, electronic equipment and computer storage medium
CN113095147A (en) Skin area detection method, system, image processing terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant