CN110942081B - Image processing method, device, electronic equipment and readable storage medium - Google Patents

Image processing method, device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN110942081B
CN110942081B CN201811120898.0A CN201811120898A CN110942081B CN 110942081 B CN110942081 B CN 110942081B CN 201811120898 A CN201811120898 A CN 201811120898A CN 110942081 B CN110942081 B CN 110942081B
Authority
CN
China
Prior art keywords
image set
feature vector
type
images
vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811120898.0A
Other languages
Chinese (zh)
Other versions
CN110942081A (en
Inventor
张修宝
王艳
沈海峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Didi Infinity Technology and Development Co Ltd
Original Assignee
Beijing Didi Infinity Technology and Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Didi Infinity Technology and Development Co Ltd filed Critical Beijing Didi Infinity Technology and Development Co Ltd
Priority to CN201811120898.0A priority Critical patent/CN110942081B/en
Publication of CN110942081A publication Critical patent/CN110942081A/en
Application granted granted Critical
Publication of CN110942081B publication Critical patent/CN110942081B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention provides an image processing method, an image processing device, electronic equipment and a readable storage medium, and belongs to the field of image processing. The method comprises the following steps: acquiring a first type image set; extracting the feature vector of each first type image; performing pairwise similarity calculation on all the feature vectors extracted in the first type of image set to obtain a deviation value between every two feature vectors; and cleaning the first type image set according to the deviation value between the characteristic vectors. According to the scheme, the feature vector of each first type of image is obtained, then the feature vector is used for carrying out similarity calculation on every two images, so that the deviation value between every two feature vectors is obtained, the first type of image set is cleaned according to the deviation value between the feature vectors, and therefore images which do not meet the similarity requirement in the first type of image set can be removed, interference of the image data is avoided, and a good image cleaning effect is achieved.

Description

Image processing method, device, electronic equipment and readable storage medium
Technical Field
The present invention relates to the field of image processing, and in particular, to an image processing method, an image processing device, an electronic device, and a readable storage medium.
Background
With the development of scientific technology, the deep learning neural network greatly promotes the development of image recognition technology, so that the image recognition technology has very wide application. But neural networks are very dependent on data, and a data set that is sufficient in volume and trustworthy is a prerequisite for the neural network to be effective. In order to train the neural network, a large amount of sample data is required to be obtained first, most of the presently disclosed image data sets are images collected on the network, the quality is uneven, some images can be very fuzzy, the characteristics in the images are not obvious, some images can be mistakenly recognized, the training of the network is quite unfavorable, so that the training result is not ideal, and finally, the problem of inaccurate recognition result when the images are recognized is caused. At present, a mode of manually cleaning an image is mostly adopted, but under the condition of huge data volume, the time and the labor are obviously wasted, and the cleaning result is also not ideal.
Disclosure of Invention
An embodiment of the invention aims to provide an image processing method, an image processing device, electronic equipment and a readable storage medium.
In a first aspect, an embodiment of the present invention provides an image processing method, including:
Acquiring a first type image set, wherein the first type image set comprises a plurality of first type images;
extracting the feature vector of each first type image;
performing pairwise similarity calculation on all the feature vectors extracted in the first type of image set to obtain a deviation value between every two feature vectors; and is combined with
And cleaning the first type image set according to the deviation value among the feature vectors.
Further, cleaning the first type image set according to the deviation value between the feature vectors, including:
counting the number of the feature vectors with the deviation value smaller than or equal to a first threshold value for each feature vector;
determining the feature vector with the largest number as a first reference feature vector of the first type image set;
and cleaning the first type image set according to the first reference feature vector.
Further, cleaning the first type of image set according to the first reference feature vector includes:
and determining the feature vector, of the feature vectors of all the images in the first type image set, of which the deviation value with the first reference feature vector is larger than the first threshold value, and removing the image corresponding to the determined feature vector from the first type image set.
Further, cleaning the first type of image set according to the first reference feature vector includes:
determining feature vectors which are neither directly similar to the first reference feature vector nor indirectly similar to the first reference feature vector in the feature vectors of all images of the first type image set, and removing the images corresponding to the determined feature vectors from the first type image set;
wherein, directly similar to the first reference feature vector means: the deviation value of the first reference feature vector is smaller than or equal to the first threshold value; indirect similarity to the first reference feature vector means: and the deviation value of the feature vector which is directly similar or indirectly similar to the first reference feature vector is smaller than or equal to the first threshold value.
Further, cleaning the first type image set according to the deviation value between the feature vectors, including:
if the deviation value of the two feature vectors is smaller than or equal to a second threshold value, eliminating the image corresponding to one of the two feature vectors from the first type image set, wherein the second threshold value is smaller than the first threshold value.
Further, after cleaning the first type image set according to the deviation value between the feature vectors, the method further includes:
Performing similarity calculation on the first reference feature vector and a second reference feature vector of a second class image set to obtain a deviation value between the first reference feature vector and the second reference feature vector;
and if the deviation value of the first reference feature vector and the second reference feature vector is smaller than a third threshold value, merging the first type image set and the second type image set into the same image set.
Further, after cleaning the first type image set according to the deviation value between the feature vectors, the method further includes:
calculating a first vector average value of the feature vectors of the first type of image set, a second vector average value of the feature vectors of the second type of image set, and a deviation value between the first vector average value and the second vector average value;
and if the deviation value of the first vector average value and the second vector average value is smaller than a fourth threshold value, merging the first type image set and the second type image set into the same image set.
Further, performing a pairwise similarity calculation on all the feature vectors extracted from the first type of image set to obtain a deviation value between every two feature vectors, including:
And calculating Euclidean distance or included angle cosine values of every two feature vectors to obtain the Euclidean distance or included angle cosine values between every two feature vectors, wherein the Euclidean distance or the included angle cosine values are the deviation values.
Further, before acquiring the first type image set, the first type image set includes the plurality of first type images, the method further includes:
acquiring a plurality of images belonging to a first type of image set;
selecting an image with a human face from the plurality of images as a plurality of original images;
and carrying out standardization processing on the face areas in the plurality of original images to obtain the plurality of first-class images.
Further, the step of carrying out standardization processing on the face areas in the plurality of original images comprises the following steps:
and if the original image comprises a plurality of face images, extracting the face with the largest area in the original image, and carrying out standardization processing on the face with the largest area.
In a second aspect, an embodiment of the present invention provides an image processing apparatus, including:
the image acquisition module is used for acquiring a first type image set, wherein the first type image set comprises a plurality of first type images;
the feature vector extraction module is used for extracting feature vectors of each first type of image;
The similarity calculation module is used for carrying out similarity calculation on all the feature vectors extracted in the first type of image set to obtain a deviation value between every two feature vectors;
and the cleaning module is used for cleaning the first type image set according to the deviation value between the characteristic vectors.
Further, the cleaning module is specifically configured to count, for each feature vector, the number of feature vectors having a deviation value smaller than or equal to a first threshold; determining the feature vector with the largest number as a first reference feature vector of the first type image set; and cleaning the first type image set according to the first reference feature vector.
Further, the cleaning module is further configured to determine a feature vector, of feature vectors of all images in the first type of image set, having a deviation value with the first reference feature vector greater than the first threshold, and reject an image corresponding to the determined feature vector from the first type of image set.
Further, the cleaning module is further configured to determine a feature vector that is neither directly similar to the first reference feature vector nor indirectly similar to the first reference feature vector, from among feature vectors of all images in the first type image set, and reject an image corresponding to the determined feature vector from the first type image set;
Wherein, directly similar to the first reference feature vector means: the deviation value of the first reference feature vector is smaller than or equal to the first threshold value; indirect similarity to the first reference feature vector means: and the deviation value of the feature vector which is directly similar or indirectly similar to the first reference feature vector is smaller than or equal to the first threshold value.
Further, the cleaning module is further configured to reject an image corresponding to one of the two feature vectors from the first type image set if there is a deviation value of the two feature vectors that is less than or equal to a second threshold, where the second threshold is less than the first threshold.
Further, the apparatus further comprises:
the first inter-class cleaning module is used for carrying out similarity calculation on the first reference feature vector and a second reference feature vector of the second class image set to obtain a deviation value between the first reference feature vector and the second reference feature vector; and if the deviation value of the first reference feature vector and the second reference feature vector is smaller than a third threshold value, merging the first type image set and the second type image set into the same image set.
Further, the apparatus further comprises:
The second inter-class cleaning module is used for calculating a first vector average value of the feature vectors of the first class image set, a second vector average value of the feature vectors of the acquired second class image set and a deviation value between the first vector average value and the second vector average value; and if the deviation value of the first vector average value and the second vector average value is smaller than a fourth threshold value, merging the first type image set and the second type image set into the same image set.
Further, the similarity calculating module is specifically configured to calculate a euclidean distance or an included angle cosine value of each two feature vectors, and obtain the euclidean distance or the included angle cosine value between each two feature vectors, where the euclidean distance or the included angle cosine value is the offset value.
Further, the apparatus further comprises:
the image processing module is used for acquiring a plurality of images belonging to a first type of image set; selecting an image with a human face from the plurality of images as a plurality of original images; and carrying out standardization processing on the face areas in the plurality of original images to obtain the plurality of first-class images.
Further, the image processing module is further configured to extract a face with a largest area in the original image if the original image includes a plurality of face images, and perform standardization processing on the face with the largest area.
In a third aspect, an embodiment of the present invention provides an electronic device comprising a processor and a memory storing computer readable instructions which, when executed by the processor, perform the steps of the method as provided in the first aspect above.
In a fourth aspect, embodiments of the present invention provide a readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of the method as provided in the first aspect above.
The embodiment of the invention provides an image processing method, an image processing device, electronic equipment and a readable storage medium, wherein the method comprises the steps of firstly acquiring a first type image set, wherein the first type image set comprises a plurality of first type images, then extracting feature vectors of each first type image, and then carrying out pairwise similarity calculation on all the feature vectors extracted in the first type image set to obtain a deviation value between every two feature vectors; and cleaning the first type image set according to the deviation value between the characteristic vectors. According to the scheme, the feature vector of each first type of image is obtained, then the feature vector is used for carrying out similarity calculation on every two images, so that the deviation value between every two feature vectors is obtained, the first type of image set is cleaned according to the deviation value between the feature vectors, and therefore images which do not meet the similarity requirement in the first type of image set can be removed, interference of the image data is avoided, and a good image cleaning effect is achieved.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be apparent from the description, or may be learned by practice of the embodiments of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims thereof as well as the appended drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of an image processing method according to an embodiment of the present invention;
fig. 2 is a flowchart of the substeps of step S130 in an image processing method according to an embodiment of the present invention;
FIG. 3 is a flowchart illustrating a step of screening an image between two class image sets according to an embodiment of the present invention;
FIG. 4 is a flowchart illustrating another step of screening images between two class image sets according to an embodiment of the present invention;
Fig. 5 is a block diagram of an image processing apparatus according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. The components of the embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the invention, as presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be made by a person skilled in the art without making any inventive effort, are intended to be within the scope of the present invention.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures. Meanwhile, in the description of the present invention, the terms "first", "second", and the like are used only to distinguish the description, and are not to be construed as indicating or implying relative importance.
Referring to fig. 1, fig. 1 is a flowchart of an image processing method according to an embodiment of the present invention, where the method includes the following steps:
step S110: a first type image set is acquired, the first type image set comprising a plurality of first type images.
Step S120: and extracting the characteristic vector of each image of the first type.
In the process of image recognition, in order to accurately recognize images, clear and effective images are required to be obtained first, and then the images are input into a neural network model for training, so that the recognition result is more accurate when the images are recognized through the neural network model.
In a specific embodiment, in order to obtain sample data for training the neural network model, a plurality of images belonging to the first type of image set are also required to be obtained first, where the plurality of images may be face images, body contour images, color feature images, and the like.
The feature vector extracted from the image may be a feature vector obtained by extracting features from face data of each first type of image, where the feature vector represents features of a face in the first type of image.
Of course, the feature vector may also be a vector obtained by extracting other features in the image, such as a color feature vector, a texture feature vector, a body contour feature vector, and the like.
The first type of image set may be obtained after a large number of images are acquired and then the images are specifically classified, for example, an image of a user is classified into one type of image set, that is, an image of a class represents an image of a user, so that a plurality of types of image sets can be obtained, and therefore, a plurality of images in the first type of image set are images of a user. However, when selecting an image of a certain user, the image set of a certain class may be mixed with images of other users, or the images of a certain user are in two class image sets, or different image classes are divided into the same class, or other images which are not satisfactory, such as mixed with unclear images, etc., so that the images also need to be screened.
As an optional implementation manner, since image recognition is generally performed through face recognition, in order to accurately recognize images later, multiple images belonging to a first type image set need to be acquired first, then an image with a face in the multiple images is selected as multiple original images, and then a face region in the multiple original images is standardized to obtain the multiple first type images.
The face image may be screened from the first type image set, that is, the image that is not a face may be screened from the multiple original images, for example, a landscape image, etc., but because the proportion of the face in the selected original image with a face may be inconsistent, for example, the area occupied by the face in the first original image is very large, the area occupied by the face in the second original image is very small, so in order to avoid the non-ideal training result caused by inconsistent samples during training, the face area in the multiple original images is standardized, or if one original image includes multiple face images, the face with the largest area in the original image is extracted, and the normalization processing is performed on the face with the largest area, that is, the images are normalized, that is, multiple first type images with the same form are obtained, where the normalization processing process may be:
firstly, determining the size of a face in a required image, for example, 100 x 100, then determining a facial feature template according to the size of the face and the position relation of facial features, and then transforming the detected facial features of the face to the determined facial feature template through a transformation matrix.
Therefore, the face area in each original image subjected to the normalization processing is made uniform in size, i.e., the face area in each of the plurality of first-class images is made uniform in size.
It should be noted that, in order to identify an image, an image having other identifying features, for example, an image having a color or a body contour, may be selected from a plurality of images, and the embodiment of the present invention describes an image having a face.
Optionally, the extracting the feature vector of each of the obtained plurality of first type images may be: each first type of image is input into a face recognition model, wherein the face recognition model can be various models in the prior art, such as a sphere model, an arcface model and the like, and in the face recognition model, feature extraction can be performed on face data of each first type of image so as to obtain feature vectors, and the feature vectors represent features of faces in the first type of image.
Step S130: and carrying out similarity calculation on all the feature vectors extracted in the first type of image set to obtain a deviation value between every two feature vectors.
In this embodiment, after obtaining respective feature vectors of each first type image in all first type images, performing a pairwise similarity calculation on the feature vectors to obtain a bias value between every two feature vectors, which may be specifically: and calculating Euclidean distance or included angle cosine values of every two feature vectors to obtain the Euclidean distance or included angle cosine values between every two feature vectors, wherein the Euclidean distance or the included angle cosine values are the deviation values.
The similarity calculation process by calculating the Euclidean distance of every two feature vectors comprises the following steps: for example, for two first-type images, one of the first-type images has a feature vector of a= (1, 2,3,4, 5), and the other first-type image has a feature vector of b= (2, 3,5,6, 7), the euclidean distance between the feature vector a and the feature vector B isFor other feature vectors, two may also be calculated by the method described aboveThe euclidean distance of the feature vectors is calculated, and then the euclidean distance of each two feature vectors is used for carrying out subsequent similarity comparison through the euclidean distance of the two feature vectors, wherein the euclidean distance can be used as a deviation value of the two feature vectors, for example, the deviation value between the feature vector A and the feature vector B is 3.74.
The similarity calculation process by calculating the cosine value of the included angle of each two feature vectors comprises the following steps: for example, for two first-type images, one of the first-type images has a feature vector of a= (1, 2,3,4, 5), and the other first-type image has a feature vector of b= (2, 3,5,6, 7), the cosine value of the included angle between the feature vector a and the feature vector B isTherefore, the cosine value of the included angle between every two feature vectors can be obtained through the calculation method, and then the subsequent similarity comparison is carried out through the cosine value of the included angle between the two feature vectors, wherein the cosine value of the included angle can be used as the deviation value between the two feature vectors, for example, the deviation value between the feature vector A and the feature vector B is 0.997.
If the Euclidean distance or the cosine value of the included angle between the two feature vectors is smaller, the similarity between the two first type images corresponding to the two feature vectors is represented to be higher.
Step S140: and cleaning the first type image set according to the deviation value among the feature vectors.
The euclidean distance or the cosine value of the included angle can be used as the deviation value between the two feature vectors. If the Euclidean distance or the cosine value of the included angle between the two feature vectors is smaller, the two first type images corresponding to the two feature vectors are more similar, so that the first type image set can be cleaned according to the deviation value between every two feature vectors, for example, the images with low similarity can be removed.
The method for judging whether the two images are similar may be: if the Euclidean distance of the two eigenvectors corresponding to the two images is smaller than or equal to a first preset threshold value, or the cosine value of the included angle of the two eigenvectors is smaller than or equal to a second preset threshold value, the two images are similar images, and if the Euclidean distance of the two eigenvectors corresponding to the two images is larger than the first preset threshold value, or the cosine value of the included angle of the two eigenvectors is larger than the second preset threshold value, the two images are dissimilar images.
In addition, as an alternative embodiment, referring to fig. 2, the manner of cleaning the first type of image set according to the deviation values between the feature vectors may be: step S131: for each feature vector, the number of feature vectors with the deviation value smaller than or equal to the first threshold is counted, and step S132: determining that the feature vector with the largest number is the first reference feature vector of the first type image set, and step S133: and cleaning the first type image set according to the first reference feature vector.
The feature vector is counted by the calculated euclidean distance or cosine value of the included angle of each two feature vectors, for example, if the euclidean distance is used to calculate the similarity, the number of feature vectors whose euclidean distance with respect to each feature vector is smaller than or equal to the first threshold is counted, if there are four feature vectors A, B, C, D, for feature vector a, the euclidean distance between feature vector a and feature vector B, feature vector a and feature vector C, and feature vector a and feature vector D is calculated, respectively, if the euclidean distance between feature vector a and feature vector B is 2, the euclidean distance between feature vector a and feature vector C is 3, the euclidean distance between feature vector a and feature vector D is 4, and if the first threshold is 5, the number of feature vectors whose euclidean distance with respect to feature vector a is smaller than 5 is 3. The euclidean distance between the other feature vector and the feature vector B is also obtained in the above manner, and then the number of feature vectors having the euclidean distance from the feature vector B equal to or smaller than the first threshold, for example, the number of feature vectors is 2, and then the feature vectors C and D are also counted in the above manner, and if the number of feature vectors having the euclidean distance from the feature vector C equal to or smaller than the first threshold is 1 and the number of feature vectors having the euclidean distance from the feature vector D equal to or smaller than the first threshold is 0, the feature vector a having the largest number is determined as the first reference feature vector.
Of course, the manner of obtaining the first reference feature vector by performing similarity calculation using the cosine value of the included angle is consistent with the foregoing, and for brevity of description, too much description will not be made here.
The value of the first threshold may be inconsistent when performing similarity calculation using the euclidean distance or the cosine value of the included angle, for example, the first threshold may be set to 5 when performing similarity calculation using the euclidean distance, and the first threshold may be set to 0.5 when performing similarity calculation using the cosine value of the included angle, and the specific value of the first threshold may be set according to the requirement.
As an embodiment, the manner of cleaning the first type of image set according to the first reference feature vector may be: and determining the feature vector, of the feature vectors of all the images in the first type image set, of which the deviation value with the first reference feature vector is larger than the first threshold value, and removing the image corresponding to the determined feature vector from the first type image set.
It can be appreciated that, for example, for the first reference feature vector a obtained as described above, if the euclidean distance between the feature vector E and the first reference feature vector a is greater than the first threshold, if the euclidean distance between the feature vector E and the first reference feature vector a is 6, and the first threshold is 5, it indicates that the similarity between the image corresponding to the feature vector E and the image corresponding to the first reference feature vector a is not high, and then the image corresponding to the feature vector E may be removed from the first type image set.
Of course, feature vectors having a deviation value from the first reference feature vector greater than the first threshold value may also be understood as: and calculating an included angle cosine value of a certain feature vector and the first reference feature vector, and if the included angle cosine value of the feature vector and the first reference feature vector is larger than a first threshold value, indicating that the deviation value of the feature vector and the first reference feature vector is larger than the first threshold value.
It can be understood that, since the obtained first reference feature vector is the feature vector with the largest number, and the first reference feature vector corresponds to the largest number of images, the images are images with the largest similarity with other images in the first type image set, and the images with the large similarity with the images can be reserved, so that the images with the small similarity with the images can be removed.
In one embodiment, the method for cleaning the first type of image set according to the first reference feature vector may further be: and determining the feature vector which is neither directly similar to the first reference vector nor indirectly similar to the first reference feature vector in the feature vectors of all the images of the first type image set, and removing the images corresponding to the determined feature vector from the first type image set.
Wherein, directly similar to the first reference feature vector means: the deviation value of the first reference feature vector is smaller than or equal to the first threshold value; indirect similarity to the first reference feature vector means: and the deviation value of the feature vector which is directly similar or indirectly similar to the first reference feature vector is smaller than or equal to the first threshold value.
It will be appreciated that, for example, the first reference feature vector is a, and the feature vector similar to feature vector a is B, D, i.e., the deviation value of feature vector B, D from feature vector a is less than the first threshold value; a feature vector similar to feature vector B is C; a feature vector similar to feature vector C is B; a feature vector similar to feature vector D is B; a feature vector similar to feature vector E is F; then the feature vectors that are directly similar to feature vector a are B and D, and because feature vector B is similar to feature vector C, feature vector a is indirectly similar to feature vector D, while feature vector E is neither directly similar nor indirectly similar to feature vector a. Therefore, two images corresponding to the feature vector E and the feature vector F can be eliminated from the first-type image set.
In addition, in order to screen the images with repetition, if the deviation value of the two feature vectors is smaller than or equal to a second threshold value in the process of carrying out similarity calculation on the two feature vectors, the corresponding image of the two feature vectors is removed from the first type image set.
For example, if the euclidean distance between the feature vector a and the feature vector B is 1, that is, the deviation value between the feature vector a and the feature vector B is 1, and if the second threshold is 2, it indicates that the deviation value between the feature vector a and the feature vector B is equal to or less than the second threshold, that is, the image corresponding to the feature vector a and the image corresponding to the feature vector B are likely to be repeated images, the image corresponding to the feature vector a may be removed from the first-type image set, or the image corresponding to the feature vector B may be removed from the first-type image set.
If the two images are determined to be duplicate images, the similarity is definitely high, so the value of the second threshold may be set to be smaller than the first threshold, for example, if the two duplicate images are the two duplicate images, the euclidean distance between the feature vectors corresponding to the two images is small, for example, the euclidean distance is 0.2, and if the euclidean distance is less than or equal to the second threshold 0.3, this means that the deviation value of the two feature vectors is smaller than the second threshold; if the two photos are similar, the euclidean distance between the feature vectors corresponding to the two photos is relatively large, for example, the euclidean distance is 3, and if the euclidean distance is less than or equal to a first threshold value of 5, the deviation value of the two feature vectors is less than or equal to the first threshold value of 5, and in this case, the second threshold value is smaller than the first threshold value.
Of course, it may also be determined whether the deviation value between the two feature vectors is less than or equal to the second threshold by calculating the cosine value of the included angle between the two feature vectors, which is not described herein in detail for brevity.
As another embodiment, if an image of a user is classified into different class image sets, as if an image of a user is classified into two class image sets, then for screening the image of the user, it may be further determined whether the images in the two class image sets are similar images, that is, whether the images of the same user are images, and for the second class image set, a second reference feature vector of the second class image set may be calculated according to the above method, which will not be described in detail herein.
Referring to fig. 3, the method for filtering the image between two class image sets may include the following steps: step S150a: performing similarity calculation on the first reference feature vector and a second reference feature vector of the second class image set to obtain a deviation value between the first reference feature vector and the second reference feature vector, and step S160a: and if the deviation value of the first reference feature vector and the second reference feature vector is smaller than a third threshold value, merging the first type image set and the second image set into the same image set.
For example, if the first reference feature vector is A1 and the second reference feature vector is A2, the euclidean distance between A1 and A2 or the cosine value of the included angle may be calculated to perform similarity calculation, if the euclidean distance between A1 and A2 is smaller than the third threshold, it indicates that the images in the two image sets have great similarity, and the two image sets belong to the same user, and then the two image sets are combined.
In addition, referring to fig. 4, as an embodiment, the method for filtering the image between the two class image sets may further include the following steps: step S150b: calculating a first vector average value of the feature vectors of the first type of image set, a second vector average value of the feature vectors of the second type of image set, and a deviation value between the first vector average value and the second vector average value; step S160b: and if the deviation value of the first vector average value and the second vector average value is smaller than a fourth threshold value, merging the first type image set and the second type image set into the same image set.
For example, for the feature vector a= (1, 2, 3), the feature vector b= (3, 5, 6), the feature vector c= (7, 8, 9) in the first type image set, the first vector average value of the three feature vectors is calculated by the formula (a+b+c)/3 to be (1+3+7, 2+5+8, 3+6+9)/3= (3.7,5,6), for the feature vector d= (2, 4, 8) and the feature vector e= (5, 9, 11) in the second type image set, the second vector average value of the two feature vectors is calculated by the formula to be (3.5,6.5,9.5), if the fourth threshold value is 4, if the deviation value (e.g., euclidean distance) of the first vector average value and the second vector average value is calculated to be 3.8, the deviation value between the first vector average value and the second vector average value is smaller than the fourth threshold value, which indicates that the first type image set is similar to the second type image set, the two type image sets are combined.
As an optional implementation manner, for cleaning the images, the two image sets may be cleaned in a set first, and then cleaning between the image sets, that is, judging whether the two image sets are similar, and when the similarity is large, merging the two image sets. After merging, the merged image set can be cleaned again, including the cleaning modes of eliminating repeated images in the set, screening dissimilar images and the like.
As another alternative implementation manner, for cleaning the images, cleaning the two image sets between classes may be performed first, then cleaning the two image sets in classes, that is, when the similarity of the two image sets is determined to be greater, merging the two image sets in classes, then cleaning the merged image set in classes, for example, firstly obtaining two image sets, such as a first image set and a second image set, then calculating a first vector average value of the feature vector of the first image set and a second vector average value of the feature vector of the second image set, then calculating a deviation value between the first vector average value and the second vector average value, if the deviation value of the first vector average value and the second vector average value is less than a preset threshold, merging the first image set and the second image set into the same image set, and then cleaning the same image set in classes, including the above-mentioned cleaning methods of removing the repeated images in the set and screening the dissimilar images, and the like. In the scheme, the inter-class cleaning can be firstly performed, then the intra-class cleaning is performed, images which do not meet the requirement of similarity can be removed, the interference of the image data is avoided, and the cleaning workload can be reduced.
Referring to fig. 5, fig. 5 is a block diagram illustrating an image processing apparatus 200 according to an embodiment of the present invention, where the apparatus includes:
an image acquisition module 210, configured to acquire a first type image set, where the first type image set includes a plurality of first type images;
a feature vector extracting module 220, configured to extract a feature vector of each of the first type images;
the similarity calculation module 230 is configured to perform a pairwise similarity calculation on all the feature vectors extracted in the first type of image set, so as to obtain a deviation value between every two feature vectors;
and the cleaning module 240 is configured to clean the first type of image set according to the deviation values between the feature vectors.
Further, the cleaning module 240 is specifically configured to count, for each feature vector, the number of feature vectors having a deviation value smaller than or equal to a first threshold; determining the feature vector with the largest number as a first reference feature vector of the first type image set; and cleaning the first type image set according to the first reference feature vector.
Further, the cleaning module 240 is further configured to determine a feature vector, which has a deviation value from the first reference feature vector greater than the first threshold, of feature vectors of all images in the first type image set, and reject an image corresponding to the determined feature vector from the first type image set.
Further, the cleaning module 240 is further configured to determine a feature vector that is neither directly similar to the first reference feature vector nor indirectly similar to the first reference feature vector, from among feature vectors of all images in the first type image set, and reject an image corresponding to the determined feature vector from the first type image set;
wherein, directly similar to the first reference feature vector means: the deviation value of the first reference feature vector is smaller than or equal to the first threshold value; indirect similarity to the first reference feature vector means: and the deviation value of the feature vector which is directly similar or indirectly similar to the first reference feature vector is smaller than or equal to the first threshold value.
Further, the cleaning module 240 is further configured to, if there is a deviation value of the two feature vectors that is less than or equal to a second threshold, reject an image corresponding to one of the two feature vectors from the first type image set, where the second threshold is greater than the first threshold.
Further, the apparatus further comprises:
the first inter-class cleaning module is used for carrying out similarity calculation on the first reference feature vector and a second reference feature vector of the second class image set to obtain a deviation value between the first reference feature vector and the second reference feature vector; and if the deviation value of the first reference feature vector and the second reference feature vector is smaller than a third threshold value, merging the first type image set and the second type image set into the same image set.
Further, the apparatus further comprises:
the second inter-class cleaning module is used for calculating a first vector average value of the feature vectors of the first class image set, a second vector average value of the feature vectors of the acquired second class image set and a deviation value between the first vector average value and the second vector average value; and if the deviation value of the first vector average value and the second vector average value is smaller than a fourth threshold value, merging the first type image set and the second type image set into the same image set.
Further, the similarity calculating module 230 is specifically configured to calculate a euclidean distance or an included angle cosine value of each two feature vectors, and obtain the euclidean distance or the included angle cosine value between each two feature vectors, where the euclidean distance or the included angle cosine value is the offset value.
Further, the apparatus further comprises:
the image processing module is used for acquiring a plurality of images belonging to a first type of image set; selecting an image with a human face from the plurality of images as a plurality of original images; and carrying out standardization processing on the face areas in the plurality of original images to obtain the plurality of first-class images.
Further, the image processing module is further configured to extract a face with a largest area in the original image if the original image includes a plurality of face images, and perform standardization processing on the face with the largest area.
Referring to fig. 6, fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application, where the electronic device may include: at least one processor 110, such as a CPU, at least one communication interface 120, at least one memory 130, and at least one communication bus 140. Wherein the communication bus 140 is used to enable direct connection communication of these components. The communication interface 120 of the device in the embodiment of the present application is used for performing signaling or data communication with other node devices. The memory 130 may be a high-speed RAM memory or a non-volatile memory (non-volatile memory), such as at least one disk memory. Memory 130 may also optionally be at least one storage device located remotely from the aforementioned processor. The memory 130 has stored therein computer readable instructions which, when executed by the processor 110, perform the method process described above in fig. 1.
An embodiment of the present application provides a readable storage medium, which when executed by a processor, performs a method process performed by an electronic device in the method embodiment shown in fig. 1.
It will be clear to those skilled in the art that, for convenience and brevity of description, reference may be made to the corresponding procedure in the foregoing method for the specific working procedure of the apparatus described above, and this will not be repeated here.
In summary, the embodiments of the present application provide an image processing method, an apparatus, an electronic device, and a readable storage medium, where the method first obtains a first type image set, where the first type image set includes a plurality of first type images, then extracts feature vectors of each of the first type images, and performs two-by-two similarity calculation on all the feature vectors extracted in the first type image set to obtain a deviation value between every two feature vectors; and cleaning the first type image set according to the deviation value between the characteristic vectors. According to the scheme, the feature vector of each first type of image is obtained, then the feature vector is used for carrying out similarity calculation on every two images, so that the deviation value between every two feature vectors is obtained, the first type of image set is cleaned according to the deviation value between the feature vectors, and therefore images which do not meet the similarity requirement in the first type of image set can be removed, interference of the image data is avoided, and a good image cleaning effect is achieved.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. The apparatus embodiments described above are merely illustrative, for example, of the flowcharts and block diagrams in the figures that illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present invention may be integrated together to form a single part, or each module may exist alone, or two or more modules may be integrated to form a single part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The above description is only of the preferred embodiments of the present invention and is not intended to limit the present invention, but various modifications and variations can be made to the present invention by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention. It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
The foregoing is merely illustrative of the present invention, and the present invention is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.

Claims (22)

1. An image processing method, the method comprising:
acquiring a first type image set and a second type image set, wherein the first type image set comprises a plurality of first type images;
if the similarity of the first type image set and the second type image set meets a preset condition, merging the first type image set and the second type image set into the same image set;
extracting the feature vector of each image in the combined image set;
performing pairwise similarity calculation on all the feature vectors to obtain a deviation value between every two feature vectors; the deviation value is used for representing the similarity between every two feature vectors; and is combined with
And cleaning the combined image set according to the deviation values among the feature vectors to remove repeated images in the set and screen dissimilar images.
2. The method of claim 1, wherein the cleaning the combined image set according to the bias value between the feature vectors comprises:
counting the number of the feature vectors with the deviation value smaller than or equal to a first threshold value for each feature vector;
determining the feature vector with the largest number as a reference feature vector of the combined image set;
And cleaning the combined image set according to the reference feature vector.
3. The method of claim 2, wherein cleaning the combined image set based on the reference feature vector comprises:
and determining the feature vector, of the feature vectors of all the images of the combined image set, of which the deviation value with the reference feature vector is larger than the first threshold value, and removing the image corresponding to the determined feature vector from the combined image set.
4. The method of claim 2, wherein cleaning the combined image set based on the reference feature vector comprises:
determining a feature vector which is neither directly similar to the reference feature vector nor indirectly similar to the reference feature vector in the feature vectors of all the images of the combined image set, and removing the image corresponding to the determined feature vector from the combined image set;
wherein, direct similarity to the reference feature vector means: the deviation value of the reference feature vector is smaller than or equal to the first threshold value; indirect similarity to the reference feature vector means: and the deviation value of the feature vector which is directly similar or indirectly similar to the reference feature vector is smaller than or equal to the first threshold value.
5. The method according to any one of claims 2 to 4, wherein the cleaning of the combined image set according to the deviation values between the feature vectors further comprises:
if the deviation value of the two feature vectors is smaller than or equal to a second threshold value, removing the corresponding image of the two feature vectors from the combined image set, wherein the second threshold value is smaller than the first threshold value.
6. The method according to claim 1, wherein merging the first type image set and the second type image set into the same image set if the similarity between the first type image set and the second type image set satisfies a preset condition, comprises:
performing similarity calculation on a first reference feature vector of the first type image set and a second reference feature vector of the second type image set to obtain a deviation value between the first reference feature vector and the second reference feature vector;
and if the deviation value of the first reference feature vector and the second reference feature vector is smaller than a third threshold value, merging the first type image set and the second type image set into the same image set.
7. The method according to claim 1, wherein merging the first type image set and the second type image set into the same image set if the similarity between the first type image set and the second type image set satisfies a preset condition, comprises:
Calculating a first vector average value of the feature vectors of the first type of image set, a second vector average value of the feature vectors of the second type of image set, and a deviation value between the first vector average value and the second vector average value;
and if the deviation value of the first vector average value and the second vector average value is smaller than a fourth threshold value, merging the first type image set and the second type image set into the same image set.
8. The method according to any one of claims 1-4, wherein performing a pairwise similarity calculation on all the feature vectors to obtain a deviation value between every two feature vectors comprises:
and calculating Euclidean distance or included angle cosine values of every two feature vectors to obtain the Euclidean distance or included angle cosine values between every two feature vectors, wherein the deviation value is the Euclidean distance or the included angle cosine values.
9. The method of any of claims 1-4, wherein prior to acquiring the first type of image set, the method further comprises:
acquiring a plurality of images belonging to a first type of image set;
selecting an image with a human face from the plurality of images as a plurality of original images;
And carrying out standardization processing on the face areas in the plurality of original images to obtain the plurality of first-class images.
10. The method of claim 9, wherein normalizing the face regions in the plurality of original images comprises:
and if the original image comprises a plurality of face images, extracting the face with the largest area in the original image, and carrying out standardization processing on the face with the largest area.
11. An image processing apparatus, characterized in that the apparatus comprises:
the image acquisition module is used for acquiring a first type image set and a second type image set, wherein the first type image set comprises a plurality of first type images;
the inter-class cleaning module is used for merging the first class image set and the second class image set into the same image set if the similarity of the first class image set and the second class image set meets a preset condition;
the feature vector extraction module is used for extracting the feature vector of each image in the combined image set;
the similarity calculation module is used for carrying out pairwise similarity calculation on all the feature vectors to obtain a deviation value between every two feature vectors; the deviation value is used for representing the similarity between every two feature vectors;
And the cleaning module is used for cleaning the combined image set according to the deviation value between the characteristic vectors so as to remove repeated images in the set and screen dissimilar images.
12. The apparatus according to claim 11, wherein the cleaning module is specifically configured to count, for each feature vector, a number of feature vectors having a deviation value from the feature vector that is less than or equal to a first threshold; determining the feature vector with the largest number as a reference feature vector of the combined image set; and cleaning the combined image set according to the reference feature vector.
13. The apparatus of claim 12, wherein the cleaning module is further configured to determine a feature vector, of feature vectors of all images of the combined image set, having a deviation value from the reference feature vector greater than the first threshold, and reject an image corresponding to the determined feature vector from the combined image set.
14. The apparatus of claim 12, wherein the cleaning module is further configured to determine a feature vector of all images of the combined image set that is neither directly similar to the reference feature vector nor indirectly similar to the reference feature vector, and reject an image corresponding to the determined feature vector from the combined image set;
Wherein, direct similarity to the reference feature vector means: the deviation value of the reference feature vector is smaller than or equal to the first threshold value; indirect similarity to the reference feature vector means: and the deviation value of the feature vector which is directly similar or indirectly similar to the reference feature vector is smaller than or equal to the first threshold value.
15. The apparatus according to any one of claims 12-14, wherein the cleaning module is further configured to reject a corresponding one of the two feature vectors from the combined image set if there is a deviation value of the two feature vectors that is less than or equal to a second threshold, the second threshold being less than the first threshold.
16. The apparatus of claim 11, wherein the inter-class cleaning module is specifically configured to:
performing similarity calculation on a first reference feature vector of the first type image set and a second reference feature vector of the second type image set to obtain a deviation value between the first reference feature vector and the second reference feature vector; and if the deviation value of the first reference feature vector and the second reference feature vector is smaller than a third threshold value, merging the first type image set and the second type image set into the same image set.
17. The apparatus of claim 11, wherein the inter-class cleaning module is specifically configured to:
calculating a first vector average value of the feature vectors of the first type of image set, a second vector average value of the feature vectors of the second type of image set, and a deviation value between the first vector average value and the second vector average value; and if the deviation value of the first vector average value and the second vector average value is smaller than a fourth threshold value, merging the first type image set and the second type image set into the same image set.
18. The apparatus according to any one of claims 11 to 14, wherein the similarity calculation module is specifically configured to calculate a euclidean distance or an included angle cosine value of each two feature vectors, and obtain the euclidean distance or the included angle cosine value between each two feature vectors, where the deviation value is the euclidean distance or the included angle cosine value.
19. The apparatus according to any one of claims 11-14, wherein the apparatus further comprises:
the image processing module is used for acquiring a plurality of images belonging to a first type of image set; selecting an image with a human face from the plurality of images as a plurality of original images; and carrying out standardization processing on the face areas in the plurality of original images to obtain the plurality of first-class images.
20. The apparatus of claim 19, wherein the image processing module is further configured to extract a face with a largest area in the original image if the original image includes a plurality of face images, and perform normalization processing on the face with the largest area.
21. An electronic device comprising a processor and a memory storing computer readable instructions which, when executed by the processor, perform the steps of the method of any of claims 1-10.
22. A readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, performs the steps of the method according to any one of claims 1-10.
CN201811120898.0A 2018-09-25 2018-09-25 Image processing method, device, electronic equipment and readable storage medium Active CN110942081B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811120898.0A CN110942081B (en) 2018-09-25 2018-09-25 Image processing method, device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811120898.0A CN110942081B (en) 2018-09-25 2018-09-25 Image processing method, device, electronic equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN110942081A CN110942081A (en) 2020-03-31
CN110942081B true CN110942081B (en) 2023-08-18

Family

ID=69904488

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811120898.0A Active CN110942081B (en) 2018-09-25 2018-09-25 Image processing method, device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN110942081B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112183648A (en) * 2020-09-30 2021-01-05 深兰人工智能(深圳)有限公司 Automatic screening method and device for fine classification training data set

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101408929A (en) * 2007-10-10 2009-04-15 三星电子株式会社 Multiple-formwork human face registering method and apparatus for human face recognition system
CN103345645A (en) * 2013-06-27 2013-10-09 复旦大学 Commodity image category forecasting method based on online shopping platform
CN103546312A (en) * 2013-08-27 2014-01-29 中国航天科工集团第二研究院七〇六所 Massive multi-source isomerism log correlation analyzing method
CN103810663A (en) * 2013-11-18 2014-05-21 北京航天金盾科技有限公司 Demographic data cleaning method based on face recognition
CN104537252A (en) * 2015-01-05 2015-04-22 深圳市腾讯计算机系统有限公司 User state single-classification model training method and device
CN105426485A (en) * 2015-11-20 2016-03-23 小米科技有限责任公司 Image combination method and device, intelligent terminal and server
CN105488527A (en) * 2015-11-27 2016-04-13 小米科技有限责任公司 Image classification method and apparatus
CN106649610A (en) * 2016-11-29 2017-05-10 北京智能管家科技有限公司 Image labeling method and apparatus
CN106709449A (en) * 2016-12-22 2017-05-24 深圳市深网视界科技有限公司 Pedestrian re-recognition method and system based on deep learning and reinforcement learning
CN106776662A (en) * 2015-11-25 2017-05-31 腾讯科技(深圳)有限公司 A kind of taxonomic revision method and apparatus of photo
WO2017162083A1 (en) * 2016-03-25 2017-09-28 阿里巴巴集团控股有限公司 Data cleaning method and apparatus
CN107368812A (en) * 2017-07-21 2017-11-21 成都恒高科技有限公司 Facial recognition data cleaning method based on maximal connected subgraphs
CN107430776A (en) * 2015-04-28 2017-12-01 欧姆龙株式会社 Template construct device and template construct method
CN107463705A (en) * 2017-08-17 2017-12-12 陕西优百信息技术有限公司 A kind of data cleaning method
CN107480203A (en) * 2017-07-23 2017-12-15 北京中科火眼科技有限公司 It is a kind of to be directed to identical and similar pictures duplicate removal view data cleaning method
CN107480685A (en) * 2016-06-08 2017-12-15 国家计算机网络与信息安全管理中心 A kind of distributed power iteration clustering method and device based on GraphX
CN107871107A (en) * 2016-09-26 2018-04-03 北京眼神科技有限公司 Face authentication method and device
CN107944020A (en) * 2017-12-11 2018-04-20 深圳云天励飞技术有限公司 Facial image lookup method and device, computer installation and storage medium
CN108229419A (en) * 2018-01-22 2018-06-29 百度在线网络技术(北京)有限公司 For clustering the method and apparatus of image
CN108319938A (en) * 2017-12-31 2018-07-24 奥瞳系统科技有限公司 High quality training data preparation system for high-performance face identification system
CN108536753A (en) * 2018-03-13 2018-09-14 腾讯科技(深圳)有限公司 The determination method and relevant apparatus of duplicate message

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9229971B2 (en) * 2010-12-21 2016-01-05 Business Objects Software Limited Matching data based on numeric difference
US9911173B2 (en) * 2014-12-30 2018-03-06 Xiaomi Inc. Method and device for displaying images
US10013448B2 (en) * 2015-02-05 2018-07-03 Quantum Corporation Mobile device agent for personal deduplication

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101408929A (en) * 2007-10-10 2009-04-15 三星电子株式会社 Multiple-formwork human face registering method and apparatus for human face recognition system
CN103345645A (en) * 2013-06-27 2013-10-09 复旦大学 Commodity image category forecasting method based on online shopping platform
CN103546312A (en) * 2013-08-27 2014-01-29 中国航天科工集团第二研究院七〇六所 Massive multi-source isomerism log correlation analyzing method
CN103810663A (en) * 2013-11-18 2014-05-21 北京航天金盾科技有限公司 Demographic data cleaning method based on face recognition
CN104537252A (en) * 2015-01-05 2015-04-22 深圳市腾讯计算机系统有限公司 User state single-classification model training method and device
CN107430776A (en) * 2015-04-28 2017-12-01 欧姆龙株式会社 Template construct device and template construct method
CN105426485A (en) * 2015-11-20 2016-03-23 小米科技有限责任公司 Image combination method and device, intelligent terminal and server
CN106776662A (en) * 2015-11-25 2017-05-31 腾讯科技(深圳)有限公司 A kind of taxonomic revision method and apparatus of photo
CN105488527A (en) * 2015-11-27 2016-04-13 小米科技有限责任公司 Image classification method and apparatus
WO2017162083A1 (en) * 2016-03-25 2017-09-28 阿里巴巴集团控股有限公司 Data cleaning method and apparatus
CN107480685A (en) * 2016-06-08 2017-12-15 国家计算机网络与信息安全管理中心 A kind of distributed power iteration clustering method and device based on GraphX
CN107871107A (en) * 2016-09-26 2018-04-03 北京眼神科技有限公司 Face authentication method and device
CN106649610A (en) * 2016-11-29 2017-05-10 北京智能管家科技有限公司 Image labeling method and apparatus
CN106709449A (en) * 2016-12-22 2017-05-24 深圳市深网视界科技有限公司 Pedestrian re-recognition method and system based on deep learning and reinforcement learning
CN107368812A (en) * 2017-07-21 2017-11-21 成都恒高科技有限公司 Facial recognition data cleaning method based on maximal connected subgraphs
CN107480203A (en) * 2017-07-23 2017-12-15 北京中科火眼科技有限公司 It is a kind of to be directed to identical and similar pictures duplicate removal view data cleaning method
CN107463705A (en) * 2017-08-17 2017-12-12 陕西优百信息技术有限公司 A kind of data cleaning method
CN107944020A (en) * 2017-12-11 2018-04-20 深圳云天励飞技术有限公司 Facial image lookup method and device, computer installation and storage medium
CN108319938A (en) * 2017-12-31 2018-07-24 奥瞳系统科技有限公司 High quality training data preparation system for high-performance face identification system
CN108229419A (en) * 2018-01-22 2018-06-29 百度在线网络技术(北京)有限公司 For clustering the method and apparatus of image
CN108536753A (en) * 2018-03-13 2018-09-14 腾讯科技(深圳)有限公司 The determination method and relevant apparatus of duplicate message

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李丹平 等.一种支持所有权认证的客户端图像模糊去重方法.计算机学报.2018,第41卷(第6期),1047-1063. *

Also Published As

Publication number Publication date
CN110942081A (en) 2020-03-31

Similar Documents

Publication Publication Date Title
Manap et al. Non-distortion-specific no-reference image quality assessment: A survey
CN109410184B (en) Live broadcast pornographic image detection method based on dense confrontation network semi-supervised learning
CN107844737B (en) Iris image detection method and device
CN107368812B (en) Face recognition data cleaning method based on maximum connected subgraph
CN113807237B (en) Training of in vivo detection model, in vivo detection method, computer device, and medium
CN110942081B (en) Image processing method, device, electronic equipment and readable storage medium
CN112906696B (en) English image region identification method and device
CN112488137A (en) Sample acquisition method and device, electronic equipment and machine-readable storage medium
CN111652259A (en) Method and system for cleaning data
US20170309040A1 (en) Method and device for positioning human eyes
CN111950556A (en) License plate printing quality detection method based on deep learning
CN110866470A (en) Face anti-counterfeiting detection method based on random image characteristics
CN116052231A (en) Palm vein recognition method, device, equipment and storage medium
CN116071348A (en) Workpiece surface detection method and related device based on visual detection
CN107798282B (en) Method and device for detecting human face of living body
CN113158745B (en) Multi-feature operator-based messy code document picture identification method and system
US20210089886A1 (en) Method for processing data based on neural networks trained by different methods and device applying method
CN110147824B (en) Automatic image classification method and device
CN107122795B (en) Pedestrian re-identification method based on coring characteristics and random subspace integration
CN112329574A (en) Automatic acquisition method and device applied to cat nose line
CN113887458A (en) Training method and device, computer equipment and storage medium
CN106326882A (en) Fingerprint identification system and fingerprint identification method based on image quality assessment technology
CN112861874A (en) Expert field denoising method and system based on multi-filter denoising result
CN114333022B (en) Training method of character feature extraction model, character recognition method and related equipment
CN111126245A (en) Digital image dot matrix positioning method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant