CN111242209A - Uniform identification method - Google Patents
Uniform identification method Download PDFInfo
- Publication number
- CN111242209A CN111242209A CN202010018293.1A CN202010018293A CN111242209A CN 111242209 A CN111242209 A CN 111242209A CN 202010018293 A CN202010018293 A CN 202010018293A CN 111242209 A CN111242209 A CN 111242209A
- Authority
- CN
- China
- Prior art keywords
- clothing
- information
- uniform
- model
- classification
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 18
- 238000013145 classification model Methods 0.000 claims abstract description 12
- 238000012549 training Methods 0.000 claims abstract description 7
- 238000001514 detection method Methods 0.000 claims abstract description 6
- 238000013135 deep learning Methods 0.000 claims description 4
- 238000013508 migration Methods 0.000 claims description 3
- 230000005012 migration Effects 0.000 claims description 3
- 238000012545 processing Methods 0.000 abstract description 2
- 239000013598 vector Substances 0.000 description 5
- 239000011159 matrix material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000012512 characterization method Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000013480 data collection Methods 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 238000005034 decoration Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000013526 transfer learning Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2411—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
Landscapes
- Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a uniform identification method, which relates to the field of image processing and pattern identification and comprises the following steps: training a clothing classification model and a clothing pattern model svm model; acquiring a picture of an area where a uniform is located by a face detection technology; respectively sending the pictures of the uniform area into a trained clothing classification model and a clothing pattern model to obtain clothing classification information and clothing pattern information, and extracting clothing color information, ornament characteristic information and other characteristic information of the clothing; and inputting the clothing classification information, the clothing pattern information, the clothing color information, the ornament characteristic information and other characteristic information of the clothing into the svm model and outputting the classification information. The invention can solve the problem that uniform is difficult to identify and classify.
Description
Technical Field
The invention relates to the field of image processing and pattern recognition, in particular to a uniform recognition method.
Background
In the professional attendance in certain specific fields, certain requirements are imposed on dressing. The existing clothes detection mainly aims at the clothes in the fashion field to carry out attribute classification, and the data volume and the subdivision degree of professional clothes are insufficient; meanwhile, the deep learning technology is purely used for classifying uniforms, and no uniform data set which is disclosed at present can be used for reference, so that the problems that the sample size is small, a deep network based on common convolution can hardly converge, and overfitting is easy to happen can be solved.
Disclosure of Invention
The invention aims to provide a uniform identification method, which solves the problem that uniform is difficult to identify and classify.
The technical scheme adopted by the invention for solving the technical problems is as follows: a uniform identification method, comprising the steps of:
s1: training a clothing classification model and a clothing pattern model svm model;
s2: acquiring a picture of an area where a uniform is located by a face detection technology;
s3: respectively sending the pictures of the uniform area into a trained clothing classification model and a clothing pattern model to obtain clothing classification information and clothing pattern information, and extracting clothing color information, ornament characteristic information and other characteristic information of the clothing;
s4: and inputting the clothing classification information, the clothing pattern information, the clothing color information, the ornament characteristic information and other characteristic information of the clothing into the svm model and outputting the classification information.
Further, the clothing classification model is based on an existing NasnetMobile network, and is obtained by adding part of custom uniform data migration learning in the original data set suit of the network.
Further, the clothing pattern model is based on an existing DenseBlock network and is obtained through deep learning of a data set depeFashinon on the basis of the network.
Further, the ornament characteristic information includes chest card information.
Further, the other characteristic information of the garment comprises the information of the garment red zipper center seam.
Further, the svm model is a single-classification svm model.
Further, the number of pictures of the partially customized uniform data is 150.
The invention has the following beneficial effects:
1. the uniform identification method uses the single-classification svm model to classify the samples, negative sample data does not need to be collected, and the difficulty of data collection is greatly reduced;
2. the method uses the DenseBlock as a basic module of the classification network, can achieve better convergence and generalization effects on small samples, and realizes higher identification precision;
3. the invention carries out manual characterization decomposition on the sample, decomposes the sample information into clothing classification information, clothing pattern information and clothing color information, extracts ornament characteristic information and other characteristic information, and enhances the interpretability and classification precision of the model.
Drawings
Fig. 1 is a flowchart of a face recognition method according to embodiment 1 of the present invention.
Detailed Description
Example 1:
as shown in fig. 1, a uniform identification method includes the following steps:
training garment classification model, garment pattern model and svm model
Firstly, based on the existing NasnetMobile network, adding part of custom uniform data in the original data set suit of the network, and training a clothing classification model by a transfer learning technology.
The number of the added part of customized uniform data pictures is 150.
The migration learning algorithm task can be described by softmax:
wherein m is the batch size of training (batch size), WyiIs the decision layer weight, x, of the current termiIs the current layer network output, byiIs the current term bias term; n is the number of classes of ImageNet, WjIs the weight of all neural units of the decision layer, bjTo determine the weights of all neurons in the layer, in a preferred embodiment of the invention, the value of n is 1000.
And then based on the existing DenseBlock network, training the data set depeFashinon by adopting a deep learning technology on the basis of the network to obtain a clothing pattern model.
The structure of the DenseBlock network is as follows:
1. input (224 × 3) Input
2、DenseBlock1*5
3. Flatting layer of Flatten
4. Output (8,1) full connection Output layer
The task is a softmax cross entropy:
wherein m is the batch size, xiFor network output, yiAs tag information, WyiIs a weight matrix of the current term, WjThe weight matrix is eight classes, b is the offset, and n is the number of classes, and in a preferred embodiment of the present invention, the value of n is 8.
A single classification svm model is then trained.
Secondly, obtaining pictures of regions where uniforms are located through a face detection technology
Thirdly, the pictures of the uniform area are respectively sent into the trained clothing classification model and the clothing pattern model to obtain clothing classification information and clothing pattern information, and clothing color information, ornament characteristic information and other characteristic information of the clothing are extracted
1. Sending the pictures of the uniform area into the trained clothing classification model to obtain clothing classification information
For compliance uniforms in cases, the corresponding classification information should be suit classification information.
2. Sending the pictures of the uniform area into the trained clothing pattern model to obtain clothing pattern information
The picture information of the uniform area is sent to the following formula:
calculate L1The value of (c).
For a compliant garment in the case, the color of the pattern should be solid.
3. Extracting garment color information
Firstly, extracting a picture of the position of a human body, and then setting the position relation with the human body according to the position of a scene camera to obtain the position parameter information of the upper body uniform and the lower body uniform.
The garment color information includes upper body garment color information and lower body garment color information.
Taking the color information of the upper body garment of a certain application scene as an example, the position of the upper body garment is [ left: 0, upper: 0.2 x picture height, right: picture width, lower: 0.6 x picture height ], the picture is cut, and then the color threshold value is judged for the picture:
(1) converting the RGB pixel points into HSV pixel points, wherein the conversion formula is as follows:
wherein max is the maximum value of RGB pixels, and min is the minimum value of RGB pixels
(2) Traversing the number of pixels meeting the condition in the picture, and summing:
taking the blue uniform in the case as an example, the colors meeting the conditions are:
lower bound h:105, s:75, v:25 f;
h:130, s:255f, v:125
In the compliance service picture, the pixels meeting the condition after summation are not more than 3200.
4. Extracting decoration feature information
Taking extracting the chest card information as an example:
(1) selecting characteristic points: the characteristic points of the image can be simply understood as more prominent points in the image, such as contour points, bright points in darker areas, dark points in lighter areas, and the like.
The detection principle of the feature points is based on the gray value of the image printed around the feature points, and if the difference between the pixel value of one circle around the detected candidate feature points and the gray value of the feature points is large, the points are regarded as the feature points.
(2) Using the ORB recognizer, a BRIEF algorithm is used to describe a feature point.
In the compliance uniform picture, an ORB value exceeding 55 may be considered as the presence of a chest card at that position
5. Extracting other characteristic information of clothing
Taking the extraction of the garment red zipper center seam information as an example:
under the condition that the position of the camera and the position of the human body are known, the positioning can be performed according to the position of the human face, the position of the human face is set as [ l, t, r, b ], the width and the height of the picture are set as w, h, in this case, the position of the slit in the red zipper is set as [0.3w,0.2h,0.6r,0.65b ], and the judgment can be performed by using the summation of color threshold pixels, for example, the color range of the slit in the red zipper in the picture is (hsv format):
lower boundary 160,100, 25;
190,225,125 at the upper bound;
in this case, a seam in the garment is considered satisfactory above 137.
Inputting the clothing classification information, clothing pattern information, clothing color information, ornament characteristic information and other characteristic information of the clothing into the svm model and outputting the classification information
Normalizing and combining the attributes into an attribute vector, wherein the method comprises the following steps:
considering the extracted clothing feature vector as the shape of [ n,1 ]]Vector X, X ofi=[Au,Ap,Auc,Alc,Amc,Ao]Wherein: a. theuUniform probability E [0,1 ] output for NasnetMobile],ApFor the probability E [0,1 that the pattern is pure color],AucMin (sigma upper body compliance color/compliance color threshold, 1), AlcMin (sigma lower body compliance color/compliance color threshold, 1), AmcMin (Σ clothing center seam color/compliant color threshold, 1), aoIs the object min (Orb value/compliant object threshold, 1)
Then, the classification information is output using the modified single classification svm in this patent:
(xi-a)T(xi-a)≤R2+ξi,ξi≥0
wherein, XiFor each input attribute vector (whether uniform, color threshold, pattern type, ornament feature value, other feature value of the garment), a denotes the decision center, R the decision radius, and ξ the decision edge relaxation (some positive vectors may also appear inBeyond the decision edge), C is the relaxation weight.
In this case, the final output value of the compliance garment should exceed 0.7.
That is, when the final output value is less than or equal to 0.7, the uniform is not compliant with the specification.
The above description is only a preferred embodiment of the present invention, and not intended to limit the present invention in any way, and those skilled in the art can make various changes and modifications to the equivalent embodiments without departing from the scope of the present invention, and all such changes, modifications, equivalents and improvements that can be made to the above embodiments without departing from the technical spirit of the present invention are within the spirit and principle of the present invention.
Claims (7)
1. A uniform identification method is characterized by comprising the following steps:
s1: training a clothing classification model and a clothing pattern model svm model;
s2: acquiring a picture of an area where a uniform is located by a face detection technology;
s3: respectively sending the pictures of the uniform area into a trained clothing classification model and a clothing pattern model to obtain clothing classification information and clothing pattern information, and extracting clothing color information, ornament characteristic information and other characteristic information of the clothing;
s4: and inputting the clothing classification information, the clothing pattern information, the clothing color information, the ornament characteristic information and other characteristic information of the clothing into the svm model and outputting the classification information.
2. The method according to claim 1, wherein the clothing classification model is based on an existing NasnetMobile network, and is obtained by adding part of customized uniform data migration learning to an original data set suit of the network.
3. The method for identifying a uniform as claimed in claim 1, wherein the pattern model of the uniform is based on an existing DenseBlock network and is obtained through deep learning of a data set depefashion on the basis of the network.
4. The uniform identification method of claim 1, wherein the ornament feature information comprises chest card information.
5. The method of claim 1, wherein the other characteristic information of the garment includes a red zipper center seam information of the garment.
6. The uniform identification method of claim 1, wherein the svm model is a single classification svm model.
7. The uniform identification method of claim 2, wherein the number of pictures of the partially customized uniform data is 150.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010018293.1A CN111242209A (en) | 2020-01-08 | 2020-01-08 | Uniform identification method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010018293.1A CN111242209A (en) | 2020-01-08 | 2020-01-08 | Uniform identification method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111242209A true CN111242209A (en) | 2020-06-05 |
Family
ID=70866261
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010018293.1A Pending CN111242209A (en) | 2020-01-08 | 2020-01-08 | Uniform identification method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111242209A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113822197A (en) * | 2021-09-23 | 2021-12-21 | 南方电网电力科技股份有限公司 | Work dressing identification method and device, electronic equipment and storage medium |
CN114359150A (en) * | 2021-12-03 | 2022-04-15 | 深圳市宏电技术股份有限公司 | Work clothes detection method based on edge detection and histogram threshold setting |
-
2020
- 2020-01-08 CN CN202010018293.1A patent/CN111242209A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113822197A (en) * | 2021-09-23 | 2021-12-21 | 南方电网电力科技股份有限公司 | Work dressing identification method and device, electronic equipment and storage medium |
CN114359150A (en) * | 2021-12-03 | 2022-04-15 | 深圳市宏电技术股份有限公司 | Work clothes detection method based on edge detection and histogram threshold setting |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108108761B (en) | Rapid traffic signal lamp detection method based on deep feature learning | |
CN109389074B (en) | Facial feature point extraction-based expression recognition method | |
CN108268859A (en) | A kind of facial expression recognizing method based on deep learning | |
CN108520226B (en) | Pedestrian re-identification method based on body decomposition and significance detection | |
CN104268583B (en) | Pedestrian re-recognition method and system based on color area features | |
CN110399821B (en) | Customer satisfaction acquisition method based on facial expression recognition | |
CN106599863A (en) | Deep face identification method based on transfer learning technology | |
CN110706235B (en) | Far infrared pedestrian detection method based on two-stage cascade segmentation | |
CN103077378B (en) | Contactless face recognition algorithms based on extension eight neighborhood Local textural feature and system of registering | |
CN107545536A (en) | The image processing method and image processing system of a kind of intelligent terminal | |
CN112464730B (en) | Pedestrian re-identification method based on domain-independent foreground feature learning | |
CN110032932B (en) | Human body posture identification method based on video processing and decision tree set threshold | |
CN105426924B (en) | A kind of scene classification method based on image middle level features | |
Moallem et al. | Fuzzy inference system optimized by genetic algorithm for robust face and pose detection | |
CN112464731B (en) | Traffic sign detection and identification method based on image processing | |
CN109145964B (en) | Method and system for realizing image color clustering | |
CN111242209A (en) | Uniform identification method | |
CN110598574A (en) | Intelligent face monitoring and identifying method and system | |
Yogarajah et al. | A dynamic threshold approach for skin tone detection in colour images | |
CN106022223A (en) | High-dimensional local-binary-pattern face identification algorithm and system | |
CN109543518A (en) | A kind of human face precise recognition method based on integral projection | |
Sahu et al. | Study on face recognition techniques | |
Mohamed et al. | Face detection based neural networks using robust skin color segmentation | |
Nuraisha et al. | Implementation of K-NN based on histogram at image recognition for pornography detection | |
CN109766860B (en) | Face detection method based on improved Adaboost algorithm |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20200605 |