CN106503644B - Glasses attribute detection method based on edge projection and color characteristic - Google Patents

Glasses attribute detection method based on edge projection and color characteristic Download PDF

Info

Publication number
CN106503644B
CN106503644B CN201610910469.8A CN201610910469A CN106503644B CN 106503644 B CN106503644 B CN 106503644B CN 201610910469 A CN201610910469 A CN 201610910469A CN 106503644 B CN106503644 B CN 106503644B
Authority
CN
China
Prior art keywords
glasses
frame
formula
face
condition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201610910469.8A
Other languages
Chinese (zh)
Other versions
CN106503644A (en
Inventor
赵明华
张鑫
张飞飞
陈棠
董博源
殷谭生
石争浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian University of Technology
Original Assignee
Xian University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian University of Technology filed Critical Xian University of Technology
Priority to CN201610910469.8A priority Critical patent/CN106503644B/en
Publication of CN106503644A publication Critical patent/CN106503644A/en
Application granted granted Critical
Publication of CN106503644B publication Critical patent/CN106503644B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of eyeglass detection method based on edge projection and color characteristic, input a facial image to be detected, edge detection is pre-processed and carried out to it, obtains the marginal information image of facial image, divides human face region based on by positioning to mouth;Then, determine whether glasses have frame according to the cross of upper half face marginal information image, longitudinal projection, left and right lens area then is further determined that by boundary of face central axes for eyeglasses, left and right lens area then is determined with face relative position for no-frame glasses, eyeglasses calculate border width and colour of lens situation by the marginal information of lens area, depth coefficient and colour of skin likelihood score;Finally, the luminance factor for calculating left and right glasses detects the reflective situation of eyeglass.In detection operation, has the features such as characteristic information in need is few, and detection is quick, accurate.

Description

Glasses attribute detection method based on edge projection and color characteristic
Technical field
The invention belongs to Face datection image procossing and computer vision fields, and in particular to one kind based on edge projection and The glasses attribute detection method of color characteristic.
Background technique
In recent years, biometrics identification technology is widely applied to every field, and face recognition technology is as biological characteristic One of important topic of identification technology has achieved great progress.However, it has been investigated that, it is worn most as face For extensive jewelry, glasses influence the accuracy rate of human face detection and recognition extremely serious.Frame, glasses color, eyeglass are reflective Equal glasses attribute to recognition of face, exit and entry certificates uploads etc. online according to detection, certificate photo important influence.Cause This, the attribute that glasses are detected in facial image has very important significance and application value.
Currently, it is few for the Comparison between detecting methods of glasses in facial image, and be according to extracted characteristic information mostly To determine whether wearing spectacles, with mathematical method positioning rims of spectacle and being removed using based on the method for principal component analysis With the glasses of frame in facial image.Wherein, the common method for positioning rims of spectacle is using binary conversion treatment and Mathematical Morphology Method is positioned.The case where this method is only thick frame glasses to the glasses worn is suitble to, and to binary map When as carrying out continuous opening and closing operation, different situations are needed to set with different threshold values.In the detection of dark border glasses and fixed In the method for position, reference point of the maximum value to the horizontal two-value perspective view of original image as hair is generally used, is come with this to eye Mirror positioning.This method requires the face in image to have dark hair, and applicable limitation is bigger, and can be only done pair The detection and positioning of thick frame glasses.Both the above method also has ignored rimless other than it can be only done the positioning to thick frame glasses The positioning of glasses.In addition, seldom relating to examine the every attribute for the glasses worn in facial image in current method The method of survey.
Summary of the invention
The glasses attribute detection method based on edge projection and color characteristic that the object of the present invention is to provide a kind of, to face The glasses worn in image whether have frame, border width, colour of lens situation and eyeglass with the presence or absence of retroreflective regions into It has gone detection, has solved the problems, such as that existing detection method detection content is less.
The technical scheme adopted by the invention is that a kind of glasses detection of attribute side based on edge projection and color characteristic Method, attribute include whether glasses contain frame, comprising the following steps:
Step 1, face marginal information image f ' (x, y) is extracted, determines mouth region;The direction the x center that mouth is positioned As face axis line position, line of demarcation of the highest point in the direction y as face top half;
Step 2, the face marginal information image f ' of top half is extractedup(x, y), by the face marginal information of top half Image f 'up(x, y) carries out horizontal, longitudinal projection according to formula (5), formula (7) respectively, determines that glasses are vertical according to formula (6), formula (8) respectively To, lateral position range, lens area is obtained;
Indexy(k)=find (Rj>μ×max(R)) (6)
Indexx(k)=find (Ci>max(C)-μ1) (8)
Then the length and width ratio for calculating glasses determines that it is no-frame glasses if length and width ratio is greater than 8.5;Otherwise, it is determined that It is the glasses for having frame.
The features of the present invention also characterized in that:
The frame width that attribute can also include eyeglasses is detected, method particularly includes: using face central axes as images of left and right eyes The line of demarcation of mirror, using formula (7), to glasses obtained in step 2, longitudinally, laterally location information carries out longitudinal projection respectively, with formula (8) the left and right boundary position for calculating left and right glasses, in face marginal information image f ' (x, y) or f 'upThe middle extraction left side (x, y), The lower half portion of right eyeglass lens, respectively in the intermediate region Computational frame width of left and right glasses horizontal direction.
Detect the color that attribute can also include framed eyeglass, method particularly includes: in obtained frame position and Further comprising the steps of on the basis of frame width information: with the lower boundary of spectacle-frame, the following are frame exterior domains, with spectacle-frame Lower boundary subtract the width of spectacle-frame the above are frame inner region, the region that size is m × n is acquired respectively, using formula (12) Spectacle-frame inner region, spectacle-frame exterior domain colour of lens depth coefficient DL are calculated separately,
Indicate that the depth coefficient of the frame exterior domain of acquisition, b indicate the depth coefficient of the frame inner region of acquisition with a;Root The color of eyeglass is judged according to the numerical value of a, b:
1) when in a and b one be more than or equal to 150, another less than 150, and | a-b | > 10, then judge the inevitable band face of glasses Color;
2) when in a and b one be more than or equal to 150, another less than 150, and | a-b |≤10, then judge glasses certainty nothing Color;
3) when a and b is both greater than equal to 150, or is both less than equal to 150, then colour of skin likelihood score ratio is calculatedIf judging that eyeglass is colourless between 0.001 < τ < 10, otherwise judging that eyeglass is colored;
Further, for colored eyeglass, it may determine that the depth of colour of lens according to b value size: if b >=80, Then glasses are light color, if b < 80, glasses are dark color;
Detect attribute can also include whether eyeglass is reflective, in the glasses of eyeglasses or no-frame glasses that step 2 obtains It is further comprising the steps of on the basis of region: to calculate the luminance factor of left and right spectacle lens each pixel under hsv color space Bright, and the pixel intensity coefficient in each section is counted with formula (24),
If the pixel intensity coefficient in each section is evenly distributed, and brightness it is big section proportion it is big, then prove light It is evenly distributed, judges that eyeglass is non-reflective;Otherwise judge that eyeglass is reflective.
Wherein, the pixel intensity coefficient in each section is evenly distributed, the big section proportion of brightness it is biggish judgement according to According to are as follows: the number of pixels in maximum brightness coefficient section is more than the 10% of the luminance factor section number of pixels of most number of pixels; The number of pixels in secondary big luminance factor section is more than the 50% of the luminance factor section number of pixels of most number of pixels;It is the third-largest The number of pixels in luminance factor section is more than the 70% of the luminance factor section number of pixels of most number of pixels, above-mentioned three kinds of feelings Condition meet one of them, that is, think that the pixel intensity coefficient in each section is evenly distributed, and brightness it is big section proportion it is big.
Determination for the lens area of no-frame glasses, preferably following methods: the method in step 2 with transverse projection is used Estimate the lengthwise position range of eyes, and determines the y-axis position of eyes;Centered on eye position, using facial length and width as foundation, Estimate the range of no-frame glasses.
The invention has the advantages that the present invention to the glasses worn in facial image whether have frame, border width, Whether eyeglass colored and eyeglass is detected with the presence or absence of reflective equal attributes, helps to be promoted Face datection and recognition of face Whether performance, the glasses that can effectively judge that the certificate photo uploaded online is worn comply with standard.
Detailed description of the invention
Fig. 1 is the flow chart of the glasses attribute detection method the present invention is based on edge projection and color characteristic;
Fig. 2 is an image set to be detected of embodiment input;
Fig. 3 is the pre-processed results for the presentation graphics chosen in the image set of Fig. 2;
Fig. 4 is the corresponding edge detection results of image in Fig. 3;
Fig. 5 is mouth region positioning figure, wherein box mark indicates mouth region, and vertical bars indicate face axis Line (i.e. the direction x=mouth x center), horizontal horizontal line indicate face top half line of demarcation (the i.e. direction y=mouth y highest point position It sets);
Fig. 6 a is the eyeglasses affiliated area determined by horizontal, longitudinal projection;
Fig. 6 b is no-frame glasses cross, institute, longitudinal projection frame region;
Fig. 7 a is to determine figure to the left and right glasses affiliated area of eyeglasses;
Fig. 7 b is to determine figure to the left and right glasses affiliated area of no-frame glasses;
Fig. 8 is glasses affiliated area edge graph, wherein box label indicates to calculate region used in border width;
Fig. 9 a is the frame measurement result of eyeglasses, and border width provides above picture;
Fig. 9 b is the judging result of no-frame glasses;
Figure 10 is colour of lens testing result, box inside and outside frame label respectively indicate selected frame interior zone and Perimeter, and display is amplified on the right side of figure;
Figure 11 a is that eyeglass has reflective luminance graph;
Figure 11 b is the luminance graph of eyeglass no-reflection;
Figure 11 c is the luminance factor statistical results chart of Figure 11 a;
Figure 11 d is the luminance factor statistical results chart of Figure 11 b;
Figure 12 is the reflective testing result of eyeglass.
Specific embodiment
Present invention will be described in further detail below with reference to the accompanying drawings and specific embodiments.
Glasses attribute detection method based on edge projection and color characteristic of the invention, first part is to facial image Edge detection is pre-processed and done, and then lens area is determined by horizontal, longitudinal projection to edge-detected image, according to longitudinal and transverse eye The ratio in mirror region completes the judgement for whether having frame to glasses;Second part is wide by the result calculating frame of edge detection Degree, then colour of lens situation and reflective situation are judged by calculating shade coefficient, colour of skin similarity and luminance factor.
As shown in Figure 1, specific detect glasses items attribute according to the following steps.
Step 1, face marginal information image is extracted, determines mouth region.
(1) face marginal information image is extracted.
It inputs facial image f (x, y), as shown in Figure 2.Since collected facial image is there are interference informations such as noises, It needs to carry out face part processing and reduces error.Firstly, colorized face images are converted into gray level image;Then, to gray scale Image carries out Gaussian smoothing, as shown in Figure 3;Finally, extracting the marginal information image f ' of the gray level image after smoothing processing (x, y), as shown in Figure 4.
(2) mouth region is determined.
The color image f (x, y) of input is transformed into YCbCr color space from RGB color, is existed using Gauss model YCbCr color space calculates face pixel as the similarity P of mouth pixel using formula (1), is switched to bianry image obtain can The mouth region of energy, and mouth region is determined by information such as shape, positions, as shown in Figure 5.
Using the upper left corner of image as coordinate origin, origin horizontal direction is x-axis to the right, and vertical direction is y-axis downwards.By mouth The direction the x center of portion's positioning is as face axis line position x=Cen, and as shown in Fig. 5 vertical bars, the highest point Up in the direction y makees For the line of demarcation y=Up of people's half part on the face, as shown in the horizontal horizontal line of Fig. 5.
P=exp ((- 0.5) × (x-M) ' × inv (cov) × (x-M)) (1)
Formula (1) is that the colorized face images f (x, y) that will be inputted is transformed into the progress of YCbCr color space from RGB color It calculates, wherein x=(Cb, Cr)TFor the chroma vector of pixel;Cov and M respectively indicates the covariance matrix of chroma vector And mean vector;A possibility that P indicates the pixel and mouth similarity, and value is expressed as more greatly mouth region is bigger, on the contrary It is smaller.Through experiment statistics, mean value and variance are respectively as follows:
M=(156.5599,117.4361)T (2)
Step 2, by the face marginal information image f ' of top halfup(x, y) carries out transverse and longitudinal projection, determines glasses area Domain, according to the region, length and width ratio determines whether glasses have frame.
Step 2-1, determines lens area
Firstly, the marginal information image according to obtained in step 1 and the face top half obtained according to mouth region The face marginal information image f ' of wherein top half is extracted in line of demarcation with formula (4)up(x,y)。
f′up(1:M, 1:N)=f ' (1:Up, 1:N) (4)
Formula (4) indicates that, by 1 Dao the Up row of f ', the Pixel Information of 1 to N column is assigned to f 'up
Then, by the edge detection results f ' of top half faceup(x, y) carries out transverse projection in a manner of formula (5), obtains Take the marginal information figure f ' of upper half faceupThe sum of the pixel of (x, y) every row Rj
In formula (5), f 'upThe marginal information image f ' of (i, j) expression top half faceupThe pixel value that (x, y) is every, I, j indicates its horizontal, ordinate, prorIndicate transverse projection operation, what every row element value for realizing to matrix was added Operation, obtains a column vector, RjIndicate the value of the every row of column vector.
Lengthwise position range [the Glass of glasses is determined with formula (6) againy1,Glassy2]。
Indexy(k)=find (Rj>μ×max(R)) (6)
In formula (6), j=1,2,3 ..., m;K=1,2,3 ..., last, R refer to column vector obtained in formula (5);Max (R) is Maximum value in column vector;μ is threshold coefficient, takes 0.55 in this method;Find refers to be waited when the condition is satisfied, and return meets condition RjSubscript j;Indexy(k) k-th of R for meeting condition is recordedjSubscript, i.e., by the R for the condition that meetsjSubscript j value be assigned to Indexy(k);Last records the R for the condition that meetsjNumber;1st meets the R of conditionjLower label bey(1), i.e., IndexyIt (1) is glasses upper side frame position Glassy1;The last one meets the R of conditionjLower label bey(last), i.e., IndexyIt (last) is glasses lower frame position Glassy2
Secondly, to the marginal information figure f ' of upper half face in a manner of formula (7)up(x, y) carries out longitudinal projection, obtains f 'up The sum of the pixel of (x, y) each column Ci
In formula (7), f 'upThe marginal information image f ' of (i, j) expression top half faceupThe pixel value that (x, y) is every, I, j indicates its horizontal, ordinate, procLongitudinal projection's operator is indicated, for realizing that each column element value to matrix is added Operation, obtain a row vector, CiIndicate the value of row vector each column.
Lateral position range [the Glass of glasses is determined with formula (8) againx1,Glassx2].By horizontal, longitudinal projection to all kinds of The affiliated area that glasses determine is as shown in Fig. 6 a, 6b.
Indexx(k)=find (Ci>max(C)-μ1) (8)
In formula (8), i=1,2,3 ..., n;K=1,2,3 ..., last, C refer to row vector obtained in formula (7);Max (C) is Maximum value in row vector;μ1For threshold coefficient, 10 are taken in this method;Find refers to be waited when the condition is satisfied, returns to the C for the condition that meetsi Subscript i;Indexx(k) k-th of C for meeting condition is recordediSubscript, i.e., by the C for the condition that meetsiSubscript i value be assigned to Indexx(k);Last records the C for the condition that meetsiNumber;1st meets the C of conditioniLower label bex(1), i.e., IndexxIt (1) is glasses left frame position Glassx1;The last one meets the C of conditioniLower label bex(last), i.e., IndexxIt (last) is glasses left frame position Glassx2
Eye positions are known according to the cross of glasses, lengthwise position range.
Step 2-2, judges whether glasses have frame.
After calculating rims of spectacle position, the length and width of glasses is calculated using formula (9), formula (10), and count using formula (11) Glasses length and width ratio is calculated,
Framey=Glassy2-Glassy1 (9)
Framex=Glassx2-Glassx1 (10)
Ratio=Framex/Framey (11)
Judge whether glasses have frame according to the ratio situation: if the Ratio value that formula (11) is calculated is greater than 8.5, No-frame glasses, no-frame glasses judging result such as Fig. 9 b are determined that it is, and sets its frame width as 0;Otherwise, it is determined that it is to have side The glasses of frame.
Step 3, Computational frame width.
1. the lens area determined for eyeglasses is left and right glasses with the central axes x=Cen obtained in step 2 Line of demarcation carries out longitudinal projection to marginal information of the lens area marked in step 2 using formula (7) respectively, with formula (8) into The accurate left and right glasses Glass of one stepl、GlassrLeft and right boundary position, i.e. Glassll、Glasslr、GlassrlWith Glassrr, as shown in Figure 7a.In marginal information image f ' (x, y) or upper half face marginal information image f 'upIt is extracted in (x, y) Left and right glasses Glassl、GlassrLower half portion, it is wide in the intermediate region Computational frame of left and right glasses horizontal direction respectively Degree, as shown in Figure 8.Eyeglasses frame width calculated result is as illustrated in fig. 9.
2. no-frame glasses frame width is defaulted as 0.
Step 4, judge colour of lens.
1. the frame position obtained for eyeglasses using step 3 and frame width information are extracted inside spectacle-frame Be divided to and spectacle-frame outer portion two at information.Specifically, with the lower boundary of spectacle-frame, the following are frame exterior domains, with spectacle-frame Lower boundary subtracts the width of spectacle-frame the above are frame inner region, and the region that acquisition size is m × n respectively, pickup area is being schemed It is outlined in 10 left side facial images, the amplification effect of display pickup area on the right side of Figure 10.M value is that 6, n value is in this experiment 10.Spectacle-frame inner region, spectacle-frame exterior domain colour of lens depth coefficient DL are calculated separately by formula (12).
In formula (12), R, G and B respectively indicate the tristimulus value of each pixel of image.
Indicate that the depth coefficient of the frame exterior domain of acquisition, b indicate the depth coefficient of the frame inner region of acquisition with a.
1) when in a and b one be more than or equal to 150, another less than 150, and | a-b | > 10, then judge the inevitable band face of glasses Color.Then its shade is further judged according to b value size, if b>=80, glasses are light color, if b<80, glasses are deep Color.
2) when in a and b one be more than or equal to 150, another less than 150, and | a-b |≤10, then judge glasses certainty nothing Color.
3) when a and b is both less than equal to 150, then judged according to colour of skin likelihood score ratio is further.
Colour of skin likelihood score ratio τ is calculated using formula (13), if judging that eyeglass is colourless, otherwise between 0.001 < τ < 10 Judge that eyeglass is colored;Then its shade is further judged according to b value size, if b>=80, glasses are light color, if b<80, Then glasses are dark color.
4) when a and b is both greater than equal to 150, then judged according to colour of skin likelihood score ratio is further.
Colour of skin likelihood score ratio τ is calculated using formula (13), if judging that eyeglass is colourless, otherwise between 0.001 < τ < 10 Judge that eyeglass is colored;Due to b >=80, then glasses are necessarily light color.
In formula (13), PsIt is the skin color probability that face's spectacle-frame exterior domain of acquisition uses formula (1) to calculate, PgIt is acquisition The skin color probability that glasses inner region uses formula (1) to calculate.The mean value and variance taken in formula (1) is respectively as follows:
Mskin=(117.4316,148.5599)T (14)
Indicate that the depth coefficient of the frame exterior domain of acquisition, b indicate the depth coefficient of the frame inner region of acquisition, τ table with a Show colour of skin similarity ratio.Judge that glasses band pornographic condition, judging result are as shown in Figure 10 by following procedure.
2. it is colourless for defaulting its eyeglass for no-frame glasses.
Step 5, judge whether eyeglass is reflective.
The left and right lens area of eyeglasses and no-frame glasses is obtained according to above-mentioned steps, is searched for respectively for left and right glasses The maximum position of its brightness.It is specific as follows:
Step 5-1 obtains lens area
1. left and right lens area can be obtained by step 3 for eyeglasses.
2. for no-frame glasses, due to step 2 method can not entirely accurate estimation glasses and face boundary, It uses in step 2 with the lengthwise position range [Glass of the method estimation eyes of transverse projectiony1,Glassy2], and use formula (16) the y-axis position of eyes is determined.Then, centered on eye position, using facial length and width as foundation, according to formula (17)~(22) Estimate the probable ranges of no-frame glasses, as shown in Figure 7b.
Xll=θ × W (17)
Xlr=Cen- (δ × W) (18)
Xrl=Cen+ (δ × W) (19)
Xrr=(1- θ) × W (20)
In formula (17)~(22), x=Cen is axis line position, EyeyIt is the position in eyes y-axis direction, W is that face is wide Degree, H are face length, XllAnd XlrIndicate the right boundary of the left side glasses of no-frame glasses, XrlAnd XrrIndicate the right side of no-frame glasses The right boundary of branch hole mirror, Y1And Y2The up-and-down boundary of no-frame glasses, θ, δ andIt is the deviation ratio of position.
Step 5-2 searches for the maximum position of brightness in lens area
Firstly, the left and right spectacle lens that will acquire go to hsv color space from RGB color respectively, pass through formula (23) To calculate the luminance factor of each pixel
Bright=(1/S)+(V × 100) (23)
In formula (23), S and V are each pixel that target area goes to hsv color space from RGB color respectively S value and V value, Bright are the luminance factor value of the pixel calculated.Figure 11 a and Figure 11 b are respectively reflective glasses area The luminance factor value of domain and no-reflection lens area is divided by the display result after 10.
Then, the pixel intensity coefficient that each section is counted with formula (24), makes statistical results chart, such as Figure 11 c, 11d institute Show.
In formula (24),It is luminance factor interval value,Refer to theThe number of pixels in a luminance factor section, before statistics,Default Value is 0.
If statistical results chart is evenly distributed, larger (the i.e. last three luminance factor sections of the big section proportion of brightness Compare respectively with the luminance factor section comprising most number of pixels, the number of pixels ratio situation according to belonging to the two defines: most The number of pixels in big luminance factor section is more than the 10% of the luminance factor section number of pixels of most number of pixels;Secondary big brightness The number of pixels in coefficient section is more than the 50% of the luminance factor section number of pixels of most number of pixels;The third-largest luminance factor The number of pixels in section is more than the 70% of the luminance factor section number of pixels of most number of pixels, and above-mentioned three kinds of situations meet it One of), it as illustrated in fig. 11d, then proves that light is evenly distributed, judges its no-reflection region.
If statistical results chart is unevenly distributed, judge that it there are retroreflective regions, then it is fixed with 5 × 5 template scanning ophthalmoscope region The maximum position of brightness in the glasses of position.And a threshold value is set according to the brightness section of histogram, image is carried out with the threshold value Binaryzation chooses the region in bianry image comprising luminance factor maximum point, as the retroreflective regions of glasses, as shown in figure 12.
Whether whether this method have frame, border width, eyeglass colored and mirror the glasses worn in facial image Piece waits attributes to be detected with the presence or absence of reflective, facilitates the performance for promoting Face datection and recognition of face, can effectively judge Whether the glasses that the certificate photo that line uploads is worn comply with standard, and are suitable for recognition of face, exit and entry certificates shines according to detection, upload Piece on-line checking etc..

Claims (7)

1. a kind of glasses attribute detection method based on edge projection and color characteristic, which is characterized in that the attribute includes eye Whether mirror contains frame, comprising the following steps:
Step 1, face marginal information image f ' (x, y) is extracted, determines mouth region;Using mouth positioning the direction x center as Face axis line position, line of demarcation of the highest point in the direction y as face top half;
Step 2, the face marginal information image f ' of top half is extractedup(x, y), by the face marginal information image of top half f′up(x, y) carries out horizontal, longitudinal projection according to formula (5), formula (7) respectively, determines that glasses are longitudinal, horizontal according to formula (6), formula (8) respectively To position range, lens area is obtained;
Indexy(k)=find (Rj> μ × max (R)) (6)
Indexx(k)=find (Ci> max (C)-μ1) (8)
Then the length and width ratio for calculating glasses determines that it is no-frame glasses if length and width ratio is greater than 8.5;Otherwise, it is determined that it is There are the glasses of frame;
In formula (5), f 'upThe marginal information image f ' of (i, j) expression top half faceupThe pixel value that (x, y) is every, i, j table Show its horizontal, ordinate, prorExpression transverse projection operation, the operation that every row element value for realizing to matrix is added, Obtain a column vector, RjIndicate the value of the every row of column vector;
In formula (6), j=1,2,3..., m;K=1,2,3..., last, R refer to column vector obtained in formula (5);Max (R) is column Maximum value in vector;μ is threshold coefficient, takes 0.55 in this method;Find refers to be waited when the condition is satisfied, returns to the R for the condition that meetsj Subscript j;Indexy(k) k-th of R for meeting condition is recordedjSubscript, i.e., by the R for the condition that meetsjSubscript j value be assigned to Indexy(k);Last records the R for the condition that meetsjNumber;1st meets the R of conditionjLower label bey(1), i.e., IndexyIt (1) is glasses upper side frame position Glassy1;The last one meets the R of conditionjLower label bey(last), i.e., IndexyIt (last) is glasses lower frame position Glassy2
In formula (7), f 'upThe marginal information image f ' of (i, j) expression top half faceupThe pixel value that (x, y) is every, i, j table Show its horizontal, ordinate, procIndicate longitudinal projection's operator, the behaviour that each column element value for realizing to matrix is added Make, obtains a row vector, CiIndicate the value of row vector each column;
In formula (8), i=1,2,3..., n;K=1,2,3..., last, C refer to row vector obtained in formula (7);Max (C) is capable Maximum value in vector;μ1For threshold coefficient, 10 are taken in this method;Find refers to be waited when the condition is satisfied, returns to the C for the condition that meetsi's Subscript i;Indexx(k) k-th of C for meeting condition is recordediSubscript, i.e., by the C for the condition that meetsiSubscript i value be assigned to Indexx (k);Last records the C for the condition that meetsiNumber;1st meets the C of conditioniLower label bex(1), i.e. Indexx It (1) is glasses left frame position Glassx1;The last one meets the C of conditioniLower label bex(last), i.e. Indexx It (last) is glasses left frame position Glassx2
2. the glasses attribute detection method according to claim 1 based on edge projection and color characteristic, which is characterized in that The attribute further includes the frame width of eyeglasses, method are as follows: take the face central axes as the boundary of left and right glasses Line, using formula (7), to glasses obtained in step 2, longitudinally, laterally location information carries out longitudinal projection respectively, with formula (8) calculating The left and right boundary position of left and right glasses, in f ' (x, y) or f 'upThe lower half portion that left and right glasses are extracted in (x, y), exists respectively The intermediate region Computational frame width of left and right glasses horizontal direction.
3. the glasses attribute detection method according to claim 2 based on edge projection and color characteristic, which is characterized in that The attribute further includes the color of eyeglasses eyeglass, is also wrapped on the basis of obtained frame position and frame width information Include following steps: with the lower boundary of spectacle-frame, the following are frame exterior domains, and the width of spectacle-frame is subtracted with the lower boundary of spectacle-frame The above are frame inner regions, acquire the region that size is m × n respectively, calculate separately spectacle-frame inner region, glasses using formula (12) Outer frame region colour of lens depth coefficient DL,
Indicate that the depth coefficient of the frame exterior domain of acquisition, b indicate the depth coefficient of the frame inner region of acquisition with a;According to a, b Numerical value judge the color of eyeglass:
1) when in a and b one be more than or equal to 150, another less than 150, and | a-b | > 10 then judges the inevitable band color of glasses;
2) when in a and b one be more than or equal to 150, another less than 150, and | a-b |≤10, then judge that glasses are necessarily colourless;
3) when a and b is both greater than equal to 150, or is both less than equal to 150, then colour of skin likelihood score ratio is calculated If judging that eyeglass is colourless between 0.001 < τ < 10, otherwise judging that eyeglass is colored;
Ps is the skin color probability of face's spectacle-frame exterior domain of acquisition, and Pg is the skin color probability of the glasses inner region of acquisition.
4. the glasses attribute detection method according to claim 3 based on edge projection and color characteristic, which is characterized in that For colored eyeglass, the depth of colour of lens is judged according to the b value size: if b >=80, glasses are light color, if b < 80, then glasses are dark color.
5. the glasses attribute detection method according to claim 2 based on edge projection and color characteristic, which is characterized in that The attribute further includes whether eyeglass is reflective, is obtained in the left and right lens area of eyeglasses and step 2 that the step 3 obtains It is further comprising the steps of on the basis of the no-frame glasses region arrived: to calculate left and right spectacle lens each pixel under hsv color space Luminance factor Bright, and count with formula (24) the pixel intensity coefficient in each section,
If the pixel intensity coefficient in each section is evenly distributed, and brightness it is big section proportion it is big, then prove light be distributed Uniformly, judge that eyeglass is non-reflective;Otherwise judge that eyeglass is reflective;
It is luminance factor interval value,Refer to theIt is a The number of pixels in luminance factor section, before statistics,Default value be 0.
6. the glasses attribute detection method according to claim 5 based on edge projection and color characteristic, which is characterized in that Accurate determination for the lens area of no-frame glasses, uses following methods: using the method in the step 2 with transverse projection Estimate the lengthwise position range of eyes, and determines the y-axis position of eyes;Centered on eye position, using facial length and width as foundation, Estimate the range of no-frame glasses.
7. the glasses attribute detection method according to claim 5 or 6 based on edge projection and color characteristic, feature exist In the pixel intensity coefficient in each section is evenly distributed, the big biggish judgment basis of section proportion of brightness are as follows: most The number of pixels in big luminance factor section is more than the 10% of the luminance factor section number of pixels of most number of pixels;Secondary big brightness The number of pixels in coefficient section is more than the 50% of the luminance factor section number of pixels of most number of pixels;The third-largest luminance factor The number of pixels in section is more than the 70% of the luminance factor section number of pixels of most number of pixels, and above-mentioned three kinds of situations meet it One of, that is, think that the pixel intensity coefficient in each section is evenly distributed, and brightness it is big section proportion it is big.
CN201610910469.8A 2016-10-19 2016-10-19 Glasses attribute detection method based on edge projection and color characteristic Expired - Fee Related CN106503644B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610910469.8A CN106503644B (en) 2016-10-19 2016-10-19 Glasses attribute detection method based on edge projection and color characteristic

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610910469.8A CN106503644B (en) 2016-10-19 2016-10-19 Glasses attribute detection method based on edge projection and color characteristic

Publications (2)

Publication Number Publication Date
CN106503644A CN106503644A (en) 2017-03-15
CN106503644B true CN106503644B (en) 2019-05-28

Family

ID=58294360

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610910469.8A Expired - Fee Related CN106503644B (en) 2016-10-19 2016-10-19 Glasses attribute detection method based on edge projection and color characteristic

Country Status (1)

Country Link
CN (1) CN106503644B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107844742B (en) * 2017-09-26 2019-01-04 平安科技(深圳)有限公司 Facial image glasses minimizing technology, device and storage medium
CN107945126B (en) * 2017-11-20 2022-02-18 杭州登虹科技有限公司 Method, device and medium for eliminating spectacle frame in image
CN107992835A (en) * 2017-12-11 2018-05-04 浙江大学 A kind of glasses image-recognizing method
CN108564540B (en) * 2018-03-05 2020-07-17 Oppo广东移动通信有限公司 Image processing method and device for removing lens reflection in image and terminal equipment
CN108596067B (en) * 2018-04-15 2019-09-10 中少科普(北京)教育科技有限公司 A kind of Young Pioneer's salute detection bearing calibration
CN111488843A (en) * 2020-04-16 2020-08-04 贵州安防工程技术研究中心有限公司 Face sunglasses distinguishing method based on step-by-step inhibition of missing report and false report rate
CN116473501B (en) * 2023-04-28 2023-12-05 北京云柿信息技术有限公司 Automatic recording method, device and system for inserting-sheet type subjective refraction result
CN116343313B (en) * 2023-05-30 2023-08-11 乐山师范学院 Face recognition method based on eye features

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1606029A (en) * 2004-11-04 2005-04-13 上海交通大学 Infrared human face spectacle disturbance elimination method based on regional characteristic element compensation
CN101162502A (en) * 2006-10-13 2008-04-16 上海银晨智能识别科技有限公司 Method for removing glasses during human recognition
CN102163289A (en) * 2011-04-06 2011-08-24 北京中星微电子有限公司 Method and device for removing glasses from human face image, and method and device for wearing glasses in human face image
CN103020579A (en) * 2011-09-22 2013-04-03 上海银晨智能识别科技有限公司 Face recognition method and system, and removing method and device for glasses frame in face image
CN103632136A (en) * 2013-11-11 2014-03-12 北京天诚盛业科技有限公司 Method and device for locating human eyes
CN104050448A (en) * 2014-06-11 2014-09-17 青岛海信信芯科技有限公司 Human eye positioning method and device and human eye region positioning method and device
CN105046250A (en) * 2015-09-06 2015-11-11 广州广电运通金融电子股份有限公司 Glasses elimination method for face recognition
CN105095841A (en) * 2014-05-22 2015-11-25 小米科技有限责任公司 Method and device for generating eyeglasses
CN105787427A (en) * 2016-01-08 2016-07-20 上海交通大学 Lip area positioning method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1606029A (en) * 2004-11-04 2005-04-13 上海交通大学 Infrared human face spectacle disturbance elimination method based on regional characteristic element compensation
CN101162502A (en) * 2006-10-13 2008-04-16 上海银晨智能识别科技有限公司 Method for removing glasses during human recognition
CN102163289A (en) * 2011-04-06 2011-08-24 北京中星微电子有限公司 Method and device for removing glasses from human face image, and method and device for wearing glasses in human face image
CN103020579A (en) * 2011-09-22 2013-04-03 上海银晨智能识别科技有限公司 Face recognition method and system, and removing method and device for glasses frame in face image
CN103632136A (en) * 2013-11-11 2014-03-12 北京天诚盛业科技有限公司 Method and device for locating human eyes
CN105095841A (en) * 2014-05-22 2015-11-25 小米科技有限责任公司 Method and device for generating eyeglasses
CN104050448A (en) * 2014-06-11 2014-09-17 青岛海信信芯科技有限公司 Human eye positioning method and device and human eye region positioning method and device
CN105046250A (en) * 2015-09-06 2015-11-11 广州广电运通金融电子股份有限公司 Glasses elimination method for face recognition
CN105787427A (en) * 2016-01-08 2016-07-20 上海交通大学 Lip area positioning method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
A Fast Eye State Computing Algorithm;Xiaolin Zhang等;《ICSESS 2015》;20150925;587-590
ENHANCED PCA RECONSTRUCTION METHOD FOR EYEGLASS FRAME AUTO-REMOVAL;Guo Pei等;《Proceedings of IC-NIDC2014》;20140921;359-363
Image processing based forest fire detection using YCbCr colour model;C. Emmy Prema1等;《ICCPCT-2014》;20140321;1229-1237

Also Published As

Publication number Publication date
CN106503644A (en) 2017-03-15

Similar Documents

Publication Publication Date Title
CN106503644B (en) Glasses attribute detection method based on edge projection and color characteristic
CN100354875C (en) Red eye moving method based on human face detection
CN105654436A (en) Backlight image enhancement and denoising method based on foreground-background separation
CN105139404A (en) Identification camera capable of detecting photographing quality and photographing quality detecting method
CN104299011A (en) Skin type and skin problem identification and detection method based on facial image identification
Shahin et al. A novel white blood cells segmentation algorithm based on adaptive neutrosophic similarity score
CN106570447B (en) Based on the matched human face photo sunglasses automatic removal method of grey level histogram
CN101162503A (en) Method for extracting and recognizing human ear characteristic by improved Hausdorff distance
Wesolkowski Color image edge detection and segmentation: A comparison of the vector angle and the euclidean distance color similarity measures
CN105205437B (en) Side face detection method and device based on contouring head verifying
Hammal et al. Parametric models for facial features segmentation
CN113128376B (en) Wrinkle identification method and device based on image processing and terminal equipment
JP2022099130A (en) Determination method, determination apparatus, and determination program
CN116091421A (en) Method for automatically dividing and calculating area of blastomere image of in-vitro fertilized embryo
CN109543518A (en) A kind of human face precise recognition method based on integral projection
CN110021029A (en) A kind of real-time dynamic registration method and storage medium suitable for RGBD-SLAM
CN107992835A (en) A kind of glasses image-recognizing method
KR101343623B1 (en) adaptive color detection method, face detection method and apparatus
Chen et al. Fully automated facial symmetry axis detection in frontal color images
Parikh et al. Effective approach for iris localization in nonideal imaging conditions
CN117197064A (en) Automatic non-contact eye red degree analysis method
CN111832464A (en) Living body detection method and device based on near-infrared camera
Sae-Tang et al. Exudates detection in fundus image using non-uniform illumination background subtraction
CN106503611B (en) Facial image eyeglass detection method based on marginal information projective iteration mirror holder crossbeam
Choukikar et al. Segmenting the Optic Disc in retinal images using bi-histogram equalization and thresholding the connected regions

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20190528

Termination date: 20211019