CN106503644A - Glasses attribute detection method based on edge projection and color characteristic - Google Patents

Glasses attribute detection method based on edge projection and color characteristic Download PDF

Info

Publication number
CN106503644A
CN106503644A CN201610910469.8A CN201610910469A CN106503644A CN 106503644 A CN106503644 A CN 106503644A CN 201610910469 A CN201610910469 A CN 201610910469A CN 106503644 A CN106503644 A CN 106503644A
Authority
CN
China
Prior art keywords
glasses
frame
eyeglass
formula
face
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610910469.8A
Other languages
Chinese (zh)
Other versions
CN106503644B (en
Inventor
赵明华
张鑫
张飞飞
陈棠
董博源
殷谭生
石争浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian University of Technology
Original Assignee
Xian University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian University of Technology filed Critical Xian University of Technology
Priority to CN201610910469.8A priority Critical patent/CN106503644B/en
Publication of CN106503644A publication Critical patent/CN106503644A/en
Application granted granted Critical
Publication of CN106503644B publication Critical patent/CN106503644B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of based on edge projection and the eyeglass detection method of color characteristic, one facial image to be detected of input, pretreatment is carried out to which and carries out rim detection, obtain the marginal information image of facial image, based on by human face region being divided to mouth positioning;Then, horizontal stroke, longitudinal projection according to upper half face marginal information image is determining whether glasses have frame, for eyeglasses then further determine that left and right lens area with face axis as boundary, for with face relative position, no-frame glasses then determine that left and right lens area, eyeglasses calculate border width and colour of lens situation by the marginal information of lens area, depth coefficient and colour of skin likelihood score;Finally, the luminance factor for calculating left and right glasses detects the reflective situation of eyeglass.In detection operation, have the features such as characteristic information in need is few, and detection is quick, accurate.

Description

Glasses attribute detection method based on edge projection and color characteristic
Technical field
The invention belongs to Face datection image procossing and computer vision field, and in particular to a kind of based on edge projection and The glasses attribute detection method of color characteristic.
Background technology
In recent years, biometrics identification technology is widely applied to every field, and face recognition technology is used as biological characteristic One of important topic of technology of identification, achieved with great progress.However, it has been investigated that, wear most as face For extensive jewelry, glasses affect extremely serious on the accuracy rate of human face detection and recognition.Picture frame, glasses color, eyeglass are reflective Deng glasses attribute, to recognition of face, exit and entry certificates, according to detection, certificate photo, the aspect such as upload has important impact online.Cause This, detects that in facial image the attribute of glasses has very important significance and using value.
At present, for the Comparison between detecting methods of glasses in facial image are few, and it is according to the characteristic information for being extracted mostly Judging whether wearing spectacles, rims of spectacle is positioned with mathematical method and is removed using the method based on principal component analysiss Glasses with frame in facial image.Wherein, the common method of positioning rims of spectacle is using binary conversion treatment and Mathematical Morphology Method is positioned.This method is only that the situation of thick frame glasses is suitable for the glasses that is worn, and to binary map As when carrying out continuous opening and closing operation, for different situations need to set different threshold values.Dark border glasses detection with fixed In the method for position, the maximum of the horizontal two-value projection to original image is generally used as the reference point of hair, come to eye with this Mirror is positioned.This method requires that the face in image has a dark hair, applicable limitation than larger, and can be only done right The detection of thick frame glasses and positioning.Both the above method is also have ignored rimless in addition to it can be only done the positioning to thick frame glasses The positioning of glasses.In addition, seldom relating to examine every attribute of the glasses that is worn in facial image in current method The method of survey.
Content of the invention
It is an object of the invention to provide a kind of glasses attribute detection method based on edge projection and color characteristic, to face Whether the glasses that is worn in image have frame, border width, colour of lens situation and eyeglass to enter with the presence or absence of retroreflective regions Detection is gone, has solved the problems, such as that existing detection method detection content is less.
The technical solution adopted in the present invention is, a kind of glasses detection of attribute side based on edge projection and color characteristic Method, attribute include that glasses, whether containing frame, are comprised the following steps:
Step 1, extracts face marginal information image f ' (x, y), determines mouth region;The x directions center that mouth is positioned As face axis line position, the demarcation line of the peak in y directions as face top half;
Step 2, extracts the face marginal information image f ' of top halfup(x, y), by the face marginal information of top half Image f 'up(x, y) carries out horizontal stroke, longitudinal projection respectively according to formula (5), formula (7), determines that glasses are indulged according to formula (6), formula (8) respectively To, lateral attitude scope, lens area is obtained;
Indexy(k)=find (Rj>μ×max(R)) (6)
Indexx(k)=find (Ci>max(C)-μ1) (8)
Then the length and width ratio of glasses is calculated, if length and width ratio determines that it is no-frame glasses more than 8.5;Otherwise, it is determined that Which is the glasses for having frame.
The characteristics of of the invention, also resides in:
Detection attribute can also include that the frame width of eyeglasses, concrete grammar are:With face axis as images of left and right eyes The demarcation line of mirror, using formula (7), to the glasses that obtain in step 2, longitudinally, laterally positional information carries out longitudinal projection respectively, with formula (8) the left and right boundary position of left and right glasses is calculated, in face marginal information image f ' (x, y) or f 'up(x, y) middle extraction left side, The latter half of right eye mirror, respectively in the zone line Computational frame width of left and right glasses horizontal direction.
Detection attribute can also include that the color of framed eyeglass, concrete grammar are:In the picture frame position for obtaining and Further comprising the steps of on the basis of picture frame width information:With below the lower boundary of spectacle-frame as picture frame exterior domain, with spectacle-frame Lower boundary deduct more than the width of spectacle-frame for picture frame inner region, region of the collection size for m × n respectively, using formula (12) Spectacle-frame inner region, spectacle-frame exterior domain colour of lens depth coefficient DL are calculated respectively,
Represent that with a the depth coefficient of the picture frame exterior domain of collection, b represent the depth coefficient of the picture frame inner region of collection;Root According to the color that the numerical value of a, b judges eyeglass:
1) when in a and b one be more than or equal to 150, another is less than 150, and | a-b |>10, then judge the inevitable band face of glasses Color;
2) when in a and b one be more than or equal to 150, another is less than 150, and | a-b |≤10 then judge the inevitable nothing of glasses Color;
3) when a and b is both greater than equal to 150, or when being both less than equal to 150, then colour of skin likelihood score ratio is calculatedIf between 0.001 < τ < 10, judging that eyeglass is colourless, otherwise judging that eyeglass is colored;
Further, for colored eyeglass, according to the depth that b value sizes may determine that colour of lens:If b >=80, Then glasses are light color, if b<80, then glasses are dark;
Detection attribute can also include whether eyeglass is reflective, the eyeglasses obtained in step 2 or the glasses of no-frame glasses Further comprising the steps of on the basis of region:Calculate the luminance factor of left and right lenses each pixel under hsv color space Bright, and each interval pixel intensity coefficient is counted with formula (24),
If each interval pixel intensity coefficient is evenly distributed, and the big interval proportion of brightness is big, then prove light It is evenly distributed, judges that eyeglass is non-reflective;Otherwise judge that eyeglass is reflective.
Wherein, each interval pixel intensity coefficient is evenly distributed, the larger judgement of the big interval proportion of brightness according to According to for:The interval number of pixels of high-high brightness coefficient exceedes the 10% of the luminance factor interval number of pixels of most number of pixels; The interval number of pixels of secondary big luminance factor exceedes the 50% of the luminance factor interval number of pixels of most number of pixels;The third-largest The interval number of pixels of luminance factor exceedes the 70% of the luminance factor interval number of pixels of most number of pixels, above-mentioned three kinds of feelings Condition meets one of them, that is, think that each interval pixel intensity coefficient is evenly distributed, and the big interval proportion of brightness is big.
For the determination of the lens area of no-frame glasses, preferably following methods:Adopt in step 2 in the method for transverse projection Estimate the lengthwise position scope of eyes, and determine the y-axis position of eyes;Centered on eye position, with a width of foundation of face length, Estimate the scope of no-frame glasses.
The invention has the beneficial effects as follows, the present invention whether the glasses that is worn in facial image are had frame, border width, Whether eyeglass colored and eyeglass waits attribute to be detected with the presence or absence of reflective, contributes to lifting Face datection and recognition of face Performance, can effectively judge glasses that the certificate photo of online upload worn whether conformance with standard.
Description of the drawings
Fig. 1 is flow chart of the present invention based on the glasses attribute detection method of edge projection and color characteristic;
Fig. 2 is an image set to be detected of embodiment input;
Fig. 3 is the pre-processed results of the presentation graphics that chooses in the image set of Fig. 2;
Fig. 4 is the corresponding edge detection results of image in Fig. 3;
Fig. 5 is mouth region positioning figure, and wherein, square frame mark represents that mouth region, vertical bars represent face axis Line (i.e. x=mouths x directions center), horizontal horizontal line represent face top half demarcation line (i.e. y=mouths y directions highest point position Put);
Fig. 6 a are the eyeglasses affiliated areas determined by horizontal, longitudinal projection;
Fig. 6 b are no-frame glasses horizontal stroke, longitudinal projection's institute's frame region;
Fig. 7 a are the determination figures to the left and right glasses affiliated area of eyeglasses;
Fig. 7 b are the determination figures to the left and right glasses affiliated area of no-frame glasses;
Fig. 8 is glasses affiliated area edge graph, and wherein, square frame labelling represents region used by calculating border width;
Fig. 9 a are the frame measurement results of eyeglasses, and border width is given above picture;
Fig. 9 b are the judged results of no-frame glasses;
Figure 10 is colour of lens testing result, the square frame labelling inside and outside picture frame represent respectively selected picture frame interior zone and Perimeter, and display is amplified on the right side of figure;
Figure 11 a are that eyeglass has reflective luminance graph;
Figure 11 b are the luminance graphs of eyeglass no-reflection;
Figure 11 c are the luminance factor statistical results charts of Figure 11 a;
Figure 11 d are the luminance factor statistical results charts of Figure 11 b;
Figure 12 is the reflective testing result of eyeglass.
Specific embodiment
The present invention is described in further detail with reference to the accompanying drawings and detailed description.
The glasses attribute detection method based on edge projection and color characteristic of the present invention, Part I is to facial image Pretreatment and rim detection is done, and then lens area is determined by horizontal, longitudinal projection to edge-detected image, according to longitudinal and transverse eye The ratio in mirror region completes the judgement for whether having frame to glasses;Part II is to calculate picture frame width by the result of rim detection Degree, then colour of lens situation and reflective situation is judged by calculating shade coefficient, colour of skin similarity and luminance factor.
As shown in figure 1, specifically according to following steps detection glasses items attribute.
Step 1, extracts face marginal information image, determines mouth region.
(1) face marginal information image is extracted.
Input facial image f (x, y), as shown in Figure 2.There is the interference informations such as noise in the facial image due to collecting, Need face part is carried out processing and reduce error.First, colorized face images are converted into gray level image;Then, to gray scale Image carries out Gaussian smoothing, as shown in Figure 3;Finally, the marginal information image f ' of the gray level image after smoothing processing is extracted (x, y), as shown in Figure 4.
(2) determine mouth region.
Coloured image f (x, y) of input is transformed into YCbCr color spaces from RGB color, is existed using Gauss model YCbCr color spaces calculate face pixel as similarity P of mouth pixel with formula (1), and being switched to bianry image can to obtain The mouth region of energy, and mouth region is determined by information such as shape, positions, as shown in Figure 5.
As zero, origin horizontal direction is to the right x-axis, and vertical direction is downwards y-axis in the upper left corner with image.By mouth Used as face axis line position x=Cen, as shown in Fig. 5 vertical bars, the peak Up in y directions makees at the x directions center of portion's positioning For the demarcation line y=Up of people's half part on the face, as shown in the horizontal horizontal lines of Fig. 5.
P=exp ((- 0.5) × (x-M) ' × inv (cov) × (x-M)) (1)
Formula (1) is that colorized face images f (x, y) of input are transformed into YCbCr color spaces from RGB color to carry out Calculate, wherein, x=(Cb, Cr)TChroma vector for pixel;Cov and M represent the covariance matrix of chroma vector respectively And mean vector;P represents the pixel and mouth similarity, and the probability that its value is expressed as more greatly mouth region is bigger, otherwise Less.Through experiment statisticses, average and variance are respectively:
M=(156.5599,117.4361)T(2)
Step 2, by the face marginal information image f ' of top halfup(x, y) carries out transverse and longitudinal projection, determines glasses area According to the region length and width ratio, domain, determines whether glasses have frame.
Step 2-1, determines lens area
First, the face top half for obtaining according to the marginal information image obtained in step 1 and according to mouth region Demarcation line, with the face marginal information image f ' that formula (4) extracts wherein top halfup(x,y).
f′up(1:M,1:N)=f ' (1:Up,1:N) (4)
Formula (4) is represented and arrives Up rows by the 1 of f ', and the Pixel Information of 1 to N row is assigned to f 'up.
Then, by the edge detection results f ' of top half faceup(x, y) carries out transverse projection in the way of formula (5), obtains Take the marginal information figure f ' of upper half faceupPixel sum R that (x, y) often goesj.
In formula (5), f 'up(i,j)Represent the marginal information image f ' of top half faceupThe pixel value that (x, y) is per, I, j represent its horizontal, vertical coordinate, prorTransverse projection computing is represented, the every row element value for realizing to matrix is added Operation, obtains a column vector, RjRepresent the value that column vector is often gone.
Determine the lengthwise position scope [Glass of glasses again with formula (6)y1,Glassy2].
Indexy(k)=find (Rj>μ×max(R)) (6)
In formula (6), j=1,2,3 ..., m;K=1,2,3 ..., last, R refer to the column vector obtained in formula (5);Max (R) is Maximum in column vector;μ is threshold coefficient, takes 0.55 in this method;Find refers to, returns and meet condition RjSubscript j;IndexyK () record meets the R of condition for k-thjSubscript, the R of condition will be metjSubscript j value be assigned to Indexy(k);Last records meet the R of conditionjNumber;1st R for meeting conditionjUnder be labeled as Indexy(1), i.e., Indexy(1) it is glasses upper side frame position Glassy1;Last meets the R of conditionjUnder be labeled as Indexy(last), i.e., Indexy(last) it is glasses lower frame position Glassy2.
Secondly, the marginal information figure f ' in the way of formula (7) to upper half faceup(x, y) carries out longitudinal projection, obtains f 'up Pixel sum C of (x, y) each columni.
In formula (7), f 'up(i, j) represents the marginal information image f ' of top half faceupThe pixel value that (x, y) is per, I, j represent its horizontal, vertical coordinate, procLongitudinal projection's operator is represented, for realizing that each column element value to matrix is added Operation, obtain a row vector, CiRepresent the value of row vector each column.
Determine the lateral attitude scope [Glass of glasses again with formula (8)x1,Glassx2].By horizontal, longitudinal projection to all kinds of The affiliated area that glasses judge is as shown in Fig. 6 a, 6b.
Indexx(k)=find (Ci>max(C)-μ1) (8)
In formula (8), i=1,2,3 ..., n;K=1,2,3 ..., last, C refer to the row vector obtained in formula (7);Max (C) is Maximum in row vector;μ1For threshold coefficient, 10 in this method, are taken;Find refers to that return meets the C of conditioni Subscript i;IndexxK () record meets the C of condition for k-thiSubscript, the C of condition will be metiSubscript i value be assigned to Indexx(k);Last records meet the C of conditioniNumber;1st C for meeting conditioniUnder be labeled as Indexx(1), i.e., Indexx(1) it is glasses left frame position Glassx1;Last meets the C of conditioniUnder be labeled as Indexx(last), i.e., Indexx(last) it is glasses left frame position Glassx2.
Horizontal stroke, lengthwise position scope according to glasses knows eye positions.
Step 2-2, judges whether glasses have frame.
After rims of spectacle position is calculated, using formula (9), the length and width of formula (10) calculating glasses, and counted using formula (11) Glasses length and width ratio is calculated,
Framey=Glassy2-Glassy1(9)
Framex=Glassx2-Glassx1(10)
Ratio=Framex/Framey(11)
Judge whether glasses have frame according to the ratio situation:If the calculated Ratio values of formula (11) are more than 8.5, No-frame glasses, no-frame glasses judged result such as Fig. 9 b is determined that it is, and its picture frame width is set as 0;Otherwise, it is determined which is have side The glasses of frame.
Step 3, Computational frame width.
1. for the lens area that eyeglasses determine is left and right glasses with the axis x=Cen obtained in step 2 Demarcation line, carries out longitudinal projection using formula (7) to the marginal information of the lens area marked in step 2 respectively, is entered with formula (8) The accurate left and right glasses Glass of one stepl、GlassrLeft and right boundary position, i.e. Glassll、Glasslr、GlassrlWith Glassrr, as shown in Figure 7a.In marginal information image f ' (x, y) or upper half face marginal information image f 'upExtract in (x, y) Left and right glasses Glassl、GlassrThe latter half, respectively in the zone line Computational frame width of left and right glasses horizontal direction Degree, as shown in Figure 8.Eyeglasses frame width result of calculation is as illustrated in fig. 9.
2. no-frame glasses frame width is defaulted as 0.
Step 4, judges colour of lens.
1. for eyeglasses, the picture frame position obtained using step 3 and picture frame width information are extracted inside spectacle-frame Be divided to and spectacle-frame outer portion two at information.Specifically, with below the lower boundary of spectacle-frame as picture frame exterior domain, with spectacle-frame Lower boundary deducts more than the width of spectacle-frame for picture frame inner region, respectively region of the collection size for m × n, and pickup area is in figure Outline in 10 left side facial images, on the right side of Figure 10, show the amplification effect of pickup area.In this experiment, m values for 6, n values are 10.Spectacle-frame inner region, spectacle-frame exterior domain colour of lens depth coefficient DL are calculated respectively by formula (12).
In formula (12), R, G and B represent the tristimulus value of each pixel of image respectively.
Represent that with a the depth coefficient of the picture frame exterior domain of collection, b represent the depth coefficient of the picture frame inner region of collection.
1) when in a and b one be more than or equal to 150, another is less than 150, and | a-b |>10, then judge the inevitable band face of glasses Color.Then its shade is judged according to b value sizes further, if b >=80, glasses are light color, if b<80, then glasses are deep Color.
2) when in a and b one be more than or equal to 150, another is less than 150, and | a-b |≤10 then judge the inevitable nothing of glasses Color.
3) when a and b is both less than equal to 150, then further judged according to colour of skin likelihood score ratio.
Colour of skin likelihood score ratio τ is calculated using formula (13), if between 0.001 < τ < 10, judging that eyeglass is colourless, otherwise Judge that eyeglass is colored;Then its shade is judged according to b value sizes further, if b >=80, glasses are light color, if b<80, Then glasses are dark color.
4) when a and b is both greater than equal to 150, then further judged according to colour of skin likelihood score ratio.
Colour of skin likelihood score ratio τ is calculated using formula (13), if between 0.001 < τ < 10, judging that eyeglass is colourless, otherwise Judge that eyeglass is colored;Due to b >=80, then glasses are necessarily light color.
In formula (13), PsIt is the skin color probability that calculated using formula (1) of face's spectacle-frame exterior domain of collection, PgIt is collection The skin color probability that glasses inner region is calculated using formula (1).The average and variance taken in formula (1) is respectively:
Mskin=(117.4316,148.5599)T(14)
Represent that with a the depth coefficient of the picture frame exterior domain of collection, b represent the depth coefficient of the picture frame inner region of collection, τ tables Show colour of skin similarity ratio.Glasses band pornographic condition is judged by procedure below, judged result is as shown in Figure 10.
2. for no-frame glasses, it is colourless to give tacit consent to its eyeglass.
Step 5, judges whether eyeglass is reflective.
According to the left and right lens area that above-mentioned steps obtain eyeglasses and no-frame glasses, searched for for left and right glasses respectively The maximum position of its brightness.Specific as follows:
Step 5-1, obtains lens area
1. for eyeglasses, images of left and right eyes mirror region is obtained by step 3.
2. for no-frame glasses, due to step 2 method can not entirely accurate estimate the boundary of glasses and face, therefore Adopt the lengthwise position scope [Glass that eyes are estimated in step 2 in the method for transverse projectiony1,Glassy2], and use formula (16) determine the y-axis position of eyes.Then, centered on eye position, with a width of foundation of face length, according to formula (17)~(22) To estimate the probable ranges of no-frame glasses, as shown in Figure 7b.
Xll=θ × W (17)
Xlr=Cen- (δ × W) (18)
Xrl=Cen+ (δ × W) (19)
Xrr=(1- θ) × W (20)
In formula (17)~(22), x=Cen is axis line position, EyeyIt is the position in eyes y-axis direction, W is face width Degree, H be face length, XllAnd XlrRepresent the right boundary of the left side glasses of no-frame glasses, XrlAnd XrrRepresent the right side of no-frame glasses The right boundary of branch hole mirror, Y1And Y2The up-and-down boundary of no-frame glasses, θ, δ andIt is the deviation ratio of position.
Step 5-2, searches for the maximum position of brightness in lens area
First, the left and right lenses for getting are gone to hsv color space from RGB color respectively, by formula (23) To calculate the luminance factor of each pixel
Bright=(1/S)+(V × 100) (23)
In formula (23), S and V is each pixel that target area goes to hsv color space from RGB color respectively S values and V-value, Bright are the luminance factor value of the pixel for calculating.Figure 11 a and Figure 11 b is respectively reflective glasses area The luminance factor value of domain and no-reflection lens area is divided by the display result after 10.
Then, each interval pixel intensity coefficient is counted with formula (24), makes statistical results chart, such as Figure 11 c, 11d institute Show.
In formula (24),It is luminance factor interval value,Refer to theThe interval number of pixels of individual luminance factor, before statistics,Acquiescence Value is 0.
If statistical results chart is evenly distributed, larger (the i.e. last three luminance factor intervals of the big interval proportion of brightness Compare with the luminance factor interval comprising most number of pixels respectively, according to belonging to the two, number of pixels ratio situation is defined:Most The interval number of pixels of big luminance factor exceedes the 10% of the luminance factor interval number of pixels of most number of pixels;Secondary big brightness The interval number of pixels of coefficient exceedes the 50% of the luminance factor interval number of pixels of most number of pixels;The third-largest luminance factor Interval number of pixels exceedes the 70% of the luminance factor interval number of pixels of most number of pixels, and above-mentioned three kinds of situations meet which One of), as illustrated in fig. 11d, then prove that light is evenly distributed, and judges its no-reflection region.
If statistical results chart skewness, judge which there are retroreflective regions, then with 5 × 5 template scanning ophthalmoscope region, fixed The maximum position of brightness in the glasses of position.And a threshold value is set according to histogrammic brightness section, image is carried out with the threshold value Binaryzation, chooses the region comprising luminance factor maximum point in bianry image, as the retroreflective regions of glasses, as shown in figure 12.
Whether whether the method have frame, border width, eyeglass colored and mirror the glasses that is worn in facial image Piece waits attribute to be detected with the presence or absence of reflective, contributes to the performance for lifting Face datection and recognition of face, effectively can judge The glasses worn of certificate photo that line is uploaded whether conformance with standard, it is adaptable to recognition of face, exit and entry certificates according to detection, upload and shine Piece on-line checking etc..

Claims (7)

1. a kind of glasses attribute detection method based on edge projection and color characteristic, it is characterised in that the attribute includes eye Whether mirror is comprised the following steps containing frame:
Step 1, extracts face marginal information image f ' (x, y), determines mouth region;Using mouth position x directions center as Face axis line position, the demarcation line of the peak in y directions as face top half;
Step 2, extracts the face marginal information image f ' of top halfup(x, y), by the face marginal information image of top half f′up(x, y) carries out horizontal stroke, longitudinal projection respectively according to formula (5), formula (7), determines that glasses are longitudinal, horizontal according to formula (6), formula (8) respectively To position range, lens area is obtained;
Indexy(k)=find (Rj>μ×max(R)) (6)
Indexx(k)=find (Ci>max(C)-μ1) (8)
Then the length and width ratio of glasses is calculated, if length and width ratio determines that it is no-frame glasses more than 8.5;Otherwise, it is determined which is There are the glasses of frame.
2. the glasses attribute detection method based on edge projection and color characteristic according to claim 1, it is characterised in that The attribute also includes the frame width of eyeglasses, and its method is:With the boundary that the face axis is left and right glasses Line, using formula (7), to the glasses that obtain in step 2, longitudinally, laterally positional information carries out longitudinal projection respectively, is calculated with formula (8) The left and right boundary position of left and right glasses, in f ' (x, y) or f 'upThe latter half of left and right glasses is extracted in (x, y), is existed respectively The zone line Computational frame width of left and right glasses horizontal direction.
3. the glasses attribute detection method based on edge projection and color characteristic according to claim 2, it is characterised in that The attribute also includes the color of framed eyeglass, the picture frame position that obtains in claim 2 and picture frame width information On the basis of further comprising the steps of:With below the lower boundary of spectacle-frame as picture frame exterior domain, eye is deducted with the lower boundary of spectacle-frame More than the width of picture frame it is picture frame inner region, collection size is the region of m × n respectively, calculates spectacle-frame respectively using formula (12) Inner region, spectacle-frame exterior domain colour of lens depth coefficient DL,
D L = s u m ( R &times; 0.299 + G &times; 0.587 + B &times; 0.114 ) m &times; n - - - ( 12 )
Represent that with a the depth coefficient of the picture frame exterior domain of collection, b represent the depth coefficient of the picture frame inner region of collection;According to a, b Numerical value judge the color of eyeglass:
1) when in a and b one be more than or equal to 150, another is less than 150, and | a-b |>10, then judge the inevitable band color of glasses;
2) when in a and b one be more than or equal to 150, another is less than 150, and | a-b |≤10 then judge that glasses are necessarily colourless;
3) when a and b is both greater than equal to 150, or when being both less than equal to 150, then colour of skin likelihood score ratio is calculated If between 0.001 < τ < 10, judging that eyeglass is colourless, otherwise judging that eyeglass is colored.
4. the glasses attribute detection method based on edge projection and color characteristic according to claim 3, it is characterised in that For colored eyeglass, according to the depth that the b values size judges colour of lens:If b >=80, glasses are light color, if b< 80, then glasses are dark.
5. the glasses attribute detection method based on edge projection and color characteristic according to claim 2, it is characterised in that The attribute also includes whether eyeglass is reflective, the left and right lens area of the eyeglasses obtained in the step 3 and step 2 On the basis of the no-frame glasses region that arrives, further comprising the steps of:Calculate left and right lenses each pixel under hsv color space Luminance factor Bright, and each interval pixel intensity coefficient is counted with formula (24),
S u m ( f l o o r ( B r i g h t 100 ) ) = S u m ( f l o o r ( B r i g h t 100 ) ) + 1 - - - ( 24 )
If each interval pixel intensity coefficient is evenly distributed, and the big interval proportion of brightness is big, then prove light distribution Uniformly, judge that eyeglass is non-reflective;Otherwise judge that eyeglass is reflective.
6. the glasses attribute detection method based on edge projection and color characteristic according to claim 5, it is characterised in that For the accurate determination of the lens area of no-frame glasses, using following methods:Adopt in the step 2 in the method for transverse projection Estimate the lengthwise position scope of eyes, and determine the y-axis position of eyes;Centered on eye position, with a width of foundation of face length, Estimate the scope of no-frame glasses.
7. the glasses attribute detection method based on edge projection and color characteristic according to claim 5 or 6, its feature exist In each interval pixel intensity coefficient described is evenly distributed, and the larger basis for estimation of the big interval proportion of brightness is:Most The interval number of pixels of big luminance factor exceedes the 10% of the luminance factor interval number of pixels of most number of pixels;Secondary big brightness The interval number of pixels of coefficient exceedes the 50% of the luminance factor interval number of pixels of most number of pixels;The third-largest luminance factor Interval number of pixels exceedes the 70% of the luminance factor interval number of pixels of most number of pixels, and above-mentioned three kinds of situations meet which One of, that is, think that each interval pixel intensity coefficient is evenly distributed, and the big interval proportion of brightness is big.
CN201610910469.8A 2016-10-19 2016-10-19 Glasses attribute detection method based on edge projection and color characteristic Expired - Fee Related CN106503644B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610910469.8A CN106503644B (en) 2016-10-19 2016-10-19 Glasses attribute detection method based on edge projection and color characteristic

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610910469.8A CN106503644B (en) 2016-10-19 2016-10-19 Glasses attribute detection method based on edge projection and color characteristic

Publications (2)

Publication Number Publication Date
CN106503644A true CN106503644A (en) 2017-03-15
CN106503644B CN106503644B (en) 2019-05-28

Family

ID=58294360

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610910469.8A Expired - Fee Related CN106503644B (en) 2016-10-19 2016-10-19 Glasses attribute detection method based on edge projection and color characteristic

Country Status (1)

Country Link
CN (1) CN106503644B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107844742A (en) * 2017-09-26 2018-03-27 平安科技(深圳)有限公司 Facial image glasses minimizing technology, device and storage medium
CN107945126A (en) * 2017-11-20 2018-04-20 杭州登虹科技有限公司 Spectacle-frame removing method, device and medium in a kind of image
CN107992835A (en) * 2017-12-11 2018-05-04 浙江大学 A kind of glasses image-recognizing method
CN108564540A (en) * 2018-03-05 2018-09-21 广东欧珀移动通信有限公司 Remove image processing method, device and the terminal device that eyeglass is reflective in image
CN108596067A (en) * 2018-04-15 2018-09-28 中少科普(北京)教育科技有限公司 A kind of Young Pioneer's salute detection bearing calibration
CN111488843A (en) * 2020-04-16 2020-08-04 贵州安防工程技术研究中心有限公司 Face sunglasses distinguishing method based on step-by-step inhibition of missing report and false report rate
CN116343313A (en) * 2023-05-30 2023-06-27 乐山师范学院 Face recognition method based on eye features
CN116473501A (en) * 2023-04-28 2023-07-25 北京云柿信息技术有限公司 Automatic recording method, device and system for inserting-sheet type subjective refraction result

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1606029A (en) * 2004-11-04 2005-04-13 上海交通大学 Infrared human face spectacle disturbance elimination method based on regional characteristic element compensation
CN101162502A (en) * 2006-10-13 2008-04-16 上海银晨智能识别科技有限公司 Method for removing glasses during human recognition
CN102163289A (en) * 2011-04-06 2011-08-24 北京中星微电子有限公司 Method and device for removing glasses from human face image, and method and device for wearing glasses in human face image
CN103020579A (en) * 2011-09-22 2013-04-03 上海银晨智能识别科技有限公司 Face recognition method and system, and removing method and device for glasses frame in face image
CN103632136A (en) * 2013-11-11 2014-03-12 北京天诚盛业科技有限公司 Method and device for locating human eyes
CN104050448A (en) * 2014-06-11 2014-09-17 青岛海信信芯科技有限公司 Human eye positioning method and device and human eye region positioning method and device
CN105046250A (en) * 2015-09-06 2015-11-11 广州广电运通金融电子股份有限公司 Glasses elimination method for face recognition
CN105095841A (en) * 2014-05-22 2015-11-25 小米科技有限责任公司 Method and device for generating eyeglasses
CN105787427A (en) * 2016-01-08 2016-07-20 上海交通大学 Lip area positioning method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1606029A (en) * 2004-11-04 2005-04-13 上海交通大学 Infrared human face spectacle disturbance elimination method based on regional characteristic element compensation
CN101162502A (en) * 2006-10-13 2008-04-16 上海银晨智能识别科技有限公司 Method for removing glasses during human recognition
CN102163289A (en) * 2011-04-06 2011-08-24 北京中星微电子有限公司 Method and device for removing glasses from human face image, and method and device for wearing glasses in human face image
CN103020579A (en) * 2011-09-22 2013-04-03 上海银晨智能识别科技有限公司 Face recognition method and system, and removing method and device for glasses frame in face image
CN103632136A (en) * 2013-11-11 2014-03-12 北京天诚盛业科技有限公司 Method and device for locating human eyes
CN105095841A (en) * 2014-05-22 2015-11-25 小米科技有限责任公司 Method and device for generating eyeglasses
CN104050448A (en) * 2014-06-11 2014-09-17 青岛海信信芯科技有限公司 Human eye positioning method and device and human eye region positioning method and device
CN105046250A (en) * 2015-09-06 2015-11-11 广州广电运通金融电子股份有限公司 Glasses elimination method for face recognition
CN105787427A (en) * 2016-01-08 2016-07-20 上海交通大学 Lip area positioning method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
C. EMMY PREMA1等: "Image processing based forest fire detection using YCbCr colour model", 《ICCPCT-2014》 *
GUO PEI等: "ENHANCED PCA RECONSTRUCTION METHOD FOR EYEGLASS FRAME AUTO-REMOVAL", 《PROCEEDINGS OF IC-NIDC2014》 *
XIAOLIN ZHANG等: "A Fast Eye State Computing Algorithm", 《ICSESS 2015》 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107844742A (en) * 2017-09-26 2018-03-27 平安科技(深圳)有限公司 Facial image glasses minimizing technology, device and storage medium
CN107844742B (en) * 2017-09-26 2019-01-04 平安科技(深圳)有限公司 Facial image glasses minimizing technology, device and storage medium
CN107945126A (en) * 2017-11-20 2018-04-20 杭州登虹科技有限公司 Spectacle-frame removing method, device and medium in a kind of image
CN107992835A (en) * 2017-12-11 2018-05-04 浙江大学 A kind of glasses image-recognizing method
CN108564540A (en) * 2018-03-05 2018-09-21 广东欧珀移动通信有限公司 Remove image processing method, device and the terminal device that eyeglass is reflective in image
CN108564540B (en) * 2018-03-05 2020-07-17 Oppo广东移动通信有限公司 Image processing method and device for removing lens reflection in image and terminal equipment
CN108596067A (en) * 2018-04-15 2018-09-28 中少科普(北京)教育科技有限公司 A kind of Young Pioneer's salute detection bearing calibration
CN111488843A (en) * 2020-04-16 2020-08-04 贵州安防工程技术研究中心有限公司 Face sunglasses distinguishing method based on step-by-step inhibition of missing report and false report rate
CN116473501A (en) * 2023-04-28 2023-07-25 北京云柿信息技术有限公司 Automatic recording method, device and system for inserting-sheet type subjective refraction result
CN116473501B (en) * 2023-04-28 2023-12-05 北京云柿信息技术有限公司 Automatic recording method, device and system for inserting-sheet type subjective refraction result
CN116343313A (en) * 2023-05-30 2023-06-27 乐山师范学院 Face recognition method based on eye features
CN116343313B (en) * 2023-05-30 2023-08-11 乐山师范学院 Face recognition method based on eye features

Also Published As

Publication number Publication date
CN106503644B (en) 2019-05-28

Similar Documents

Publication Publication Date Title
CN106503644B (en) Glasses attribute detection method based on edge projection and color characteristic
CN103761519B (en) Non-contact sight-line tracking method based on self-adaptive calibration
CN107292251B (en) Driver fatigue detection method and system based on human eye state
CN110097034A (en) A kind of identification and appraisal procedure of Intelligent human-face health degree
CN104299011A (en) Skin type and skin problem identification and detection method based on facial image identification
CN107578035A (en) Human body contour outline extracting method based on super-pixel polychrome color space
US20040146187A1 (en) Iris extraction method
CN105956578A (en) Face verification method based on identity document information
CN105913093A (en) Template matching method for character recognizing and processing
CN105139404A (en) Identification camera capable of detecting photographing quality and photographing quality detecting method
CN105654436A (en) Backlight image enhancement and denoising method based on foreground-background separation
CN103914699A (en) Automatic lip gloss image enhancement method based on color space
CN106570447B (en) Based on the matched human face photo sunglasses automatic removal method of grey level histogram
CN101359365A (en) Iris positioning method based on Maximum between-Cluster Variance and gray scale information
CN106204594A (en) A kind of direction detection method of dispersivity moving object based on video image
CN109409298A (en) A kind of Eye-controlling focus method based on video processing
CN103927509A (en) Eye locating method and device
CN106548139A (en) A kind of pedestrian recognition methodss again
CN104794693A (en) Human image optimization method capable of automatically detecting mask in human face key areas
CN101996317B (en) Method and device for identifying markers in human body
CN107992835A (en) A kind of glasses image-recognizing method
CN106557745A (en) Human eyeball&#39;s detection method and system based on maximum between-cluster variance and gamma transformation
KR20160036375A (en) Fast Eye Detection Method Using Block Contrast and Symmetry in Mobile Device
Parikh et al. Effective approach for iris localization in nonideal imaging conditions
CN104573743B (en) A kind of facial image detection filter method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20190528

Termination date: 20211019

CF01 Termination of patent right due to non-payment of annual fee