CN106558046A - A kind of quality determining method and detection means of certificate photo - Google Patents

A kind of quality determining method and detection means of certificate photo Download PDF

Info

Publication number
CN106558046A
CN106558046A CN201610927700.4A CN201610927700A CN106558046A CN 106558046 A CN106558046 A CN 106558046A CN 201610927700 A CN201610927700 A CN 201610927700A CN 106558046 A CN106558046 A CN 106558046A
Authority
CN
China
Prior art keywords
face
region
background area
represent
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610927700.4A
Other languages
Chinese (zh)
Other versions
CN106558046B (en
Inventor
韩智素
王珏
刘新科
谌波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen 666 Network Service Co.,Ltd.
Original Assignee
Shenzhen Fluttering Baby Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Fluttering Baby Co Ltd filed Critical Shenzhen Fluttering Baby Co Ltd
Priority to CN201610927700.4A priority Critical patent/CN106558046B/en
Publication of CN106558046A publication Critical patent/CN106558046A/en
Application granted granted Critical
Publication of CN106558046B publication Critical patent/CN106558046B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Landscapes

  • Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a kind of quality determining method of certificate photo, coordinate based on human face region in the certificate photo image to be detected for obtaining, the coordinate of Garment region in certificate photo image to be detected and the coordinate of background area are positioned respectively, based on the Garment region, background area, human face region, and the coordinate of the eye areas for obtaining, face average brightness value is calculated respectively, left and right face luminance difference value and face inclination angle value, the colour consistency parameter of background area, color degree of fitting parameter between the slickness parameter of background area and Garment region and background area, evaluation result based on certificate photo image to be detected described in corresponding parameter determination, by the evaluation result output display.Certificate photo quality determining method of the method for the present invention for complete set, by evaluation result, user can voluntarily judge whether the quality of certificate photograph and picture has reached standard, so as to the quality of certificate photograph and picture is further improved according to evaluation result.

Description

A kind of quality determining method and detection means of certificate photo
Technical field
The invention belongs to certificate photo processing technology field, more particularly to a kind of quality determining method and detection dress of certificate photo Put.
Background technology
With the popularization and the lifting of mobile phone camera quality of smart mobile phone, increasing user is no longer desire to photograph Photo studio shoots certificate photo, and is intended to by mobile phone at home or office is shot, then repair figure by the later stage Method is improving photographic quality, and realizes for photo background being substituted for the demands such as reference colours.Whether a success can be made , the certificate photo for meeting standard be often depending on shoot original image whether be easy to later image to process, this is because the later stage Image processing algorithm often has limitation, it is impossible to automatically process more complicated situation.If the initial original image matter for shooting Amount is too poor, then process even across the later stage, cannot also be repaired as qualified certificate photo.
However, user is after it have taken original image, and cannot voluntarily judge whether the quality of the original image reaches The standard of certificate photo has reached the standard that later image is processed so that user makes a certificate that is successful, meeting standard According to often extremely difficult.
The content of the invention
The present invention provides a kind of quality determining method and detection means of certificate photo, it is intended to which solution cannot judge certificate photo figure Whether tablet quality reaches the problem of standard.
To solve above-mentioned technical problem, the invention provides a kind of quality determining method of certificate photo, the detection method Including:
Based on the coordinate of human face region in the certificate photo image to be detected for obtaining, to the certificate photo image underpants to be detected The coordinate of the coordinate and background area that take region is positioned respectively;
The coordinate of coordinate based on the human face region and the eye areas for obtaining, calculate respectively face average brightness value, Left and right face luminance difference value and face inclination angle value;
Gauss of distribution function is set up based on the RGB color value of the pixel in the background area, according to the Gauss point Cloth function determines the colour consistency parameter of the background area, and carries out rim detection to the background area, based on described Edge detection results calculate the slickness parameter of the background area;
Based on the gauss of distribution function and the pixel color of the Garment region, the Garment region and background are calculated Color degree of fitting parameter between region;
Based on the face average brightness value, left and right face luminance difference value, face inclination angle value, background area color Parameter of consistency, the slickness parameter of background area and the color degree of fitting parameter between Garment region and background area, really The evaluation result of the fixed certificate photo image to be detected, by the evaluation result output display.
Further, based on the human face region coordinate and the coordinate of the eye areas for obtaining, calculate face respectively and put down Brightness value, left and right face luminance difference value and face inclination angle value is specifically included:
The certificate photo image to be detected is converted into into gray-scale maps, obtain described in be converted to the certificate photo to be detected of gray-scale maps The pixel intensity initial value of image;
(1) the pixel intensity initial value based on the certificate photo image to be detected for being converted to gray-scale maps and the face area The coordinate in domain, calculates face average brightness value according to equation below:
Wherein, IfaceRepresent the face average brightness value, wfRepresent the length of the human face region, hfRepresent the people The width in face region, RfaceThe coordinate of the human face region is represented, p represents the certificate photo figure to be detected for being converted to gray-scale maps Any pixel as in, IpThe pixel intensity initial value of the certificate photo image to be detected of gray-scale maps is converted to described in representing;
(2) human face region is divided equally into into two sub-regions of size identical, described two subregions are respectively Left face region and right face region;
Based on the coordinate of the human face region, the average brightness value in the left face region is calculated according to equation below:
Wherein,The average brightness value in the left face region is represented,Represent the coordinate in the left face region, wfRepresent The length of the human face region, hfThe width of the human face region is represented, p represents the certificate to be detected for being converted to gray-scale maps According to any pixel in image, IpThe pixel intensity initial value of the certificate photo image to be detected of gray-scale maps is converted to described in representing;
Based on the coordinate of the human face region, the average brightness value in the right face region is calculated according to equation below:
Wherein,The average brightness value in the right face region is stated,Represent the right face region, wfRepresent described The length of human face region, hfThe width of the human face region is represented, p represents the certificate photo figure to be detected for being converted to gray-scale maps Any pixel as in, IpThe pixel intensity initial value of the certificate photo image to be detected of gray-scale maps is converted to described in representing;
The average brightness value in the average brightness value based on the left face region and the right face region, according to equation below meter Calculate the left and right face luminance difference value:
Wherein, DIfaceThe left and right face luminance difference value is represented,The average brightness value in the left face region is represented, State the average brightness value in the right face region;
(3) centre coordinate based on the two-eye area in the human face region for obtaining, calculates according to equation below The face inclination angle value:
Ang=atan2 (yeye2-yeye1,xeye1-xeye2)
Wherein, Ang represents the face inclination angle value, (xeye1, yeye1) represent that the center of first eye areas is sat Mark, (xeye2, yeye2) represent second eye areas centre coordinate.
Further, the RGB color value of the pixel based in the background area sets up gauss of distribution function, Determine that according to the gauss of distribution function colour consistency parameter of the background area is specifically included:
Obtain the RGB color value of all pixels in the certificate photo image background regions to be detected;
Calculating of averaging is carried out to the RGB color value of all pixels, background pixel average color is obtained;
Covariance matrix is calculated based on the RGB color value of the pixel;
Based on the background pixel average color and the covariance matrix, gauss of distribution function is set up;
The eigenvalue of maximum of the covariance matrix is calculated, root operation of making even is carried out based on the eigenvalue of maximum, is obtained To the colour consistency parameter of the background area.
Further, it is described based on the gauss of distribution function and the pixel color of the Garment region, calculate described Color degree of fitting parameter between Garment region and background area is specifically included:
Obtain the RGB color value of all pixels in the Garment region;
Based on the RGB color value of the pixel in the gauss of distribution function and the Garment region, according to following public affairs Formula calculates the degree of fitting between the RGB color value and background color of each pixel in the Garment region:
Wherein,Represent ith pixel in the Garment region RGB color value and background color it Between degree of fitting,The background pixel average color is represented, ∑ represents the covariance matrix,Represent the clothes The color value of ith pixel point in region;
The color degree of fitting parameter between the Garment region and background area is calculated according to equation below:
Wherein, PclothRepresent the color degree of fitting parameter between the Garment region and background area, NclothRepresent described The gross area of Garment region,Represent the RGB color value and background face of each pixel in the Garment region Degree of fitting between color, RclothRepresent the coordinate of the Garment region.
Further, based on obtain certificate photo image to be detected in human face region coordinate, to the card to be detected Before part is positioned respectively according to the coordinate of Garment region in image and the coordinate of background area, methods described also includes:Receive Certificate photo image to be detected, according to default method for detecting human face to the human face region coordinate in the certificate photo image to be detected And eye areas coordinate carries out coordinate setting respectively.
Further, it is described based on the face average brightness value, left and right face luminance difference value, face inclination angle value, The colour consistency parameter of background area, the slickness parameter of background area and the color between Garment region and background area Degree of fitting parameter, determines that the evaluation result of the certificate photo image to be detected is specifically included:
Based on the face average brightness value, left and right face luminance difference value, face inclination angle value, background area color Parameter of consistency, the slickness parameter of background area and the color degree of fitting parameter between Garment region and background area, point Corresponding fraction is not calculated;
Based on the fraction of the face brightness, human face light uniformity fraction, face gradient fraction, the background area Color monotonicity fraction, the background area structure monotonicity fraction and the Garment region color-identifying degree fraction, according to pre- If evaluation criterion the certificate photo image to be detected is evaluated, obtain evaluation result.
Further, it is described based on the face average brightness value, left and right face luminance difference value, face inclination angle value, The colour consistency parameter of background area, the slickness parameter of background area and the color between Garment region and background area Degree of fitting parameter, calculates corresponding fraction respectively and specifically includes:
The guarantee output valve scope of standards of grading is determined between 0 to 100, the basis for estimation of the standards of grading isWherein, T represents a truncation funcation;
(1) fraction of the face brightness is calculated according to equation below:
S1=T (100- | 170-Iface|*0.45)
Wherein, S1Represent the fraction of the face brightness, IfaceRepresent the face average brightness value;
(2) the human face light uniformity fraction is calculated according to equation below:
SDI=T (100-DIface)
Wherein, SDIRepresent the human face light uniformity fraction, DIfaceRepresent the left and right face luminance difference value;
(3) the face gradient fraction is calculated according to equation below:
Wherein, SAngThe face gradient fraction is represented, Ang represents the face inclination angle value;
(4) the background area color monotonicity fraction is calculated according to equation below:
SBC=100*exp (- σmax/20)
Wherein, the SBCRepresent background area color monotonicity fraction, σmaxRepresent the colour consistency parameter;
(5) the background area structure monotonicity fraction is calculated according to equation below:
SBE=T (100-Re*2000)
Wherein, SBEThe expression background area structure monotonicity fraction, ReRepresent the background area slickness parameter;
(6) the Garment region color-identifying degree fraction is calculated according to equation below:
Scloth=T (100-Pcloth*100)
Wherein, SclothRepresent the color region color-identifying degree fraction, PclothRepresent the Garment region and background area Color degree of fitting parameter between domain.
Present invention also offers a kind of quality detection device of certificate photo, described device includes:
Locating module, for the coordinate based on human face region in the certificate photo image to be detected for obtaining, to described to be detected In certificate photo image, the coordinate of Garment region and the coordinate of background area are positioned respectively;
Face characteristic extraction module, for the coordinate based on the human face region and the coordinate of the eye areas for obtaining, point Ji Suan not face average brightness value, left and right face luminance difference value and face inclination angle value;
Background characteristics extraction module, for setting up Gauss point based on the RGB color value of the pixel in the background area Cloth function, determines the colour consistency parameter of the background area according to the gauss of distribution function, and to the background area Rim detection is carried out, and the slickness parameter of the background area is calculated based on the edge detection results;
Garment features extraction module, for the pixel color based on the gauss of distribution function and the Garment region, Calculate the color degree of fitting parameter between the Garment region and background area;
Evaluation module, for based on the face average brightness value, left and right face luminance difference value, face inclination angle value, The colour consistency parameter of background area, the slickness parameter of background area and the color between Garment region and background area Degree of fitting parameter, determines the evaluation result of the certificate photo image to be detected, by the evaluation result output display.
Further, the face characteristic extraction module includes:
Modular converter, for the certificate photo image to be detected is converted into gray-scale maps, obtain described in be converted to gray-scale maps Certificate photo image to be detected pixel intensity initial value;
Face brightness calculation module, for the pixel intensity based on the certificate photo image to be detected for being converted to gray-scale maps Initial value and the coordinate of the human face region, calculate face average brightness value according to equation below:
Wherein, IfaceRepresent the face average brightness value, wfRepresent the length of the human face region, hfRepresent the people The width in face region, RfaceThe coordinate of the human face region is represented, p represents the certificate photo figure to be detected for being converted to gray-scale maps Any pixel as in, IpThe pixel intensity initial value of the certificate photo image to be detected of gray-scale maps is converted to described in representing;
Face luminance difference computing module, for the human face region is divided equally into two sub-districts of size identical Domain, described two subregions are respectively left face region and right face region, and the face luminance difference computing module includes that left face is bright Degree computing module, right face brightness calculation module and left and right face luminance difference computing module;
The left face brightness calculation module, for the coordinate based on the human face region, calculates described according to equation below The average brightness value in left face region:
Wherein,The average brightness value in the left face region is represented,Represent the coordinate in the left face region, wfTable Show the length of the human face region, hfThe width of the human face region is represented, p represents the card to be detected for being converted to gray-scale maps Part is according to any pixel in image, IpThe pixel intensity of the certificate photo image to be detected for being converted to gray-scale maps described in representing is initial Value;
The right face brightness calculation module, for the coordinate based on the human face region, calculates described according to equation below The average brightness value in right face region:
Wherein,The average brightness value in the right face region is stated,Represent the right face region, wfRepresent the people The length in face region, hfThe width of the human face region is represented, p represents the certificate photo image to be detected for being converted to gray-scale maps In any pixel, IpThe pixel intensity initial value of the certificate photo image to be detected of gray-scale maps is converted to described in representing;
The left and right face luminance difference computing module, for the average brightness value based on the left face region and the right face The average brightness value in region, calculates the left and right face luminance difference value according to equation below:
Wherein, DIfaceThe left and right face luminance difference value is represented,The average brightness value in the left face region is represented, State the average brightness value in the right face region;
Face angle of inclination computing module, for the center based on the two-eye area in the human face region for obtaining Coordinate, calculates the face inclination angle value according to equation below:
Ang=atan2 (yeye2-yeye1,xeye1-xeye2)
Wherein, Ang represents the face inclination angle value, (xeye1, yeye1) represent that the center of first eye areas is sat Mark, (xeye2, yeye2) represent second eye areas centre coordinate.
Further, the background characteristics extraction module includes:
Colour consistency parameter calculating module, for obtaining all pictures in the certificate photo image background regions to be detected The RGB color value of element, carries out calculating of averaging to the RGB color value of all pixels, obtains background pixel and puts down Color value, is calculated covariance matrix based on the RGB color value of the pixel, based on the average face of the background pixel Colour and the covariance matrix, set up gauss of distribution function, calculate the eigenvalue of maximum of the covariance matrix, based on described Eigenvalue of maximum carries out root operation of making even, and obtains the colour consistency parameter of the background area;
Slickness parameter calculating module, for carrying out rim detection to the background area, is tied based on the rim detection Fruit calculates the slickness parameter of the background area.
Further, the garment features extraction module specifically for:
Obtain the RGB color value of all pixels in the Garment region;
Based on the RGB color value of the pixel in the gauss of distribution function and the Garment region, according to following public affairs Formula calculates the degree of fitting between the RGB color value and background color of each pixel in the Garment region:
Wherein,Represent ith pixel in the Garment region RGB color value and background color it Between degree of fitting,The background pixel average color is represented, ∑ represents the covariance matrix,Represent the clothes The color value of ith pixel point in region;
The color degree of fitting parameter between the Garment region and background area is calculated according to equation below:
Wherein, PclothRepresent the color degree of fitting parameter between the Garment region and background area, NclothRepresent described The gross area of Garment region,Represent the RGB color value and background face of each pixel in the Garment region Degree of fitting between color, RclothRepresent the coordinate of the Garment region.
Further, described device also includes Primary Location module;
The Primary Location module, for receiving certificate photo image to be detected, according to default method for detecting human face to institute The human face region coordinate and eye areas coordinate stated in certificate photo image to be detected carries out coordinate setting respectively.
Further, the evaluation module includes:
Grading module, for based on the face average brightness value, left and right face luminance difference value, face inclination angle value, The colour consistency parameter of background area, the slickness parameter of background area and the color between Garment region and background area Degree of fitting parameter, calculates corresponding fraction respectively;
Submodule is evaluated, for based on the fraction of the face brightness, human face light uniformity fraction, face gradient point Several, described background area color monotonicity fractions, the background area structure monotonicity fraction and the Garment region color are distinguished Knowledge and magnanimity fraction, evaluates to the certificate photo image to be detected according to default evaluation criterion, obtains evaluation result;
Output module, for by the evaluation result output display.
Further, institute's scoring module includes:
Standards of grading formulate module, for determining the guarantee output valve scope of standards of grading between 0 to 100, institute's commentary The accurate basis for estimation of minute mark isWherein, T represents a truncation funcation;
Face brightness fraction computing module, for the fraction of the face brightness is calculated according to equation below:
S1=T (100- | 170-Iface|*0.45)
Wherein, S1Represent the fraction of the face brightness, IfaceRepresent the face average brightness value;
Human face light uniformity fraction computing module, for calculating the human face light uniformity point according to equation below Number:
SDI=T (100-DIface)
Wherein, SDIRepresent the human face light uniformity fraction, DIfaceRepresent the left and right face luminance difference value;
Face gradient fraction computing module, for calculating the face gradient fraction according to equation below:
Wherein, SAngThe face gradient fraction is represented, Ang represents the face inclination angle value;
Background color monotonicity fraction computing module, for calculating the background area color monotonicity according to equation below Fraction:
SBC=100*exp (- σmax/20)
Wherein, the SBCRepresent background area color monotonicity fraction, σmaxRepresent the colour consistency parameter;
Background structure monotonicity fraction computing module, for calculating the background area structure monotonicity according to equation below Fraction:
SBE=T (100-Re*2000)
Wherein, SBEThe expression background area structure monotonicity fraction, ReRepresent the background area slickness parameter;
Garment region color-identifying degree fraction computing module, distinguishes for calculating the Garment region color according to equation below Knowledge and magnanimity fraction:
Scloth=T (100-Pcloth*100)
Wherein, SclothRepresent the color region color-identifying degree fraction, PclothRepresent the Garment region and background area Color degree of fitting parameter between domain.
Compared with prior art, beneficial effect is the present invention:
The quality determining method of certificate photo provided by the present invention, based on face area in the certificate photo image to be detected for obtaining The coordinate in domain, positions to the coordinate of Garment region in certificate photo image to be detected and the coordinate of background area respectively, is based on The Garment region, background area, human face region and obtain eye areas coordinate, respectively calculate face average brightness value, Left and right face luminance difference value and face inclination angle value, the colour consistency parameter of background area, the slickness of background area Color degree of fitting parameter between parameter and Garment region and background area, based on card to be detected described in corresponding parameter determination Evaluation result of the part according to image, by the evaluation result output display, the invention provides the certificate photo quality inspection of complete set By evaluation result, survey method, user can voluntarily judge whether the quality of certificate photograph and picture has reached standard, so as to basis is commented Valency result further improves the quality of certificate photograph and picture.
Description of the drawings
Fig. 1 is the quality determining method schematic diagram of the certificate photo that first embodiment of the invention is provided;
Fig. 2 is the quality determining method schematic diagram of the certificate photo that second embodiment of the invention is provided;
Fig. 3 is the quality detection device schematic diagram of the certificate photo that third embodiment of the invention is provided;
Fig. 4 is the quality detection device schematic diagram of the certificate photo that fourth embodiment of the invention is provided;
Fig. 5 is the face luminance difference computing module schematic diagram that fourth embodiment of the invention is provided;
Fig. 6 is the grading module schematic diagram that fourth embodiment of the invention is provided;
Fig. 7 is zone location schematic diagram provided in an embodiment of the present invention.
Specific embodiment
In order that the objects, technical solutions and advantages of the present invention become more apparent, it is below in conjunction with drawings and Examples, right The present invention is further elaborated.It should be appreciated that specific embodiment described herein is only to explain the present invention, and It is not used in the restriction present invention.
As one embodiment of the present invention, as shown in figure 1, the matter of the certificate photo provided for first embodiment of the invention Quantity measuring method schematic diagram.First embodiment of the invention provides a kind of quality determining method of certificate photo, the detection method Including:
Step S101:Based on the coordinate of human face region in the certificate photo image to be detected for obtaining, to certificate photo figure to be detected As in, the coordinate of Garment region and the coordinate of background area are positioned respectively.
Step S102:The coordinate of coordinate and the eye areas for obtaining based on human face region, calculates face averagely bright respectively Angle value, left and right face luminance difference value and face inclination angle value.
Step S103:Gauss of distribution function is set up based on the RGB color value of the pixel in background area, according to Gauss Distribution function determines the colour consistency parameter of background area, and carries out rim detection to background area, is tied based on rim detection Fruit calculates the slickness parameter of background area.
Step S104:Based on gauss of distribution function and the pixel color of Garment region, the Garment region is calculated with the back of the body Color degree of fitting parameter between scene area.
Step S105:Based on face average brightness value, left and right face luminance difference value, face inclination angle value, background area Colour consistency parameter, the slickness parameter of background area and the color degree of fitting between Garment region and background area ginseng Number, determines the evaluation result of the certificate photo image to be detected, by the evaluation result output display.
In sum, first embodiment of the invention provides the certificate photo quality determining method of complete set, and user can be with Voluntarily judge by evaluation result whether the quality of certificate photograph and picture has reached standard, so as to further improve according to evaluation result The quality of certificate photograph and picture.
As second embodiment of the present invention, as shown in Fig. 2 the matter of the certificate photo provided for second embodiment of the invention Quantity measuring method schematic diagram.Second embodiment of the invention provides a kind of quality determining method of certificate photo, the detection method Including:
Step S201:Certificate photo image to be detected is received, according to default method for detecting human face to the certificate to be detected Coordinate setting is carried out respectively according to the human face region coordinate and eye areas coordinate in image.
There are many methods to carry out to the human face region coordinate or eye areas coordinate in photograph image in prior art Accurately detection and localization, is not repeated in the present invention in detail.But it should be recognized that the certificate photo of standard should only include a people Face, if in step s 201, is not detected by face coordinate or detects multiple faces sitting in certificate photo image to be detected Timestamp, then show that the certificate photo image is not just inconsistent standardization in initial detection stage, therefore, the inspection provided by the embodiment of the present invention Survey method can point out user certificate photo sub-standard at this moment, so as to not carry out next step operation.
Step S202:Based on the coordinate of human face region in the certificate photo image to be detected for obtaining, to certificate photo figure to be detected As in, the coordinate of Garment region and the coordinate of background area are positioned respectively, as shown in fig. 7, being provided in an embodiment of the present invention Zone location schematic diagram.
In step S202, it is fixed that the embodiment of the present invention is carried out to the coordinate of Garment region and background area as follows Position:
The coordinate of human face region in certificate photo image to be detected is got in advance.The coordinate of human face region can be defined as Rface=(xf, yf, wf, hf).Wherein, RfaceRepresent the coordinate of human face region, (xf, yf) represent the seat that human face region upper left pinpoints Mark, wfRepresent the length of human face region, hfRepresent the width of human face region;
Based on the coordinate of above-mentioned human face region, in the certificate photo image to be detected, the coordinate of Garment region is positioned method and is Rcloth=(xf, yf+hf* 1.5, wf, H-yf-hf*1.5).Wherein, RclothRepresent the coordinate of Garment region, (xf, yf) represent face The coordinate of region upper left fixed point, wfRepresent the length of human face region, hfThe width of human face region is represented, H represents certificate photo to be detected The length of image;
In the present embodiment, background area is divided into into left side background area and right side background area first, based on above-mentioned The coordinate of human face region, is positioned respectively to coordinate of the left side background area with right side background area.Left side background area Coordinate location method isThe coordinate location method of right side background area is Rbg2=(xf+ wf* 1.5,0, W-xf-wf*1.5,yf+hf).Wherein, Rbg1Represent the coordinate of left side background area, Rbg2Represent right side background area Coordinate, W represents the width of certificate photo image to be detected.
Step S203:The coordinate of coordinate and the eye areas for obtaining based on the human face region, calculates face respectively and puts down Equal brightness value, left and right face luminance difference value and face inclination angle value.Step S203 is specifically included:
Certificate photo image to be detected is converted into into gray-scale maps, acquisition is converted to the picture of the certificate photo image to be detected of gray-scale maps Plain brightness initial value.
(1) seat of the pixel intensity initial value based on the certificate photo image to be detected for being converted to gray-scale maps and human face region Mark, calculates face average brightness value according to equation below:
Wherein, IfaceRepresent face average brightness value, wfRepresent the length of human face region, hfThe width of human face region is represented, RfaceThe coordinate of human face region is represented, p represents any pixel in the certificate photo image to be detected for being converted to gray-scale maps, IpRepresent Be converted to the pixel intensity initial value of the certificate photo image to be detected of gray-scale maps.
(2) human face region is divided equally into into two sub-regions of size identical, two sub-regions are respectively left face region With right face region, based on the coordinate of human face region, the average brightness value in the left face region is calculated according to equation below:
Wherein,The average brightness value in left face region is represented,Represent the coordinate in left face region, wfRepresent human face region Length, hfThe width of human face region is represented, p represents any pixel in the certificate photo image to be detected for being converted to gray-scale maps, Ip Expression is converted to the pixel intensity initial value of the certificate photo image to be detected of gray-scale maps.
Based on the coordinate of human face region, the average brightness value in right face region is calculated according to equation below:
Wherein,The average brightness value in right face region is stated,Represent right face region, wfRepresent the length of human face region Degree, hfThe width of human face region is represented, p represents any pixel in the certificate photo image to be detected for being converted to gray-scale maps, IpRepresent Be converted to the pixel intensity initial value of the certificate photo image to be detected of gray-scale maps.
Average brightness value based on left face region and the average brightness value in right face region, calculate left and right face according to equation below Luminance difference value:
Wherein,Left and right face luminance difference value is represented,The average brightness value in left face region is represented,Statement is right The average brightness value in face region;
(3) centre coordinate based on the two-eye area in the human face region for obtaining, calculates according to equation below The face inclination angle value:
Ang=atan2 (yeye2-yeye1,xeye1-xeye2)
Wherein, Ang represents the face inclination angle value, (xeye1, yeye1) represent that the center of first eye areas is sat Mark, (xeye2, yeye2) represent second eye areas centre coordinate.
Step S204:Gauss of distribution function is set up based on the RGB color value of the pixel in the background area, according to The gauss of distribution function determines the colour consistency parameter of the background area, and carries out edge inspection to the background area Survey, the slickness parameter of the background area is calculated based on the edge detection results.Step S204 is specifically included:
(1) obtain the RGB color value of all pixels in the certificate photo image background regions to be detected;To described The RGB color value of all pixels carries out calculating of averaging, and obtains background pixel average color;Based on the pixel RGB color value is calculated covariance matrix;Based on the background pixel average color and the covariance matrix, build Vertical gauss of distribution function;The eigenvalue of maximum of the covariance matrix is calculated, root of making even is carried out based on the eigenvalue of maximum Computing, obtains the colour consistency parameter of the background area.In the present embodiment, gauss of distribution function is usedTable Show, wherein,Background pixel average color is represented, ∑ represents covariance matrix, the colour consistency ginseng of the background area Number uses σmaxRepresent.
(2) according to default edge detection method, rim detection is carried out to background area, obtains some edge pixels;Base Background area slickness parameter is calculated in the edge pixel, computing formula is Re=Ne/N.Wherein, ReRepresent that background area is smooth Property parameter, NeThe number of edge pixel is represented, N represents the gross area of the background area.Background area slickness parameter master The complexity embodied by certificate photo picture background region, if background area is more complicated, edge pixel is more, therefore and causes the back of the body The value of scene area slickness parameter is bigger.
Step S205:Based on gauss of distribution function and the pixel color of Garment region, Garment region and background area are calculated Color degree of fitting parameter between domain.Step S205 is specifically included:
Obtain the RGB color value of all pixels in Garment region;
Based on the RGB color value of the pixel in gauss of distribution function and Garment region, clothing is calculated according to equation below Take the degree of fitting between the RGB color value and background color of each pixel in region:
Wherein,Between the RGB color value and background color of the ith pixel in expression Garment region Degree of fitting,Background pixel average color is represented, Σ represents covariance matrix,Represent ith pixel in Garment region The color value of point;
The color degree of fitting parameter between Garment region and background area is calculated according to equation below:
Wherein, PclothRepresent the color degree of fitting parameter between Garment region and background area, NclothRepresent Garment region The gross area,Fitting between the RGB color value and background color of each pixel in expression Garment region Degree, RclothRepresent the coordinate of the Garment region.Color degree of fitting parameter between Garment region and background area is bigger, shows Clothes color in certificate photograph and picture is more similar to background color.
Step S206:Based on the face average brightness value, left and right face luminance difference value, face inclination angle value, background The fitting of the colour consistency parameter in region, the slickness parameter of background area and the color between Garment region and background area Degree parameter, determines the evaluation result of the certificate photo image to be detected, by the evaluation result output display.Step S206 is concrete Including:
(1) based on the face average brightness value, left and right face luminance difference value, face inclination angle value, background area Colour consistency parameter, the slickness parameter of background area and the color degree of fitting between Garment region and background area ginseng Number, calculates corresponding fraction respectively.It is described in detail to how to calculate corresponding fraction below:
Between 0 to 100, the basis for estimation of the standards of grading is the guarantee output valve scope of predetermined standards of gradingWherein, T represents a truncation funcation;
1) fraction of the face brightness is calculated according to equation below:
S1=T (100- | 170-Iface|*0.45)
Wherein, S1Represent the fraction of the face brightness, IfaceRepresent the face average brightness value.Wherein, general picture Average color brightness value at 170, belong to more satisfactory brightness effects.
2) the human face light uniformity fraction is calculated according to equation below:
SDI=T (100-DIface)
Wherein, SDIRepresent the human face light uniformity fraction, DIfaceRepresent the left and right face luminance difference value;
3) the face gradient fraction is calculated according to equation below:
Wherein, SAngThe face gradient fraction is represented, Ang represents the face inclination angle value.Face gradient point Number is lower, represents that the face in certificate photograph and picture to be detected is more inclined.
4) the background area color monotonicity fraction is calculated according to equation below:
SBC=100*exp (- σmax/20)
Wherein, the SBCRepresent background area color monotonicity fraction, σmaxRepresent the colour consistency parameter;
5) the background area structure monotonicity fraction is calculated according to equation below:
SBE=T (100-Re*2000)
Wherein, SBEThe expression background area structure monotonicity fraction, ReRepresent the background area slickness parameter;
6) the Garment region color-identifying degree fraction is calculated according to equation below:
Scloth=T (100-Pcloth*100)
Wherein, SclothRepresent the color region color-identifying degree fraction, PclothRepresent the Garment region and background area Color degree of fitting parameter between domain.
(2) based on the fraction of the face brightness, human face light uniformity fraction, face gradient fraction, the background Field color monotonicity fraction, the background area structure monotonicity fraction and the Garment region color-identifying degree fraction, press The certificate photo image to be detected is evaluated according to default evaluation criterion, obtain evaluation result.
It should be noted that in step S206, during by evaluation result output display, the evaluation result can be scoring, Or scoring and evaluation result, or evaluation result.And the evaluation result can be carrying for " qualified " or " unqualified " Show, or some are recommended or improved suggestion.And the method provided by the present embodiment, can in advance to scoring setting threshold The item rating, when certain item rating is less than threshold value, is highlighted or is pointed out user and note by value.
In sum, the certificate photo quality determining method that second embodiment of the invention is provided, user can be by evaluating As a result voluntarily judge whether the quality of certificate photograph and picture has reached standard, user can further improve certificate photo according to evaluation result The quality of picture, improves the success rate that the later stage is processed to picture, such that it is able to be reached within a short period of time Certificate photo quality standard and can allow customer satisfaction system certificate photo.And the detection method algorithm is simple, can be fast to detection zone Fast positioning, rapid extraction feature, Fast Evaluation, therefore, substantially increase detection efficiency.
As the 3rd embodiment of the present invention, as shown in figure 3, the matter of the certificate photo provided for third embodiment of the invention Amount detecting device schematic diagram.The detection means includes:
Locating module 11, for the coordinate based on human face region in the certificate photo image to be detected for obtaining, to card to be detected Part is positioned respectively according to the coordinate of Garment region in image and the coordinate of background area.
Face characteristic extraction module 22, for the coordinate based on human face region and the coordinate of the eye areas for obtaining, difference Calculate face average brightness value, left and right face luminance difference value and face inclination angle value.
Background characteristics extraction module 33, sets up Gauss distribution for the RGB color value based on the pixel in background area Function, determines the colour consistency parameter of the background area according to gauss of distribution function, and carries out edge inspection to background area Survey, the slickness parameter of the background area is calculated based on edge detection results.
Garment features extraction module 44, for the pixel color based on gauss of distribution function and the Garment region, meter Calculate the color degree of fitting parameter between Garment region and background area.
Evaluation module 55, for based on face average brightness value, left and right face luminance difference value, face inclination angle value, the back of the body The colour consistency parameter of scene area, the slickness parameter of background area and the color between Garment region and background area are intended Right parameter, determines the evaluation result of certificate photo image to be detected, by the evaluation result output display.
In sum, the certificate photo quality detection device that third embodiment of the invention is provided, user can be by evaluating knot Fruit voluntarily judges whether the quality of certificate photograph and picture has reached standard, so as to further improve certificate photograph and picture according to evaluation result Quality.
As the 4th embodiment of the present invention, as shown in figure 4, the matter of the certificate photo provided for fourth embodiment of the invention Amount detecting device schematic diagram.The detection means includes:
Primary Location module 66, for receiving certificate photo image to be detected, according to default method for detecting human face to described Human face region coordinate and eye areas coordinate in certificate photo image to be detected carries out coordinate setting respectively.Have in prior art Many methods can carry out accurately detection and localization to the human face region coordinate or eye areas coordinate in photograph image, at this Do not repeat in detail in bright.But it should be recognized that when preliminary locating module 66 is not detected by certificate photo image to be detected Face coordinate or when detecting multiple face coordinates, then show that the certificate photo image does not just meet mark in initial detection stage Standard, therefore, Primary Location module 66 can point out user certificate photo sub-standard at this moment, so as to not carry out next step operation.
Locating module 11, for the coordinate based on human face region in the certificate photo image to be detected for obtaining, to card to be detected Part is positioned respectively according to the coordinate of Garment region in image and the coordinate of background area.The locating module 11 specifically for:
The coordinate of human face region in certificate photo image to be detected is got in advance.The coordinate of human face region can be defined as Rface=(xf, yf, wf, hf).Wherein, RfaceRepresent the coordinate of human face region, (xf, yf) represent the seat that human face region upper left pinpoints Mark, wfRepresent the length of human face region, hfRepresent the width of human face region;
Based on the coordinate of above-mentioned human face region, in the certificate photo image to be detected, the coordinate of Garment region is positioned method and is Rcloth=(xf, yf+hf* 1.5, wf, H-yf-hf*1.5).Wherein, RclothRepresent the coordinate of Garment region, (xf, yf) represent face The coordinate of region upper left fixed point, wfRepresent the length of human face region, hfThe width of human face region is represented, H represents certificate photo to be detected The length of image;
In the present embodiment, background area is divided into into left side background area and right side background area first, based on above-mentioned The coordinate of human face region, is positioned respectively to coordinate of the left side background area with right side background area.Left side background area Coordinate location method isThe coordinate location method of right side background area is Rbg2=(xf+ wf* 1.5,0, W-xf-wf*1.5,yf+hf).Wherein, Rbg1Represent the coordinate of left side background area, Rbg2Represent right side background area Coordinate, W represents the width of certificate photo image to be detected.
Face characteristic extraction module 22, for the coordinate based on human face region and the coordinate of the eye areas for obtaining, difference Calculate face average brightness value, left and right face luminance difference value and face inclination angle value.Face characteristic extraction module 22 is concrete Including:
Modular converter 201, for the certificate photo image to be detected is converted into gray-scale maps, obtain described in be converted to gray scale The pixel intensity initial value of the certificate photo image to be detected of figure;
Face brightness calculation module 202, for the pixel based on the certificate photo image to be detected for being converted to gray-scale maps Brightness initial value and the coordinate of the human face region, calculate face average brightness value according to equation below:
Wherein, IfaceRepresent the face average brightness value, wfRepresent the length of the human face region, hfRepresent the people The width in face region, RfaceThe coordinate of the human face region is represented, p represents the certificate photo figure to be detected for being converted to gray-scale maps Any pixel as in, IpThe pixel intensity initial value of the certificate photo image to be detected of gray-scale maps is converted to described in representing;
Face luminance difference computing module 203, it is sub for the human face region is divided equally into size identical two Region, described two subregions are respectively left face region and right face region, as shown in figure 5, face luminance difference computing module 203 Including left face brightness calculation module 2031, right face brightness calculation module 2032 and left and right face luminance difference computing module 2033;
Left face brightness calculation module 2031, for the coordinate based on the human face region, calculates described according to equation below The average brightness value in left face region:
Wherein,The average brightness value in the left face region is represented,Represent the coordinate in the left face region, wfTable Show the length of the human face region, hfThe width of the human face region is represented, p represents the card to be detected for being converted to gray-scale maps Part is according to any pixel in image, IpThe pixel intensity of the certificate photo image to be detected for being converted to gray-scale maps described in representing is initial Value;
Right face brightness calculation module 2032, for the coordinate based on the human face region, calculates described according to equation below The average brightness value in right face region:
Wherein,The average brightness value in the right face region is stated,Represent the right face region, wfRepresent described The length of human face region, hfThe width of the human face region is represented, p represents the certificate photo figure to be detected for being converted to gray-scale maps Any pixel as in, IpThe pixel intensity initial value of the certificate photo image to be detected of gray-scale maps is converted to described in representing;
Left and right face luminance difference computing module 2033, for the average brightness value based on the left face region and the right face The average brightness value in region, calculates the left and right face luminance difference value according to equation below:
Wherein, DIfaceThe left and right face luminance difference value is represented,The average brightness value in the left face region is represented, State the average brightness value in the right face region;
Face angle of inclination computing module 204, for based on the two-eye area in the human face region for obtaining Centre coordinate, calculates the face inclination angle value according to equation below:
Ang=atan2 (yeye2-yeye1,xeye1-xeye2)
Wherein, Ang represents the face inclination angle value, (xeye1, yeye1) represent that the center of first eye areas is sat Mark, (xeye2, yeye2) represent second eye areas centre coordinate.
Background characteristics extraction module 33, sets up Gauss distribution for the RGB color value based on the pixel in background area Function, determines the colour consistency parameter of the background area according to gauss of distribution function, and carries out edge inspection to background area Survey, the slickness parameter of the background area is calculated based on edge detection results.Background characteristics extraction module 33 specifically includes face Color parameter of consistency computing module 301 and slickness parameter calculating module 302:
Colour consistency parameter calculating module 301, for obtaining the institute in the certificate photo image background regions to be detected There is the RGB color value of pixel, calculating of averaging is carried out to the RGB color value of all pixels, background picture is obtained Plain average color, is calculated covariance matrix based on the RGB color value of the pixel, flat based on the background pixel Color value and the covariance matrix, set up gauss of distribution function, calculate the eigenvalue of maximum of the covariance matrix, are based on The eigenvalue of maximum carries out root operation of making even, and obtains the colour consistency parameter of the background area.In the present embodiment, Gauss of distribution function is usedRepresent, wherein,Background pixel average color is represented, Σ represents covariance matrix.
Slickness parameter calculating module 302, for carrying out rim detection to the background area, based on the rim detection As a result calculate the slickness parameter of the background area.In the present embodiment, according to default edge detection method, to background area Domain carries out rim detection, obtains some edge pixels;Background area slickness parameter, computing formula are calculated based on the edge pixel For Re=Ne/N.Wherein, ReRepresent background area slickness parameter, NeThe number of edge pixel is represented, N represents the background area The gross area in domain.
Garment features extraction module 44, for the pixel color based on gauss of distribution function and the Garment region, meter Calculate the color degree of fitting parameter between Garment region and background area.Garment features extraction module 44 specifically for:
Obtain the RGB color value of all pixels in Garment region;
Based on the RGB color value of the pixel in gauss of distribution function and Garment region, clothing is calculated according to equation below Take the degree of fitting between the RGB color value and background color of each pixel in region:
Wherein,Between the RGB color value and background color of the ith pixel in expression Garment region Degree of fitting,Background pixel average color is represented, Σ represents covariance matrix,Represent ith pixel in Garment region The color value of point;
The color degree of fitting parameter between Garment region and background area is calculated according to equation below:
Wherein, PclothRepresent the color degree of fitting parameter between Garment region and background area, NclothRepresent Garment region The gross area,Fitting between the RGB color value and background color of each pixel in expression Garment region Degree, RclothRepresent the coordinate of the Garment region.
Evaluation module 55, for based on face average brightness value, left and right face luminance difference value, face inclination angle value, the back of the body The colour consistency parameter of scene area, the slickness parameter of background area and the color between Garment region and background area are intended Right parameter, determines the evaluation result of certificate photo image to be detected, by the evaluation result output display.The evaluation module 55 is wrapped Include grading module 501, evaluate submodule 502, output module 503.
Grading module 501, for based on the face average brightness value, left and right face luminance difference value, face angle of inclination Value, between the colour consistency parameter of background area, the slickness parameter of background area and Garment region and background area Color degree of fitting parameter, calculates corresponding fraction respectively.As shown in fig. 6, grading module 501 includes:
Standards of grading formulate module 5011, for determining the guarantee output valve scope of standards of grading between 0 to 100, institute The basis for estimation for stating standards of grading isWherein, T represents a truncation funcation.
Face brightness fraction computing module 5012, for the fraction of the face brightness is calculated according to equation below:
S1=T (100- | 170-Iface|*0.45)
Wherein, S1Represent the fraction of the face brightness, IfaceRepresent the face average brightness value;
Human face light uniformity fraction computing module 5013, for calculating the human face light uniformity according to equation below Fraction:
SDI=T (100-DIface)
Wherein, SDIRepresent the human face light uniformity fraction, DIfaceRepresent the left and right face luminance difference value;
Face gradient fraction computing module 5014, for calculating the face gradient fraction according to equation below:
Wherein, SAngThe face gradient fraction is represented, Ang represents the face inclination angle value;
Background color monotonicity fraction computing module 5015, for calculating the background area color list according to equation below Tonality fraction:
SBC=100*exp (- σmax/20)
Wherein, the SBCRepresent background area color monotonicity fraction, σmaxRepresent the colour consistency parameter;
Background structure monotonicity fraction computing module 5016, for calculating the background area structure list according to equation below Tonality fraction:
SBE=T (100-Re*2000)
Wherein, SBEThe expression background area structure monotonicity fraction, ReRepresent the background area slickness parameter;
Garment region color-identifying degree fraction computing module 5017, for calculating the Garment region face according to equation below Color identification fraction:
Scloth=T (100-Pcloth*100)
Wherein, SclothRepresent the color region color-identifying degree fraction, PclothRepresent the Garment region and background area Color degree of fitting parameter between domain.
Submodule 502 is evaluated, for inclining based on the fraction of the face brightness, human face light uniformity fraction, face Degree fraction, the background area color monotonicity fraction, the background area structure monotonicity fraction and the Garment region face Color identification fraction, evaluates to the certificate photo image to be detected according to default evaluation criterion, obtains evaluation result.
Output module 503, for by the evaluation result output display.
It should be noted that the evaluation result of output can be scoring, or scoring and evaluation result, or evaluate As a result.And the evaluation result can be the prompting, or some recommendations or improved suggestion of " qualified " or " unqualified ". And the method provided by the present embodiment, can in advance to the given threshold that scores, when certain item rating is less than threshold value, by the item rating Highlight or point out user to note.
In sum, the certificate photo quality detection device that fourth embodiment of the invention is provided, user can be by evaluating knot Fruit voluntarily judges whether the quality of certificate photograph and picture has reached standard, and user can further improve certificate photo figure according to evaluation result The quality of piece, improves the success rate that the later stage is processed to picture, reaches card such that it is able to obtain within a short period of time Part is according to quality standard and can allow customer satisfaction system certificate photo.And the detection means algorithm is simple, can be quick to detection zone Positioning, rapid extraction feature, Fast Evaluation, therefore, substantially increase detection efficiency.
Presently preferred embodiments of the present invention is the foregoing is only, not to limit invention, all spirit in the present invention With any modification, equivalent and the improvement made within principle etc., should be included within the scope of the present invention.

Claims (14)

1. a kind of quality determining method of certificate photo, it is characterised in that the detection method includes:
Based on the coordinate of human face region in the certificate photo image to be detected for obtaining, to clothes area in the certificate photo image to be detected The coordinate in domain and the coordinate of background area are positioned respectively;
The coordinate of coordinate and the eye areas for obtaining based on the human face region, calculates face average brightness value, left and right respectively Face luminance difference value and face inclination angle value;
Gauss of distribution function is set up based on the RGB color value of the pixel in the background area, according to the Gauss distribution letter Number determines the colour consistency parameter of the background area, and carries out rim detection to the background area, based on the edge Testing result calculates the slickness parameter of the background area;
Based on the gauss of distribution function and the pixel color of the Garment region, the Garment region and background area are calculated Between color degree of fitting parameter;
Based on the face average brightness value, left and right face luminance difference value, face inclination angle value, background area solid colour Property parameter, the slickness parameter of background area and the color degree of fitting parameter between Garment region and background area, determine institute The evaluation result of certificate photo image to be detected is stated, by the evaluation result output display.
2. detection method as claimed in claim 1, it is characterised in that the coordinate and the eye for obtaining based on the human face region The coordinate in eyeball region, calculates face average brightness value, left and right face luminance difference value and face inclination angle value respectively and specifically wraps Include:
The certificate photo image to be detected is converted into into gray-scale maps, obtain described in be converted to the certificate photo image to be detected of gray-scale maps Pixel intensity initial value;
(1) the pixel intensity initial value and the human face region based on the certificate photo image to be detected for being converted to gray-scale maps Coordinate, calculates face average brightness value according to equation below:
I f a c e = 1 w f * h f Σ p ∈ R f a c e I p
Wherein, IfaceRepresent the face average brightness value, wfRepresent the length of the human face region, hfRepresent the face area The width in domain, RfaceIn representing that the coordinate of the human face region, p represent the certificate photo image to be detected for being converted to gray-scale maps Any pixel, IpThe pixel intensity initial value of the certificate photo image to be detected of gray-scale maps is converted to described in representing;
(2) human face region is divided equally into into two sub-regions of size identical, described two subregions are respectively left face Region and right face region;
Based on the coordinate of the human face region, the average brightness value in the left face region is calculated according to equation below:
I f a c e l = 2 w f * h f Σ p ∈ R f a c e l I p
Wherein,The average brightness value in the left face region is represented,Represent the coordinate in the left face region, wfRepresent described The length of human face region, hfThe width of the human face region is represented, p represents the certificate photo figure to be detected for being converted to gray-scale maps Any pixel as in, IpThe pixel intensity initial value of the certificate photo image to be detected of gray-scale maps is converted to described in representing;
Based on the coordinate of the human face region, the average brightness value in the right face region is calculated according to equation below:
I f a c e r = 2 w f * h f Σ p ∈ R f a c e r I p
Wherein,The average brightness value in the right face region is stated,Represent the right face region, wfRepresent the face area The length in domain, hfIn representing that the width of the human face region, p represent the certificate photo image to be detected for being converted to gray-scale maps Any pixel, IpThe pixel intensity initial value of the certificate photo image to be detected of gray-scale maps is converted to described in representing;
Average brightness value based on the left face region and the average brightness value in the right face region, calculate institute according to equation below State left and right face luminance difference value:
DI f a c e = | I f a c e l - I f a c e r |
Wherein, DIfaceThe left and right face luminance difference value is represented,The average brightness value in the left face region is represented,Statement The average brightness value in the right face region;
(3) centre coordinate based on the two-eye area in the human face region for obtaining, calculates described according to equation below Face inclination angle value:
Ang=atan2 (yeye2-yeye1,xeye1-xeye2)
Wherein, Ang represents the face inclination angle value, (xeye1, yeye1) represent first eye areas centre coordinate, (xeye2, yeye2) represent second eye areas centre coordinate.
3. detection method as claimed in claim 1, it is characterised in that the pixel based in the background area it is red green Blue color value sets up gauss of distribution function, determines the colour consistency parameter of the background area according to the gauss of distribution function Specifically include:
Obtain the RGB color value of all pixels in the certificate photo image background regions to be detected;
Calculating of averaging is carried out to the RGB color value of all pixels, background pixel average color is obtained;
Covariance matrix is calculated based on the RGB color value of the pixel;
Based on the background pixel average color and the covariance matrix, gauss of distribution function is set up;
The eigenvalue of maximum of the covariance matrix is calculated, root operation of making even is carried out based on the eigenvalue of maximum, obtain institute State the colour consistency parameter of background area.
4. detection method as claimed in claim 3, it is characterised in that described based on the gauss of distribution function and the clothing The pixel color in region is taken, the color degree of fitting parameter calculated between the Garment region and background area is specifically included:
Obtain the RGB color value of all pixels in the Garment region;
Based on the RGB color value of the pixel in the gauss of distribution function and the Garment region, according to equation below meter Calculate the degree of fitting between the RGB color value and background color of each pixel in the Garment region:
P ( C c l o t h i ) = 1 ( 2 π ) 3 | Σ | exp ( - 1 2 ( C c l o t h i - C ‾ ) T Σ - 1 ( C c l o t h i - C ‾ ) )
Wherein,Represent between the RGB color value and background color of ith pixel in the Garment region Degree of fitting,The background pixel average color is represented, ∑ represents the covariance matrix,Represent the Garment region The color value of interior ith pixel point;
The color degree of fitting parameter between the Garment region and background area is calculated according to equation below:
P c l o t h = 1 N c l o t h Σ i ∈ R c l o t h P ( C c l o t h i )
Wherein, PclothRepresent the color degree of fitting parameter between the Garment region and background area, NclothRepresent the clothes The gross area in region,Represent each pixel in the Garment region RGB color value and background color it Between degree of fitting, RclothRepresent the coordinate of the Garment region.
5. detection method as claimed in claim 1, it is characterised in that the face in based on the certificate photo image to be detected for obtaining The coordinate in region, is positioned respectively to the coordinate of Garment region in the certificate photo image to be detected and the coordinate of background area Before, methods described also includes:
Certificate photo image to be detected is received, according to default method for detecting human face to the face in the certificate photo image to be detected Area coordinate and eye areas coordinate carry out coordinate setting respectively.
6. the detection method as described in claim 1 to 5 any one, it is characterised in that described averagely bright based on the face Angle value, left and right face luminance difference value, face inclination angle value, the colour consistency parameter of background area, background area it is smooth Color degree of fitting parameter of the property between parameter and Garment region and background area, determines commenting for the certificate photo image to be detected Valency result is specifically included:
Based on the face average brightness value, left and right face luminance difference value, face inclination angle value, background area solid colour Property parameter, the slickness parameter of background area and the color degree of fitting parameter between Garment region and background area, are counted respectively Calculate corresponding fraction;
Based on the fraction of the face brightness, human face light uniformity fraction, face gradient fraction, the background area color Monotonicity fraction, the background area structure monotonicity fraction and the Garment region color-identifying degree fraction, according to default Evaluation criterion is evaluated to the certificate photo image to be detected, obtains evaluation result.
7. detection method as claimed in claim 6, it is characterised in that described based on the face average brightness value, left and right face Luminance difference value, face inclination angle value, the colour consistency parameter of background area, the slickness parameter of background area and clothing The color degree of fitting parameter between region and background area is taken, corresponding fraction is calculated respectively and is specifically included:
The guarantee output valve scope of standards of grading is determined between 0 to 100, the basis for estimation of the standards of grading isWherein, T represents a truncation funcation;
(1) fraction of the face brightness is calculated according to equation below:
S1=T (100- | 170-Iface|*0.45)
Wherein, S1Represent the fraction of the face brightness, IfaceRepresent the face average brightness value;
(2) the human face light uniformity fraction is calculated according to equation below:
SDI=T (100-DIface)
Wherein, SDIRepresent the human face light uniformity fraction, DIfaceRepresent the left and right face luminance difference value;
(3) the face gradient fraction is calculated according to equation below:
S A n g = T ( 100 - | A n g | * 2 π )
Wherein, SAngThe face gradient fraction is represented, Ang represents the face inclination angle value;
(4) the background area color monotonicity fraction is calculated according to equation below:
SBC=100*exp (- σmax/20)
Wherein, the SBCRepresent background area color monotonicity fraction, σmaxRepresent the colour consistency parameter;
(5) the background area structure monotonicity fraction is calculated according to equation below:
SBE=T (100-Re*2000)
Wherein, SBEThe expression background area structure monotonicity fraction, ReRepresent the background area slickness parameter;
(6) the Garment region color-identifying degree fraction is calculated according to equation below:
Scloth=T (100-Pcloth*100)
Wherein, SclothRepresent the color region color-identifying degree fraction, PclothRepresent the Garment region and background area it Between color degree of fitting parameter.
8. a kind of quality detection device of certificate photo, it is characterised in that described device includes:
Locating module, for the coordinate based on human face region in the certificate photo image to be detected for obtaining, to the certificate to be detected The coordinate of coordinate and background area according to Garment region in image is positioned respectively;
Face characteristic extraction module, for the coordinate based on the human face region and the coordinate of the eye areas for obtaining, is counted respectively Calculate face average brightness value, left and right face luminance difference value and face inclination angle value;
Background characteristics extraction module, sets up Gauss distribution letter for the RGB color value based on the pixel in the background area Number, determines the colour consistency parameter of the background area according to the gauss of distribution function, and the background area is carried out Rim detection, calculates the slickness parameter of the background area based on the edge detection results;
Garment features extraction module, for the pixel color based on the gauss of distribution function and the Garment region, calculates Color degree of fitting parameter between the Garment region and background area;
Evaluation module, for based on the face average brightness value, left and right face luminance difference value, face inclination angle value, background The fitting of the colour consistency parameter in region, the slickness parameter of background area and the color between Garment region and background area Degree parameter, determines the evaluation result of the certificate photo image to be detected, by the evaluation result output display.
9. detection means as claimed in claim 8, it is characterised in that the face characteristic extraction module includes:
Modular converter, for the certificate photo image to be detected is converted into gray-scale maps, obtain described in be converted to treating for gray-scale maps The pixel intensity initial value of detection certificate photo image;
Face brightness calculation module, it is initial for the pixel intensity based on the certificate photo image to be detected for being converted to gray-scale maps Be worth the coordinate with the human face region, face average brightness value is calculated according to equation below:
I f a c e = 1 w f * h f Σ p ∈ R f a c e I p
Wherein, IfaceRepresent the face average brightness value, wfRepresent the length of the human face region, hfRepresent the face area The width in domain, RfaceIn representing that the coordinate of the human face region, p represent the certificate photo image to be detected for being converted to gray-scale maps Any pixel, IpThe pixel intensity initial value of the certificate photo image to be detected of gray-scale maps is converted to described in representing;
Face luminance difference computing module, for the human face region is divided equally into two sub-regions of size identical, institute State two sub-regions and be respectively left face region and right face region, the face luminance difference computing module includes left face brightness calculation Module, right face brightness calculation module and left and right face luminance difference computing module;
The left face brightness calculation module, for the coordinate based on the human face region, calculates the left face according to equation below The average brightness value in region:
I f a c e l = 2 w f * h f Σ p ∈ R f a c e l I p
Wherein,The average brightness value in the left face region is represented,Represent the coordinate in the left face region, wfRepresent described The length of human face region, hfThe width of the human face region is represented, p represents the certificate photo figure to be detected for being converted to gray-scale maps Any pixel as in, IpThe pixel intensity initial value of the certificate photo image to be detected of gray-scale maps is converted to described in representing;
The right face brightness calculation module, for the coordinate based on the human face region, calculates the right face according to equation below The average brightness value in region:
I f a c e r = 2 w f * h f Σ p ∈ R f a c e r I p
Wherein,The average brightness value in the right face region is stated,Represent the right face region, wfRepresent the face area The length in domain, hfIn representing that the width of the human face region, p represent the certificate photo image to be detected for being converted to gray-scale maps Any pixel, IpThe pixel intensity initial value of the certificate photo image to be detected of gray-scale maps is converted to described in representing;
The left and right face luminance difference computing module, for the average brightness value based on the left face region and the right face region Average brightness value, calculate the left and right face luminance difference value according to equation below:
DI f a c e = | I f a c e l - I f a c e r |
Wherein, DIfaceThe left and right face luminance difference value is represented,The average brightness value in the left face region is represented,Statement The average brightness value in the right face region;
Face angle of inclination computing module, sits for the center based on the two-eye area in the human face region for obtaining Mark, calculates the face inclination angle value according to equation below:
Ang=atan2 (yeye2-yeye1,xeye1-xeye2)
Wherein, Ang represents the face inclination angle value, (xeye1, yeye1) represent first eye areas centre coordinate, (xeye2, yeye2) represent second eye areas centre coordinate.
10. detection means as claimed in claim 8, it is characterised in that the background characteristics extraction module includes:
Colour consistency parameter calculating module, for obtaining all pixels in the certificate photo image background regions to be detected RGB color value, carries out calculating of averaging to the RGB color value of all pixels, obtains the average face of background pixel Colour, is calculated covariance matrix based on the RGB color value of the pixel, based on the background pixel average color With the covariance matrix, gauss of distribution function is set up, calculate the eigenvalue of maximum of the covariance matrix, based on the maximum Eigenvalue carries out root operation of making even, and obtains the colour consistency parameter of the background area;
Slickness parameter calculating module, for carrying out rim detection to the background area, based on the edge detection results meter Calculate the slickness parameter of the background area.
11. detection means as claimed in claim 10, it is characterised in that the garment features extraction module specifically for:
Obtain the RGB color value of all pixels in the Garment region;
Based on the RGB color value of the pixel in the gauss of distribution function and the Garment region, according to equation below meter Calculate the degree of fitting between the RGB color value and background color of each pixel in the Garment region:
P ( C c l o t h i ) = 1 ( 2 π ) 3 | Σ | exp ( - 1 2 ( C c l o t h i - C ‾ ) T Σ - 1 ( C c l o t h i - C ‾ ) )
Wherein,Represent the plan between the RGB color value and background color of ith pixel in the Garment region It is right,The background pixel average color is represented, ∑ represents the covariance matrix,Represent in the Garment region The color value of ith pixel point;
The color degree of fitting parameter between the Garment region and background area is calculated according to equation below:
P c l o t h = 1 N c l o t h Σ i ∈ R c l o t h P ( C c l o t h i )
Wherein, PclothRepresent the color degree of fitting parameter between the Garment region and background area, NclothRepresent the clothes The gross area in region,Represent each pixel in the Garment region RGB color value and background color it Between degree of fitting, RclothRepresent the coordinate of the Garment region.
12. detection means as claimed in claim 8, it is characterised in that described device also includes Primary Location module;
The Primary Location module, for receiving certificate photo image to be detected, treats to described according to default method for detecting human face Human face region coordinate and eye areas coordinate in detection certificate photo image carries out coordinate setting respectively.
13. detection means as described in any one of claim 8 to 12, it is characterised in that the evaluation module includes:
Grading module, for based on the face average brightness value, left and right face luminance difference value, face inclination angle value, background The fitting of the colour consistency parameter in region, the slickness parameter of background area and the color between Garment region and background area Degree parameter, calculates corresponding fraction respectively;
Evaluate submodule, for based on the fraction of the face brightness, human face light uniformity fraction, face gradient fraction, The background area color monotonicity fraction, the background area structure monotonicity fraction and the Garment region color-identifying degree Fraction, evaluates to the certificate photo image to be detected according to default evaluation criterion, obtains evaluation result;
Output module, for by the evaluation result output display.
14. detection means as claimed in claim 13, it is characterised in that institute's scoring module includes:
Standards of grading formulate module, for determining the guarantee output valve scope of standards of grading between 0 to 100, the scoring mark Accurate basis for estimation isWherein, T represents a truncation funcation;
Face brightness fraction computing module, for the fraction of the face brightness is calculated according to equation below:
S1=T (100- | 170-Iface|*0.45)
Wherein, S1Represent the fraction of the face brightness, IfaceRepresent the face average brightness value;
Human face light uniformity fraction computing module, for calculating the human face light uniformity fraction according to equation below:
SDI=T (100-DIface)
Wherein, SDIRepresent the human face light uniformity fraction, DIfaceRepresent the left and right face luminance difference value;
Face gradient fraction computing module, for calculating the face gradient fraction according to equation below:
S A n g = T ( 100 - | A n g | * 2 π )
Wherein, SAngThe face gradient fraction is represented, Ang represents the face inclination angle value;
Background color monotonicity fraction computing module, for calculating the background area color monotonicity point according to equation below Number:
SBC=100*exp (- σmax/20)
Wherein, the SBCRepresent background area color monotonicity fraction, σmaxRepresent the colour consistency parameter;
Background structure monotonicity fraction computing module, for calculating the background area structure monotonicity point according to equation below Number:
SBE=T (100-Re*2000)
Wherein, SBEThe expression background area structure monotonicity fraction, ReRepresent the background area slickness parameter;
Garment region color-identifying degree fraction computing module, for calculating the Garment region color-identifying degree according to equation below Fraction:
Scloth=T (100-Pcloth*100)
Wherein, SclothRepresent the color region color-identifying degree fraction, PclothRepresent the Garment region and background area it Between color degree of fitting parameter.
CN201610927700.4A 2016-10-31 2016-10-31 The quality determining method and detection means of a kind of certificate photo Active CN106558046B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610927700.4A CN106558046B (en) 2016-10-31 2016-10-31 The quality determining method and detection means of a kind of certificate photo

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610927700.4A CN106558046B (en) 2016-10-31 2016-10-31 The quality determining method and detection means of a kind of certificate photo

Publications (2)

Publication Number Publication Date
CN106558046A true CN106558046A (en) 2017-04-05
CN106558046B CN106558046B (en) 2018-02-06

Family

ID=58443195

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610927700.4A Active CN106558046B (en) 2016-10-31 2016-10-31 The quality determining method and detection means of a kind of certificate photo

Country Status (1)

Country Link
CN (1) CN106558046B (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108921148A (en) * 2018-09-07 2018-11-30 北京相貌空间科技有限公司 Determine the method and device of positive face tilt angle
CN109165674A (en) * 2018-07-19 2019-01-08 南京富士通南大软件技术有限公司 A kind of certificate photo classification method based on multi-tag depth convolutional network
CN109359703A (en) * 2018-12-19 2019-02-19 佛山科学技术学院 A kind of ceramic tile grade separation method based on decision tree
CN110648321A (en) * 2019-09-24 2020-01-03 西北工业大学 Method for evaluating uniformity of temperature inside oven
CN110706296A (en) * 2019-10-11 2020-01-17 北京弘远博学科技有限公司 Batch automatic detection method for background color compliance of electronic certificate photos
CN110889470A (en) * 2018-09-07 2020-03-17 京东数字科技控股有限公司 Method and apparatus for processing image
CN111429535A (en) * 2020-03-13 2020-07-17 深圳市雄帝科技股份有限公司 Method, system, device and medium for evaluating difference degree between clothes and background in image
CN112700396A (en) * 2019-10-17 2021-04-23 中国移动通信集团浙江有限公司 Illumination evaluation method and device for face picture, computing equipment and storage medium
CN112883771A (en) * 2020-09-17 2021-06-01 密传金 Face image quality detection method
CN112991159A (en) * 2021-04-29 2021-06-18 南京甄视智能科技有限公司 Face illumination quality evaluation method, system, server and computer readable medium
CN113724130A (en) * 2021-08-20 2021-11-30 深圳市飘飘宝贝有限公司 Width-variable portrait fine matting method, device, equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010005222A1 (en) * 1999-12-24 2001-06-28 Yoshihiro Yamaguchi Identification photo system and image processing method
CN105120167A (en) * 2015-08-31 2015-12-02 广州市幸福网络技术有限公司 Certificate picture camera and certificate picture photographing method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010005222A1 (en) * 1999-12-24 2001-06-28 Yoshihiro Yamaguchi Identification photo system and image processing method
CN105120167A (en) * 2015-08-31 2015-12-02 广州市幸福网络技术有限公司 Certificate picture camera and certificate picture photographing method

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109165674A (en) * 2018-07-19 2019-01-08 南京富士通南大软件技术有限公司 A kind of certificate photo classification method based on multi-tag depth convolutional network
CN110889470A (en) * 2018-09-07 2020-03-17 京东数字科技控股有限公司 Method and apparatus for processing image
CN108921148A (en) * 2018-09-07 2018-11-30 北京相貌空间科技有限公司 Determine the method and device of positive face tilt angle
CN110889470B (en) * 2018-09-07 2023-11-07 京东科技控股股份有限公司 Method and apparatus for processing image
CN109359703A (en) * 2018-12-19 2019-02-19 佛山科学技术学院 A kind of ceramic tile grade separation method based on decision tree
CN110648321A (en) * 2019-09-24 2020-01-03 西北工业大学 Method for evaluating uniformity of temperature inside oven
CN110706296B (en) * 2019-10-11 2023-06-16 北京弘远博学科技有限公司 Batch automatic detection method for background color compliance of electronic certificate photos
CN110706296A (en) * 2019-10-11 2020-01-17 北京弘远博学科技有限公司 Batch automatic detection method for background color compliance of electronic certificate photos
CN112700396A (en) * 2019-10-17 2021-04-23 中国移动通信集团浙江有限公司 Illumination evaluation method and device for face picture, computing equipment and storage medium
CN111429535A (en) * 2020-03-13 2020-07-17 深圳市雄帝科技股份有限公司 Method, system, device and medium for evaluating difference degree between clothes and background in image
CN111429535B (en) * 2020-03-13 2023-09-08 深圳市雄帝科技股份有限公司 Method, system, equipment and medium for evaluating difference degree between clothes and background in image
CN112883771A (en) * 2020-09-17 2021-06-01 密传金 Face image quality detection method
CN112991159A (en) * 2021-04-29 2021-06-18 南京甄视智能科技有限公司 Face illumination quality evaluation method, system, server and computer readable medium
CN113724130A (en) * 2021-08-20 2021-11-30 深圳市飘飘宝贝有限公司 Width-variable portrait fine matting method, device, equipment and storage medium
CN113724130B (en) * 2021-08-20 2022-04-19 深圳市飘飘宝贝有限公司 Width-variable portrait fine matting method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN106558046B (en) 2018-02-06

Similar Documents

Publication Publication Date Title
CN106558046B (en) The quality determining method and detection means of a kind of certificate photo
CN105426828B (en) Method for detecting human face, apparatus and system
CN104572971B (en) The method and apparatus of image retrieval
CN108010024B (en) Blind reference tone mapping image quality evaluation method
CN107123088B (en) A kind of method of automatic replacement photo background color
CN103914708B (en) Food kind detection method based on machine vision and system
CN105931224A (en) Pathology identification method for routine scan CT image of liver based on random forests
CN105096347B (en) Image processing apparatus and method
CN106506901B (en) A kind of hybrid digital picture halftoning method of significance visual attention model
CN104243973B (en) Video perceived quality non-reference objective evaluation method based on areas of interest
CN107396095B (en) A kind of no reference three-dimensional image quality evaluation method
CN106127773A (en) A kind of foot type data capture method based on picture
CN106097354B (en) A kind of hand images dividing method of combining adaptive Gauss Face Detection and region growing
CN104766319B (en) Lifting night takes pictures the method for image registration accuracy
CN109215091B (en) Clothing fashion color automatic extraction method based on graph representation
CN104504722B (en) Method for correcting image colors through gray points
CN105959585A (en) Multi-grade backlight detection method and device
CN108961227A (en) A kind of image quality evaluating method based on airspace and transform domain multiple features fusion
CN102819850A (en) Method for detecting edge of color image on basis of local self-adaption color difference threshold
CN105930798A (en) Tongue image quick detection and segmentation method based on learning and oriented to handset application
CN116309409A (en) Weld defect detection method, system and storage medium
CN104182970A (en) Souvenir photo portrait position recommendation method based on photography composition rule
CN106326834A (en) Human body gender automatic identification method and apparatus
CN110569784A (en) Human body size measuring method and system, storage medium and electronic equipment
CN106934770A (en) A kind of method and apparatus for evaluating haze image defog effect

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address

Address after: A1-603, Building A, Kexing Science Park, No. 15 Keyuan Road, Science Park Community, Yuehai Street, Nanshan District, Shenzhen City, Guangdong Province, 518035

Patentee after: Shenzhen 666 Network Service Co.,Ltd.

Address before: 518000 Science and Technology Building 401K, No. 9 Scientific Research Road, Nanshan Street, Nanshan District, Shenzhen City, Guangdong Province

Patentee before: SHENZHEN PIAOPIAO BAOBEI CO.,LTD.

CP03 Change of name, title or address