CN107451555B  Hair direction judging method based on gradient direction  Google Patents
Hair direction judging method based on gradient direction Download PDFInfo
 Publication number
 CN107451555B CN107451555B CN201710626890.0A CN201710626890A CN107451555B CN 107451555 B CN107451555 B CN 107451555B CN 201710626890 A CN201710626890 A CN 201710626890A CN 107451555 B CN107451555 B CN 107451555B
 Authority
 CN
 China
 Prior art keywords
 gradient
 hair
 point
 region
 image
 Prior art date
 Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
 Active
Links
 210000004209 Hair Anatomy 0.000 title claims abstract description 88
 238000004364 calculation methods Methods 0.000 claims abstract description 11
 238000000034 methods Methods 0.000 claims abstract description 10
 239000011159 matrix material Substances 0.000 claims description 9
 238000001514 detection method Methods 0.000 claims description 2
 238000010606 normalization Methods 0.000 claims description 2
 238000000926 separation method Methods 0.000 claims 7
 239000010936 titanium Substances 0.000 description 12
 210000004709 Eyebrows Anatomy 0.000 description 2
 238000004043 dyeing Methods 0.000 description 2
 229920001276 Ammonium polyphosphate Polymers 0.000 description 1
 210000003128 Head Anatomy 0.000 description 1
 210000001331 Nose Anatomy 0.000 description 1
 230000000694 effects Effects 0.000 description 1
 238000005286 illumination Methods 0.000 description 1
 230000004048 modification Effects 0.000 description 1
 238000006011 modification reactions Methods 0.000 description 1
 210000000056 organs Anatomy 0.000 description 1
Classifications

 G—PHYSICS
 G06—COMPUTING; CALCULATING; COUNTING
 G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
 G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
 G06K9/00221—Acquiring or recognising human faces, facial parts, facial sketches, facial expressions
 G06K9/00268—Feature extraction; Face representation
 G06K9/00281—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships

 G—PHYSICS
 G06—COMPUTING; CALCULATING; COUNTING
 G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
 G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
 G06K9/20—Image acquisition
 G06K9/32—Aligning or centering of the image pickup or imagefield
 G06K9/3233—Determination of region of interest
 G06K9/3241—Recognising objects as potential recognition candidates based on visual cues, e.g. shape
Abstract
The invention relates to a hair direction judging method based on a gradient direction, which comprises the following steps: (1) detecting 68 characteristic points of the face in the face portrait picture by using a Dlib face characteristic point detection algorithm; (2) rotating the image according to 68 characteristic points of the face to enable the face to be in the horizontal position of the image, and then intercepting an integral photo area of the face containing hair; (3) cutting out the image of the hair region aiming at the transformed complete face region, and then calculating the gradient direction of each pixel point of the image of the region; (4) and traversing the gradient direction of each pixel point of the whole hair area, and judging the direction of the hair according to the change condition of the gradient direction. The method can process the pictures shot at any angle, has high judgment accuracy, and can effectively reduce the overall calculation amount of the algorithm and improve the processing speed because the gradient direction is calculated and the gradient information is processed in a traversing way only in the hair area.
Description
Technical Field
The invention relates to the technical field of face recognition, in particular to a hair direction judging method based on a gradient direction.
Background
Human face avatar processing is an important research direction in the field of image processing. In recent years, various face beautifying functions provided by various mobile phone APP software based on face portrait processing are very popular with users, particularly female users.
Based on the hair treatment of the face portrait, the method can provide abundant functions for beautifying the face. For example, functions such as hair dyeing and hair changing related to hair in face beautification, hair style selection and hair line generation in a face cartoon, the whole effect of face beautification treatment can be influenced to a great extent by the hair treatment of the face portrait, and rich and comprehensive face beautification experience is provided for users.
The hair direction judgment is the basis of hair treatment in the face portrait, and can guide subsequent hair changing, hair dyeing and hair style selection to be equal to hair related treatment. The existing hair direction dividing judgment method mainly comprises the steps of binarizing face and hair region images, then extracting edge lines of the image region above eyebrows, and judging the hair direction by the bifurcation change of the hair edge lines. The method is greatly influenced by the illumination environment, the branching conditions of a plurality of complex edge lines cannot be judged, and the accuracy is low.
Disclosure of Invention
In order to solve the above technical problems, an object of the present invention is to provide a method for judging hair direction based on gradient direction, which can process images shot at any angle, and has high accuracy and high processing speed.
In order to achieve the technical purpose, the invention provides a hair direction judging method based on a gradient direction, which specifically comprises the following steps:
(1) detecting 68 characteristic points (marked as B1B68 respectively) of the face in the face portrait picture by using a Dlib face characteristic point detection algorithm;
(2) rotating the image according to 68 characteristic points of the face to enable the face to be in the horizontal position of the image, and then intercepting an integral photo area of the face containing hair;
(3) cutting out the image of the hair region aiming at the transformed complete face region, and then calculating the gradient direction of each pixel point of the image of the region;
(4) and traversing the gradient direction of each pixel point of the whole hair area, and judging the direction of the hair according to the change condition of the gradient direction.
Specifically, the step (2) includes the steps of:
(2.1) calculating an included angle alpha between a straight line formed by points B1 and B17 of the human face contour and a horizontal line;
(2.2) rotating the image clockwise by alpha to obtain an image of a horizontal face, and correcting 68 coordinates of the characteristic point of the face;
(2.3) determining the coordinates (X) of the center point C of the face_{c}，Y_{c})；
(2.4) calculating X coordinates of upper left and lower right corners of the rectangular face region based on the Cpoint seat and the coordinates of points B1 and B17;
(2.5) calculating Y coordinates of upper left and lower right corners of the rectangular face region based on the Cpoint seat and the coordinates of points B1 and B17;
and (2.6) determining and intercepting face and hair area images based on the coordinates of the upper left corner and the lower right corner.
Specifically, the step (3) includes the steps of:
(3.1) determining a hair region in the face image;
(3.2) calculating the gradient angle theta of each pixel point in the hair region;
and (3.3) obtaining a gradient angle matrix of the pixel points in the hair area.
Specifically, the step (4) includes the steps of:
(4.1) normalizing the gradient angle matrix;
(4.2) dividing the hair region image into 2N small regions (N is a positive integer), calculating the average value of the gradient angle of each region, and normalizing the average value;
(4.3) calculating the average value of the gradient angles of the left N blocks of regions and the right N blocks of regions;
and (4.4) judging the hair direction according to the average value of the gradient angles of the N areas on the left side and the right side.
In particular, the angleWherein (X)_{B1},Y_{B1}) Is B1 point coordinate, (X)_{B17},Y_{B17}) Is B17 point coordinates.
Specifically, the calculation formula of the X coordinates of the upper left corner and the lower right corner of the rectangular face area is as follows: x coordinate X of upper left corner point_{L}Can be calculated by the following formula, X_{L}＝X_{C}1.3×(X_{C}X_{B1}) If X is_{L}Less than 0, then X_{L}0; x coordinate X of lower right corner_{R}Can be calculated by the following formula, X_{R}＝X_{C}+1.3×(X_{B17}X_{C}) (ii) a If X is calculated above_{R}Greater than the Width Width of the image, then X_{R}Width; wherein, the coordinates of the point C
Specifically, the calculation formula of the Y coordinates of the upper left corner and the lower right corner of the rectangular face area is as follows: y coordinate Y of upper left corner point_{L}Can be calculated by the following formula, Y_{L}＝Y_{C}1.5×(Y_{B9}Y_{C}) Wherein B9 is the edge of the face contour that marks the chin boundary, if Y_{L}Less than 0, then Y_{L}0; y coordinate Y of lower right corner_{R}Can be calculated by the following formula, Y_{R}＝Y_{C}+1.5×(Y_{B9}Y_{C}) If Y is calculated as above_{R}Greater than Height of image, Y_{R}Height; wherein, the coordinates of the point C
Specifically, the method for calculating the gradient angle of each pixel point in the hair region is as follows: point P_{i}(I, j) is the pixel value of I (I, j), firstly, the pixel point P is calculated_{i}Gradient D in Xdirection_{x}Gradient D in (I, J) and Y directions_{y}(I, J) respectively calculating the formula: d_{x}(I，J)＝I(I+1，J)I(I1，J)，D_{y}(I, J) ═ I (I, J +1) I (I, J1), pixel P_{i}Angle of gradient (D)θ＝atan2(D_{y}(I，J)，D_{x} ^{(}I，J))。
Specifically, the gradient angle matrix normalization formula is: when in useConsidering the direction to the right, and setting theta to be 1; when in useConsidering the position to be deviated to the left, and setting theta as1; when in useWhen the deviation is not generated, θ is set to 0.
Specifically, the hair region image is divided into 12 small regions by the following method: taking the center points C of the characteristic points B1 and B17 as the circle centers, drawing a sector area from the point B1 at intervals of 15 degrees, and traversing to the point B17 to obtain exactly 12 sector areas which are respectively marked as S1 to S12; the process of calculating and normalizing the average value of the gradient angle of each block area is as follows: adding the gradient angles of each pixel point in the Si region and then dividing the sum by the number of the pixel points to obtain the average gradient value Ti of the Si region; when Ti is more than 0.3 and less than or equal to 1, the region is considered to be deviated to the right, and Ti is set to 1; when theta is more than or equal to1 and less than or equal to0.3, the area is considered to be deviated to the left, and Ti is set as1; when0.3 < theta.ltoreq.0.3, the region is considered to have no bias, and Ti is set to 0.
According to the technical scheme, the hair direction judgment method based on the gradient direction detects 68 characteristic points of the face in the face portrait picture by using a Dlib face characteristic point detection algorithm, rotates the image according to the characteristic points in the face portrait picture, enables the face portrait to be in a horizontal position, and can process images shot at any angle; in the processing process, the face image area containing the hair is intercepted from the whole image, and in the hair direction judging process, the gradient direction calculation and the traversing processing of gradient information are only carried out in the hair area, so that the whole calculation amount of the algorithm can be effectively reduced, and the processing speed is improved; the gradient direction of the hair area can well reflect the trend of the hair, so the change of the gradient direction can be naturally estimated.
Drawings
(1) FIG. 1 is a flow chart of the steps of the present invention.
(2) FIG. 2 is a detailed flowchart of step (2) in the present invention.
Detailed Description
The invention is further described below with reference to the accompanying drawings:
referring to fig. 12, the method for determining a direction of hair based on a gradient direction provided in this embodiment includes the following steps:
(1) detecting 68 characteristic points of the face in the face portrait picture by using a Dlib face characteristic point detection algorithm; the detection of the human face characteristic points can be completed by adopting a human face characteristic point detection algorithm of an open source library Dlib, 68 characteristic points of a human face can be detected by a Dlib library function, the detection speed is high, the positions of the characteristic points are accurate, the method is a currently popular human face characteristic point detection method, 17 points are used for the face contour edge, 10 points are used for two eyebrows, 12 points are used for two eyes, 9 points are used for a nose, 20 points are used for a mouth, and the specific positions of various organs of the face can be conveniently determined through the 68 points;
(2) marking 68 characteristic points detected by the Dlib as B1B68, wherein 17 points of the face contour edge are B1B17, and a straight line consisting of B1 and B17, and calculating the slopeThe angle α between the line and the horizontal line can be found, where (X)_{B1},Y_{B1}) Is B1 point coordinate, (X)_{B17},Y_{B17}) B17 point coordinates;
(3) clockwise rotating the whole picture by an angle alpha by taking B1 as a circle center to obtain a corrected face image, so that the face is in a horizontal position, and meanwhile, clockwise rotating 68 feature points by an angle alpha to correct to obtain feature points in the horizontal face;
(4) the center positions of the 1 st point B1 and the 17 th point B17 in the corrected 68 feature points are marked as a point C, the point C is the center point of the whole hair and face image, and the coordinate of the center point C can be calculated
(5) Calculating X coordinates of a left upper corner point and a right lower corner point of the rectangular frame of the intercepted face and hair area based on the coordinates of the point C and the coordinates of a face edge starting point B1 and an end point B17; wherein, the X coordinate X of the upper left corner point_{L}Can be calculated by the following formula, X_{L}＝X_{C}1.3×(X_{C}X_{B1}) If X is_{L}Less than 0, then X_{L}0; x coordinate X of lower right corner_{R}Can be calculated by the following formula, X_{R}＝X_{C}+1.3×(X_{B17}X_{C}) (ii) a If X is calculated above_{R}Greater than the Width Width of the image, then X_{R}＝Width；
(6) Based on the coordinates of the point C and the coordinates of the starting point B1 and the ending point B17 of the face edge, the Y coordinates of the upper left corner point and the lower right corner point of the rectangle frame of the intercepted face and hair area are calculated, and the Y coordinate of the upper left corner point is Y coordinate_{L}Can be calculated by the following formula, Y_{L}＝Y_{C}1.5×(Y_{B9}Y_{C}) Wherein B9 is the edge of the face contour that marks the chin boundary, if Y_{L}Less than 0, then Y_{L}0; y coordinate Y of lower right corner_{R}Can be calculated by the following formula, Y_{R}＝Y_{C}+1.5×(Y_{B9}Y_{C}) If Y is calculated as above_{R}Greater than Height of image, Y_{R}＝Height；
(7) Coordinates of upper left corner (X)_{L}，Y_{L}) And coordinates of lower right corner (X)_{R}，Y_{R}) The rectangular area marked by the method is the face and hair area image, and the image area is zoomed, so that the zoomed image width is 480 pixels;
(8) characteristic point B_{i}Has an original coordinate of (X)_{Bi}，Y_{Bi}) B in the new image obtained in step 2_{i}Is (X'_{Bi}，Y_{Bi}') can be calculated by the following formula, wherein the X coordinate is calculated by the formula:the calculation formula of the Y coordinate is as follows:
(9) taking the area above the ears as the hair area, and intercepting the Y coordinate which is less than Y ═ max (Y)_{B1}，Y_{B17}) As a hair region;
(10) calculating the gradient angle of each pixel point (I, j) in the hair region, wherein the pixel value of the pixel point (I, j) is I (I, j), and firstly calculating the gradient D of the pixel point in the X direction_{x}Gradient D in (I, J) and Y directions_{y}(I, J) respectively calculating the formula: d_{x}(I，J)＝I(I+1，J)I(I1，J)，D_{y}(I, J) ═ I (I, J +1) I (I, J1), gradient angle θ of pixel point ═ atan2 (D)_{y}(I，J)，D_{x} ^{(}I，J))；
(11) Obtaining a gradient angle matrix of a new image, wherein the Y coordinate is larger than Y ═ max (Y)_{B1}，Y_{B17}) In other regions, the gradient angle θ of the pixel point is set to 0;
(12) normalizing the gradient angle matrix of the image by normalizing the gradient angle of the pixels by the following formula: when in useConsidering the direction to the right, and setting theta to be 1; when in useConsidering the position to be deviated to the left, and setting theta as1; when in useConsidering no deviation, and setting theta as 0;
(13) taking the central points C of the characteristic points B1 and B17 as the circle centers, drawing a sector area from the point B1 at intervals of 15 degrees, traversing to the point B17 to obtain just 12 sector areas, respectively marking as S1 to S12, and adding the gradient angle of each pixel point in each sector area Si according to the normalized gradient angle matrix, and then dividing the gradient angle by the number of the pixel points to obtain the area average gradient value Ti of the Si;
(14) normalizing the average gradient value Ti of the sector area, and normalizing the average gradient Ti of the 12 areas calculated by the following formula: when Ti is more than 0.3 and less than or equal to 1, the region is considered to be deviated to the right, and Ti is set to 1; when theta is more than or equal to1 and less than or equal to0.3, the area is considered to be deviated to the left, and Ti is set as1; when theta is more than0.3 and less than or equal to 0.3, the area is considered to have no deviation, and Ti is set to be 0;
(15) summing the average gradient values of 6 fanshaped areas in the left half area of the hair image to obtain a gradient value T_{L}(ii) a Summing the average gradient values of 6 fanshaped areas of the right half area to obtain a gradient value T_{R}(ii) a For T_{L}And T_{R}When it is less than 0, it indicates that the hair is biased to the left, set to1; when the hair deflection is larger than 0, the hair is deflected to the right and is set as 1;
(16) based on T_{L}And T_{R}Judging the hair direction according to the different values of the hair direction; there are a total of 4 states of hair distribution: dividing, left dividing, right dividing and undivided; when T is_{L}Is1 and T_{R}When the number is 1, the number is judged to be the median; when T is_{L}+T_{R}<When 0, judging as left score; when T is_{L}+T_{R}>When 0, judging as right score; in addition to the abovedescribed bisection case, T_{L}+T_{R}When 0, the hair has no direction.
The invention uses Dlib face feature point detection algorithm to detect 68 feature points of the face in the face head portrait picture, and can process images shot at any angle; in the processing process, the face image area containing the hair is intercepted from the whole image, and in the hair direction judging process, the gradient direction calculation and the traversing processing of gradient information are only carried out in the hair area, so that the whole calculation amount of the algorithm can be effectively reduced, and the processing speed is improved; because the gradient direction of the hair area can well reflect the trend of the hair, the change of the gradient direction can be naturally estimated, and the invention can accurately judge the direction of the hair based on the principle and has higher accuracy; the invention has good application value.
The abovementioned embodiments are merely illustrative of the preferred embodiments of the present invention, and do not limit the scope of the present invention, and various modifications and improvements of the technical solution of the present invention by those skilled in the art should fall within the protection scope defined by the claims of the present invention without departing from the spirit of the present invention.
Claims (8)
1. A hair direction judging method based on gradient direction is characterized in that: the method comprises the following steps:
(1) detecting 68 characteristic points of the human face in the human face head portrait picture by using a Dlib human face characteristic point detection algorithm, and sequentially marking the 68 characteristic points as B1B68;
(2) rotating the image according to 68 characteristic points of the face to enable the face to be in the horizontal position of the image, and then intercepting an integral photo area of the face containing hair;
(3) and (3) intercepting an image of a hair region aiming at the transformed complete face region, and then calculating the gradient direction of each pixel point of the image of the region, wherein the step (3) comprises the following steps:
(3.1) determining a hair region in the face image;
(3.2) calculating the gradient angle theta of each pixel point in the hair region;
(3.3) obtaining a gradient angle matrix of the pixel points in the hair area;
(4) traversing the gradient direction of each pixel point of the whole hair area, and judging the direction of the hair according to the change condition of the gradient direction, wherein the step (4) comprises the following steps:
(4.1) normalizing the gradient angle matrix;
(4.2) dividing the hair region image into 2N small regions, wherein N is a positive integer, calculating a gradient angle average value of each region, and normalizing the average value;
(4.3) calculating the average value of the gradient angles of the left N blocks of regions and the right N blocks of regions;
and (4.4) judging the hair direction according to the average value of the gradient angles of the N areas on the left side and the right side.
2. The method for judging the direction of hair separation based on a gradient direction according to claim 1, characterized in that: the step (2) comprises the following steps:
(2.1) calculating an included angle alpha between a straight line formed by points B1 and B17 of the human face contour and a horizontal line;
(2.2) rotating the image clockwise by alpha to obtain an image of a horizontal face, and correcting 68 coordinates of the characteristic point of the face;
(2.3) determining the coordinates of the center point C of the faceWherein (X)_{B1},Y_{B1}) Is B1 point coordinate, (X)_{B17},Y_{B17}) B17 point coordinates;
(2.4) calculating X coordinates of upper left and lower right corners of the rectangular face region based on the Cpoint seat and the coordinates of points B1 and B17;
(2.5) calculating Y coordinates of upper left and lower right corners of the rectangular face region based on the Cpoint seat and the coordinates of points B1 and B17;
and (2.6) determining and intercepting face and hair area images based on the coordinates of the upper left corner and the lower right corner.
3. The method for judging the direction of hair separation based on a gradient direction according to claim 2, characterized in that: the angleWherein (X)_{B1},Y_{B1}) Is B1 point coordinate, (X)_{B17},Y_{B17}) Is the B17 point coordinate.
4. The method for judging the direction of hair separation based on a gradient direction according to claim 2, characterized in that: the calculation formula of the X coordinates of the upper left corner and the lower right corner of the rectangular face area is as follows: x coordinate X of upper left corner point_{L}Can be calculated by the following formula, X_{L}＝X_{C}1.3×(X_{C}X_{B1}) If X is_{L}Less than 0, then X_{L}0; x coordinate X of lower right corner_{R}Can be calculated by the following formula, X_{R}＝X_{C}+1.3×(X_{B17}X_{C}) (ii) a If X is calculated above_{R}Greater than the Width Width of the image, then X_{R}Width; wherein the coordinates of point C
5. The method for judging the direction of hair separation based on a gradient direction according to claim 2, characterized in that: the calculation formula of the Y coordinates of the upper left corner and the lower right corner of the rectangular face area is as follows: y coordinate Y of upper left corner point_{L}Can be calculated by the following formula, Y_{L}＝Y_{C}1.5×(Y_{B9}Y_{C}) Wherein B9 is the edge of the face contour that marks the chin boundary, if Y_{L}Less than 0, then Y_{L}0; y coordinate Y of lower right corner_{R}Can be calculated by the following formula, Y_{R}＝Y_{C}+1.5×(Y_{B9}Y_{C}) If Y is calculated as above_{R}Greater than Height of image, Y_{R}Height; wherein, the coordinates of the point C
6. The method for judging the direction of hair separation based on a gradient direction according to claim 1, characterized in that: what is needed isThe calculation method for calculating the gradient angle theta of each pixel point in the hair region is as follows: point P_{i}(I, j) is the pixel value of I (I, j), firstly, the pixel point P is calculated_{i}Gradient D in Xdirection_{x}Gradient D in (I, J) and Y directions_{y}(I, J) respectively calculating the formula: d_{x}(I，J)＝I(I+1，J)I(I1，J)，D_{y}(I, J) ═ I (I, J +1) I (I, J1), gradient angle θ of pixel point ═ atan2 (D)_{y}(I，J)，D_{x}(I，J))。
7. The method for judging the direction of hair separation based on a gradient direction according to claim 1, characterized in that: the gradient angle matrix normalization formula is as follows: when in useConsidering the direction to the right, and setting theta to be 1; when in useConsidering the position to be deviated to the left, and setting theta as1; when in useWhen the deviation is not generated, θ is set to 0.
8. The method for judging the direction of hair separation based on a gradient direction according to claim 1, characterized in that: dividing the hair region image into 12 small regions, wherein the dividing method comprises the following steps: taking the center points C of the characteristic points B1 and B17 as the circle centers, drawing a sector area from the point B1 at intervals of 15 degrees, and traversing to the point B17 to obtain exactly 12 sector areas which are respectively marked as S1 to S12; the process of calculating and normalizing the average value of the gradient angle of each block area is as follows: adding the gradient angles of each pixel point in the Si region and then dividing the sum by the number of the pixel points to obtain the average gradient value Ti of the Si region; when Ti is more than 0.3 and less than or equal to 1, the region is considered to be deviated to the right, and Ti is set to 1; when theta is more than or equal to1 and less than or equal to0.3, the area is considered to be deviated to the left, and Ti is set as1; when theta is more than0.3 and less than or equal to 0.3, the region is considered to have no deviation, and Ti is set to 0.
Priority Applications (1)
Application Number  Priority Date  Filing Date  Title 

CN201710626890.0A CN107451555B (en)  20170727  20170727  Hair direction judging method based on gradient direction 
Applications Claiming Priority (1)
Application Number  Priority Date  Filing Date  Title 

CN201710626890.0A CN107451555B (en)  20170727  20170727  Hair direction judging method based on gradient direction 
Publications (2)
Publication Number  Publication Date 

CN107451555A CN107451555A (en)  20171208 
CN107451555B true CN107451555B (en)  20200825 
Family
ID=60489739
Family Applications (1)
Application Number  Title  Priority Date  Filing Date 

CN201710626890.0A Active CN107451555B (en)  20170727  20170727  Hair direction judging method based on gradient direction 
Country Status (1)
Country  Link 

CN (1)  CN107451555B (en) 
Families Citing this family (1)
Publication number  Priority date  Publication date  Assignee  Title 

CN110245568B (en) *  20190515  20200131  特斯联（北京）科技有限公司  area security method and system based on face recognition 
Citations (7)
Publication number  Priority date  Publication date  Assignee  Title 

JP2006181100A (en) *  20041227  20060713  Kao Corp  Method of measuring shape of hair 
CN101163189A (en) *  20061013  20080416  上海银晨智能识别科技有限公司  Face image correcting method 
CN101488224A (en) *  20080116  20090722  中国科学院自动化研究所  Characteristic point matching method based on relativity measurement 
CN102436636A (en) *  20100929  20120502  中国科学院计算技术研究所  Method and system for segmenting hair automatically 
CN103093488A (en) *  20130202  20130508  浙江大学  Virtual haircut interpolation and tweening animation producing method 
CN105718869A (en) *  20160115  20160629  网易（杭州）网络有限公司  Method and device for estimating face score in picture 
CN106503625A (en) *  20160928  20170315  维沃移动通信有限公司  A kind of method of detection hair distribution situation and mobile terminal 

2017
 20170727 CN CN201710626890.0A patent/CN107451555B/en active Active
Patent Citations (7)
Publication number  Priority date  Publication date  Assignee  Title 

JP2006181100A (en) *  20041227  20060713  Kao Corp  Method of measuring shape of hair 
CN101163189A (en) *  20061013  20080416  上海银晨智能识别科技有限公司  Face image correcting method 
CN101488224A (en) *  20080116  20090722  中国科学院自动化研究所  Characteristic point matching method based on relativity measurement 
CN102436636A (en) *  20100929  20120502  中国科学院计算技术研究所  Method and system for segmenting hair automatically 
CN103093488A (en) *  20130202  20130508  浙江大学  Virtual haircut interpolation and tweening animation producing method 
CN105718869A (en) *  20160115  20160629  网易（杭州）网络有限公司  Method and device for estimating face score in picture 
CN106503625A (en) *  20160928  20170315  维沃移动通信有限公司  A kind of method of detection hair distribution situation and mobile terminal 
NonPatent Citations (4)
Title 

Hair Segmentation Using HeuristicallyTrained Neural Networks;Wenzhangzhi Guo;《https://tspace.library.utoronto.ca/bitstream/1807/72673/3/Guo_Wenzhangzhi_201606_MAS_thesis.pdf》;20160630;第2.2.2节，第2.4节 * 
Quantitative Evaluation of Hair Texture;Wenzhangzhi Guo 等;《2015 IEEE International Symposium on Multimedia》;20151216;7780 * 
Semantic face segmentation from video streams in the wild;Deividas SKIPARIS;《http://sergioescalera.com/wpcontent/uploads/2017/07/MAI_Thesis_DeividasSkiparis.pdf》;20170616;正文第2页第3段 * 
基于深度学习特征的稀疏表示的人脸识别方法;马晓 等;《智能系统学报》;20160513;第11卷(第3期);279286 * 
Also Published As
Publication number  Publication date 

CN107451555A (en)  20171208 
Similar Documents
Publication  Publication Date  Title 

US9672416B2 (en)  Facial expression tracking  
US10198623B2 (en)  Threedimensional facial recognition method and system  
US10586570B2 (en)  Real time video processing for changing proportions of an object in the video  
US20180225828A1 (en)  Image processing method and processing system  
US9916676B2 (en)  3D model rendering method and apparatus and terminal device  
CN104583902B (en)  The identification of improved gesture  
US9483835B2 (en)  Depth value restoration method and system  
US9262671B2 (en)  Systems, methods, and software for detecting an object in an image  
WO2017133605A1 (en)  Method and device for facial tracking and smart terminal  
CN104813258B (en)  Data input device  
JP4307496B2 (en)  Facial part detection device and program  
CN105474263B (en)  System and method for generating threedimensional face model  
US20150093018A1 (en)  Systems and methods for three dimensional geometric reconstruction of captured image data  
KR100556856B1 (en)  Screen control method and apparatus in mobile telecommunication terminal equipment  
US8369574B2 (en)  Person tracking method, person tracking apparatus, and person tracking program storage medium  
CN100561503C (en)  A kind of people's face canthus and corners of the mouth location and method and the device followed the tracks of  
EP3144899B1 (en)  Apparatus and method for adjusting brightness of image  
JP4830650B2 (en)  Tracking device  
US20150253864A1 (en)  Image Processor Comprising Gesture Recognition System with Finger Detection and Tracking Functionality  
CN103116896B (en)  Visual saliency model based automatic detecting and tracking method  
US9632678B2 (en)  Image processing apparatus, image processing method, and program  
CN101593022B (en)  Method for quickspeed humancomputer interaction based on finger tip tracking  
US8077969B2 (en)  Contour finding in segmentation of video sequences  
US8401247B2 (en)  Image processing apparatus, biometric authentication apparatus, image processing method and recording medium  
CN103839250B (en)  The method and apparatus processing for faceimage 
Legal Events
Date  Code  Title  Description 

PB01  Publication  
PB01  Publication  
SE01  Entry into force of request for substantive examination  
SE01  Entry into force of request for substantive examination  
CB02  Change of applicant information  
CB02  Change of applicant information 
Address after: 230000 Yafu Park, Juchao Economic Development Zone, Chaohu City, Hefei City, Anhui Province Applicant after: ANHUI HUISHI JINTONG TECHNOLOGY Co.,Ltd. Address before: 102, room 602, C District, Hefei National University, Mount Huangshan Road, 230000 Hefei Road, Anhui, China Applicant before: ANHUI HUISHI JINTONG TECHNOLOGY Co.,Ltd. 

GR01  Patent grant  
GR01  Patent grant 