CN102184543A - Method of face and eye location and distance measurement - Google Patents

Method of face and eye location and distance measurement Download PDF

Info

Publication number
CN102184543A
CN102184543A CN 201110125628 CN201110125628A CN102184543A CN 102184543 A CN102184543 A CN 102184543A CN 201110125628 CN201110125628 CN 201110125628 CN 201110125628 A CN201110125628 A CN 201110125628A CN 102184543 A CN102184543 A CN 102184543A
Authority
CN
China
Prior art keywords
projection function
circle
eyes
integral projection
integral
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 201110125628
Other languages
Chinese (zh)
Other versions
CN102184543B (en
Inventor
陈国庆
赵军庆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SUZHOU LIANGJIANG TECHNOLOGY Co Ltd
Original Assignee
SUZHOU LIANGJIANG TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SUZHOU LIANGJIANG TECHNOLOGY Co Ltd filed Critical SUZHOU LIANGJIANG TECHNOLOGY Co Ltd
Priority to CN 201110125628 priority Critical patent/CN102184543B/en
Publication of CN102184543A publication Critical patent/CN102184543A/en
Application granted granted Critical
Publication of CN102184543B publication Critical patent/CN102184543B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention relates to a method of face and eye location and distance measurement, belonging to the technical field of face detection and location. The method comprises the following steps of: 1) finding out vertical coordinates of eyes by utilizing circle difference operator projection functions; 2) finding out horizontal coordinates of two eyes by utilizing hybrid projection functions; 3) displaying an integral projection curve at two directions; 4) defining the vertical coordinates of the eyes as y, and the horizontal coordinates of the two eyes as x1 and x2 according to two crests of integral projection, while the distance between two eyes is an absolute value of (x1-x2), and coordinates of two eyes are (x1, y) and (x2, y). By adopting the method, the accuracy and the speed of the eye location and distance measurement are improved in the face detection.

Description

The method of a kind of people's face eye location and distance measuring and calculating
Technical field
The present invention relates to the method for a kind of people's face eye location and distance measuring and calculating, belong to people's face detection and location technical field.
Background technology
It is the very work of difficulty that people's face detects, and its complexity to a certain extent even surpassed recognition of face.
Through years of researches, the method that plurality of human faces detects has appearred being permitted.And because eyes are face one of notable attribute the most, so eye location becomes the committed step of many method for detecting human face.In case the position of left and right sides eyes is decided, the position at people's face place is also just basic have been determined.According to the length and the direction of line between two, the size and Orientation of human face region also can roughly estimate.
Some effective eye locating methods have appearred at present.For example, people such as Bala proposes a kind of eye locating method based on genetic algorithm and decision tree.This method adopts a kind of blending heredity structure that basic ocular rules is constantly evolved, and finally obtains can be used for the ocular rules of the form of decision tree of eye location.People such as Reinders propose a kind of eye locating method based on neural network, and this method is with the input as neural network of the pixel of search window, if this window comprises eye image, then the output of neural network is bigger.Wu and Zhou propose a kind of eye locating method based on intensity contrast.This method utilizes big these characteristics of the gray scale of eye areas to find out the position of eyes.Yet said method only can provide the approximate location of eyes, can not accurately locate the center of eyes.Therefore, in order to improve the accuracy rate that people's face detects, be necessary to study the pinpoint method of eyes.
Projection is a kind of method of effective extraction characteristics of image.Usually, a width of cloth two dimensional image can be analyzed by the one dimension projection function of two quadratures.The feature of analysis image is convenient in the reduction of dimension, and has reduced calculated amount, so projection becomes a kind of important images analytical approach.Up to the present, existing a lot of scholars successfully apply to locate facial characteristics with projection function.Kanade successfully is applied to recognition of face with the integral projection function the earliest, and he at first carries out binaryzation with Laplace operator to original gray-scale map, with the integral projection function binary map is analyzed then.Brunelli and Poggio have done improvement to the algorithm of Kanade, and they are applied to the boundary graph analysis with the integral projection function, thereby determine the position of facial each feature.The notion of variance projection function is proposed by Feng and Yuen the earliest, and has proposed a kind of straightforward procedure of utilizing variance projection function location eyes simultaneously.Afterwards, they proposed the method for a kind of multi thread location eyes again, wherein used a kind of eyes variance filter (eye variance filter), and this filtrator utilized just the variance projection function to produce.This shows that projection is actually a kind of location technology that often adopts in the recognition of face.
But the eye detection result that traditional mixing integration projection function obtains after for whole face area integration is not satisfactory, and in the simple detection of round difference operator projection function for whole face, relatively accurately bigger for the detection of eyes ordinate but detect the influence that is subjected to ear and hair on the temple for two horizontal ordinate, and speed is slower.So still leave some room for improvement.
Summary of the invention
In order to address the above problem, the present invention proposes the method for a kind of people's face eye location and distance measuring and calculating, improved the accuracy and the speed of eye location and distance measuring and calculating during people's face detects.
The present invention adopts following technical scheme for solving its technical matters:
The method of a kind of people's face eye location and distance measuring and calculating comprises the steps:
1) utilize circle difference operator projection function to find the ordinate of eyes;
The definition of circle difference operator: set up the xy coordinate system on image, initial point is at the image middle position usually; If image (x, y) gray-scale value on the coordinate points is f(x, y); The threshold value of setting difference is h; So that (x y) is the center of circle, makes the circle that radius is r, and all form circle upper set S near the pixel of circumference, and sum of all pixels is n; If the gray-scale value of centre point be f (x, y), then in the S all gray scale (x, y)+number of pixels of h is designated as nl, (x, y)-number of pixels of h is designated as n2, the average gray of all pixels is designated as Favg to all gray scales in the S smaller or equal to f in the S more than or equal to f., 3 round coefficients of variation of definition centre point are as follows:
Circle dark coefficient of variation: b (x, y)=n1/n;
Circle bright coefficient of variation: c (x, y)=n2/n;
Circle mean difference coefficient: v (x, y)=Favg-f (x, y);
Circle difference operator CDO (x, y): if v (x, y)<=0 or b (x, y)<0.6 then CDO (x, y)=0;
if?v(x,y)>0?and?b(x,y)>=0.6
then?CDO(x,y)?=?v(x,y);
In algorithm, generally get 1<=h<=10,2<=r<=10, concrete value will obtain in practical operation;
2) utilize mixed projection function to find two horizontal ordinate;
Mixed projection function is defined as the integral projection function and adds the upside deviation projection function;
Integral projection function: suppose that (x y) represents that (the vertical integral projection function representation in interval [y1, y2] is Sv (x) to point for x, the grey scale pixel value of y) locating to I;
Formula one:
Figure 2011101256280100002DEST_PATH_IMAGE001
(1)
Average integral projection function Mv (x) is expressed as:
Formula two: Mv (x)=Sv (x)/(y2-y1) (2)
The variance projection function: the vertical variance projection function that is located in the interval [y1, y2] is expressed as 2v (x)
Formula three:
Figure 327801DEST_PATH_IMAGE003
2v (x)=
Figure 2011101256280100002DEST_PATH_IMAGE004
2/ (y2-y1) (3)
Vertical mixed projection function Hv (x) is defined as:
Formula four: Hv (x)=
Figure 2011101256280100002DEST_PATH_IMAGE005
/ 2+
Figure 2011101256280100002DEST_PATH_IMAGE006
/ 2 (4)
Earlier image is justified the horizontal integral projection of difference operator, find the highest vertical coordinate that is defined as eyes of peak value, each horizontal ordinate to this coordinate vertically mixes integral projection then, and two crests about finding are as the horizontal ordinate of eyes;
3) integral projection curve of demonstration both direction;
4) ordinate that will obtain eyes is defined as y, and the horizontal ordinate that obtains two according to two crests of integral projection is x1, x2, and then two distance is | x1-x2|, two eye coordinateses are: (x1, y), (x2, y).
Beneficial effect of the present invention is as follows:
1, seeks the ordinate of eyes finds eyes then with mixing integration projection function horizontal ordinate method to justify the difference projection function, on the complexity of algorithm and accuracy, all have better effect than single method.
2, because the time complexity of circle difference operator projection function is O (n4), and the time complexity that mixes the integration projection function is O (n2), only to ordinate circle difference operator projection function, and to horizontal ordinate mixing integration projection function, used time ratio all uses the required time of circle difference operator projection function short to horizontal ordinate.The used time of this method approximates 1/2 of the simple round difference operator used time of projection function.Therefore to be better than the positive effect of classic method be that speed is fast to this method, the accuracy height.
Description of drawings
Fig. 1 processing flow chart.
Embodiment
Below in conjunction with accompanying drawing the invention is described in further details.
Treatment scheme as shown in Figure 1, the prerequisite of this method is that recognition of face has been finished and carried out carrying out on the basis of image pre-service (image noise reduction, figure image intensifying, image reconstruction).Implementation step is as follows:
1, utilize circle difference operator projection function to find the ordinate of eyes, the definition of circle difference operator: set up the xy coordinate system on image, initial point is at the image middle position usually.If image (x, y) gray-scale value on the coordinate points is f(x, y); The threshold value of setting difference is h; So that (x y) is the center of circle, makes the circle that radius is r, and all form circle upper set S near the pixel of circumference, and sum of all pixels is n.If the gray-scale value of centre point be f (x, y), then in the S all gray scale (x, y)+number of pixels of h is designated as nl, (x, y)-number of pixels of h is designated as n2, the average gray of all pixels is designated as Favg to all gray scales in the S smaller or equal to f in the S more than or equal to f., 3 round coefficients of variation of definition centre point are as follows:
Circle dark coefficient of variation: b (x, y)=n1/n; Circle bright coefficient of variation: c (x, y)=n2/n;
Circle mean difference coefficient: v (x, y)=Favg-f (x, y).
Circle difference operator CDO (x, y): if v (x, y)<=0 or b (x, y)<0.6 then CDO (x, y)=0;
if?v(x,y)>0?and?b(x,y)>=0.6
then?CDO(x,y)?=?v(x,y);
In algorithm, generally get 1<=h<=10,2<=r<=10, concrete value will obtain in practical operation.
2, utilize mixed projection function to find two horizontal ordinate then.
Mixed projection function is defined as the integral projection function and adds the upside deviation projection function.
Integral projection function: suppose that (x y) represents that (the vertical integral projection function representation in interval [y1, y2] is Sv (x) to point for x, the grey scale pixel value of y) locating to I;
Formula one:
Figure 434822DEST_PATH_IMAGE001
Figure 2011101256280100002DEST_PATH_IMAGE007
(1)
Average integral projection function Mv (x) is expressed as:
Formula two: Mv (x)=Sv (x)/(y2-y1) (2)
The variance projection function: the vertical variance projection function that is located in the interval [y1, y2] is expressed as
Figure 276876DEST_PATH_IMAGE003
2v (x);
Formula three:
Figure 485134DEST_PATH_IMAGE003
2v (x)=
Figure 299507DEST_PATH_IMAGE004
2/ (y2-y1) (3)
Vertical mixed projection function Hv (x) is defined as:
Formula four: Hv (x)=
Figure 184286DEST_PATH_IMAGE005
/ 2+
Figure 762904DEST_PATH_IMAGE006
/ 2 (1)
In this paper algorithm, earlier image is justified the horizontal integral projection of difference operator, find the highest vertical coordinate that is defined as eyes of peak value, each horizontal ordinate to this coordinate vertically mixes integral projection then, two crests about finding are as the horizontal ordinate of eyes.
3, the integral projection curve that shows both direction.
4, the ordinate that will obtain eyes is defined as y, and the horizontal ordinate that obtains two according to two crests of integral projection is x1, x2, and then two distance is | x1-x2|, two eye coordinateses are: (x1, y), (x2, y).

Claims (1)

1. the method for people's face eye location and distance measuring and calculating is characterized in that, comprises the steps:
1) utilize circle difference operator projection function to find the ordinate of eyes;
The definition of circle difference operator: set up the xy coordinate system on image, initial point is at the image middle position usually; If image (x, y) gray-scale value on the coordinate points is f(x, y); The threshold value of setting difference is h; So that (x y) is the center of circle, makes the circle that radius is r, and all form circle upper set S near the pixel of circumference, and sum of all pixels is n; If the gray-scale value of centre point be f (x, y), then in the S all gray scale more than or equal to f (x, y)+number of pixels of h is designated as nl, in the S all gray scales smaller or equal to f (x, y)-number of pixels of h is designated as n2, the average gray of all pixels is designated as Favg in the S;
3 round coefficients of variation of definition centre point are as follows:
Circle dark coefficient of variation: b (x, y)=n1/n;
Circle bright coefficient of variation: c (x, y)=n2/n;
Circle mean difference coefficient: v (x, y)=Favg-f (x, y);
Circle difference operator CDO (x, y): if v (x, y)<=0 or b (x, y)<0.6 then CDO (x, y)=0;
if?v(x,y)>0?and?b(x,y)>=0.6
then?CDO(x,y)?=?v(x,y);
In algorithm, generally get 1<=h<=10,2<=r<=10, concrete value will obtain in practical operation;
2) utilize mixed projection function to find two horizontal ordinate;
Mixed projection function is defined as the integral projection function and adds the upside deviation projection function;
Integral projection function: suppose that (x y) represents that (the vertical integral projection function representation in interval [y1, y2] is Sv (x) to point for x, the grey scale pixel value of y) locating to I;
Formula one:
Figure 341282DEST_PATH_IMAGE001
Figure 222519DEST_PATH_IMAGE002
(1)
Average integral projection function Mv (x) is expressed as:
Formula two: Mv (x)=Sv (x)/(y2-y1) (2)
The variance projection function: the vertical variance projection function that is located in the interval [y1, y2] is expressed as 2v (x)
Formula three:
Figure 981714DEST_PATH_IMAGE003
2v (x)= 2/ (y2-y1) (3)
Vertical mixed projection function Hv (x) is defined as:
Formula four: Hv (x)=
Figure 50350DEST_PATH_IMAGE005
/ 2+
Figure 339990DEST_PATH_IMAGE006
/ 2 (4)
Earlier image is justified the horizontal integral projection of difference operator, find the highest vertical coordinate that is defined as eyes of peak value, each horizontal ordinate to this coordinate vertically mixes integral projection then, and two crests about finding are as the horizontal ordinate of eyes;
3) integral projection curve of demonstration both direction;
4) ordinate that will obtain eyes is defined as y, and the horizontal ordinate that obtains two according to two crests of integral projection is x1, x2, and then two distance is | x1-x2|, two eye coordinateses are: (x1, y), (x2, y).
CN 201110125628 2011-05-16 2011-05-16 Method of face and eye location and distance measurement Expired - Fee Related CN102184543B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201110125628 CN102184543B (en) 2011-05-16 2011-05-16 Method of face and eye location and distance measurement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201110125628 CN102184543B (en) 2011-05-16 2011-05-16 Method of face and eye location and distance measurement

Publications (2)

Publication Number Publication Date
CN102184543A true CN102184543A (en) 2011-09-14
CN102184543B CN102184543B (en) 2013-03-27

Family

ID=44570713

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201110125628 Expired - Fee Related CN102184543B (en) 2011-05-16 2011-05-16 Method of face and eye location and distance measurement

Country Status (1)

Country Link
CN (1) CN102184543B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113011393A (en) * 2021-04-25 2021-06-22 中国民用航空飞行学院 Human eye positioning method based on improved hybrid projection function

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1474357A (en) * 2003-06-13 2004-02-11 南京大学 Accurately automatically positioning method for centre of human face and eyes in digital grey scale image
US20060147094A1 (en) * 2003-09-08 2006-07-06 Woong-Tuk Yoo Pupil detection method and shape descriptor extraction method for a iris recognition, iris feature extraction apparatus and method, and iris recognition system and method using its

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1474357A (en) * 2003-06-13 2004-02-11 南京大学 Accurately automatically positioning method for centre of human face and eyes in digital grey scale image
US20060147094A1 (en) * 2003-09-08 2006-07-06 Woong-Tuk Yoo Pupil detection method and shape descriptor extraction method for a iris recognition, iris feature extraction apparatus and method, and iris recognition system and method using its

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
《Pattern Recognition》 20041231 Zhi-Hua Zhou et al. Projection functions for eye detection 全文 1 第37卷, *
《广西大学学报(自然科学版)》 20080630 陈雪云等 一种基于圆差异算子的眼睛垂直定位方法 全文 1 第33卷, 第2期 *
《计算机工程》 20100131 陈雪云等 基于Haar小波的眼睛定位方法 全文 1 第36卷, 第1期 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113011393A (en) * 2021-04-25 2021-06-22 中国民用航空飞行学院 Human eye positioning method based on improved hybrid projection function

Also Published As

Publication number Publication date
CN102184543B (en) 2013-03-27

Similar Documents

Publication Publication Date Title
CN103810490B (en) A kind of method and apparatus for the attribute for determining facial image
US8831337B2 (en) Method, system and computer program product for identifying locations of detected objects
CN103310194B (en) Pedestrian based on crown pixel gradient direction in a video shoulder detection method
CN103208123B (en) Image partition method and system
CN104966285B (en) A kind of detection method of salient region
CN102184544B (en) Method for correcting deformity and identifying image of go notation
CN102609724B (en) Method for prompting ambient environment information by using two cameras
CN108615014B (en) Eye state detection method, device, equipment and medium
CN104615996B (en) A kind of various visual angles two-dimension human face automatic positioning method for characteristic point
CN103218605A (en) Quick eye locating method based on integral projection and edge detection
CN103839038A (en) People counting method and device
CN106570538B (en) Character image processing method and device
CN106529432A (en) Hand area segmentation method deeply integrating significance detection and prior knowledge
CN104050448A (en) Human eye positioning method and device and human eye region positioning method and device
CN105869148A (en) Target detection method and device
CN105405130A (en) Cluster-based license image highlight detection method and device
CN103914829B (en) Method for detecting edge of noisy image
CN104599291A (en) Structural similarity and significance analysis based infrared motion target detection method
CN109840905A (en) Power equipment rusty stain detection method and system
Devadethan et al. Face detection and facial feature extraction based on a fusion of knowledge based method and morphological image processing
CN108416304B (en) Three-classification face detection method using context information
CN105069403B (en) A kind of three-dimensional human ear identification based on block statistics feature and the classification of dictionary learning rarefaction representation
CN107145892B (en) A kind of image significance object detection method based on adaptive syncretizing mechanism
CN110557622B (en) Depth information acquisition method and device based on structured light, equipment and medium
CN102184543B (en) Method of face and eye location and distance measurement

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20130327

Termination date: 20160516

CF01 Termination of patent right due to non-payment of annual fee