CN100514353C - Living body detecting method and system based on human face physiologic moving - Google Patents
Living body detecting method and system based on human face physiologic moving Download PDFInfo
- Publication number
- CN100514353C CN100514353C CNB2007101780886A CN200710178088A CN100514353C CN 100514353 C CN100514353 C CN 100514353C CN B2007101780886 A CNB2007101780886 A CN B2007101780886A CN 200710178088 A CN200710178088 A CN 200710178088A CN 100514353 C CN100514353 C CN 100514353C
- Authority
- CN
- China
- Prior art keywords
- people
- face
- motion
- testing result
- result frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/40—Spoof detection, e.g. liveness detection
Abstract
The invention discloses an in vivo detection method and a system based on the physiological motion of face, pertaining to the face recognition field. The method includes that: Step A. detecting a motion region and a motion direction of the object within the imaging view of the detection system, and locking a face detection result frame; Step B. judging whether a valid face motion exists in the face detection result frame, if not, a picture face is determined, and if it is, turn to the Step C; Step C. judging whether the face motion in the face detection result frame is a physiological motion, if not, a picture face is determined, and if it is, an actual face is determined. The system comprises a motion detection module, a valid-face-motion judgment module and a physiological motion judgment module. The technical proposal in the invention can discriminate actual face and picture face, and enhance the reliability of the face recognition system.
Description
Technical field
The present invention relates to the face recognition technology field, particularly a kind of biopsy method and system based on the motion of people's face physiological.
Background technology
In recent years, biometrics identification technology had had considerable progress, and biological characteristic commonly used has people's face, fingerprint, iris etc.Carry out person identification with biological characteristic and have a wide range of applications, can distinguish true lander exactly and forge the lander by these biological informations in the whole world.But living things feature recognition exists various threats, lands or the like such as the photo with people's face, fingerprint and the iris forged.Whether the biological characteristic that differentiation is submitted to system prevents malice adulterator by the biological characteristic of stealing other people be used for identification from lived individuality, and the live body that has formed living creature characteristic recognition system detects.Face recognition technology is widely used in identification, video monitoring and video data retrieval analysis aspect in recent years owing to it has conveniently, is easy to be advantages such as people's acceptance.But, move towards from research in the process of practical application at face recognition technology, must solve the security threat of face recognition technology.Usually, forging the form of landing face identification system can be classified as following several: photo people face, people's face video clips, the three-dimensional face model of imitation.Wherein, photo people face is more prone to obtain than alternate manner, also appears at most to forge and lands in the face identification system.In order to make face identification system can move towards practical, need design can resist people's face live body detection system that photo people face lands threat.People's face live body detects and recognition of face is complementary, and whether the maturation of people's face live body detection technique is determining whether recognition of face can move towards practical application.
At people's face live body detection range, conventional detection mainly contains following several: first kind is to measure three-dimensional depth information by motion.The difference of real human face and photo people face is that real human face is the three-dimensional body that depth information is arranged, and photo is a two-dimensional plane, therefore can pass through by three-dimensional model reconfiguration people face, and from the motion calculation degree of depth, thereby difference real human face and photo people face.The shortcoming of this method is to have difficulties with three-dimensional model reconfiguration people face, is difficult to calculate exactly depth information.Second kind is by analyzing the high fdrequency component proportion of photo people's face and real human face.The basic assumption of this method be think the imaging of photo people face with real human face imaging compare, lost high-frequency information.For the low photo people face of some resolution is to have this problem, but for high-resolution photo, this method is inapplicable; The third is to follow the tracks of by the real-time face of video sequence, extracts feature and design category device and judges.The thought of this method is that a real human face and photo people face are divided into two classes, needs design and trains special sorter.But this method is more time-consuming, especially ignored go to analyze real human face and photo people face exist aspect the physiological motion basic different.
Summary of the invention
In order to distinguish real human face and photo people face simply and effectively, improve the reliability of face identification system, the embodiment of the invention provides a kind of biopsy method and system based on the motion of people's face physiological.Described technical scheme is as follows:
A kind of biopsy method based on the motion of people's face physiological, described method comprises:
Steps A: the moving region and the direction of motion of object in the detection system camera angle, locking people face testing result frame;
Step B: judge whether there is the motion of effective people's face portion in described people's face testing result frame, if there is no, then think photo people face,, then change step C over to if exist;
Step C: judge whether the described people's face portion motion in described people's face testing result frame is the physiological motion, if not, then think photo people face, if then think real human face.
Wherein, step B is specially:
Step B1: judge that whether described people's face testing result frame exists the consistance motion in the preset range outward, if exist, then thinks photo people face; If there is no, then change step B2 over to;
Step B2: judge whether the people's face portion motion in described people's face testing result frame is created near eyes and the mouth, if not, then think photo people face, if then change step C over to; Or
Judge whether the people's face portion motion in described people's face testing result frame is created near the mouth, if not, then think photo people face, if then change step C over to; Or
Judge whether the people's face portion motion in described people's face testing result frame is created near the eyes, if not, then think photo people face, if then change step C over to.
Wherein, step B1 is specially:
Step D1: add up the direction of motion in the described moving region, whether the difference of judging described direction of motion less than predetermined angular, if not, then think not have described consistance motion, if then think to have described consistance motion, and change step D2 over to;
Step D2: calculate described moving region centre coordinate whether outside people's face testing result frame, and whether the scope of described moving region is greater than predetermined threshold, if then think outside described people's face testing result frame, to have consistance motion in the preset range.
Step B2 is specially:
Calculate between the position coordinates of eyes of described moving region centre coordinate and people's face, and and the position coordinates of mouth between Euclidean distance, motion is created near eyes and the mouth if described Euclidean distance, is then thought described people's face portion less than predetermined threshold; Or calculate Euclidean distance between the position coordinates of described moving region centre coordinate and mouth, motion is created near the mouth if described Euclidean distance, is then thought described people's face portion less than predetermined threshold; Or
Calculate the Euclidean distance between the position coordinates of described moving region centre coordinate and eyes, motion is created near the eyes if described Euclidean distance, is then thought described people's face portion less than predetermined threshold.
Wherein, step C is specially:
Add up the described direction of motion in the described moving region,, determine that then the motion of described people's face portion moves for physiological if the described direction of motion in the described moving region is vertical oppositely the time.
A kind of live body detection system based on the motion of people's face physiological, described system comprises:
Detect motion module, be used for the moving region and the direction of motion of object in the detection system camera angle, locking people face testing result frame;
Effective people's face portion motion determination module is used to judge have the motion of effective people's face portion in described people's face testing result frame;
Physiological motion determination module is used to judge whether the people's face portion motion in described people's face testing result frame is created near eyes and the mouth, if not, then think photo people face, if then think real human face.
Wherein, effectively people's face portion motion determination module comprises:
Consistance motion determination module is used to judge whether described people's face testing result frame exists the consistance motion in the preset range outward, if exist, then thinks photo people face; If there is no, then change people's face portion range of movement judge module over to;
People's face portion range of movement judge module is used to judge whether the described people's face portion motion in described people's face testing result frame is created near eyes and the mouth; Or
Be used to judge whether the described people's face portion motion in described people's face testing result frame is created near the mouth; Or
Be used to judge whether the described people's face portion motion in described people's face testing result frame is created near the eyes.
Wherein, consistance motion determination module comprises:
There is judge module in consistance motion, and whether the difference that is used to judge described direction of motion less than predetermined angular, if not, then think and do not have described consistance motion, if then think to have described consistance motion, and change consistance range of movement judge module over to;
Consistance range of movement judge module, be used to calculate described moving region centre coordinate whether outside described people's face testing result frame, and whether the scope of described moving region is greater than predetermined threshold, if then think outside described people's face testing result frame, to exist the consistance in the preset range to move.
Wherein, people's face portion range of movement judge module is specially:
Mouth and eye distance judge module, be used to calculate between the position coordinates of eyes of described moving region centre coordinate and people's face, and and the mouth position coordinates between Euclidean distance, motion is created near eyes and the mouth if described Euclidean distance, is then thought described people's face portion less than predetermined threshold; Or
Mouth is apart from judge module, is used to calculate the Euclidean distance between the mouth position coordinates of described moving region centre coordinate and people's face, and motion is created near the mouth if described Euclidean distance, is then thought described people's face portion less than predetermined threshold; Or
The eye distance judge module is used to calculate the Euclidean distance between the position coordinates of eyes of described moving region centre coordinate and people's face, and motion is created near the eyes if described Euclidean distance, is then thought described people's face portion less than predetermined threshold.
Wherein, physiological motion determination module is specially:
The direction of motion judge module is used to add up the described direction of motion in the described moving region, if the described direction of motion in the described moving region is vertical oppositely the time, determines that then the motion of described people's face portion moves for physiological.
The beneficial effect of the technical scheme that the embodiment of the invention provides is: real human face and photo people face simply and effectively, reduce the invasion property of face identification system, and help to improve the performance that people's face live body detects.
Description of drawings
Fig. 1 is the process flow diagram of a kind of biopsy method based on people's face physiological motion of providing of the embodiment of the invention 1;
Fig. 2 is the synoptic diagram of a kind of live body detection system based on people's face physiological motion of providing of the embodiment of the invention 2.
Embodiment
For making the purpose, technical solutions and advantages of the present invention clearer, embodiment of the present invention is described further in detail below in conjunction with accompanying drawing.
Embodiment 1
In order to distinguish real human face and photo people face, the embodiment of the invention provides a kind of biopsy method and system based on the motion of people's face physiological.This method can be distinguished real human face and photo people face effectively by judging the physiological motion of people's face.As shown in Figure 1, concrete implementation step is as follows:
Step 101: the moving region of the object in the detection system camera angle and direction of motion, locking people face testing result frame.
Carry out people's face and detect in current system camera angle, locking is as the rectangle frame of people's face, thus locking people face testing result frame.Can detect the moving region of object in the current system camera angle by adjacent two frame differences, the moving region can be one, also can be a plurality of; By calculated level gradient and VG (vertical gradient) inspected object travel direction, obtain the centre coordinate in total movement zone in system's camera angle, scope and and the moving region in the direction of motion of object.
Step 102: judge that whether people's face testing result frame exists the consistance motion of preset range outward, if exist, then thinks photo people face; If not, then change step 103 over to.
The consistance motion is meant the motion of all the spot moving direction unanimities in the moving region.Direction of motion in the statistics moving region when the angle difference of direction of motion is spent less than 5, thinks that then the direction of motion of moving region is the consistance motion.For each moving region, calculate the distance of the centre coordinate of moving region to people's face testing result frame, and whether the scope of calculating the moving region is greater than predetermined threshold (threshold value general span 30~50 pixels), if the centre coordinate of this moving region is outside people's face testing result frame, and the scope of moving region can be judged the consistance motion that people's face testing result frame has preset range outward greater than predetermined threshold.
Real human face keeps under the motionless situation consistance motion outside general no one's face basic.If detect the consistance motion that there is preset range outward in people's face testing result frame, then think in people's face testing result frame it is photo people face.Do like this and can cause certain False Rejects, when landing such as real human face, have powerful connections disturb or after one's death the someone pass by etc., can guarantee very low false acceptance rate but do like this, guarantee the security of system.And the refusal that in a single day makes a mistake, the lander lands after can adjusting again.
Step 103: judge whether the moving region in people's face testing result frame is created near eyes and the mouth, if not, then think photo people face, if then change step 104 over to.
With the sample training of the eyes and the mouth of a large amount of people's faces, obtain the sorter of eyes and mouth.In people's face testing result frame, carry out the detection of eyes and mouth with the eyes that train and the sorter of mouth, and draw the position coordinates of eyes and mouth.Calculate the Euclidean distance of the eyes of moving region centre coordinate and people's face, and the Euclidean distance that calculates the mouth of moving region centre coordinate and people's face.When this Euclidean distance during, judge that this moving region is near the eyes and mouth that are created in people's face testing result frame less than predetermined threshold (generally getting 6~10 pixels); If Euclidean distance during greater than predetermined threshold, is then thought photo people face.
From the security consideration of system, this step is necessary.If there is not the motion of the eyes or the mouth of people's face in successive frame, then think photo people face.
As a kind of preferred scheme, can also calculate the Euclidean distance of the eyes of moving region centre coordinate and people's face, if this distance is during less than predetermined threshold (generally getting 6~10 pixels), judge that then this moving region is near the eyes that are created in people's face testing result frame, otherwise, promptly think photo people face.
As the preferred scheme of another kind, can also calculate the Euclidean distance of the mouth of moving region centre coordinate and people's face, if this distance is during less than predetermined threshold (generally getting 10~15 pixels), judge that then this moving region is near the mouth that is created in people's face testing result frame, otherwise, promptly think photo people face.
Step 104: judge that near the motion that is created in eyes and the mouth in people's face testing result frame is the physiological motion, if not, then think photo people face; If then think real human face.
Physiological is moved, and comprises some physiological nictation of people's face portion, speaks, and actions such as smile all are the necessary motions of people, and the eyes that real human face produces and the motion of mouth are the motions that the position relation constraint is arranged, and are reverse up and down; And the motion that the simulation of photo people face produces does not have this character.Whether the direction of motion that calculating is created near the moving region people's face eyes and the mouth is consistent, if the direction of motion in this moving region is consistent, then is not that physiological is moved.The specific implementation method is, for near the moving region eyes and the person's mouth, add up the direction of motion in this moving region, when the direction of motion in this moving region is positive 90 degree and two main directions of negative 90 degree, thinking has reverse up and down motion in this moving region, and then judge that this motion is the physiological motion, thereby think real human face.
As a kind of preferred scheme, can also be only motion by judging mouth in people's face whether be that physiological is moved and distinguished real human face and photo people face, concrete grammar and present embodiment are similar, repeat no more.
As the preferred scheme of another kind, can also be only motion by judging eyes in people's face whether be that physiological is moved and distinguished real human face and photo people face, concrete grammar and present embodiment are similar, repeat no more.
Embodiment 2
The embodiment of the invention provides a kind of live body detection system based on the motion of people's face physiological, and as shown in Figure 2, this system comprises:
Detect motion module, be used for the moving region and the direction of motion of object in the detection system camera angle, locking people face testing result frame.
Effective people's face portion motion determination module is used to judge have the motion of effective people's face portion in people's face testing result frame.
Physiological motion determination module is used to judge whether the people's face portion motion in people's face testing result frame is created near eyes and the mouth, if not, then think photo people face, if then think real human face.
Wherein, effectively people's face portion motion determination module comprises:
Consistance motion determination module is used to judge whether people's face testing result frame exists the consistance motion in the preset range outward, if exist, then thinks photo people face; If there is no, then change people's face portion range of movement judge module over to;
People's face portion range of movement judge module is used to judge that the people's face portion motion in people's face testing result frame is created near eyes and the mouth.
Wherein, consistance motion determination module comprises:
There is judge module in consistance motion, and whether the difference that is used to judge direction of motion less than predetermined angular, if not, then think not have the consistance motion, if then think to have the consistance motion, and change consistance range of movement judge module over to.
Consistance range of movement judge module, be used to calculate the moving region centre coordinate whether outside people's face testing result frame, and whether the scope of moving region is greater than predetermined threshold, if think that then there is the consistance motion in the preset range outward in people's face testing result frame.
Wherein, people's face portion range of movement judge module is specially:
Apart from judge module, be used to calculate between the position coordinates of eyes of moving region centre coordinate and people's face, and and the mouth position coordinates between Euclidean distance, motion is created near eyes and the mouth if Euclidean distance, is then thought people's face portion less than predetermined threshold; Or
Mouth is apart from judge module, is used to calculate the Euclidean distance between the mouth position coordinates of moving region centre coordinate and people's face, and motion is created near the mouth if Euclidean distance, is then thought people's face portion less than predetermined threshold; Or
The eye distance judge module is used to calculate the Euclidean distance between the position coordinates of eyes of moving region centre coordinate and people's face, and motion is created near the eyes if Euclidean distance, is then thought people's face portion less than predetermined threshold.
Wherein, physiological motion determination module is specially:
The direction of motion judge module is used to add up the direction of motion in the moving region, if the direction of motion in the moving region is vertical oppositely the time, determines that then the motion of people's face portion moves for physiological.
Can carry out performance test to the embodiment of the invention by test experiments.This test experiments has been set up the database of 400 people's face live body sequences and 200 photo people face sequences.Wherein, 400 people's face live body sequences are divided into two classes, and a class is the people's face live body sequence that cooperates, and promptly head keeps motionless substantially, and only there is blink in face or speaks and wait the motion generation; Another kind of is ill-matched people's face live body sequence, promptly arbitrarily be sitting in camera before, motion can be arranged arbitrarily, comprise rotary head or new line etc., people's face two eye distances are from from 25 pixels to 100 pixels, picture size is 240 * 320.In addition, this test experiments is tested 53 sections talking (speaking) people face video (this people's face video belongs to people's face sequence of cooperation) data in the CMU database, wherein human eye distance is probably all about 100 pixels in people's face video, and picture size is that picture size is 486 * 640.The result of test is as shown in table 1:
Table 1:
Sequence number | By | Refusal | |
Photo people face sequence | 200 | 0 | 200 |
The people's face sequence that cooperates | 200 | 195 | 5 |
Ill-matched people's face sequence | 200 | 120 | 80 |
CMU?talking?faces | 53 | 48 | 5 |
By table 1 as can be seen, people's face live body sequence percent of pass of cooperation is higher than the percent of pass of ill-matched people's face live body sequence far away.System needs the user to carry out certain cooperation, the purpose of doing like this be for the percent of pass that guarantees photo sequence very low.Because in the middle of biological recognition system, in order to guarantee the security of biological recognition system, promptly the biological characteristic for forgeries such as photos does not preferably all allow to pass through, and requires very low FAR (Failure Acceptance Ratio, false acceptance rate).Can make certain cooperation because the people has the character of live body, but the invasive of system is reduced.
It is the indivisible important component part of face identification system that people's face live body detects, people's face live body detects the quality of performance, determining face identification system to move towards practical application from research, can distinguish real human face and photo people face by technical scheme of the present invention, reduce the invasion property of face identification system, help to improve the performance that people's face live body detects.
In addition, the method for landing face identification system by the forgery mode also has a lot, except using photo, and common video Video (video) this mode of recording a video of using in addition.The mode that video recording lands for this use Video by detecting user's nictation, is spoken, and opens one's mouth, and adds interactive instructions, and such as requiring the user to magnify mouth in real time, cooperation such as require the user to close one's eyes or to speak detects user's reaction in real time and makes judgement.
The above only is preferred embodiment of the present invention, and is in order to restriction the present invention, within the spirit and principles in the present invention not all, any modification of being done, is equal to replacement, improvement etc., all should be included within protection scope of the present invention.
Claims (8)
1, a kind of biopsy method based on the motion of people's face physiological is characterized in that described method comprises:
Steps A: the moving region and the direction of motion of object in the detection system camera angle, locking people face testing result frame;
Step B: judge whether there is the motion of effective people's face portion in described people's face testing result frame, if there is no, then think photo people face,, then change step C over to if exist;
Wherein, exist the step of effective people's face portion motion to be specially in the described people's face of the described judgement testing result frame:
Step B1: judge that whether described people's face testing result frame exists the consistance motion in the preset range outward, if exist, then thinks photo people face; If there is no, then change step B2 over to;
Step B2: judge whether the people's face portion motion in described people's face testing result frame is created near eyes and the mouth, if not, then think photo people face, if then change step C over to; Or
Judge whether the people's face portion motion in described people's face testing result frame is created near the mouth, if not, then think photo people face, if then change step C over to; Or
Judge whether the people's face portion motion in described people's face testing result frame is created near the eyes, if not, then think photo people face, if then change step C over to;
Wherein, the described people's face of described judgement testing result frame exists the step of the consistance motion in the preset range to be specially outward:
Step D1: add up the direction of motion in the described moving region, whether the difference of judging described direction of motion less than predetermined angular, if not, then think not have described consistance motion, if then think to have described consistance motion, and change step D2 over to;
Step D2: calculate described moving region centre coordinate whether outside people's face testing result frame, and whether the scope of described moving region is greater than predetermined threshold, if then think outside described people's face testing result frame, to have consistance motion in the preset range;
Step C: judge whether the described people's face portion motion in described people's face testing result frame is the physiological motion, if not, then think photo people face, if then think real human face;
Wherein, the described step of determining that the motion of the people's face portion in described people's face testing result frame is moved for physiological is specially:
Add up the described direction of motion in the described moving region,, determine that then the motion of described people's face portion moves for physiological if the described direction of motion in the described moving region is vertical oppositely the time.
According to claim 1 based on the biopsy method of people's face physiological motion, it is characterized in that 2, near the described step of judging whether the described people's face portion motion in described people's face testing result frame is created in eyes and the mouth is specially:
Calculate between the position coordinates of eyes of described moving region centre coordinate and people's face, and and the position coordinates of mouth between Euclidean distance, motion is created near eyes and the mouth if described Euclidean distance, is then thought described people's face portion less than predetermined threshold.
According to claim 1 based on the biopsy method of people's face physiological motion, it is characterized in that 3, near the described step of judging whether the described people's face portion motion in described people's face testing result frame is created in the mouth is specially:
Calculate the Euclidean distance between the position coordinates of described moving region centre coordinate and mouth, motion is created near the mouth if described Euclidean distance, is then thought described people's face portion less than predetermined threshold.
According to claim 1 based on the biopsy method of people's face physiological motion, it is characterized in that 4, near the described step of judging whether the described people's face portion motion in described people's face testing result frame is created in the eyes is specially:
Calculate the Euclidean distance between the position coordinates of described moving region centre coordinate and eyes, motion is created near the eyes if described Euclidean distance, is then thought described people's face portion less than predetermined threshold.
5, a kind of live body detection system based on the motion of people's face physiological is characterized in that described system comprises:
Detect motion module, be used for the moving region and the direction of motion of object in the detection system camera angle, locking people face testing result frame;
Effective people's face portion motion determination module is used to judge have the motion of effective people's face portion in described people's face testing result frame;
Wherein, described effective people's face portion motion determination module comprises:
Consistance motion determination module is used to judge whether described people's face testing result frame exists the consistance motion in the preset range outward, if exist, then thinks photo people face; If there is no, then change people's face portion range of movement judge module over to;
People's face portion range of movement judge module is used to judge whether the described people's face portion motion in described people's face testing result frame is created near eyes and the mouth; Or
Be used to judge whether the described people's face portion motion in described people's face testing result frame is created near the mouth; Or
Be used to judge whether the described people's face portion motion in described people's face testing result frame is created near the eyes;
Wherein, described consistance motion determination module comprises:
There is judge module in consistance motion, and whether the difference that is used to judge described direction of motion less than predetermined angular, if not, then think and do not have described consistance motion, if then think to have described consistance motion, and change consistance range of movement judge module over to;
Consistance range of movement judge module, be used to calculate described moving region centre coordinate whether outside described people's face testing result frame, and whether the scope of described moving region is greater than predetermined threshold, if then think outside described people's face testing result frame, to exist the consistance in the preset range to move;
Physiological motion determination module is used to judge whether the people's face portion motion in described people's face testing result frame is created near eyes and the mouth, if not, then think photo people face, if then think real human face;
Described physiological motion determination module is specially:
The direction of motion judge module is used to add up the described direction of motion in the described moving region, if the described direction of motion in the described moving region is vertical oppositely the time, determines that then the motion of described people's face portion moves for physiological.
6, as described in the claim 5 based on the live body detection system of people's face physiological motion, it is characterized in that described people's face portion range of movement judge module is specially:
Mouth and eye distance judge module, be used to calculate between the position coordinates of eyes of described moving region centre coordinate and people's face, and and the mouth position coordinates between Euclidean distance, motion is created near eyes and the mouth if described Euclidean distance, is then thought described people's face portion less than predetermined threshold.
7, as described in the claim 5 based on the live body detection system of people's face physiological motion, it is characterized in that described people's face portion range of movement judge module is specially:
Mouth is apart from judge module, is used to calculate the Euclidean distance between the mouth position coordinates of described moving region centre coordinate and people's face, and motion is created near the mouth if described Euclidean distance, is then thought described people's face portion less than predetermined threshold.
8, as described in the claim 5 based on the live body detection system of people's face physiological motion, it is characterized in that described people's face portion range of movement judge module is specially:
The eye distance judge module is used to calculate the Euclidean distance between the position coordinates of eyes of described moving region centre coordinate and people's face, and motion is created near the eyes if described Euclidean distance, is then thought described people's face portion less than predetermined threshold.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CNB2007101780886A CN100514353C (en) | 2007-11-26 | 2007-11-26 | Living body detecting method and system based on human face physiologic moving |
US12/129,708 US20090135188A1 (en) | 2007-11-26 | 2008-05-30 | Method and system of live detection based on physiological motion on human face |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CNB2007101780886A CN100514353C (en) | 2007-11-26 | 2007-11-26 | Living body detecting method and system based on human face physiologic moving |
Publications (2)
Publication Number | Publication Date |
---|---|
CN101159016A CN101159016A (en) | 2008-04-09 |
CN100514353C true CN100514353C (en) | 2009-07-15 |
Family
ID=39307106
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CNB2007101780886A Expired - Fee Related CN100514353C (en) | 2007-11-26 | 2007-11-26 | Living body detecting method and system based on human face physiologic moving |
Country Status (2)
Country | Link |
---|---|
US (1) | US20090135188A1 (en) |
CN (1) | CN100514353C (en) |
Families Citing this family (63)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4508257B2 (en) * | 2008-03-19 | 2010-07-21 | ソニー株式会社 | Composition determination apparatus, composition determination method, and program |
CN101908140A (en) * | 2010-07-29 | 2010-12-08 | 中山大学 | Biopsy method for use in human face identification |
EP2439700B1 (en) * | 2010-10-06 | 2013-05-01 | Alcatel Lucent | Method and Arrangement for Identifying Virtual Visual Information in Images |
CN102004904B (en) * | 2010-11-17 | 2013-06-19 | 东软集团股份有限公司 | Automatic teller machine-based safe monitoring device and method and automatic teller machine |
CN102323817A (en) * | 2011-06-07 | 2012-01-18 | 上海大学 | Service robot control platform system and multimode intelligent interaction and intelligent behavior realizing method thereof |
EP2546782B1 (en) | 2011-07-11 | 2014-06-25 | Accenture Global Services Limited | Liveness detection |
DE102011054658A1 (en) * | 2011-10-20 | 2013-04-25 | Bioid Ag | Method for distinguishing between a real face and a two-dimensional image of the face in a biometric capture process |
KR101901591B1 (en) * | 2011-11-01 | 2018-09-28 | 삼성전자주식회사 | Face recognition apparatus and control method for the same |
US9137246B2 (en) | 2012-04-09 | 2015-09-15 | Brivas Llc | Systems, methods and apparatus for multivariate authentication |
US8542879B1 (en) | 2012-06-26 | 2013-09-24 | Google Inc. | Facial recognition |
CN102902271A (en) * | 2012-10-23 | 2013-01-30 | 上海大学 | Binocular vision-based robot target identifying and gripping system and method |
CN103778360A (en) * | 2012-10-26 | 2014-05-07 | 华为技术有限公司 | Face unlocking method and device based on motion analysis |
JP5885645B2 (en) * | 2012-12-05 | 2016-03-15 | 京セラドキュメントソリューションズ株式会社 | Information processing apparatus and authentication method |
CN104348778A (en) * | 2013-07-25 | 2015-02-11 | 信帧电子技术(北京)有限公司 | Remote identity authentication system, terminal and method carrying out initial face identification at handset terminal |
CN103440479B (en) * | 2013-08-29 | 2016-12-28 | 湖北微模式科技发展有限公司 | A kind of method and system for detecting living body human face |
TWI546052B (en) | 2013-11-14 | 2016-08-21 | 財團法人工業技術研究院 | Apparatus based on image for detecting heart rate activity and method thereof |
CN103593598B (en) * | 2013-11-25 | 2016-09-21 | 上海骏聿数码科技有限公司 | User's on-line authentication method and system based on In vivo detection and recognition of face |
CN104751110B (en) * | 2013-12-31 | 2018-12-04 | 汉王科技股份有限公司 | A kind of biopsy method and device |
US9773151B2 (en) * | 2014-02-06 | 2017-09-26 | University Of Massachusetts | System and methods for contactless biometrics-based identification |
US9633269B2 (en) * | 2014-09-05 | 2017-04-25 | Qualcomm Incorporated | Image-based liveness detection for ultrasonic fingerprints |
JP6482816B2 (en) * | 2014-10-21 | 2019-03-13 | Kddi株式会社 | Biological detection device, system, method and program |
CN115457664A (en) * | 2015-01-19 | 2022-12-09 | 创新先进技术有限公司 | Living body face detection method and device |
CN105989264B (en) * | 2015-02-02 | 2020-04-07 | 北京中科奥森数据科技有限公司 | Biological characteristic living body detection method and system |
US9934443B2 (en) | 2015-03-31 | 2018-04-03 | Daon Holdings Limited | Methods and systems for detecting head motion during an authentication transaction |
US10133918B1 (en) * | 2015-04-20 | 2018-11-20 | Snap Inc. | Generating a mood log based on user images |
CN104835231B (en) * | 2015-05-25 | 2018-02-27 | 安恒世通(北京)网络科技有限公司 | A kind of recognition of face lockset |
CN104835232B (en) * | 2015-05-25 | 2018-02-27 | 安恒世通(北京)网络科技有限公司 | A kind of acoustic control lockset |
CN105612533B (en) * | 2015-06-08 | 2021-03-02 | 北京旷视科技有限公司 | Living body detection method, living body detection system, and computer program product |
WO2017000217A1 (en) * | 2015-06-30 | 2017-01-05 | 北京旷视科技有限公司 | Living-body detection method and device and computer program product |
CN105518582B (en) * | 2015-06-30 | 2018-02-02 | 北京旷视科技有限公司 | Biopsy method and equipment |
CN105518714A (en) * | 2015-06-30 | 2016-04-20 | 北京旷视科技有限公司 | Vivo detection method and equipment, and computer program product |
CN105138967B (en) * | 2015-08-05 | 2018-03-27 | 三峡大学 | Biopsy method and device based on human eye area active state |
US9794260B2 (en) * | 2015-08-10 | 2017-10-17 | Yoti Ltd | Liveness detection |
CN105184246B (en) | 2015-08-28 | 2020-05-19 | 北京旷视科技有限公司 | Living body detection method and living body detection system |
CN105119723A (en) * | 2015-09-15 | 2015-12-02 | 重庆智韬信息技术中心 | Identity authentication and authorization method based on human eye recognition |
CN105335473B (en) * | 2015-09-30 | 2019-02-12 | 小米科技有限责任公司 | Picture playing method and device |
CN105335722B (en) * | 2015-10-30 | 2021-02-02 | 商汤集团有限公司 | Detection system and method based on depth image information |
CN105450664B (en) * | 2015-12-29 | 2019-04-12 | 腾讯科技(深圳)有限公司 | A kind of information processing method and terminal |
CN105760817A (en) * | 2016-01-28 | 2016-07-13 | 深圳泰首智能技术有限公司 | Method and device for recognizing, authenticating, unlocking and encrypting storage space by using human face |
CN107273794A (en) * | 2017-04-28 | 2017-10-20 | 北京建筑大学 | Live body discrimination method and device in a kind of face recognition process |
CN107358152B (en) * | 2017-06-02 | 2020-09-08 | 广州视源电子科技股份有限公司 | Living body identification method and system |
CN107358157B (en) | 2017-06-07 | 2020-10-02 | 创新先进技术有限公司 | Face living body detection method and device and electronic equipment |
CN107491757A (en) * | 2017-08-18 | 2017-12-19 | 上海二三四五金融科技有限公司 | A kind of antifraud system and control method based on living body characteristics |
CN108124486A (en) * | 2017-12-28 | 2018-06-05 | 深圳前海达闼云端智能科技有限公司 | Face living body detection method based on cloud, electronic device and program product |
CN108537103B (en) * | 2018-01-19 | 2022-06-10 | 东北电力大学 | Living body face detection method and device based on pupil axis measurement |
CN108537131B (en) * | 2018-03-15 | 2022-04-15 | 中山大学 | Face recognition living body detection method based on face characteristic points and optical flow field |
EP3576016A4 (en) | 2018-04-12 | 2020-03-18 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Face recognition method and apparatus, and mobile terminal and storage medium |
CN110400161A (en) * | 2018-04-25 | 2019-11-01 | 鸿富锦精密电子(天津)有限公司 | Customer behavior analysis method, customer behavior analysis system and storage device |
CN108809992B (en) * | 2018-06-15 | 2021-07-13 | 黄玉新 | Face recognition verification system and correlation method of face recognition verification system and target system |
CN108960088A (en) * | 2018-06-20 | 2018-12-07 | 天津大学 | The detection of facial living body characteristics, the recognition methods of specific environment |
WO2020095350A1 (en) * | 2018-11-05 | 2020-05-14 | 日本電気株式会社 | Information processing device, information processing method, and recording medium |
CN109271978A (en) * | 2018-11-23 | 2019-01-25 | 四川长虹电器股份有限公司 | Recognition of face anti-fraud method |
CN111382592B (en) * | 2018-12-27 | 2023-09-29 | 杭州海康威视数字技术股份有限公司 | Living body detection method and apparatus |
CN111382607A (en) * | 2018-12-28 | 2020-07-07 | 北京三星通信技术研究有限公司 | Living body detection method and device and face authentication system |
CN110084152B (en) * | 2019-04-10 | 2022-03-15 | 武汉大学 | Disguised face detection method based on micro-expression recognition |
CN110059624B (en) * | 2019-04-18 | 2021-10-08 | 北京字节跳动网络技术有限公司 | Method and apparatus for detecting living body |
US10726246B1 (en) | 2019-06-24 | 2020-07-28 | Accenture Global Solutions Limited | Automated vending machine with customer and identification authentication |
USD963407S1 (en) | 2019-06-24 | 2022-09-13 | Accenture Global Solutions Limited | Beverage dispensing machine |
CN110287900B (en) * | 2019-06-27 | 2023-08-01 | 深圳市商汤科技有限公司 | Verification method and verification device |
CN111241945A (en) * | 2019-12-31 | 2020-06-05 | 杭州艾芯智能科技有限公司 | Method and device for testing face recognition performance, computer equipment and storage medium |
EP3869395A1 (en) | 2020-02-21 | 2021-08-25 | Accenture Global Solutions Limited | Identity and liveness verification |
CN114694220A (en) * | 2022-03-25 | 2022-07-01 | 上海大学 | Double-flow face counterfeiting detection method based on Swin transform |
CN114821404B (en) * | 2022-04-08 | 2023-07-25 | 马上消费金融股份有限公司 | Information processing method, device, computer equipment and storage medium |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5802220A (en) * | 1995-12-15 | 1998-09-01 | Xerox Corporation | Apparatus and method for tracking facial motion through a sequence of images |
US5774591A (en) * | 1995-12-15 | 1998-06-30 | Xerox Corporation | Apparatus and method for recognizing facial expressions and facial gestures in a sequence of images |
JP2001216515A (en) * | 2000-02-01 | 2001-08-10 | Matsushita Electric Ind Co Ltd | Method and device for detecting face of person |
WO2008156437A1 (en) * | 2006-04-10 | 2008-12-24 | Avaworks Incorporated | Do-it-yourself photo realistic talking head creation system and method |
US6919892B1 (en) * | 2002-08-14 | 2005-07-19 | Avaworks, Incorporated | Photo realistic talking head creation system and method |
JP2005323340A (en) * | 2004-04-07 | 2005-11-17 | Matsushita Electric Ind Co Ltd | Communication terminal and communication method |
US20080166052A1 (en) * | 2007-01-10 | 2008-07-10 | Toshinobu Hatano | Face condition determining device and imaging device |
US20080260212A1 (en) * | 2007-01-12 | 2008-10-23 | Moskal Michael D | System for indicating deceit and verity |
JP4389956B2 (en) * | 2007-04-04 | 2009-12-24 | ソニー株式会社 | Face recognition device, face recognition method, and computer program |
JP4959445B2 (en) * | 2007-07-04 | 2012-06-20 | オリンパス株式会社 | Image processing apparatus and image processing program |
-
2007
- 2007-11-26 CN CNB2007101780886A patent/CN100514353C/en not_active Expired - Fee Related
-
2008
- 2008-05-30 US US12/129,708 patent/US20090135188A1/en not_active Abandoned
Non-Patent Citations (6)
Title |
---|
基于邻域运动一致性的运动矢量快速构建算法. 陈熹,周军.信息技术,第2007年卷第2期. 2007 |
基于邻域运动一致性的运动矢量快速构建算法. 陈熹,周军.信息技术,第2007年卷第2期. 2007 * |
结合整体与局部信息的人脸识别方法. 汪宁,丁晓青.计算机工程,第30卷第5期. 2004 |
结合整体与局部信息的人脸识别方法. 汪宁,丁晓青.计算机工程,第30卷第5期. 2004 * |
鉴别局部特征分析及其在人脸识别中的应用. 杨琼,丁晓青.清华大学学报(自然科学版),第44卷第4期. 2004 |
鉴别局部特征分析及其在人脸识别中的应用. 杨琼,丁晓青.清华大学学报(自然科学版),第44卷第4期. 2004 * |
Also Published As
Publication number | Publication date |
---|---|
US20090135188A1 (en) | 2009-05-28 |
CN101159016A (en) | 2008-04-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN100514353C (en) | Living body detecting method and system based on human face physiologic moving | |
WO2020151489A1 (en) | Living body detection method based on facial recognition, and electronic device and storage medium | |
CN110837784B (en) | Examination room peeping and cheating detection system based on human head characteristics | |
CN102214291B (en) | Method for quickly and accurately detecting and tracking human face based on video sequence | |
CN102521565B (en) | Garment identification method and system for low-resolution video | |
CN105893946B (en) | A kind of detection method of front face image | |
CN101246544B (en) | Iris positioning method based on boundary point search and minimum kernel value similarity region edge detection | |
CN108805009A (en) | Classroom learning state monitoring method based on multimodal information fusion and system | |
CN107316031A (en) | The image characteristic extracting method recognized again for pedestrian | |
CN109101871A (en) | A kind of living body detection device based on depth and Near Infrared Information, detection method and its application | |
CN108182409A (en) | Biopsy method, device, equipment and storage medium | |
CN106295568A (en) | The mankind's naturalness emotion identification method combined based on expression and behavior bimodal | |
CN106682573B (en) | A kind of pedestrian tracting method of single camera | |
CN107330371A (en) | Acquisition methods, device and the storage device of the countenance of 3D facial models | |
CN103902970B (en) | Automatic fingerprint Attitude estimation method and system | |
CN107992797A (en) | Face identification method and relevant apparatus | |
CN106485735A (en) | Human body target recognition and tracking method based on stereovision technique | |
CN106169071A (en) | A kind of Work attendance method based on dynamic human face and chest card recognition and system | |
CN106372570A (en) | Visitor flowrate statistic method | |
CN106203256A (en) | A kind of low resolution face identification method based on sparse holding canonical correlation analysis | |
CN105701466A (en) | Rapid all angle face tracking method | |
CN109784130A (en) | Pedestrian recognition methods and its device and equipment again | |
CN106384345A (en) | RCNN based image detecting and flow calculating method | |
CN106709438A (en) | Method for collecting statistics of number of people based on video conference | |
CN103577804B (en) | Based on SIFT stream and crowd's Deviant Behavior recognition methods of hidden conditional random fields |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20090715 Termination date: 20191126 |
|
CF01 | Termination of patent right due to non-payment of annual fee |