WO2017092573A1 - 一种基于眼球跟踪的活体检测的方法、装置及系统 - Google Patents

一种基于眼球跟踪的活体检测的方法、装置及系统 Download PDF

Info

Publication number
WO2017092573A1
WO2017092573A1 PCT/CN2016/106248 CN2016106248W WO2017092573A1 WO 2017092573 A1 WO2017092573 A1 WO 2017092573A1 CN 2016106248 W CN2016106248 W CN 2016106248W WO 2017092573 A1 WO2017092573 A1 WO 2017092573A1
Authority
WO
WIPO (PCT)
Prior art keywords
curve
motion
trajectory
sequence
matching degree
Prior art date
Application number
PCT/CN2016/106248
Other languages
English (en)
French (fr)
Inventor
蔡子豪
冯亮
Original Assignee
中国银联股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中国银联股份有限公司 filed Critical 中国银联股份有限公司
Publication of WO2017092573A1 publication Critical patent/WO2017092573A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/40Spoof detection, e.g. liveness detection
    • G06V40/45Detection of the body part being alive
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris

Definitions

  • the invention relates to the field of biometric identification technology, in particular to a face recognition technology, and in particular to a method, device and system for living body detection based on eyeball tracking.
  • face recognition as a new type of identity verification method has become a trend of financial technology development, which can be widely used in payment, transfer and other scenarios.
  • face recognition technology With the rapid development and wide application of face recognition technology, its security has also been severely challenged.
  • the main threats in the field of face recognition generally come from photos of legitimate users, videos of legitimate users, or 3D models of legitimate users. Therefore, how to detect living organisms is a very important task in the face recognition process.
  • Live detection is mainly to prove that a living person is in front of the camera, not a photo, a video or a 3D model.
  • the face in the photo is flat, and there are defects such as quality loss, blurring, etc. caused by secondary acquisition.
  • the face in the video is reflected by the LCD due to the video player.
  • the movement of the face formed by the 3D model is a rigid movement. Live detection can effectively prevent related fraud and improve the security and practicability of face recognition based authentication.
  • the living body detection scheme has technical defects such as complicated detection equipment (for example, a device such as a thermal image sensor), high detection cost, low accuracy, and poor user experience. Therefore, how to research and develop a living detection solution with high security and good user experience is a technical problem to be solved in the field.
  • the present invention provides a method, device and system for living body detection based on eye tracking, the display displaying a moving ball, guiding the user to look at the small ball, during the user's gaze,
  • the camera captures the video of the user's eye, and then the living body detecting device acquires the moving trajectory of the eyeball from the video, and matches the motion trajectory of the moving ball, thereby generating a living body matching result, which is combined with the trajectory curve due to the matching process between the motion trajectory and the active trajectory.
  • Two main characteristics, namely slope and distance, are used for dimensional analysis, thus solving the translation and scale transformation of the motion trajectory curve on different coordinate systems, which is suitable for the scene of the eyeball trajectory.
  • One of the objectives of the present invention is to provide a method for living body detection based on eye tracking, the method comprising: outputting a motion trajectory, the motion trajectory is irregular; acquiring an eye when the current user is gazing at the motion trajectory a video of the part; drawing an activity trajectory of the user's eyeball according to the video; matching the motion trajectory with the active trajectory to output a living body detection result.
  • the motion trajectory is a small ball randomly moving in different directions.
  • the activity track of the user's eyeball is drawn according to the video: converting the video into multiple pictures; obtaining the relative position of the user's eyeball and the eye in each picture; The relative position draws the activity track of the user's eyeball.
  • the moving trajectory is matched with the active trajectory
  • outputting the living body detection result includes: drawing a motion curve according to the motion trajectory; and drawing an activity curve according to the active trajectory; Determining a matching degree between the motion curve and the activity curve; acquiring a preset matching degree threshold; and outputting a living body detection result according to the matching degree and the matching degree threshold.
  • determining the matching degree between the motion curve and the activity curve comprises: drawing a first curve according to the motion curve; drawing a second curve according to the activity curve; determining according to a recursive algorithm a length of the longest common subsequence of the first curve and the second curve; determining a matching degree of the first curve and the second curve according to the length of the longest common subsequence.
  • the drawing the first curve according to the motion curve comprises: acquiring a preset sampling interval; sampling the motion curve according to the sampling interval to obtain a plurality of motion sampling points. The slopes between the two motion sampling points are sequentially determined to form a sequence of motion slopes; the first curve is drawn according to the sequence of motion slopes.
  • the drawing the second curve according to the activity curve comprises: acquiring a preset sampling interval; sampling the activity curve according to the sampling interval to obtain a plurality of active sampling points. The slopes between the two active sampling points are determined in turn to form a sequence of active slopes; the second curve is drawn according to the sequence of active slopes.
  • determining the longest common subsequence length of the first curve and the second curve according to a recursive algorithm is performed by the following formula:
  • m is the element in the sequence M
  • ie M(m 1 , m 2 ,...m a ) is the sequence of motion slope
  • a is the length of the sequence of motion slope M
  • n is the element in the sequence N
  • ie N ( n 1 , n 2 ,...n b ) is the active slope sequence
  • b is the length of the active slope sequence N
  • LS(m,n) is the longest common subsequence length of M and N
  • Head(m) is removed
  • determining the matching degree of the first curve and the second curve according to the length of the longest common subsequence is performed by the following formula:
  • S(m,n) is the matching degree
  • min(m,n) is the minimum value of m and n.
  • outputting the living body detection result according to the matching degree and the matching degree threshold value includes: determining whether the matching degree is greater than a matching degree threshold; when the determination is yes, outputting the current user living body detection successfully. The result of the living body test; otherwise, the current body test result of the current user's living body detection failure is output.
  • An object of the present invention is to provide an apparatus for living body detection based on eye tracking, the apparatus comprising: a motion trajectory output module for outputting a motion trajectory, the motion trajectory is irregular; video acquisition a module, configured to acquire a video of an eye when the current user looks at the motion track; a motion track determining module, configured to draw an activity track of the user eye according to the video; and a track matching module, configured to: The active trajectories are matched, and the living body detection result is output.
  • the motion trajectory is a small ball randomly moving in different directions.
  • the active trajectory determining module includes: a converting unit configured to convert the video into a plurality of pictures; and a relative position acquiring unit configured to acquire a user's eye and eyes in each picture a relative position; an activity trajectory drawing unit for drawing an activity trajectory of the user's eyeball according to the relative position.
  • the trajectory matching module includes: a motion curve drawing unit, configured to draw a motion curve according to the motion trajectory; and a motion curve drawing unit, configured to draw an activity curve according to the active trajectory a matching degree determining unit, configured to determine a matching degree of the motion curve and the activity curve; a threshold acquiring unit, configured to acquire a preset matching degree threshold; and a detection result output unit, configured to perform, according to the matching degree, The matching threshold outputs the result of the living body detection.
  • the matching degree determining unit includes: a first curve drawing unit configured to draw a first curve according to the motion curve; and a second curve drawing unit configured to perform the activity according to the The second curve is drawn by the curve; the longest common subsequence length determining unit is configured to determine the first curve and the first according to the recursive algorithm The longest common subsequence length of the two curves; the matching degree determining subunit is configured to determine the matching degree of the first curve and the second curve according to the length of the longest common subsequence.
  • the first curve drawing unit includes: a first sampling interval acquiring unit, configured to acquire a preset sampling interval; and a first sampling unit configured to perform, according to the sampling interval
  • the motion curve is sampled to obtain a plurality of motion sampling points;
  • a first slope determining unit is configured to sequentially determine a slope between the two motion sampling points to form a motion slope sequence;
  • the first curve is to draw a sub-unit, A first curve is drawn according to the sequence of motion slopes.
  • the second curve drawing unit includes: a second sampling interval acquiring unit, configured to acquire a preset sampling interval; and a second sampling unit, configured to perform, according to the sampling interval
  • the activity curve is sampled to obtain a plurality of active sampling points;
  • the second slope determining unit is configured to sequentially determine a slope between two active sampling points to form an active slope sequence;
  • the second curve is to draw a sub-unit, A second curve is drawn based on the sequence of activity slopes.
  • the detection result output unit includes: a determining unit, configured to determine whether the matching degree is greater than a matching degree threshold; and the first detecting result output unit is configured to: when the determining unit determines If yes, the current detection result of the current user's living body detection is successful; and the second detection result output unit is configured to output the living body detection result of the current user's living body detection failure when the determination unit determines to be no.
  • the device includes: a motion track output module for outputting a motion track, the motion track is irregular; the display is configured to receive and display the motion track; and the camera is used to capture a current The video of the eye when the user looks at the motion track; the device for detecting the living body based on the eyeball tracking further includes: a video acquiring device, configured to acquire the video; and an activity track determining device, configured to use the video according to the video An activity track of the user's eyeball is drawn; the living body detecting device is configured to match the motion track with the active track, and output a living body detection result.
  • the invention has the beneficial effects of providing a method, a device and a system for living body detection based on eyeball tracking, wherein a small ball of motion is displayed by the display, and the user is guided to look at the small ball.
  • the camera captures the eye of the user.
  • the living body detecting device obtains the moving trajectory of the eyeball from the video, and matches the motion trajectory of the moving ball, thereby generating a living body matching result, and combining the two main characteristics of the trajectory curve in the process of matching the motion trajectory with the active trajectory, That is, the slope and the distance are analyzed by dimensionality, thus solving the difference in the motion trajectory curve.
  • the translation and scale transformation on the coordinate system are applicable to the scene of the eyeball trajectory, and the ease of use of the present invention is strong, and the existing face recognition device is used without loading a new device.
  • FIG. 1 is a flowchart of a method for detecting a living body based on eyeball tracking according to an embodiment of the present invention
  • step S103 in FIG. 1 is a specific flowchart of step S103 in FIG. 1;
  • FIG. 3 is a specific flowchart of step S104 in FIG. 1;
  • step S303 in FIG. 3 is a specific flowchart of step S303 in FIG. 3;
  • FIG. 5 is a specific flowchart of step S401 in FIG. 4;
  • FIG. 6 is a specific flowchart of step S402 in FIG. 4;
  • FIG. 7 is a specific flowchart of step S305 in FIG. 3;
  • FIG. 8 is a schematic structural diagram of a system for detecting a living body based on eyeball tracking according to an embodiment of the present invention.
  • FIG. 9 is a schematic structural diagram of an apparatus for detecting a living body based on eyeball tracking according to an embodiment of the present invention.
  • FIG. 10 is a connection scene diagram of a system for detecting a living body based on eyeball tracking according to an embodiment of the present invention.
  • FIG. 11 is a structural block diagram of an active trajectory determining module 30 in a device for detecting a living body based on eyeball tracking according to an embodiment of the present invention
  • FIG. 12 is a structural block diagram of a trajectory matching module 40 in a device for detecting a living body based on eyeball tracking according to an embodiment of the present invention
  • FIG. 13 is a structural block diagram of a matching degree determining unit 43 in an apparatus for detecting a living body based on eyeball tracking according to an embodiment of the present invention
  • FIG. 14 is a structural block diagram of a first curve drawing unit 431 in an apparatus for detecting a living body based on eyeball tracking according to an embodiment of the present invention
  • FIG. 15 is a structural block diagram of a second curve drawing unit 432 in an apparatus for detecting a living body based on eyeball tracking according to an embodiment of the present invention
  • FIG. 16 is a structural block diagram of a detection result output unit 45 in an apparatus for detecting a living body based on eyeball tracking according to an embodiment of the present invention.
  • Live detection is mainly to prove that a living person is in front of the camera instead of a photo or video. This work can effectively prevent related fraud and improve the security and practicability of face recognition based authentication.
  • Eye tracking is mainly to study the acquisition, modeling and simulation of eye movement information. With the popularity of cameras in ATMs, mobile phones, notebook computers, PCs, etc., eye tracking has been widely used in scenes such as automobile driver fatigue detection and command control.
  • FIG. 10 is a node scene diagram of a system for living body detection based on eyeball tracking.
  • the face faces the display screen.
  • the eyes look at the display screen, and there is a camera on the display screen that can clearly capture the user's face.
  • FIG. 8 is a schematic structural diagram of a system for detecting a living body based on eyeball tracking according to an embodiment of the present invention. As shown in FIG. 8, the system includes a display 1, a camera 2, and a device 3 for detecting a living body based on eyeball tracking.
  • FIG. 9 is a schematic structural diagram of an apparatus for detecting a living body based on eyeball tracking according to an embodiment of the present invention.
  • the apparatus 3 for detecting a living body based on eyeball tracking includes: a motion track output module 10, A motion trajectory is output, and the motion trajectory is irregular.
  • the display 1 is configured to receive and display the motion trajectory.
  • the motion trajectory may be formed by random movement of a small ball in different directions, that is, a small ball appears in the display, small The ball moves randomly in all directions (four directions, up and down, left and right) at a constant speed or a variable speed, and the motion trajectory is irregular.
  • the camera 2 is configured to capture a video of an eye when the current user looks at the motion track. That is, during the display of the motion track, the user looks at the ball and keeps the body and head still. The camera captures the video of the user's eyes.
  • the device for detecting a living body based on eye tracking further includes:
  • a video acquiring device 20 configured to acquire the video
  • the activity trajectory determining device 30 is configured to draw an activity trajectory of the user's eyeball according to the video
  • FIG. 11 is a structural block diagram of the activity trajectory determining module 30 in the living body detecting device.
  • the living body detecting device 40 is configured to match the motion trajectory with the active trajectory and output a living body detection result.
  • FIG. 12 is a structural block diagram of the track matching module 40.
  • FIG. 11 is a structural block diagram of an active trajectory determining module 30 in an apparatus for detecting a living body based on eyeball tracking according to an embodiment of the present invention.
  • the active trajectory determining module 30 includes:
  • a converting unit 31 configured to convert the video into multiple pictures
  • the relative position obtaining unit 32 is configured to acquire a relative position of the user's eyeball and the eye in each picture;
  • the activity trajectory drawing unit 33 is configured to draw an activity trajectory of the user's eyeball according to the relative position.
  • FIG. 12 is a structural block diagram of a trajectory matching module 40 in a device for detecting a living body based on eyeball tracking according to an embodiment of the present invention. As shown in FIG. 12, the trajectory matching module 40 includes:
  • a motion curve drawing unit 41 configured to draw a motion curve according to the motion trajectory
  • An activity curve drawing unit 42 is configured to draw an activity curve according to the activity trajectory
  • the matching degree determining unit 43 is configured to determine the matching degree of the motion curve and the activity curve, and FIG. 13 is a structural block diagram of the matching degree determining unit 43.
  • the threshold obtaining unit 44 is configured to acquire a preset matching degree threshold.
  • the matching threshold value is a preset parameter. The larger the value is set, the lower the false positive rate will be, but the rejection rate will increase, and vice versa.
  • the detection result output unit 45 is configured to output a living body detection result according to the matching degree and the matching degree threshold value, and FIG. 16 is a structural block diagram of the detection result output unit 45.
  • FIG. 13 is a structural block diagram of a matching degree determining unit 43 in an apparatus for detecting a living body based on an eyeball tracking according to an embodiment of the present invention.
  • the matching degree determining unit 43 includes:
  • the first curve drawing unit 431 is configured to draw a first curve according to the motion curve
  • FIG. 14 is a structural block diagram of the first curve drawing unit 431. As shown in FIG. 14, the first curve drawing unit includes:
  • the first sampling interval obtaining unit 4311 is configured to acquire a preset sampling interval.
  • the motion curve and the activity curve are sampled, and the sampling time intervals are consistent. The shorter the interval length, the more accurate the final result, and the more computation is required.
  • the starting points of the motion curve and the activity curve are set to be the same. If the acquired motion curve and activity curve sample data are sampled at different time intervals, normalization processing is required.
  • a first sampling unit 4312 configured to sample the motion curve according to the sampling interval to obtain a plurality of motion sampling points
  • a first slope determining unit 4313 configured to sequentially determine a slope between two pairs of motion sampling points to form a sequence of motion slopes
  • the first curve drawing sub-unit 4314 is configured to draw the first curve according to the sequence of motion slopes.
  • the motion slope sequence takes the time (or sequence number) as the abscissa and the slope as the ordinate, draws the curve, and obtains the first curve in a new coordinate system space.
  • the slope of every two points may be referred to as a reference, for example, the slope of [0, 180] degrees is divided into 36 points every 5 degrees, and the slope is Between the [0,5) degree ranges, the slope value is 0, [5,10) is 1, [10,15) is 2, and so on. So far, the motion slope sequence M is obtained.
  • the data divided by these slopes is represented by the time (or serial number) as the abscissa, the slope interval represents the ordinate, and the curve is drawn, and the first garment curve is obtained in a new coordinate system space.
  • the matching degree determining unit 43 further includes:
  • the second curve drawing unit 432 is configured to draw a second curve according to the activity curve
  • FIG. 15 is a structural block diagram of the second curve drawing unit 432.
  • the second curve drawing unit includes:
  • the second sampling interval obtaining unit 4321 is configured to acquire a preset sampling interval.
  • the motion curve and the activity curve are sampled, and the sampling time intervals are consistent. The shorter the interval length, the more accurate the final result, and the more computation is required.
  • the starting points of the motion curve and the activity curve are set to be the same. If the acquired motion curve and activity curve sample data are sampled at different time intervals, normalization processing is required.
  • a second sampling unit 4322 configured to sample the activity curve according to the sampling interval to obtain a plurality of active sampling points
  • a second slope determining unit 4323 configured to sequentially determine a slope between two pairs of active sampling points to form a sequence of active slopes
  • a second curve drawing sub-unit 4324 is configured to draw a second curve according to the sequence of active slopes.
  • the motion slope sequence takes the time (or sequence number) as the abscissa and the slope as the ordinate, draws the curve, and obtains the second curve in a new coordinate system space.
  • the slope of every two points may be referred to, for example, the slope of [0,180] degrees is divided into 36 points every 5 degrees, and the slope is Between the [0,5) degree ranges, the slope value is 0, [5,10) is 1, [10,15) is 2, and so on.
  • the activity slope sequence N is obtained.
  • the slope-divided data takes the time (or serial number) as the abscissa, the slope interval represents the ordinate, draws the curve, and obtains the second curve in a new coordinate system space.
  • the matching degree determining unit 43 further includes:
  • the longest common subsequence length determining unit 433 is configured to determine a longest common subsequence length of the first curve and the second curve according to a recursive algorithm. Calculate the length of the longest common subsequence of the first curve and the second curve, that is, the degree of coincidence of the two curves.
  • the calculation method of the longest common subsequence may be a recursive algorithm in a specific embodiment, such as by the following formula:
  • m is the element in the sequence M
  • ie M(m 1 , m 2 ,...m a ) is the sequence of motion slope
  • a is the length of the sequence of motion slope M
  • n is the element in the sequence N
  • ie N ( n 1 , n 2 ,...n b ) is the active slope sequence
  • b is the length of the active slope sequence N
  • LS(m,n) is the longest common subsequence length of M and N
  • Head(m) is removed
  • M(m 1 , m 2 , . . . m a ) is a sequence of motion slopes
  • a is the length of the sequence of motion slopes M
  • N(n 1 , n 2 , . . . n b ) is a sequence of active slopes
  • b is the length of the sequence of activity slopes N.
  • the formula LS(m,n) represents the length of the longest common subsequence of sequence M and sequence N, which is a measure of the degree to which the two curves coincide.
  • the value of LS(m, n) is equal to 1 + LS (Head(m), Head(n) )); where Head(m) denotes a subsequence of M that removes the last term, namely (m 1 , m 2 , ... m a-1 ), and the length of the subsequence is a-1. Head (n) is the same.
  • the matching degree determining sub-unit 434 is configured to determine a matching degree of the first curve and the second curve according to the length of the longest common sub-sequence.
  • S(m,n) is the value of LS(m,n) divided by the minimum of the number of points in M, N, that is, S(m,n) is equal to LS(m,n) divided by Min(m,n).
  • the value range of S(m,n) is between 0 and 1. The closer the value is to 1, the more similar the two curves are, ie by the following formula:
  • S(m,n) is the matching degree
  • min(m,n) is the minimum value of m and n.
  • FIG. 16 is a structural block diagram of a detection result output unit 45 in a device for detecting a living body based on eyeball tracking according to an embodiment of the present invention. As shown in FIG. 16, the detection result output unit includes:
  • the determining unit 451 is configured to determine whether the matching degree is greater than a matching degree threshold
  • the first detection result output unit 452 is configured to output a living body detection result that the current user is successfully detected by the living unit when the determination unit determines to be YES;
  • the second detection result output unit 453 is configured to output a living body detection result of the current user biometric detection failure when the determination unit determines NO.
  • the matching degree is greater than the matching degree threshold, the matching is successful, and the current user's living body detection is successful. Otherwise, the biometric detection fails.
  • the above is a device and system for detecting a living body based on eyeball tracking provided by the present invention.
  • a moving ball is displayed by the display to guide the user to look at the small ball.
  • the camera captures the video of the user's eye, and then
  • the living body detecting device acquires the active trajectory of the eyeball from the video and matches the motion trajectory of the moving ball to generate a living body matching result.
  • the two main characteristics of the trajectory curve that is, the slope and the distance, are obtained due to the matching process between the motion trajectory and the active trajectory.
  • Dimensional analysis is performed, thus solving the translation and scale transformation of the motion trajectory curve on different coordinate systems, which is suitable for the scene of the eyeball trajectory, and the invention is easy to use, and the existing face recognition device is used without loading New equipment.
  • FIG. 1 is a flowchart of a method for detecting a living body based on eye tracking.
  • FIG. 1 shows that the method for detecting a living body based on eye tracking includes :
  • the motion trajectory may be formed by random movement of a small ball in different directions, that is, the small ball randomly performs motion in all directions (four directions as above, left and right) at a constant speed or a variable speed, and the motion trajectory is none. Regular.
  • S102 Acquire a video of an eye when the current user looks at the motion track.
  • the user looks at the ball and keeps the body and head motionless.
  • FIG. 3 is a specific flowchart of step S104.
  • FIG. 2 is a specific flowchart of step S103 in FIG. 1. As can be seen from FIG. 2, step S103 includes:
  • S203 Draw an activity track of the user's eyeball according to the relative position.
  • FIG. 3 is a specific flowchart of step S104 in FIG. 1. As can be seen from FIG. 3, step S104 includes:
  • FIG. 4 is a specific flowchart of step S303.
  • S304 Acquire a preset matching degree threshold.
  • the matching threshold value is a preset parameter. The larger the value is set, the lower the false positive rate will be, but the rejection rate will increase, and vice versa.
  • step S305 Output a living body detection result according to the matching degree and the matching degree threshold
  • FIG. 7 is a specific flowchart of step S305.
  • step S303 includes:
  • step S401 The first curve is drawn according to the motion curve, and FIG. 5 is a specific flowchart of step S401. As shown in FIG. 5, step S401 includes:
  • S501 Acquire a preset sampling interval.
  • the motion curve and the activity curve are sampled, and the sampling time intervals are consistent. The shorter the interval length, the more accurate the final result, and the more computation is required.
  • the starting points of the motion curve and the activity curve are set to be the same. If the acquired motion curve and activity curve sample data are sampled at different time intervals, normalization processing is required.
  • S502 sampling the motion curve according to the sampling interval to obtain a plurality of motion sampling points
  • S504 Draw a first curve according to the sequence of motion slopes.
  • the motion slope sequence takes the time (or sequence number) as the abscissa and the slope as the ordinate, draws the curve, and obtains the first curve in a new coordinate system space.
  • the slope of every two points may be referred to as a reference, for example, the slope of [0, 180] degrees is divided into 36 points every 5 degrees, and the slope is Between the [0,5) degree ranges, the slope value is 0, [5,10) is 1, [10,15) is 2, and so on. So far, the motion slope sequence M is obtained.
  • the data divided by these slopes is represented by the time (or serial number) as the abscissa, the slope interval represents the ordinate, and the curve is drawn, and the first garment curve is obtained in a new coordinate system space.
  • step S303 further includes:
  • step S402 Draw a second curve according to the activity curve
  • FIG. 6 is a specific flowchart of step S402. As can be seen from FIG. 6, step S402 includes:
  • S601 Acquire a preset sampling interval.
  • the motion curve and the activity curve are sampled, and the sampling time intervals are consistent. The shorter the interval length, the more accurate the final result, and the more computation is required.
  • the starting points of the motion curve and the activity curve are set to be the same. If the acquired motion curve and activity curve sample data are sampled at different time intervals, normalization processing is required.
  • S602 sampling the activity curve according to the sampling interval to obtain a plurality of active sampling points
  • S603 sequentially determining a slope between two active sampling points to form an active slope sequence
  • S604 Draw a second curve according to the sequence of activity slopes.
  • the motion slope sequence takes the time (or sequence number) as the abscissa and the slope as the ordinate, draws the curve, and obtains the second curve in a new coordinate system space.
  • the slope of every two points may be referred to, for example, the slope of [0,180] degrees is divided into 36 points every 5 degrees, and the slope is Between the [0,5) degree ranges, the slope value is 0, [5,10) is 1, [10,15) is 2, and so on.
  • the activity slope sequence N is obtained.
  • the data divided by these slopes is time (or serial number) as the abscissa, the slope interval represents the ordinate, and the curve is drawn, and the second curve is obtained in a new coordinate system space.
  • step S303 further includes:
  • S403 Determine a longest common subsequence length of the first curve and the second curve according to a recursive algorithm. Calculate the length of the longest common subsequence of the first curve and the second curve, that is, the degree of coincidence of the two curves.
  • the calculation method of the longest common subsequence may be a recursive algorithm in a specific embodiment, such as by the following formula:
  • m is the element in the sequence M
  • ie M(m 1 , m 2 ,...m a ) is the sequence of motion slope
  • a is the length of the sequence of motion slope M
  • n is the element in the sequence N
  • ie N ( n 1 , n 2 ,...n b ) is the active slope sequence
  • b is the length of the active slope sequence N
  • LS(m,n) is the longest common subsequence length of M and N
  • Head(m) is removed
  • M(m 1 , m 2 , . . . m a ) is a sequence of motion slopes
  • a is the length of the sequence of motion slopes M
  • N(n 1 , n 2 , . . . n b ) is a sequence of active slopes
  • b is the length of the sequence of activity slopes N.
  • the formula LS(m,n) represents the length of the longest common subsequence of sequence M and sequence N, which is a measure of the degree to which the two curves coincide.
  • the value of LS(m, n) is equal to 1 + LS (Head(m), Head(n) )); where Head(m) denotes a subsequence of M that removes the last term, namely (m 1 , m 2 , ... m a-1 ), and the length of the subsequence is a-1. Head (n) is the same.
  • S404 Determine a matching degree between the first curve and the second curve according to the length of the longest common subsequence.
  • S(m,n) is the value of LS(m,n) divided by the minimum of the number of points in M, N, that is, S(m,n) is equal to LS(m,n) divided by Min(m,n).
  • the value range of S(m,n) is between 0 and 1. The closer the value is to 1, the more similar the two curves are, ie by the following formula:
  • S(m,n) is the matching degree
  • min(m,n) is the minimum value of m and n.
  • step S305 includes:
  • S701 determining whether the matching degree is greater than a matching degree threshold
  • the matching degree is greater than the matching degree threshold, the matching is successful, and the current user's living body detection is successful. Otherwise, the biometric detection fails.
  • the above is a method for detecting a living body based on eye tracking provided by the present invention.
  • a moving ball is displayed by the display to guide the user to look at the small ball.
  • the camera captures the video of the user's eye, and then, the living body
  • the detecting device acquires the active trajectory of the eyeball from the video and matches the motion trajectory of the moving ball, thereby generating a living body matching result, and the two main characteristics, namely the slope and the distance, are combined with the trajectory curve in the process of matching the motion trajectory with the active trajectory.
  • Dimensional analysis thus solving the translation and scale transformation of the motion trajectory curve on different coordinate systems, suitable for the scene of the eyeball trajectory.
  • FIG. 10 is a connection scene diagram of a system for detecting a living body based on eyeball tracking according to an embodiment of the present invention.
  • a moving ball is displayed in the display to guide the user to look at the ball and keep the body and head still.
  • Face during face recognition Facing the display the eyes look at the display, and there is a camera on the display that can clearly capture the user's face.
  • the camera captures the video of the user's eyes during the user's gaze.
  • the living body detecting device acquires the eye movement track from the video and matches the motion path of the moving ball. If the matching is successful, it is determined that the current user is detected in the living body successfully.
  • the display prompts the user for the next operation rule, and gives the user sufficient time to prepare, and when the user understands the operation rule, confirms.
  • the living body detecting device receives the captured video, and the video is replaced into a plurality of pictures, and according to the relative position of the eye ball and the eye, the activity track of the user's eye ball is drawn, thereby determining the activity track curve.
  • the living body detecting device acquires the motion trajectory of the small ball, determines the motion trajectory curve, and matches the active trajectory curve of the user's eye ball. Calculate the matching degree of the two curves. If the matching degree is greater than the matching threshold, if the matching is successful, the current user's living body detection is successful. Otherwise, the biometric detection fails.
  • the matching threshold is a parameter set by the system. The larger the value is set, the lower the false positive rate will be, but the rejection rate will increase, and vice versa.
  • the process of calculating the matching degree of the two trajectory curves in this step is complicated, and will be described in detail below.
  • the technical solution of the present invention combines two main characteristics of the trajectory curve, namely slope and distance, in the process of calculating the matching degree of the curve, and performs analysis of two dimensions.
  • the two curves are considered similar if the slope changes are close; if the two curves coincide more, they are considered similar.
  • the combination of the two dimensions can improve the accuracy and solve the problem of two-curve translation and scale change, and the applicability is stronger.
  • the following is a detailed description of its process:
  • the two curves are sampled, and the sampling interval length is the same. The shorter the interval length, the more accurate the similarity method is calculated, and the more computation is required; in addition, the starting points of the two curves are set to be the same. If the input curve sample data acquired by the system is sampled at different time intervals, normalization processing is required.
  • LS(m,n) is calculated as follows: If condition 1 is satisfied (the difference between the values of a and b is less than the parameter ⁇ ) and the condition 2 is satisfied (the difference between the values of m a and n b is less than the parameter ⁇ ), then LS(m) , n) has a value equal to 1 + LS (Head (m), Head (n)); where Head (m) represents a subsequence of M that removes the last term, ie (m 1 , m 2 , ... m a -1 ), the length of the subsequence is a-1.
  • Head(n) is the same; if it can't satisfy condition 1 and condition 2, before calculating LS(m,n), we need to calculate formula LS(Head(m),n) and formula LS(m,Head(n) respectively.
  • the value of the two takes the larger value as the value of LS(m,n); if the length of any one of M and N is 0, LS(m,n) is 0; where ⁇ and ⁇ are preset by the system Parameters.
  • the similarity S (m, n) of the last two curves is the value of LS (m, n) divided by the minimum of the number of points in M, N, that is, S (m, n) is equal to LS (m, n) divided by min(m,n).
  • the value range of S(m,n) is from 0 to 1, and the closer the value is to 1, the more similar the two curves are.
  • the present invention provides a method, device and system for living body detection based on eye tracking, which displays a moving ball by a display, guides the user to look at the small ball, and the camera captures the user's eye during the user's gaze.
  • the video of the part then, the living body detecting device acquires the moving track of the eyeball from the video, and matches the motion track of the moving ball, thereby generating a living body matching result.
  • the motion trajectory matching method combines two main characteristics of the trajectory curve, namely slope and distance, for dimensional analysis. It solves the translation and scale transformation of the motion trajectory curve on different coordinate systems, and is suitable for the scene of the eyeball trajectory.
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), or a random access memory (RAM).

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Eye Examination Apparatus (AREA)
  • Image Analysis (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

一种基于眼球跟踪的活体检测的方法、装置(3)以及系统,所述系统包括显示器(1)、摄像头(2)以及基于眼球跟踪的活体检测的装置(3),其中,装置(3)包括运动轨迹输出模块(10),用于输出一运动轨迹;显示器(1)用于接收并显示运动轨迹;摄像头(2)用于拍摄当前用户注视运动轨迹时眼部的视频;装置(3)还包括视频获取装置(20),用于获取所述的视频;活动轨迹确定装置(30),用于根据视频绘制用户眼球的活动轨迹;活体检测装置(40),用于将运动轨迹与活动轨迹进行匹配,输出活体检测结果。本方法解决了运动轨迹曲线在不同坐标系上的平移和尺度变换,适用于眼球轨迹的场景,且发明的易用性强,使用现有的人脸识别的设备,无需装载新的设备。

Description

一种基于眼球跟踪的活体检测的方法、装置及系统 技术领域
本发明关于生物特征识别技术领域,特别是关于人脸识别的技术,具体的讲是一种基于眼球跟踪的活体检测的方法、装置及系统。
背景技术
目前,人脸识别作为一种新型的身份确认方法,已成为金融技术发展的趋势,其可广泛应用于支付、转账等场景中。但随着人脸识别技术的迅速发展和广泛应用,其安全性也受到了严峻的挑战。现有技术中,在人脸识别领域主要的威胁一般来自于合法用户的照片、合法用户的视频或合法用户的3D模型。因此,如何检测活体是人脸识别过程中的一项非常重要的工作。
活体检测主要是证明一个活人处于镜头前,而不是一张照片、一段视频或3D模型。与真实的人脸相比,照片中的人脸是平面的,并且存在二次获取带来的质量损失、模糊等缺陷。视频中的人脸则由于视频播放器的缘故而存在LCD反光等现象。3D模型构成的人脸的运动是刚性的运动。活体检测可以有效地防止相关欺诈手段,提升基于人脸识别的身份验证的安全性和实用性。
现有技术中的活体检测方案,存在检测设备复杂(例如需要采用热图像传感器等设备)、检测成本高、准确度较低、用户体验不好等技术缺陷。因此,如何研究和开发出一种安全性高、用户体验佳的活体检测方案是本领域亟待解决的技术难题。
发明内容
为了克服现有技术存在的上述技术问题,本发明提供了一种基于眼球跟踪的活体检测的方法、装置及系统,显示器显示一个运动的小球,引导用户注视该小球,在用户注视期间,摄像头拍摄用户眼部的视频,然后,活体检测装置从视频中获取眼球的活动轨迹,并匹配运动小球的运动轨迹,进而生成活体匹配结果,由于运动轨迹与活动轨迹的匹配过程中结合轨迹曲线两个主要特性,即斜率和距离,进行维度分析,因此解决了运动轨迹曲线在不同坐标系上的平移和尺度变换,适用于眼球轨迹的场景。
本发明的目的之一是,提供一种基于眼球跟踪的活体检测的方法,所述方法包括:输出一运动轨迹,所述的运动轨迹是无规则的;获取当前用户注视所述运动轨迹时眼部的视频;根据所述的视频绘制用户眼球的活动轨迹;将所述的运动轨迹与所述的活动轨迹进行匹配,输出活体检测结果。
在本发明的优选实施方式中,所述的运动轨迹为一小球沿不同方向随机运动。
在本发明的优选实施方式中,根据所述的视频绘制用户眼球的活动轨迹包括:将所述的视频转换为多张图片;获取每张图片中用户眼球与眼睛的相对位置;根据所述的相对位置绘制用户眼球的活动轨迹。
在本发明的优选实施方式中,将所述的运动轨迹与所述的活动轨迹进行匹配,输出活体检测结果包括:根据所述的运动轨迹绘制运动曲线;根据所述的活动轨迹绘制活动曲线;确定所述运动曲线与所述活动曲线的匹配度;获取预先设定匹配度阈值;根据所述的匹配度以及匹配度阈值输出活体检测结果。
在本发明的优选实施方式中,确定所述运动曲线与所述活动曲线的匹配度包括:根据所述的运动曲线绘制第一曲线;根据所述的活动曲线绘制第二曲线;根据递归算法确定所述第一曲线与第二曲线的最长公共子序列长度;根据所述最长公共子序列长度确定所述第一曲线与第二曲线的匹配度。
在本发明的优选实施方式中,根据所述的运动曲线绘制第一曲线包括:获取预先设定的采样间隔;根据所述的采样间隔对所述的运动曲线进行采样,得到多个运动采样点;依次确定多个运动采样点两两之间的斜率,组成运动斜率序列;根据所述的运动斜率序列绘制第一曲线。
在本发明的优选实施方式中,根据所述的活动曲线绘制第二曲线包括:获取预先设定的采样间隔;根据所述的采样间隔对所述的活动曲线进行采样,得到多个活动采样点;依次确定多个活动采样点两两之间的斜率,组成活动斜率序列;根据所述的活动斜率序列绘制第二曲线。
在本发明的优选实施方式中,根据递归算法确定所述第一曲线与第二曲线的最长公共子序列长度通过如下公式进行:
Figure PCTCN2016106248-appb-000001
其中,m为序列M中的元素,即M(m1,m2,...ma)为运动斜率序列,a为运动斜率序列M的长度,n为序列N中的元素,即N(n1,n2,...nb)为活动斜率序列,b为活动斜率序列N的长度,LS(m,n)为M和N的最长公共子序列长度,Head(m)为除去最后一项的M的子序列,即(m1,m2,...ma-1),Head(n)为除去最后一项的N的子序列,即(n1,n2,...nb-1),ε、η为预先设置的参数。
在本发明的优选实施方式中,根据所述最长公共子序列长度确定所述第一曲线与第二曲线的匹配度通过如下公式进行:
Figure PCTCN2016106248-appb-000002
其中,S(m,n)为匹配度,min(m,n)为m、n中的最小值。
在本发明的优选实施方式中,根据所述的匹配度以及匹配度阈值输出活体检测结果包括:判断所述的匹配度是否大于匹配度阀值;当判断为是时,输出当前用户活体检测成功的活体检测结果;否则,输出当前用户活体检测失败的活体检测结果。
本发明的目的之一是,提供一种基于眼球跟踪的活体检测的装置,所述的装置包括:运动轨迹输出模块,用于输出一运动轨迹,所述的运动轨迹是无规则的;视频获取模块,用于获取当前用户注视所述运动轨迹时眼部的视频;活动轨迹确定模块,用于根据所述的视频绘制用户眼球的活动轨迹;轨迹匹配模块,用于将所述的运动轨迹与所述的活动轨迹进行匹配,输出活体检测结果。
在本发明的优选实施方式中,所述的运动轨迹为一小球沿不同方向随机运动。
在本发明的优选实施方式中,所述的活动轨迹确定模块包括:转换单元,用于将所述的视频转换为多张图片;相对位置获取单元,用于获取每张图片中用户眼球与眼睛的相对位置;活动轨迹绘制单元,用于根据所述的相对位置绘制用户眼球的活动轨迹。
在本发明的优选实施方式中,所述的轨迹匹配模块包括:运动曲线绘制单元,用于根据所述的运动轨迹绘制运动曲线;活动曲线绘制单元,用于根据所述的活动轨迹绘制活动曲线;匹配度确定单元,用于确定所述运动曲线与所述活动曲线的匹配度;阈值获取单元,用于获取预先设定匹配度阈值;检测结果输出单元,用于根据所述的匹配度以及匹配度阈值输出活体检测结果。
在本发明的优选实施方式中,所述的匹配度确定单元包括:第一曲线绘制单元,用于根据所述的运动曲线绘制第一曲线;第二曲线绘制单元,用于根据所述的活动曲线绘制第二曲线;最长公共子序列长度确定单元,用于根据递归算法确定所述第一曲线与第 二曲线的最长公共子序列长度;匹配度确定子单元,用于根据所述最长公共子序列长度确定所述第一曲线与第二曲线的匹配度。
在本发明的优选实施方式中,所述的第一曲线绘制单元包括:第一采样间隔获取单元,用于获取预先设定的采样间隔;第一采样单元,用于根据所述的采样间隔对所述的运动曲线进行采样,得到多个运动采样点;第一斜率确定单元,用于依次确定多个运动采样点两两之间的斜率,组成运动斜率序列;第一曲线绘制子单元,用于根据所述的运动斜率序列绘制第一曲线。
在本发明的优选实施方式中,所述的第二曲线绘制单元包括:第二采样间隔获取单元,用于获取预先设定的采样间隔;第二采样单元,用于根据所述的采样间隔对所述的活动曲线进行采样,得到多个活动采样点;第二斜率确定单元,用于依次确定多个活动采样点两两之间的斜率,组成活动斜率序列;第二曲线绘制子单元,用于根据所述的活动斜率序列绘制第二曲线。
在本发明的优选实施方式中,所述的检测结果输出单元包括:判断单元,用于判断所述的匹配度是否大于匹配度阀值;第一检测结果输出单元,用于当判断单元判断为是时,输出当前用户活体检测成功的活体检测结果;第二检测结果输出单元,用于当判断单元判断为否时,输出当前用户活体检测失败的活体检测结果。
本发明的目的之一是,提供一种基于眼球跟踪的活体检测的系统,所述的系统包括显示器、摄像头以及基于眼球跟踪的活体检测的装置,其中,所述的基于眼球跟踪的活体检测的装置包括:运动轨迹输出模块,用于输出一运动轨迹,所述的运动轨迹是无规则的;所述的显示器,用于接收并显示所述的运动轨迹;所述的摄像头,用于拍摄当前用户注视所述运动轨迹时眼部的视频;所述的基于眼球跟踪的活体检测的装置还包括:视频获取装置,用于获取所述的视频;活动轨迹确定装置,用于根据所述的视频绘制用户眼球的活动轨迹;活体检测装置,用于将所述的运动轨迹与所述的活动轨迹进行匹配,输出活体检测结果。
本发明的有益效果在于,提供了一种基于眼球跟踪的活体检测的方法、装置以及系统,由显示器显示一个运动的小球,引导用户注视该小球,在用户注视期间,摄像头拍摄用户眼部的视频,然后,活体检测装置从视频中获取眼球的活动轨迹,并匹配运动小球的运动轨迹,进而生成活体匹配结果,由于运动轨迹与活动轨迹的匹配过程中结合轨迹曲线两个主要特性,即斜率和距离,进行维度分析,因此解决了运动轨迹曲线在不同 坐标系上的平移和尺度变换,适用于眼球轨迹的场景,且本发明的易用性强,使用现有的人脸识别的设备,无需装载新的设备。
为让本发明的上述和其他目的、特征和优点能更明显易懂,下文特举较佳实施例,并配合所附图式,作详细说明如下。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本发明实施例提供的一种基于眼球跟踪的活体检测的方法的流程图;
图2为图1中的步骤S103的具体流程图;
图3为图1中的步骤S104的具体流程图;
图4为图3中的步骤S303的具体流程图;
图5为图4中的步骤S401的具体流程图;
图6为图4中的步骤S402的具体流程图;
图7为图3中的步骤S305的具体流程图;
图8为本发明实施例提供的一种基于眼球跟踪的活体检测的系统的结构示意图;
图9为本发明实施例提供的一种基于眼球跟踪的活体检测的装置的结构示意图;
图10为本发明实施例提供的一种基于眼球跟踪的活体检测的系统的结场景图;
图11为本发明实施例提供的一种基于眼球跟踪的活体检测的装置中活动轨迹确定模块30的结构框图;
图12为本发明实施例提供的一种基于眼球跟踪的活体检测的装置中轨迹匹配模块40的结构框图;
图13为本发明实施例提供的一种基于眼球跟踪的活体检测的装置中匹配度确定单元43的结构框图;
图14为本发明实施例提供的一种基于眼球跟踪的活体检测的装置中第一曲线绘制单元431的结构框图;
图15为本发明实施例提供的一种基于眼球跟踪的活体检测的装置中第二曲线绘制单元432的结构框图;
图16为本发明实施例提供的一种基于眼球跟踪的活体检测的装置中检测结果输出单元45的结构框图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
活体检测主要是证明一个活人处在镜头前而不是一张照片或是视频。该项工作可以有效地防止相关欺诈手段,提升基于人脸识别的身份验证的安全性和实用性。眼球追踪主要是研究眼球运动信息的获取、建模和模拟。随着摄像头已广泛普及在ATM、手机、笔记本电脑、PC等设备中,眼球追踪已广泛用于汽车驾驶员疲劳检测、指令控制等场景中。
本发明提出了一种基于眼球跟踪的活体检测的系统,图10为基于眼球跟踪的活体检测的系统的结场景图,由图10可知,在做人脸识别期间中,人脸面对着显示屏,眼睛注视显示屏,显示屏端有一个摄像头能够清楚拍摄用户面部画面。
图8为本发明实施例提供的一种基于眼球跟踪的活体检测的系统的结构示意图,由图8可知,所述的系统包括显示器1、摄像头2以及基于眼球跟踪的活体检测的装置3。
图9为本发明实施例提供的一种基于眼球跟踪的活体检测的装置的结构示意图,由图9可知,所述的基于眼球跟踪的活体检测的装置3包括:运动轨迹输出模块10,用于输出一运动轨迹,所述的运动轨迹是无规则的。
所述的显示器1,用于接收并显示所述的运动轨迹,在具体的实施方式中,所述的运动轨迹可为一小球沿不同方向随机运动形成,即显示器中出现一个小球,小球以匀速或变速随机进行各个方向(如上下左右四个方向)的运动,运动轨迹是无规则的。
所述的摄像头2,用于拍摄当前用户注视所述运动轨迹时眼部的视频。即显示器显示运动轨迹期间,用户注视该小球,并且保持身体和头不动。摄像头拍摄用户眼部的视频。
由图9可知,所述的基于眼球跟踪的活体检测的装置还包括:
视频获取装置20,用于获取所述的视频;
活动轨迹确定装置30,用于根据所述的视频绘制用户眼球的活动轨迹,图11为活体检测的装置中活动轨迹确定模块30的结构框图。
活体检测装置40,用于将所述的运动轨迹与所述的活动轨迹进行匹配,输出活体检测结果。图12为轨迹匹配模块40的结构框图。
图11为本发明实施例提供的一种基于眼球跟踪的活体检测的装置中活动轨迹确定模块30的结构框图,由图11可知,所述的活动轨迹确定模块30包括:
转换单元31,用于将所述的视频转换为多张图片;
相对位置获取单元32,用于获取每张图片中用户眼球与眼睛的相对位置;
活动轨迹绘制单元33,用于根据所述的相对位置绘制用户眼球的活动轨迹。
图12为本发明实施例提供的一种基于眼球跟踪的活体检测的装置中轨迹匹配模块40的结构框图,由图12可知,所述的轨迹匹配模块40包括:
运动曲线绘制单元41,用于根据所述的运动轨迹绘制运动曲线;
活动曲线绘制单元42,用于根据所述的活动轨迹绘制活动曲线;
匹配度确定单元43,用于确定所述运动曲线与所述活动曲线的匹配度,图13为匹配度确定单元43的结构框图。
阈值获取单元44,用于获取预先设定匹配度阈值。匹配度阈值阀值是预先设置的参数,数值设置的越大,误识率会降低,但拒识率会上升,反之亦然。
检测结果输出单元45,用于根据所述的匹配度以及匹配度阈值输出活体检测结果,图16为检测结果输出单元45的结构框图。
图13为本发明实施例提供的一种基于眼球跟踪的活体检测的装置中匹配度确定单元43的结构框图,由图13可知,所述的匹配度确定单元43包括:
第一曲线绘制单元431,用于根据所述的运动曲线绘制第一曲线,图14为第一曲线绘制单元431的结构框图,由图14可知,所述的第一曲线绘制单元包括:
第一采样间隔获取单元4311,用于获取预先设定的采样间隔。在具体的实施方式中,对运动曲线、活动曲线进行采样,采样时间的间隔长度一致。时间间隔长度越短,最终的结果越是精确,同时,需要的计算量也越多。另外,运动曲线、活动曲线的起始点设置为相同的。如果获取的运动曲线、活动曲线采样数据是不同时间间隔采样而成,需要做归一化的处理。
第一采样单元4312,用于根据所述的采样间隔对所述的运动曲线进行采样,得到多个运动采样点;
第一斜率确定单元4313,用于依次确定多个运动采样点两两之间的斜率,组成运动斜率序列;
第一曲线绘制子单元4314,用于根据所述的运动斜率序列绘制第一曲线。在具体的实施方式中,把运动斜率序列以时间(或者序号)为横坐标,斜率为纵坐标,画出曲线,就在一个新的坐标系空间中得到第一曲线。
在本发明的其他实施方式中,对运动曲线做斜率的计算时,也可以每两点的斜率为参照,如将[0,180]度的斜率以每5度为单位,划分为36分,斜率处在[0,5)度范围之间,斜率值为0,[5,10)为1,[10,15)为2,以此类推。至此,获得了运动斜率序列M。把这些斜率划分后的数据以时间(或者序号)为横坐标,斜率区间所代表的值为纵坐标,画出曲线,就在一个新的坐标系空间中得到第衣曲线。
由图13可知,所述的匹配度确定单元43还包括:
第二曲线绘制单元432,用于根据所述的活动曲线绘制第二曲线,图15为第二曲线绘制单元432的结构框图。由图15可知,所述的第二曲线绘制单元包括:
第二采样间隔获取单元4321,用于获取预先设定的采样间隔。在具体的实施方式中,对运动曲线、活动曲线进行采样,采样时间的间隔长度一致。时间间隔长度越短,最终的结果越是精确,同时,需要的计算量也越多。另外,运动曲线、活动曲线的起始点设置为相同的。如果获取的运动曲线、活动曲线采样数据是不同时间间隔采样而成,需要做归一化的处理。
第二采样单元4322,用于根据所述的采样间隔对所述的活动曲线进行采样,得到多个活动采样点;
第二斜率确定单元4323,用于依次确定多个活动采样点两两之间的斜率,组成活动斜率序列;
第二曲线绘制子单元4324,用于根据所述的活动斜率序列绘制第二曲线。在具体的实施方式中,把运动斜率序列以时间(或者序号)为横坐标,斜率为纵坐标,画出曲线,就在一个新的坐标系空间中得到第二曲线。
在本发明的其他实施方式中,对活动曲线做斜率的计算时,也可以每两点的斜率为参照,如将[0,180]度的斜率以每5度为单位,划分为36分,斜率处在[0,5)度范围之间,斜率值为0,[5,10)为1,[10,15)为2,以此类推。至此,获得了活动斜率序列N。把这 些斜率划分后的数据以时间(或者序号)为横坐标,斜率区间所代表的值为纵坐标,画出曲线,就在一个新的坐标系空间中得到第二曲线。
由图13可知,所述的匹配度确定单元43还包括:
最长公共子序列长度确定单元433,用于根据递归算法确定所述第一曲线与第二曲线的最长公共子序列长度。计算第一曲线与第二曲线的最长公共子序列长度,即两条曲线的重合程度。最长公共子序列的计算方法在具体的实施方式中可以是递归算法,如通过如下公式进行:
Figure PCTCN2016106248-appb-000003
其中,m为序列M中的元素,即M(m1,m2,...ma)为运动斜率序列,a为运动斜率序列M的长度,n为序列N中的元素,即N(n1,n2,...nb)为活动斜率序列,b为活动斜率序列N的长度,LS(m,n)为M和N的最长公共子序列长度,Head(m)为除去最后一项的M的子序列,即(m1,m2,...ma-1),Head(n)为除去最后一项的N的子序列,即(n1,n2,...nb-1),ε、η为预先设置的参数。
也即,M(m1,m2,...ma)为运动斜率序列,a为运动斜率序列M的长度,N(n1,n2,...nb)为活动斜率序列,b为活动斜率序列N的长度。公式LS(m,n)表示序列M和序列N的最长公共子序列长度,是衡量两条曲线重合的程度。如果满足条件a与b的数值差别小于参数η,并且满足条件ma和nb的数值差别小于参数ε,则LS(m,n)的数值等于1+LS(Head(m),Head(n));其中Head(m)表示除去最后一项的M的子序列,即(m1,m2,...ma-1),子序列的长度为a-1。Head(n)同理。如果,无法满足上述,在计算LS(m,n)之前,分别需要计算公式LS(Head(m),n)和公式LS(m,Head(n))的数值,取两者较大值作为LS(m,n)的数值;如果M和N中任意一个序列长度为0,则LS(m,n)为0。
匹配度确定子单元434,用于根据所述最长公共子序列长度确定所述第一曲线与第二曲线的匹配度。在具体的实施方式中,S(m,n)是把LS(m,n)的值除以M,N中点数的最小值,即S(m,n)等于LS(m,n)除以min(m,n)。S(m,n)的数值范围处于0至1,数值越靠近1,则表示两条曲线越相似,即通过如下公式进行:
Figure PCTCN2016106248-appb-000004
其中,S(m,n)为匹配度,min(m,n)为m、n中的最小值。
图16为本发明实施例提供的一种基于眼球跟踪的活体检测的装置中检测结果输出单元45的结构框图,由图16可知,所述的检测结果输出单元包括:
判断单元451,用于判断所述的匹配度是否大于匹配度阀值;
第一检测结果输出单元452,用于当判断单元判断为是时,输出当前用户活体检测成功的活体检测结果;
第二检测结果输出单元453,用于当判断单元判断为否时,输出当前用户活体检测失败的活体检测结果。
即如果匹配度大于匹配度阀值,则匹配成功,判断当前用户活体检测成功。否则,提示活体检测失败。
如上即是本发明提供的一种基于眼球跟踪的活体检测的装置以及系统,由显示器显示一个运动的小球,引导用户注视该小球,在用户注视期间,摄像头拍摄用户眼部的视频,然后,活体检测装置从视频中获取眼球的活动轨迹,并匹配运动小球的运动轨迹,进而生成活体匹配结果,由于运动轨迹与活动轨迹的匹配过程中结合轨迹曲线两个主要特性,即斜率和距离,进行维度分析,因此解决了运动轨迹曲线在不同坐标系上的平移和尺度变换,适用于眼球轨迹的场景,且本发明的易用性强,使用现有的人脸识别的设备,无需装载新的设备。
本发明还提供了一种基于眼球跟踪的活体检测的方法,图1为一种基于眼球跟踪的活体检测的方法的流程图,由图1可知,所述的基于眼球跟踪的活体检测的方法包括:
S101:输出一运动轨迹,所述的运动轨迹是无规则的。在具体的实施方式中,所述的运动轨迹可为一小球沿不同方向随机运动形成,即小球以匀速或变速随机进行各个方向(如上下左右四个方向)的运动,运动轨迹是无规则的。
S102:获取当前用户注视所述运动轨迹时眼部的视频。在具体的实施方式中,用户注视该小球,并且保持身体和头不动。
S103:根据所述的视频绘制用户眼球的活动轨迹,图2步骤S103的具体流程图。
S104:将所述的运动轨迹与所述的活动轨迹进行匹配,输出活体检测结果。图3为步骤S104的具体流程图。
图2为图1中的步骤S103的具体流程图,由图2可知,步骤S103包括:
S201:将所述的视频转换为多张图片;
S202:获取每张图片中用户眼球与眼睛的相对位置;
S203:根据所述的相对位置绘制用户眼球的活动轨迹。
图3为图1中的步骤S104的具体流程图,由图3可知,步骤S104包括:
S301:根据所述的运动轨迹绘制运动曲线;
S302:根据所述的活动轨迹绘制活动曲线;
S303:确定所述运动曲线与所述活动曲线的匹配度,图4为步骤S303的具体流程图。
S304:获取预先设定匹配度阈值。匹配度阈值阀值是预先设置的参数,数值设置的越大,误识率会降低,但拒识率会上升,反之亦然。
S305:根据所述的匹配度以及匹配度阈值输出活体检测结果,图7为步骤S305的具体流程图。
图4为图3中的步骤S303的具体流程图,由图4可知,步骤S303包括:
S401:根据所述的运动曲线绘制第一曲线,图5为步骤S401的具体流程图,由图5可知,步骤S401包括:
S501:获取预先设定的采样间隔。在具体的实施方式中,对运动曲线、活动曲线进行采样,采样时间的间隔长度一致。时间间隔长度越短,最终的结果越是精确,同时,需要的计算量也越多。另外,运动曲线、活动曲线的起始点设置为相同的。如果获取的运动曲线、活动曲线采样数据是不同时间间隔采样而成,需要做归一化的处理。
S502:根据所述的采样间隔对所述的运动曲线进行采样,得到多个运动采样点;
S503:依次确定多个运动采样点两两之间的斜率,组成运动斜率序列;
S504:根据所述的运动斜率序列绘制第一曲线。在具体的实施方式中,把运动斜率序列以时间(或者序号)为横坐标,斜率为纵坐标,画出曲线,就在一个新的坐标系空间中得到第一曲线。
在本发明的其他实施方式中,对运动曲线做斜率的计算时,也可以每两点的斜率为参照,如将[0,180]度的斜率以每5度为单位,划分为36分,斜率处在[0,5)度范围之间,斜率值为0,[5,10)为1,[10,15)为2,以此类推。至此,获得了运动斜率序列M。把这些斜率划分后的数据以时间(或者序号)为横坐标,斜率区间所代表的值为纵坐标,画出曲线,就在一个新的坐标系空间中得到第衣曲线。
由图4可知,步骤S303还包括:
S402:根据所述的活动曲线绘制第二曲线,图6为步骤S402的具体流程图。由图6可知,步骤S402包括:
S601:获取预先设定的采样间隔。在具体的实施方式中,对运动曲线、活动曲线进行采样,采样时间的间隔长度一致。时间间隔长度越短,最终的结果越是精确,同时,需要的计算量也越多。另外,运动曲线、活动曲线的起始点设置为相同的。如果获取的运动曲线、活动曲线采样数据是不同时间间隔采样而成,需要做归一化的处理。
S602:根据所述的采样间隔对所述的活动曲线进行采样,得到多个活动采样点;
S603:依次确定多个活动采样点两两之间的斜率,组成活动斜率序列;
S604:根据所述的活动斜率序列绘制第二曲线。在具体的实施方式中,把运动斜率序列以时间(或者序号)为横坐标,斜率为纵坐标,画出曲线,就在一个新的坐标系空间中得到第二曲线。
在本发明的其他实施方式中,对活动曲线做斜率的计算时,也可以每两点的斜率为参照,如将[0,180]度的斜率以每5度为单位,划分为36分,斜率处在[0,5)度范围之间,斜率值为0,[5,10)为1,[10,15)为2,以此类推。至此,获得了活动斜率序列N。把这些斜率划分后的数据以时间(或者序号)为横坐标,斜率区间所代表的值为纵坐标,画出曲线,就在一个新的坐标系空间中得到第二曲线。
由图4可知,步骤S303还包括:
S403:根据递归算法确定所述第一曲线与第二曲线的最长公共子序列长度。计算第一曲线与第二曲线的最长公共子序列长度,即两条曲线的重合程度。最长公共子序列的计算方法在具体的实施方式中可以是递归算法,如通过如下公式进行:
Figure PCTCN2016106248-appb-000005
其中,m为序列M中的元素,即M(m1,m2,...ma)为运动斜率序列,a为运动斜率序列M的长度,n为序列N中的元素,即N(n1,n2,...nb)为活动斜率序列,b为活动斜率序列N的长度,LS(m,n)为M和N的最长公共子序列长度,Head(m)为除去最后一项的M的子序列,即(m1,m2,...ma-1),Head(n)为除去最后一项的N的子序列,即(n1,n2,...nb-1),ε、η为预先设置的参数。
也即,M(m1,m2,...ma)为运动斜率序列,a为运动斜率序列M的长度,N(n1,n2,...nb)为活动斜率序列,b为活动斜率序列N的长度。公式LS(m,n)表示序列M和序列N的最长公共子序列长度,是衡量两条曲线重合的程度。如果满足条件a与b的数值差别小于参数η,并且满足条件ma和nb的数值差别小于参数ε,则LS(m,n)的数值等于1+LS(Head(m),Head(n));其中Head(m)表示除去最后一项的M的子序列,即(m1,m2,...ma-1),子序列的长度为a-1。Head(n)同理。如果,无法满足上述,在计算LS(m,n)之前,分别需要计算公式LS(Head(m),n)和公式LS(m,Head(n))的数值,取两者较大值作为LS(m,n)的数值;如果M和N中任意一个序列长度为0,则LS(m,n)为0。
S404:根据所述最长公共子序列长度确定所述第一曲线与第二曲线的匹配度。在具体的实施方式中,S(m,n)是把LS(m,n)的值除以M,N中点数的最小值,即S(m,n)等于LS(m,n)除以min(m,n)。S(m,n)的数值范围处于0至1,数值越靠近1,则表示两条曲线越相似,即通过如下公式进行:
Figure PCTCN2016106248-appb-000006
其中,S(m,n)为匹配度,min(m,n)为m、n中的最小值。
图7为步骤S305的具体流程图,由图7可知,步骤S305包括:
S701:判断所述的匹配度是否大于匹配度阀值;
S702:当判断为是时,输出当前用户活体检测成功的活体检测结果;
S703:当判断为否时,输出当前用户活体检测失败的活体检测结果。
即如果匹配度大于匹配度阀值,则匹配成功,判断当前用户活体检测成功。否则,提示活体检测失败。
如上即是本发明提供的一种基于眼球跟踪的活体检测的方法,由显示器显示一个运动的小球,引导用户注视该小球,在用户注视期间,摄像头拍摄用户眼部的视频,然后,活体检测装置从视频中获取眼球的活动轨迹,并匹配运动小球的运动轨迹,进而生成活体匹配结果,由于运动轨迹与活动轨迹的匹配过程中结合轨迹曲线两个主要特性,即斜率和距离,进行维度分析,因此解决了运动轨迹曲线在不同坐标系上的平移和尺度变换,适用于眼球轨迹的场景。
下面结合具体的实施例,详细介绍本发明的技术方案。图10为本发明实施例提供的一种基于眼球跟踪的活体检测的系统的结场景图。本方案中首先,显示器中显示一个运动的小球,引导用户注视该小球,并且保持身体和头不动。在做人脸识别期间,人脸 面对着显示器,眼睛注视显示器,显示器端有一个摄像头能够清楚拍摄用户面部画面。在用户注视期间,摄像头拍摄用户眼部的视频。然后,活体检测装置从视频中获取眼球运动轨迹,并匹配运动小球的运动路径,如果匹配成功,则判断当前用户活体检测成功。
具体的,1)、显示器提示用户接下来的操作规则,给用户充足时间进行准备,当用户了解完操作规则后,进行确认。
2)、显示器中出现一个小球,小球以匀速随机进行上下左右四个方向的运动,运动轨迹是无规则的。在此期间,引导用户注视该小球,并且保持身体和头不动。同时,摄像头拍摄用户眼部的视频。
3)、活体检测的装置接收到拍摄的视频,视频装换成若干张图片,根据眼球与眼睛的相对位置,绘制成用户眼球的活动轨迹,进而确定出活动轨迹曲线。
4)、活体检测的装置获取小球的运动轨迹,确定出运动轨迹曲线,匹配用户眼球的活动轨迹曲线。计算两条曲线的匹配度,如果匹配度大于匹配度阀值,则匹配成功,则判断当前用户活体检测成功。否则,提示活体检测失败。匹配度阀值是系统设置的参数,数值设置越大,误识率会降低,但拒识率会上升,反之亦然。
该步骤中计算两条轨迹曲线的匹配度过程较为复杂,以下将详细介绍。本发明的技术方案在计算曲线的匹配度的过程中结合了轨迹曲线两个主要特性,即斜率和距离,进行两个维度的分析。两条曲线如果斜率变化接近,则认为其相似;如果两条曲线重合地方比较多,则认为相似。两个维度结合起来可以提高其准确度,并解决了两曲线平移和尺度变化的问题,适用性更强。下面具体讲述一下其流程:
(1)、对两条曲线进行采样,采样时间间隔长度一致。时间间隔长度越短,相似度方法计算越是精确,同时,需要的计算量也越多;另外,两条曲线的起始点设置为相同的。如果系统获取的输入曲线采样数据是不同时间间隔采样而成,需要做归一化的处理。
(2)、对两条曲线分别做斜率的计算,以每两点的斜率为参照。将[0,180]度的斜率以每5度为单位,划分为36分,斜率处在[0,5)度范围之间,斜率值为0,[5,10)为1,[10,15)为2,以此类推。至此,系统获得了两条斜率值序列M、N;把这些斜率划分后的数据以时间(或者序号)为横坐标,斜率区间所代表的值为纵坐标,画出曲线,就在一个新的坐标系空间中得到两条曲线。
(3)、计算这两条斜率值序列的最长公共子序列长度,即两条曲线的重合程度。最长公共子序列的计算方法是递归算法。给定序列M(m1,m2,...ma)和N(n1,n2,...nb),其中变量a和b表示序列M和N的长度。公式LS(m,n)表示序列M和序列N的最长公共子序列长度,是衡量两条曲线重合的程度。公式LS(m,n)的计算方法如下:如果满足条件1(a与b的数值差别小于参数η),并且满足条件2(ma和nb的数值差别小于参数ε),则LS(m,n)的数值等于1+LS(Head(m),Head(n));其中Head(m)表示除去最后一项的M的子序列,即(m1,m2,...ma-1),子序列的长度为a-1。Head(n)同理;如果,无法满足条件1和条件2,在计算LS(m,n)之前,分别需要计算公式LS(Head(m),n)和公式LS(m,Head(n))的数值,取两者较大值作为LS(m,n)的数值;如果M和N中任意一个序列长度为0,则LS(m,n)为0;其中ε、η是系统预先设置的参数。
(4)、最后两条曲线的相似度S(m,n)是把LS(m,n)的值除以M,N中点数的最小值,即S(m,n)等于LS(m,n)除以min(m,n)。S(m,n)的数值范围处于0至1,数值越靠近1,则表示两条曲线越相似。
综上所述,本发明提供了的一种基于眼球跟踪的活体检测的方法、装置以及系统,由显示器显示一个运动的小球,引导用户注视该小球,在用户注视期间,摄像头拍摄用户眼部的视频,然后,活体检测装置从视频中获取眼球的活动轨迹,并匹配运动小球的运动轨迹,进而生成活体匹配结果。
本发明的有益效果在于:
1)、易用性强,使用现有的人脸识别的设备,无需装载新的设备。
2)、小球采用无规则运动,抗攻击能力强;
3)、运动轨迹匹配方法结合轨迹曲线两个主要特性,即斜率和距离,进行维度分析。解决了运动轨迹曲线在不同坐标系上的平移和尺度变换,适用于眼球轨迹的场景。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,可以通过计算机程序来指令相关的硬件来完成,所述的程序可存储于一般计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)或随机存储记忆体(Random Access Memory,RAM)等。
本领域技术人员还可以了解到本发明实施例列出的各种功能是通过硬件还是软件来实现取决于特定的应用和整个系统的设计要求。本领域技术人员可以对于每种特定的应 用,可以使用各种方法实现所述的功能,但这种实现不应被理解为超出本发明实施例保护的范围。
本发明中应用了具体实施例对本发明的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本发明的方法及其核心思想;同时,对于本领域的一般技术人员,依据本发明的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本发明的限制。

Claims (21)

  1. 一种基于眼球跟踪的活体检测的方法,其特征是,所述的方法包括:
    输出一运动轨迹,所述的运动轨迹是无规则的;
    获取当前用户注视所述运动轨迹时眼部的视频;
    根据所述的视频绘制用户眼球的活动轨迹;
    将所述的运动轨迹与所述的活动轨迹进行匹配,输出活体检测结果。
  2. 根据权利要求1所述的方法,其特征是,所述的运动轨迹为一小球沿不同方向随机运动。
  3. 根据权利要求1所述的方法,其特征是,根据所述的视频绘制用户眼球的活动轨迹包括:
    将所述的视频转换为多张图片;
    获取每张图片中用户眼球与眼睛的相对位置;
    根据所述的相对位置绘制用户眼球的活动轨迹。
  4. 根据权利要求1或3所述的方法,其特征是,将所述的运动轨迹与所述的活动轨迹进行匹配,输出活体检测结果包括:
    根据所述的运动轨迹绘制运动曲线;
    根据所述的活动轨迹绘制活动曲线;
    确定所述运动曲线与所述活动曲线的匹配度;
    获取预先设定匹配度阈值;
    根据所述的匹配度以及匹配度阈值输出活体检测结果。
  5. 根据权利要求4所述的方法,其特征是,确定所述运动曲线与所述活动曲线的匹配度包括:
    根据所述的运动曲线绘制第一曲线;
    根据所述的活动曲线绘制第二曲线;
    根据递归算法确定所述第一曲线与第二曲线的最长公共子序列长度;
    根据所述最长公共子序列长度确定所述第一曲线与第二曲线的匹配度。
  6. 根据权利要求5所述的方法,其特征是,根据所述的运动曲线绘制第一曲线包括:
    获取预先设定的采样间隔;
    根据所述的采样间隔对所述的运动曲线进行采样,得到多个运动采样点;
    依次确定多个运动采样点两两之间的斜率,组成运动斜率序列;
    根据所述的运动斜率序列绘制第一曲线。
  7. 根据权利要求6所述的方法,其特征是,根据所述的活动曲线绘制第二曲线包括:
    获取预先设定的采样间隔;
    根据所述的采样间隔对所述的活动曲线进行采样,得到多个活动采样点;
    依次确定多个活动采样点两两之间的斜率,组成活动斜率序列;
    根据所述的活动斜率序列绘制第二曲线。
  8. 根据权利要求7所述的方法,其特征是,根据递归算法确定所述第一曲线与第二曲线的最长公共子序列长度通过如下公式进行:
    Figure PCTCN2016106248-appb-100001
    其中,m为序列M中的元素,即M(m1,m2,...ma)为运动斜率序列,a为运动斜率序列M的长度,n为序列N中的元素,即N(n1,n2,...nb)为活动斜率序列,b为活动斜率序列N的长度,LS(m,n)为M和N的最长公共子序列长度,Head(m)为除去最后一项的M的子序列,即(m1,m2,...ma-1),Head(n)为除去最后一项的N的子序列,即(n1,n2,...nb-1),ε、η为预先设置的参数。
  9. 根据权利要求8所述的方法,其特征是,根据所述最长公共子序列长度确定所述第一曲线与第二曲线的匹配度通过如下公式进行:
    Figure PCTCN2016106248-appb-100002
    其中,S(m,n)为匹配度,min(m,n)为m、n中的最小值。
  10. 根据权利要求4所述的方法,其特征是,根据所述的匹配度以及匹配度阈值输出活体检测结果包括:
    判断所述的匹配度是否大于匹配度阀值;
    当判断为是时,输出当前用户活体检测成功的活体检测结果;
    否则,输出当前用户活体检测失败的活体检测结果。
  11. 一种基于眼球跟踪的活体检测的装置,其特征是,所述的装置包括:
    运动轨迹输出模块,用于输出一运动轨迹,所述的运动轨迹是无规则的;
    视频获取模块,用于获取当前用户注视所述运动轨迹时眼部的视频;
    活动轨迹确定模块,用于根据所述的视频绘制用户眼球的活动轨迹;
    轨迹匹配模块,用于将所述的运动轨迹与所述的活动轨迹进行匹配,输出活体检测结果。
  12. 根据权利要求11所述的装置,其特征是,所述的运动轨迹为一小球沿不同方向随机运动。
  13. 根据权利要求11所述的装置,其特征是,所述的活动轨迹确定模块包括:
    转换单元,用于将所述的视频转换为多张图片;
    相对位置获取单元,用于获取每张图片中用户眼球与眼睛的相对位置;
    活动轨迹绘制单元,用于根据所述的相对位置绘制用户眼球的活动轨迹。
  14. 根据权利要求11或13所述的装置,其特征是,所述的轨迹匹配模块包括:
    运动曲线绘制单元,用于根据所述的运动轨迹绘制运动曲线;
    活动曲线绘制单元,用于根据所述的活动轨迹绘制活动曲线;
    匹配度确定单元,用于确定所述运动曲线与所述活动曲线的匹配度;
    阈值获取单元,用于获取预先设定匹配度阈值;
    检测结果输出单元,用于根据所述的匹配度以及匹配度阈值输出活体检测结果。
  15. 根据权利要求14所述的装置,其特征是,所述的匹配度确定单元包括:
    第一曲线绘制单元,用于根据所述的运动曲线绘制第一曲线;
    第二曲线绘制单元,用于根据所述的活动曲线绘制第二曲线;
    最长公共子序列长度确定单元,用于根据递归算法确定所述第一曲线与第二曲线的最长公共子序列长度;
    匹配度确定子单元,用于根据所述最长公共子序列长度确定所述第一曲线与第二曲线的匹配度。
  16. 根据权利要求15所述的装置,其特征是,所述的第一曲线绘制单元包括:
    第一采样间隔获取单元,用于获取预先设定的采样间隔;
    第一采样单元,用于根据所述的采样间隔对所述的运动曲线进行采样,得到多个运动采样点;
    第一斜率确定单元,用于依次确定多个运动采样点两两之间的斜率,组成运动斜率序列;
    第一曲线绘制子单元,用于根据所述的运动斜率序列绘制第一曲线。
  17. 根据权利要求16所述的装置,其特征是,所述的第二曲线绘制单元包括:
    第二采样间隔获取单元,用于获取预先设定的采样间隔;
    第二采样单元,用于根据所述的采样间隔对所述的活动曲线进行采样,得到多个活动采样点;
    第二斜率确定单元,用于依次确定多个活动采样点两两之间的斜率,组成活动斜率序列;
    第二曲线绘制子单元,用于根据所述的活动斜率序列绘制第二曲线。
  18. 根据权利要求17所述的装置,其特征是,所述的最长公共子序列长度确定单元通过如下公式进行:
    Figure PCTCN2016106248-appb-100003
    其中,m为序列M中的元素,即M(m1,m2,...ma)为运动斜率序列,a为运动斜率序列M的长度,n为序列N中的元素,即N(n1,n2,...nb)为活动斜率序列,b为活动斜率序列N的长度,LS(m,n)为M和N的最长公共子序列长度,Head(m)为除去最后一项的M的子序列,即(m1,m2,...ma-1),Head(n)为除去最后一项的N的子序列,即(n1,n2,...nb-1),ε、η为预先设置的参数。
  19. 根据权利要求18所述的装置,其特征是,所述的匹配度确定子单元通过如下公式进行:
    Figure PCTCN2016106248-appb-100004
    其中,S(m,n)为匹配度,min(m,n)为m、n中的最小值。
  20. 根据权利要求14所述的装置,其特征是,所述的检测结果输出单元包括:
    判断单元,用于判断所述的匹配度是否大于匹配度阀值;
    第一检测结果输出单元,用于当判断单元判断为是时,输出当前用户活体检测成功的活体检测结果;
    第二检测结果输出单元,用于当判断单元判断为否时,输出当前用户活体检测失败的活体检测结果。
  21. 一种基于眼球跟踪的活体检测的系统,其特征是,所述的系统包括显示器、摄像头以及如权利要求11至20任意一项所述的基于眼球跟踪的活体检测的装置,
    其中,所述的基于眼球跟踪的活体检测的装置包括:运动轨迹输出模块,用于输出一运动轨迹,所述的运动轨迹是无规则的;
    所述的显示器,用于接收并显示所述的运动轨迹;
    所述的摄像头,用于拍摄当前用户注视所述运动轨迹时眼部的视频;
    所述的基于眼球跟踪的活体检测的装置还包括:
    视频获取装置,用于获取所述的视频;
    活动轨迹确定装置,用于根据所述的视频绘制用户眼球的活动轨迹;
    活体检测装置,用于将所述的运动轨迹与所述的活动轨迹进行匹配,输出活体检测结果。
PCT/CN2016/106248 2015-11-30 2016-11-17 一种基于眼球跟踪的活体检测的方法、装置及系统 WO2017092573A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510854243.6 2015-11-30
CN201510854243.6A CN105930761A (zh) 2015-11-30 2015-11-30 一种基于眼球跟踪的活体检测的方法、装置及系统

Publications (1)

Publication Number Publication Date
WO2017092573A1 true WO2017092573A1 (zh) 2017-06-08

Family

ID=56839971

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/106248 WO2017092573A1 (zh) 2015-11-30 2016-11-17 一种基于眼球跟踪的活体检测的方法、装置及系统

Country Status (3)

Country Link
CN (1) CN105930761A (zh)
TW (1) TWI707243B (zh)
WO (1) WO2017092573A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108229120A (zh) * 2017-09-07 2018-06-29 北京市商汤科技开发有限公司 人脸解锁及其信息注册方法和装置、设备、程序、介质

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105930761A (zh) * 2015-11-30 2016-09-07 中国银联股份有限公司 一种基于眼球跟踪的活体检测的方法、装置及系统
DE112016007124T5 (de) * 2016-09-08 2019-05-16 Ford Motor Company Verfahren und Vorrichtungen zur Überwachung eines Aktivitätsniveaus eines Fahrers
CN108875469A (zh) * 2017-06-14 2018-11-23 北京旷视科技有限公司 活体检测与身份认证的方法、装置及计算机存储介质
CN109063448A (zh) * 2018-08-20 2018-12-21 中国联合网络通信集团有限公司 身份验证方法和系统
CN110909704A (zh) * 2019-11-29 2020-03-24 北京奇艺世纪科技有限公司 一种活体检测方法、装置、电子设备及存储介质
CN113837326B (zh) * 2021-11-30 2022-03-25 自然资源部第一海洋研究所 一种基于特征曲线的机载激光测深数据配准方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1924892A (zh) * 2006-09-21 2007-03-07 杭州电子科技大学 虹膜识别中的活体检测方法及装置
US8235529B1 (en) * 2011-11-30 2012-08-07 Google Inc. Unlocking a screen using eye tracking information
CN102842040A (zh) * 2012-08-13 2012-12-26 高艳玲 运用眼球跟踪进行活体化检测的方法
CN105930761A (zh) * 2015-11-30 2016-09-07 中国银联股份有限公司 一种基于眼球跟踪的活体检测的方法、装置及系统

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102722556B (zh) * 2012-05-29 2014-10-22 清华大学 一种基于相似性度量的模型比对方法
CN103942542A (zh) * 2014-04-18 2014-07-23 重庆卓美华视光电有限公司 人眼跟踪方法及装置
CN104966086B (zh) * 2014-11-14 2017-10-13 深圳市腾讯计算机系统有限公司 活体鉴别方法及装置
CN105072431A (zh) * 2015-07-28 2015-11-18 上海玮舟微电子科技有限公司 一种基于人眼跟踪的裸眼3d播放方法及系统

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1924892A (zh) * 2006-09-21 2007-03-07 杭州电子科技大学 虹膜识别中的活体检测方法及装置
US8235529B1 (en) * 2011-11-30 2012-08-07 Google Inc. Unlocking a screen using eye tracking information
CN102842040A (zh) * 2012-08-13 2012-12-26 高艳玲 运用眼球跟踪进行活体化检测的方法
CN105930761A (zh) * 2015-11-30 2016-09-07 中国银联股份有限公司 一种基于眼球跟踪的活体检测的方法、装置及系统

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108229120A (zh) * 2017-09-07 2018-06-29 北京市商汤科技开发有限公司 人脸解锁及其信息注册方法和装置、设备、程序、介质

Also Published As

Publication number Publication date
TWI707243B (zh) 2020-10-11
CN105930761A (zh) 2016-09-07
TW201719477A (zh) 2017-06-01

Similar Documents

Publication Publication Date Title
WO2017092573A1 (zh) 一种基于眼球跟踪的活体检测的方法、装置及系统
US10515199B2 (en) Systems and methods for facial authentication
Wang et al. Scene flow to action map: A new representation for rgb-d based action recognition with convolutional neural networks
CN108197586B (zh) 脸部识别方法和装置
US8515124B2 (en) Method and apparatus for determining fake image
JP2018160237A (ja) 顔認証方法及び装置
US20140254891A1 (en) Method and apparatus for registering face images, and apparatus for inducing pose change, and apparatus for recognizing faces
CN109766785B (zh) 一种人脸的活体检测方法及装置
Alletto et al. From ego to nos-vision: Detecting social relationships in first-person views
US10254831B2 (en) System and method for detecting a gaze of a viewer
US20210027080A1 (en) Spoof detection by generating 3d point clouds from captured image frames
CN112149615B (zh) 人脸活体检测方法、装置、介质及电子设备
Smith-Creasey et al. Continuous face authentication scheme for mobile devices with tracking and liveness detection
US9280209B2 (en) Method for generating 3D coordinates and mobile terminal for generating 3D coordinates
US9323989B2 (en) Tracking device
KR20150127381A (ko) 얼굴 특징점 추출 방법 및 이를 수행하는 장치
US20140232748A1 (en) Device, method and computer readable recording medium for operating the same
WO2018103416A1 (zh) 用于人脸图像的检测方法和装置
KR20150065445A (ko) 얼굴 포즈를 이용한 정면 얼굴 검출 장치 및 방법
KR20130043366A (ko) 시선 추적 장치와 이를 이용하는 디스플레이 장치 및 그 방법
Xu et al. Sensor-assisted face recognition system on smart glass via multi-view sparse representation classification
WO2023168957A1 (zh) 姿态确定方法、装置、电子设备、存储介质及程序
Zavan et al. Benchmarking parts based face processing in-the-wild for gender recognition and head pose estimation
WO2023015938A1 (zh) 三维点检测的方法、装置、电子设备及存储介质
US20210295016A1 (en) Living body recognition detection method, medium and electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16869883

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16869883

Country of ref document: EP

Kind code of ref document: A1