CN109522868B - Method and device for detecting blink - Google Patents

Method and device for detecting blink Download PDF

Info

Publication number
CN109522868B
CN109522868B CN201811455963.5A CN201811455963A CN109522868B CN 109522868 B CN109522868 B CN 109522868B CN 201811455963 A CN201811455963 A CN 201811455963A CN 109522868 B CN109522868 B CN 109522868B
Authority
CN
China
Prior art keywords
frame
eye
eye image
image
blinking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811455963.5A
Other languages
Chinese (zh)
Other versions
CN109522868A (en
Inventor
王钦民
王云飞
黄通兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing 7Invensun Technology Co Ltd
Original Assignee
Beijing 7Invensun Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing 7Invensun Technology Co Ltd filed Critical Beijing 7Invensun Technology Co Ltd
Priority to CN201811455963.5A priority Critical patent/CN109522868B/en
Publication of CN109522868A publication Critical patent/CN109522868A/en
Application granted granted Critical
Publication of CN109522868B publication Critical patent/CN109522868B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses a method and a device for detecting blinking, wherein the method comprises the following steps: obtaining at least a first frame of eye image, eye features of a second frame of eye image and detection marks of the first frame of eye image representing the open and closed eye states of the eye image; based on the detection mark, firstly judging whether the eye features of the second frame of eye image meet a first preset condition relative to the eye features of the first frame of eye image; if yes, judging whether the eye features of the second frame of eye image meet a second preset condition; and if so, determining that the second frame eye image is a blink state image. Therefore, according to the fact that the actual blinking characteristics in the blinking process are the up-down movement characteristics of the upper eyelid and the basic pupil immobility characteristics, the first preset condition and the second preset condition related to blinking are set, whether the eye images are in the blinking state is judged based on whether the eye characteristics in the front and back eye images meet the blinking conditions, the robustness and the robustness are good, the detection error rate is reduced, and the detection precision is improved.

Description

Method and device for detecting blink
Technical Field
The present application relates to the field of image recognition and analysis technologies, and in particular, to a method and an apparatus for detecting blinking.
Background
With the rapid development of biological information recognition technology and image recognition technology, blink detection based on image processing technology is urgently needed in many fields, such as the field of Virtual Reality (VR), the field of Augmented Reality (AR), the field of artificial intelligence, and the like.
In the prior art, blink detection is generally performed by a simple threshold determination or logic determination method based on an eye image, for example, determining whether the center of an eye is in a current communication area. However, the inventors have studied and found that the current methods for determining blink detection do not capture the essence of blinking, and it is difficult to capture the actual blinking moment, and especially, it is difficult to capture the entire blinking process when the frame rate is not high, and detection errors are likely to occur.
Disclosure of Invention
The technical problem to be solved by the application is to provide a method and a device for detecting blinking, which are used for judging whether the eye is in a blinking state based on whether the upper eyelid edge features in the eye image and the pupil features meet the blinking conditions or not, so that the method and the device have good robustness and robustness, reduce the detection error rate and improve the detection precision.
In a first aspect, an embodiment of the present application provides a method for detecting an eye blink, including:
obtaining at least a first frame eye image, eye characteristics of a second frame eye image and a detection mark of the first frame eye image, wherein the detection mark is used for representing the open and closed eye state of the eye image;
judging whether the eye features of the second frame of eye image meet a first preset condition relative to the eye features of the first frame of eye image or not based on the detection mark of the first frame of eye image;
if yes, judging whether the eye features of the second frame of eye image meet a second preset condition;
if so, determining that the second frame eye image is a blink state image.
Optionally, the ocular features include superior eyelid margin and pupil location.
Optionally, the first frame eye image and the second frame eye image are obtained by preprocessing a front eye moving image and a rear eye moving image according to a preset image preprocessing mode, where the preset image preprocessing mode at least includes gray histogram processing, erosion processing, and binarization processing; the upper eyelid edge is a gray value boundary point in the first frame eye image and the second frame eye image.
Optionally, the first preset condition is a blinking eyelid movement condition, where the blinking eyelid movement condition includes that the upper eyelid edge of the second frame of eye image moves downward by a first preset distance relative to the upper eyelid edge of the first frame of eye image or the upper eyelid edge of the second frame of eye image moves upward by a second preset distance relative to the upper eyelid edge of the first frame of eye image when the detection flag of the first frame of eye image is the eye closing completion flag.
Optionally, the second preset condition is an eye pupil position condition or a blinking pupil position moving condition.
Optionally, the blinking pupil position condition includes that the pupil position of the second frame of eye image is invalid.
Optionally, the blinking pupil position moving condition includes that pupil displacement of the pupil position of the second frame of eye image relative to the pupil position of the first frame of eye image is smaller than a third preset distance under the condition that the pupil position in the second frame of eye image is valid.
Optionally, the method further includes:
and if the eye features of the second frame of eye image do not meet a first preset condition relative to the eye features of the first frame of eye image, determining that the second frame of eye image is a non-blinking state image.
Optionally, the method further includes:
and if the eye features of the second frame of eye images do not meet a second preset condition, determining that the second frame of eye images are non-blinking state images.
In a second aspect, an embodiment of the present application provides an apparatus for detecting an eye blink, including:
a first obtaining unit, configured to obtain at least a first frame eye image, an eye feature of a second frame eye image, and a detection flag of the first frame eye image, where the detection flag is used to indicate an open/close eye state of the eye image;
a first judging unit, configured to judge whether an eye feature of the second frame of eye image satisfies a first preset condition with respect to an eye feature of the first frame of eye image based on the detection flag of the first frame of eye image; if yes, entering a second judgment unit;
a second judging unit, configured to judge whether an eye feature of the second frame of eye image satisfies a second preset condition; if yes, entering a first determining unit;
and the first determining unit is used for determining the second frame eye image as a blink state image.
Compared with the prior art, the method has the advantages that:
by adopting the technical scheme of the embodiment of the application, the eye characteristics of at least a first frame of eye image and a second frame of eye image and the detection marks of the first frame of eye image representing the open-close eye state of the eye image are obtained; based on the detection mark of the first frame of eye image, firstly judging whether the eye features of the second frame of eye image meet a first preset condition relative to the eye features of the first frame of eye image; if yes, judging whether the eye features of the second frame of eye image meet a second preset condition; and if so, determining that the second frame eye image is a blink state image. Therefore, according to the fact that the actual blinking characteristics in the blinking process are the up-down movement characteristics of the upper eyelid and the basic pupil immobility characteristics, the first preset condition and the second preset condition related to blinking are set, whether the eye images are in the blinking state is judged based on whether the eye characteristics in the front and back eye images meet the blinking conditions, robustness and robustness are good, the detection error rate is reduced, and the detection accuracy is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments of the present application will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments described in the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic diagram of a system framework related to an application scenario in an embodiment of the present application;
fig. 2 is a flowchart illustrating a method for detecting blinking according to an embodiment of the present disclosure;
fig. 3 is a schematic flowchart of another method for detecting a blink according to an embodiment of the application;
fig. 4 is a schematic structural diagram of an apparatus for detecting a blink according to an embodiment of the present disclosure.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
At present, in many fields such as VR field, AR field, and artificial intelligence field, blink detection based on image processing technology is urgently needed, and blink detection is generally performed by methods such as simple threshold determination or logic determination based on eye images, for example, determining whether the center of an eye is in a current communication area. However, the inventors have found that these methods do not capture the essence of blinking, and it is difficult to capture the actual blinking moment, and especially, it is difficult to capture the entire blinking process when the frame rate is not high, and detection errors are likely to occur.
In order to solve this problem, in the embodiment of the present application, at least the first frame eye image, the eye feature of the second frame eye image, and the detection flag of the first frame eye image indicating the open-closed eye state of the eye image are obtained; based on the detection mark of the first frame of eye image, firstly judging whether the eye features of the second frame of eye image meet a first preset condition relative to the eye features of the first frame of eye image; if yes, judging whether the eye features of the second frame of eye image meet a second preset condition; and if so, determining that the second frame eye image is a blink state image. Therefore, according to the fact that the actual blinking characteristics in the blinking process are the up-down movement characteristics of the upper eyelid and the basic pupil immobility characteristics, the first preset condition and the second preset condition related to blinking are set, whether the eye images are in the blinking state is judged based on whether the eye characteristics in the front and back eye images meet the blinking conditions, robustness and robustness are good, the detection error rate is reduced, and the detection accuracy is improved.
For example, one of the scenarios in the embodiment of the present application may be applied to the scenario shown in fig. 1, where the scenario includes the processor 101 and the image device 102. The image device 102 refers to a device that obtains eye moving images, such as a camera, a VR device, or an AR device. The image device 102 is used for acquiring an eye moving image and transmitting the eye moving image to the processor 101, and the processor 101 is used for implementing the method for detecting the blink in the embodiment of the application.
It is to be understood that, in the above application scenarios, although the actions of the embodiments of the present application are described as being performed by the processor 101, the present application is not limited in terms of the subject of execution as long as the actions disclosed in the embodiments of the present application are performed.
It is to be understood that the above scenario is only one example of a scenario provided in the embodiment of the present application, and the embodiment of the present application is not limited to this scenario.
The following describes in detail a specific implementation manner of the method and apparatus for detecting a blink according to the embodiment of the present application, by way of example, with reference to the accompanying drawings.
Exemplary method
Referring to fig. 2, a flowchart of a method for detecting a blink in an embodiment of the application is shown. In this embodiment, the method may include, for example, the steps of:
step 201: the method comprises the steps of obtaining at least a first frame eye image, eye characteristics of a second frame eye image and detection marks of the first frame eye image, wherein the detection marks are used for representing the open and closed eye states of the eye images.
It will be appreciated that in reality blinking is essentially the movement of the upper eyelid up and down, with the pupil substantially not moving. In the multi-frame eye images, the up-and-down movement of the upper eyelid is represented by the change of the upper eyelid edge (namely the lowest edge of the upper eyelid) in each frame of eye image, and the pupil movement is represented by the change of the pupil position in each frame of eye image, so that the upper eyelid edge and the pupil position in the front and back two frames of eye images at least need to be obtained, namely, the upper eyelid edge and pupil position in the first frame eye image and the second frame eye image, the detection mark of the first frame eye image is also needed to be obtained, used for accurately judging the change of the eye features of the second frame of eye image relative to the eye features of the first frame of eye image, the detection flag of the first frame eye image is the open-closed eye state of the first frame eye image obtained by analysis based on the open-closed eye state of the previous multi-frame eye image (i.e., the detection flag of the previous multi-frame eye image). Thus, in some implementations of this embodiment, the ocular features include an upper eyelid edge and a pupil location.
It should be noted that the obtained front and rear eye motion images have high pixels, the upper eyelid edge of which is difficult to judge, and in order to accurately, conveniently and quickly obtain the upper eyelid edge, preset image preprocessing modes including gray histogram processing, erosion processing and binarization processing can be preset, and the first eye image frame and the second eye image frame are obtained through preprocessing operations. Therefore, in some embodiments of this embodiment, the first frame eye image and the second frame eye image are obtained by preprocessing two frames of eye moving images before and after the first frame eye image and the second frame eye moving image according to a preset image preprocessing manner, where the preset image preprocessing manner at least includes gray histogram processing, erosion processing, and binarization processing.
Specifically, a front and back two-frame eye moving image mode is preprocessed according to a preset image preprocessing mode; for example, the following steps may be included:
step A: and respectively carrying out gray level histogram processing on the front and rear eye moving images to obtain the minimal gray level threshold values of the eyelids corresponding to the front and rear eye moving images.
The purpose of the gray histogram processing is to obtain a minimum gray threshold of the eyelid in the eye moving image of two frames before and after, so that the threshold can be used as a reference value for binarization processing in the following, and the eyelid and the rest of the eye moving image can be distinguished conveniently. The gray level histogram processing means that the change of the number of corresponding pixels is analyzed sequentially from a low gray level value to a high gray level value by counting the number of pixels of each gray level value in an image. For the eye image, the minimum gray value after the number of the pixel points is changed suddenly is the pupil minimum gray threshold, a specific difference value exists between the pupil gray value and the eyelid gray value based on the research on the gray values of a plurality of eye images, the difference value is used as a preset difference value, and after the pupil minimum gray threshold is obtained, the eyelid minimum gray threshold can be obtained based on the pupil minimum gray threshold and the preset difference value. Thus, step a may for example comprise the following steps:
step A1: respectively counting the number of pixel points of each gray value in the eye moving images of the front frame and the back frame;
step A2: acquiring pupil minimum gray threshold values corresponding to the front and rear eye moving images according to the change of the number of gray value pixel points in the front and rear eye moving images;
step A3: and obtaining the minimum gray threshold of the eyelid corresponding to the front and rear eye moving images according to the minimum gray threshold of the pupil corresponding to the front and rear eye moving images and a preset difference value.
And B: and respectively carrying out corrosion treatment on the front and rear two frames of eye motion images to obtain the front and rear two frames of eye corrosion motion images.
The purpose of the corrosion treatment is to filter out light spots in front and back two frames of eye moving images, so that the pupil and iris connected domain is complete, the binaryzation treatment is more accurate, and the light spots cannot be mistakenly used as upper eyelids.
And C: and carrying out binarization processing on the front and rear two frames of corroded eye moving images based on the minimal gray threshold of the eyelid corresponding to the front and rear two frames of corroded eye moving images to obtain the first frame of eye images and the second frame of eye images.
Specifically, the binarization processing based on the eyelid minimum gray level threshold is to set a part of the two frames of eye moving images before and after the eye moving image, where the gray level is greater than or equal to the eyelid minimum gray level threshold, as 0, and a part of the two frames of eye moving images, where the gray level is less than the eyelid minimum gray level threshold, as 255. Finally, the eyelid part in the first frame eye image and the second frame eye image obtained after binarization is set to be 0 to be black, and the rest part is set to be 255 to be white.
Naturally, before step a, the two frames of eye moving images before and after the previous frame may be respectively subjected to a reduction process (down-sampling process), and the reduction ratio is set according to actual conditions, so as to ensure that the two frames of eye moving images after the reduction are not distorted, for example, the two frames of eye moving images before and after the reduction can be reduced to one fourth of the original eye moving images, thereby eliminating noise, shortening the preprocessing time, and improving the preprocessing efficiency.
Based on the above description, since the eyelid portions in the first frame eye image and the second frame eye image are set to 0 for black and the rest to 255 for white, the upper eyelid edge is the first gray value boundary point from top to bottom of the image. Therefore, in some embodiments of the present embodiment, the upper eyelid edge in the first frame eye image and the second frame eye image refers to a gray value boundary point in the first frame eye image and the second frame eye image.
Step 202: judging whether the eye features of the second frame of eye image meet a first preset condition relative to the eye features of the first frame of eye image or not based on the detection mark of the first frame of eye image; if yes, go to step 203.
It is to be understood that, since the blinking is essentially that the upper eyelid moves up and down, and the pupil does not substantially move, when detecting whether the second eye image is the blinking state image, it should be first determined whether the upper eyelid in the second eye image moves down or up with respect to the upper eyelid in the first eye image. Specifically, the blinking eyelid movement condition is preset as a first preset condition based on the upward and downward movement characteristic of the upper eyelid during the blinking process, so in practical applications, it is determined whether the eye features of the second frame of eye image satisfy the first preset condition, specifically, the blinking eyelid movement condition, with respect to the eye features of the first frame of eye image, that is, whether the upper eyelid in the second frame of eye image moves downward or upward with respect to the upper eyelid in the first frame of eye image, and only if the eye features satisfy the first preset condition, the subsequent step 203 is executed.
It should be noted that the blinking process includes an eye closing process and an eye opening process, when a blink is completed, first, the upper eyelid moves downward, and then, the upper eyelid moves upward, for the two previous and subsequent frames of eye images, the upper eyelid edge in the second frame of eye image moves downward relative to the upper eyelid edge in the first frame of eye image, or the upper eyelid moves upward after the blink is completed, which meets the blinking eyelid movement characteristic, wherein whether the movement is determined based on the preset distance, and the movement is determined when the movement is greater than the preset distance. Therefore, in some embodiments of the present embodiment, the blinking eyelid movement condition includes that the upper eyelid edge of the second frame eye image moves downward by a first preset distance relative to the upper eyelid edge of the first frame eye image or the upper eyelid edge of the second frame eye image moves upward by a second preset distance relative to the upper eyelid edge of the first frame eye image when the eye closure completion flag of the first frame eye image is detected.
It should be noted that, there is a case where the first preset condition is not satisfied, for example, the upper eyelid edge in the second frame of eye image does not satisfy the first preset distance for downward movement of the upper eyelid edge with respect to the upper eyelid edge in the first frame of eye image, and meanwhile, the upper eyelid edge in the second frame of eye image does not satisfy the detection flag of the first frame of eye image for the upper eyelid edge in the first frame of eye image as the eye closing completion flag or does not satisfy the second preset distance for upward movement of the upper eyelid edge in the first frame of eye image, at this time, the second frame of eye image is considered as the non-blinking state image. Therefore, in some embodiments of this embodiment, the method further comprises: and if the eye features of the second frame of eye image do not meet a first preset condition relative to the eye features of the first frame of eye image, determining that the second frame of eye image is a non-blinking state image.
Specifically, when the movement condition of the upper eyelid edge in the second frame of eye image relative to the upper eyelid edge in the first frame of eye image does not satisfy the first preset condition, the second frame of eye image may be an eye-closing state image or an undetermined uncertain state image based on whether the pupil position of the second frame of eye image is valid (i.e., whether the pupil is detected in the second frame of eye image).
Step 203: judging whether the eye features of the second frame of eye image meet a second preset condition or not; if yes, go to step 204.
It is understood that the blinking pupil position condition and the blinking pupil position movement condition are preset based on the characteristic that the pupil does not substantially move during blinking. In some embodiments of this embodiment, the blinking pupil position condition or the blinking pupil position movement condition is used as a second preset condition.
In practical applications, after it is determined that the upper eyelid is actually moved downward or upward, it is determined whether any one of the two conditions is satisfied based on the pupil positions in the first frame of eye image and the second frame of eye image, that is, it is determined whether the pupil is substantially not moved, and the subsequent step 204 is performed only if the pupil position is satisfied.
It should be noted that, since the pupil may be detected in the second frame image (i.e., the pupil position is valid) or may not be detected (i.e., the pupil position is invalid), and when the pupil is not detected to meet the characteristic that the pupil does not move substantially in the blinking process, the preset blinking pupil position condition includes that the pupil position in the second frame image is invalid. If the pupil position in the second frame of eye image is valid, the movement of the pupil position in the second frame of eye image relative to the pupil position in the first frame of eye image is small, whether the movement is basically determined based on the preset distance is determined, the movement is considered negligible when the movement is smaller than the preset distance, and the characteristic that the pupil is basically not moved in the blinking process is also met, and the movement condition of the blinking pupil position comprises that the pupil displacement of the pupil position of the second frame of eye image relative to the pupil position of the first frame of eye image is smaller than a third preset distance when the pupil position in the second frame of eye image is valid.
The determination of valid or invalid pupil position is as follows:
the pupil position may be obtained by eye tracking, which may be referred to as eye gaze tracking, which is a technique for estimating the gaze and/or fixation point of the eye by measuring the movement of the eye.
The eye tracking module may include a light source, which may be an infrared LED light source. Since the infrared LED light source does not affect the vision of the eye. The infrared LED light sources can comprise 2 groups or more than 2 groups, and each group at least comprises one LED light source. The infrared LED light sources can be arranged in a set mode, such as a round shape, a delta shape and the like; an image acquisition device, such as an infrared camera device, an infrared image sensor, a camera or a video camera. The image acquisition equipment can shoot eye images required by eyeball tracking; a driving device: driving the image acquisition equipment and the infrared LED light source, performing image preprocessing or communicating with other equipment and the like;
in this embodiment, specifically, the infrared LED light source irradiates the eye to be detected, the image acquisition device shoots the eye to be detected, and shoots a reflection point, i.e., a light spot, of the infrared LED light source on the cornea; when the eyeball finishes blinking, the relative position relationship between the pupil center and the light spot is changed, and the position change relationship can be reflected by a plurality of correspondingly acquired eye images with the light spot; the pupil position can be determined by analyzing the position relationship between the spot characteristic and the pupil characteristic in the eye image; when the user image does not contain light spots and the sight of the user cannot be determined according to a pre-constructed sight model, determining that the pupil position is invalid; wherein the pre-constructed gaze model may be a model determined when the user first uses the image capture device for eye tracking.
It should be further noted that there may be a case where the second preset condition is not satisfied, that is, the condition of the blinking pupil position is not satisfied, and the condition of the movement of the blinking pupil position is also not satisfied, for example, the pupil position in the second frame eye image is valid, and the movement of the pupil position in the second frame eye image relative to the pupil position in the first frame eye image is greater than a third preset distance, at this time, the second frame eye image is considered to be a non-blinking state image, for example, specifically, an image in a non-blinking state. Therefore, in some embodiments of this embodiment, the method further comprises: and if the eye features of the second frame of eye images do not meet a second preset condition, determining that the second frame of eye images are non-blinking state images.
Step 204: and determining that the second frame eye image is a blink state image.
It can be understood that, when the determination result in the step 202 is that the step 203 is performed and the determination result is that the determination result is satisfied, it indicates that the upper eyelid edge and the pupil position in the second frame of eye image correspond to the upper eyelid edge and the pupil position in the first frame of eye image, and the characteristics that the upper eyelid moves upward or downward and the pupil does not substantially move during the blinking process are met, and then the second frame of eye image is the blinking state image.
Through various implementation manners provided by the embodiment, at least a first frame of eye image, eye characteristics of a second frame of eye image and a detection mark indicating the open-closed eye state of the eye image of the first frame of eye image are obtained; based on the detection mark of the first frame of eye image, firstly judging whether the eye features of the second frame of eye image meet a first preset condition relative to the eye features of the first frame of eye image; if yes, judging whether the eye features of the second frame of eye image meet a second preset condition; and if so, determining that the second frame eye image is a blink state image. Therefore, according to the fact that the actual blinking characteristics in the blinking process are the up-down movement characteristics of the upper eyelid and the basic pupil immobility characteristics, the first preset condition and the second preset condition related to blinking are set, whether the eye images are in the blinking state is judged based on whether the eye characteristics in the front and back eye images meet the blinking conditions, robustness and robustness are good, the detection error rate is reduced, and the detection accuracy is improved.
Referring to fig. 3, a schematic flow chart of another method for detecting a blink in the embodiment of the application is shown. In this embodiment, the method may include, for example, the steps of:
step 301: preprocessing two frames of eye moving images before and after according to a preset image preprocessing mode, and acquiring a first frame of eye image and a second frame of eye image in advance, wherein the preset image preprocessing mode at least comprises gray histogram processing, corrosion processing and binarization processing.
Step 302: and acquiring the positions of the upper eyelid edge and the pupil in the first frame of eye image and the second frame of eye image, and the detection mark of the second frame of eye image, wherein the first frame of eye image is earlier than the second frame of eye image, and the detection mark is used for indicating the open and close eye state of the eye image.
Step 303: judging whether the movement condition of the upper eyelid edge in the second frame of eye image relative to the upper eyelid edge in the first frame of eye image meets the requirement that the upper eyelid edge moves downwards by a first preset distance, if so, executing a step 305; if not, go to step 304.
Step 304: based on the detection flag of the first frame of eye image, it is determined whether the movement condition of the upper eyelid edge in the second frame of eye image relative to the upper eyelid edge in the first frame of eye image satisfies that the upper eyelid edge moves upward by a second preset distance when the detection flag of the first frame of eye image is the eye closing completion flag, if yes, step 305 is executed, and if not, step 308 is executed.
Step 305: and judging whether the pupil position in the second frame of eye image meets the requirement that the pupil position in the second frame of eye image is invalid, if so, executing step 307, and if not, executing step 306.
Step 306: and judging whether the movement condition of the pupil position in the second frame of eye image relative to the pupil position in the first frame of eye image meets the condition that the pupil movement is less than a third preset distance, if so, executing step 307, and if not, executing step 308.
Step 307: and determining the second eye image as a blink state image.
Step 308: and determining that the second eye image is a non-blinking state image.
Through various implementation manners provided by the embodiment, at least a first frame of eye image, eye characteristics of a second frame of eye image and a detection mark indicating the open-closed eye state of the eye image of the first frame of eye image are obtained; based on the detection mark of the first frame of eye image, firstly judging whether the eye features of the second frame of eye image meet a first preset condition relative to the eye features of the first frame of eye image; if yes, judging whether the eye features of the second frame of eye image meet a second preset condition; and if so, determining that the second frame eye image is a blink state image. Therefore, according to the fact that the actual blinking characteristics in the blinking process are the up-down movement characteristics of the upper eyelid and the basic pupil immobility characteristics, the first preset condition and the second preset condition related to blinking are set, whether the eye images are in the blinking state is judged based on whether the eye characteristics in the front and back eye images meet the blinking conditions, robustness and robustness are good, the detection error rate is reduced, and the detection accuracy is improved.
Exemplary device
Referring to fig. 4, a schematic structural diagram of an apparatus for detecting a blink in the embodiment of the application is shown. In this embodiment, the apparatus may specifically include:
a first obtaining unit 401, configured to obtain at least a first frame eye image, an eye feature of a second frame eye image, and a detection flag of the first frame eye image, where the detection flag is used to indicate an open/close eye state of the eye image;
a first determining unit 402, configured to determine whether an eye feature of the second frame eye image satisfies a first preset condition relative to an eye feature of the first frame eye image based on the detection flag of the first frame eye image; if yes, entering a second judgment unit;
a second determining unit 403, configured to determine whether the eye feature of the second frame of eye image meets a second preset condition; if yes, entering a first determining unit;
a first determining unit 404, configured to determine that the second frame eye image is a blink state image.
Optionally, the ocular features include superior eyelid margin and pupil location.
Optionally, the first frame eye image and the second frame eye image are obtained by preprocessing a front eye moving image and a rear eye moving image according to a preset image preprocessing mode, where the preset image preprocessing mode at least includes gray histogram processing, erosion processing, and binarization processing; the upper eyelid edge is a gray value boundary point in the first frame eye image and the second frame eye image.
Optionally, the first preset condition is a blinking eyelid movement condition, where the blinking eyelid movement condition includes that the upper eyelid edge of the second frame of eye image moves downward by a first preset distance relative to the upper eyelid edge of the first frame of eye image or the upper eyelid edge of the second frame of eye image moves upward by a second preset distance relative to the upper eyelid edge of the first frame of eye image when the detection flag of the first frame of eye image is the eye closing completion flag.
Optionally, the second preset condition is an eye pupil position condition or a blinking pupil position moving condition.
Optionally, the blinking pupil position condition includes that the pupil position of the second frame of eye image is invalid.
Optionally, the blinking pupil position moving condition includes that pupil displacement of the pupil position of the second frame of eye image relative to the pupil position of the first frame of eye image is smaller than a third preset distance under the condition that the pupil position in the second frame of eye image is valid.
Optionally, the method further includes:
and the second determining unit is used for determining that the second frame eye image is a non-blinking state image if the eye features of the second frame eye image do not meet a first preset condition relative to the eye features of the first frame eye image.
Optionally, the method further includes:
and the third determining unit is used for determining that the second frame of eye image is a non-blinking state image if the eye features of the second frame of eye image do not meet a second preset condition.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The device disclosed by the embodiment corresponds to the method disclosed by the embodiment, so that the description is simple, and the relevant points can be referred to the method part for description.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The foregoing is merely a preferred embodiment of the present application and is not intended to limit the present application in any way. Although the present application has been described with reference to the preferred embodiments, it is not intended to limit the present application. Those skilled in the art can now make numerous possible variations and modifications to the disclosed embodiments, or modify equivalent embodiments, using the methods and techniques disclosed above, without departing from the scope of the claimed embodiments. Therefore, any simple modification, equivalent change and modification made to the above embodiments according to the technical essence of the present application still fall within the protection scope of the technical solution of the present application without departing from the content of the technical solution of the present application.

Claims (5)

1. A method of detecting an eye blink, comprising:
obtaining at least a first frame of eye image, eye features of a second frame of eye image and detection marks of the first frame of eye image, wherein the detection marks are used for representing the open and closed eye states of the eye images, and the eye features comprise upper eyelid edges and pupil positions;
based on the detection sign of the first frame of eye image, judging whether the eye features of the second frame of eye image meet a first preset condition relative to the eye features of the first frame of eye image, wherein the first preset condition is preset based on the up-down movement characteristic of an upper eyelid in a blinking process, the first preset condition is a blinking eyelid movement condition, and the blinking eyelid movement condition comprises that the upper eyelid edge of the second frame of eye image moves downwards relative to the upper eyelid edge of the first frame of eye image by a first preset distance or the upper eyelid edge of the second frame of eye image moves upwards relative to the upper eyelid edge of the first frame of eye image by a second preset distance under the condition that the detection sign of the first frame of eye image is an eye closing completion sign;
if so, judging whether the eye features of the second frame of eye image meet a second preset condition, wherein the second preset condition is a blinking pupil position condition or a blinking pupil position moving condition, the blinking pupil position condition comprises that the pupil position of the second frame of eye image is invalid, and the blinking pupil position moving condition comprises that the pupil position of the second frame of eye image relative to the pupil position of the first frame of eye image has a pupil displacement smaller than a third preset distance under the condition that the pupil position in the second frame of eye image is valid;
if so, determining that the second frame eye image is a blink state image.
2. The method according to claim 1, wherein the first frame eye image and the second frame eye image are obtained by preprocessing two frames of eye moving images before and after according to a preset image preprocessing mode, wherein the preset image preprocessing mode at least comprises gray histogram processing, erosion processing and binarization processing; the upper eyelid edge is a gray value boundary point in the first frame eye image and the second frame eye image.
3. The method of claim 1, further comprising:
and if the eye features of the second frame of eye image do not meet a first preset condition relative to the eye features of the first frame of eye image, determining that the second frame of eye image is a non-blinking state image.
4. The method of claim 1, further comprising:
and if the eye features of the second frame of eye images do not meet a second preset condition, determining that the second frame of eye images are non-blinking state images.
5. An apparatus for detecting eye blinking, comprising:
a first obtaining unit, configured to obtain at least a first frame of eye image, eye features of a second frame of eye image, and detection flags of the first frame of eye image, where the detection flags are used to indicate open and closed eye states of the eye images, and the eye features include an upper eyelid edge and a pupil position;
a first determining unit, configured to determine, based on the detection flag of the first frame of eye image, whether the eye feature of the second frame of eye image satisfies a first preset condition with respect to the eye feature of the first frame of eye image, where the first preset condition is preset based on an upper eyelid vertical movement characteristic in a blinking process, the first preset condition is a blinking eyelid movement condition, and the blinking eyelid movement condition includes that an upper eyelid edge of the second frame of eye image moves downward by a first preset distance with respect to an upper eyelid edge of the first frame of eye image or that the upper eyelid edge of the second frame of eye image moves upward by a second preset distance with respect to the upper eyelid edge of the first frame of eye image when the detection flag of the first frame of eye image is an eye closing completion flag; if yes, entering a second judgment unit;
a second determining unit, configured to determine whether an eye feature of the second frame of eye image meets a second preset condition, where the second preset condition is a blinking pupil position condition or a blinking pupil position movement condition, the blinking pupil position condition includes that a pupil position of the second frame of eye image is invalid, and the blinking pupil position movement condition includes that a pupil displacement of the pupil position of the second frame of eye image relative to a pupil position of the first frame of eye image is smaller than a third preset distance under a condition that the pupil position in the second frame of eye image is valid; if yes, entering a first determining unit;
and the first determining unit is used for determining the second frame eye image as a blink state image.
CN201811455963.5A 2018-11-30 2018-11-30 Method and device for detecting blink Active CN109522868B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811455963.5A CN109522868B (en) 2018-11-30 2018-11-30 Method and device for detecting blink

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811455963.5A CN109522868B (en) 2018-11-30 2018-11-30 Method and device for detecting blink

Publications (2)

Publication Number Publication Date
CN109522868A CN109522868A (en) 2019-03-26
CN109522868B true CN109522868B (en) 2021-07-23

Family

ID=65793684

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811455963.5A Active CN109522868B (en) 2018-11-30 2018-11-30 Method and device for detecting blink

Country Status (1)

Country Link
CN (1) CN109522868B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107766840A (en) * 2017-11-09 2018-03-06 杭州有盾网络科技有限公司 A kind of method, apparatus of blink detection, equipment and computer-readable recording medium
CN107862298A (en) * 2017-11-27 2018-03-30 电子科技大学 It is a kind of based on the biopsy method blinked under infrared eye
CN107977622A (en) * 2017-11-30 2018-05-01 西安科锐盛创新科技有限公司 Eyes detection method based on pupil feature
CN108629293A (en) * 2018-04-16 2018-10-09 西安交通大学 A kind of adaptive near-infrared iris image acquiring method with feedback mechanism
CN108875541A (en) * 2018-03-16 2018-11-23 中国计量大学 A kind of visual fatigue detection algorithm based on virtual reality technology
CN108898093A (en) * 2018-02-11 2018-11-27 陈佳盛 A kind of face identification method and the electronic health record login system using this method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130021226A1 (en) * 2011-07-21 2013-01-24 Jonathan Arnold Bell Wearable display devices
US9305225B2 (en) * 2013-10-14 2016-04-05 Daon Holdings Limited Methods and systems for determining user liveness
CN106897659B (en) * 2015-12-18 2019-05-24 腾讯科技(深圳)有限公司 The recognition methods of blink movement and device
CN107358151A (en) * 2017-06-02 2017-11-17 广州视源电子科技股份有限公司 A kind of eye motion detection method and device and vivo identification method and system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107766840A (en) * 2017-11-09 2018-03-06 杭州有盾网络科技有限公司 A kind of method, apparatus of blink detection, equipment and computer-readable recording medium
CN107862298A (en) * 2017-11-27 2018-03-30 电子科技大学 It is a kind of based on the biopsy method blinked under infrared eye
CN107977622A (en) * 2017-11-30 2018-05-01 西安科锐盛创新科技有限公司 Eyes detection method based on pupil feature
CN108898093A (en) * 2018-02-11 2018-11-27 陈佳盛 A kind of face identification method and the electronic health record login system using this method
CN108875541A (en) * 2018-03-16 2018-11-23 中国计量大学 A kind of visual fatigue detection algorithm based on virtual reality technology
CN108629293A (en) * 2018-04-16 2018-10-09 西安交通大学 A kind of adaptive near-infrared iris image acquiring method with feedback mechanism

Also Published As

Publication number Publication date
CN109522868A (en) 2019-03-26

Similar Documents

Publication Publication Date Title
US9733703B2 (en) System and method for on-axis eye gaze tracking
CN106774863B (en) Method for realizing sight tracking based on pupil characteristics
CN101523411B (en) Eye opening detection system and method of detecting eye opening
JP4137969B2 (en) Eye detection device, eye detection method, and program
US11849998B2 (en) Method for pupil detection for cognitive monitoring, analysis, and biofeedback-based treatment and training
JP4895847B2 (en) 瞼 Detection device and program
US7362885B2 (en) Object tracking and eye state identification method
CN109684981B (en) Identification method and equipment of cyan eye image and screening system
CN109697716B (en) Identification method and equipment of cyan eye image and screening system
JP2008146172A (en) Eye detection device, eye detection method and program
CN109939432B (en) Intelligent rope skipping counting method
CN105224285A (en) Eyes open and-shut mode pick-up unit and method
CN107895157B (en) Method for accurately positioning iris center of low-resolution image
KR102442220B1 (en) Living-body detection method and apparatus for face, electronic device ad computer readable medium
CN109635761B (en) Iris recognition image determining method and device, terminal equipment and storage medium
CN115019380A (en) Strabismus intelligent identification method, device, terminal and medium based on eye image
CN115039150A (en) Determination method, determination device, and determination program
KR20030066512A (en) Iris Recognition System Robust to noises
JP5139470B2 (en) Sleepiness level estimation device and sleepiness level estimation method
KR20220107022A (en) Fundus Image Identification Method, Apparatus and Apparatus
CN105279764B (en) Eye image processing apparatus and method
CN109522868B (en) Method and device for detecting blink
JP2004192552A (en) Eye opening/closing determining apparatus
CN110598635B (en) Method and system for face detection and pupil positioning in continuous video frames
JP5128454B2 (en) Wrinkle detection device, wrinkle detection method and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant