CN107862304A - The determination methods of eye state - Google Patents

The determination methods of eye state Download PDF

Info

Publication number
CN107862304A
CN107862304A CN201711241703.3A CN201711241703A CN107862304A CN 107862304 A CN107862304 A CN 107862304A CN 201711241703 A CN201711241703 A CN 201711241703A CN 107862304 A CN107862304 A CN 107862304A
Authority
CN
China
Prior art keywords
pupil
point
area
eye
eye state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201711241703.3A
Other languages
Chinese (zh)
Other versions
CN107862304B (en
Inventor
张捷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Cresun Innovation Technology Co Ltd
Original Assignee
Xian Cresun Innovation Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Cresun Innovation Technology Co Ltd filed Critical Xian Cresun Innovation Technology Co Ltd
Priority to CN201711241703.3A priority Critical patent/CN107862304B/en
Publication of CN107862304A publication Critical patent/CN107862304A/en
Application granted granted Critical
Publication of CN107862304B publication Critical patent/CN107862304B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/197Matching; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The present invention relates to a kind of determination methods of eye state, including:First pupil center's point is positioned according to eyes image;According to the first pupil center point, the first pupil boundary points are extracted;Pass through the second pupil center of the first pupil boundary point location point;First pupil area and the second pupil area are calculated according to the second pupil center point respectively;Eye state is judged according to first pupil area and the second pupil area.The determination methods of eye state provided by the invention, eye state is judged by the feature of pupil boundary points, this method is lacked to face, deflection angle, background, illumination, the restriction of eyelid, eyelashes and the conditions such as minute surface is reflective, and accuracy rate is higher, it is cheap without equipment costly, cost.

Description

Eye state judging method
Technical Field
The invention relates to the field of image recognition, in particular to a method for judging eye states.
Background
The image is an image formed by objective scenes in human brain, is the most important information source of human beings, is obtained from the objective world through various observation systems, and has intuition and comprehensibility. With the rapid development of computer technology, multimedia technology and artificial intelligence technology, the application of image recognition technology is more and more extensive, and the image recognition technology has achieved certain achievement in the fields of scientific research, education management, medical treatment and health, military and the like. Image recognition technology is significantly changing people's lifestyle and production means, such as people can enjoy the view of the moon by means of image recognition technology, a license plate recognition system in traffic management, computer vision in the field of robots, and the like. Image recognition is a technique that utilizes a computer to process, analyze, and understand images to identify various patterns of objects and objects. However, due to the complexity and specificity of the acquired images, image processing and recognition technology becomes a research hotspot.
Eye recognition is one of image recognition processes, and has wide applications. If the driving fatigue automatic alarm system judges whether the driver is tired by detecting the eye state and continuously tracking to determine the eye closing duration or the blinking frequency; living body judgment can also be carried out by continuously tracking the eye state; the judgment and tracking of the eye state can also provide rich information for facial expression analysis, man-machine interaction and the like. Therefore, the eye state judgment has very important significance.
Currently, there are many methods for eye state detection, including symmetric transformation, hough transformation, special light source, statistical-based, template-based, and gray-scale projection. The methods have the disadvantages of large computation amount, low program execution efficiency, low detection accuracy rate, difficult algorithm realization under the existing software and hardware conditions, and the like. Background, illumination, eyelids, eyelashes, specular reflection and the like have certain conditions, and the method has large calculation amount and slow processing.
Disclosure of Invention
Therefore, in order to solve the technical defects and shortcomings of the prior art, the invention provides a method for judging eye states.
Specifically, an embodiment of the present invention provides an eye state detection method based on pupil characteristics, including:
positioning a first pupil center point according to the eye image;
extracting a first pupil boundary point according to the first pupil center point;
locating a second pupil center point through the first pupil boundary point;
respectively calculating a first pupil area and a second pupil area according to the second pupil center point;
and judging the eye state according to the first pupil area and the second pupil area.
In one embodiment of the present invention, locating the first pupil center point from the eye image comprises:
and estimating an eye central region through the gray value of the eye image, and searching the point with the minimum gray value of the eye central region to be positioned as the first pupil center point.
In an embodiment of the present invention, extracting a first pupil boundary point according to the first pupil center point includes:
taking the first pupil center point as an origin, respectively emitting M first rays in the directions of a positive y-axis half shaft and a negative y-axis half shaft, wherein the first rays are symmetrical to the x axis;
calculating a gray scale gradient of the first ray direction;
and selecting the point with the maximum gray gradient as the first pupil boundary point.
In one embodiment of the invention, locating a second pupil center point with the first pupil boundary point comprises:
fitting the first pupil boundary point, and acquiring a central point of the first pupil boundary point by an averaging method;
taking the central point as the second pupil central point.
In an embodiment of the present invention, calculating the first pupil area and the second pupil area according to the second pupil center point respectively includes:
extracting a second pupil boundary point according to the second pupil center point;
and respectively calculating a first pupil area and a second pupil area through the second pupil boundary point.
In an embodiment of the present invention, extracting a second pupil boundary point according to the second pupil center point includes:
respectively emitting N second rays in the directions of the positive half axis and the negative half axis of the y axis by taking the center point of the second pupil as an origin, wherein the second rays are symmetrical to the x axis;
calculating a gray scale gradient of the second ray direction;
and selecting the point with the maximum gray gradient as the second pupil boundary point.
In one embodiment of the present invention, calculating the first pupil area from the second pupil boundary point comprises:
fitting the second pupil boundary points to a circle;
calculating the circular area as the first pupil area.
In one embodiment of the present invention, calculating a second pupil area according to the second pupil center point includes:
connecting the second pupil boundary points in a straight way two by two to form a polygon;
and calculating the polygonal area as the second pupil area.
In one embodiment of the present invention, determining the eye state according to the first pupil area and the second pupil area comprises:
determining an eye state value by adopting an eye state formula according to the first pupil area and the second pupil area;
selecting a first eye state threshold value and a second eye state threshold value;
and comparing the eye state value with a first eye state threshold value and a second eye state threshold value respectively to determine the eye state.
In one embodiment of the present invention, the eye state formula is:
wherein S is 1 Is the first pupil area, S 2 Is the second pupil area.
Based on this, the invention has the following advantages:
the method for judging the eye state based on the pupil characteristics has the advantages that:
1) The method has few restrictions on conditions such as human faces, deflection angles, backgrounds, illumination, eyelids, eyelashes and mirror reflection, and has high accuracy;
2) The method does not need a large number of training samples, has simple and convenient algorithm and high calculation efficiency;
3) The invention does not need expensive and complicated equipment and has low cost.
Other aspects and features of the present invention will become apparent from the following detailed description, which proceeds with reference to the accompanying drawings. It is to be understood, however, that the drawings are designed solely for purposes of illustration and not as a definition of the limits of the invention, for which reference should be made to the appended claims. It should be further understood that the drawings are not necessarily drawn to scale and that, unless otherwise indicated, they are merely intended to conceptually illustrate the structures and procedures described herein.
Drawings
The following detailed description of embodiments of the invention will be made with reference to the accompanying drawings.
Fig. 1 is a schematic view illustrating a method for determining an eye state according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an eye-open state provided by an embodiment of the present invention;
FIG. 3 is a schematic diagram of an eye closure state provided by an embodiment of the present invention;
fig. 4 is a schematic diagram of a half-open and half-closed state of eyes according to an embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in detail below.
Example one
Referring to fig. 1, fig. 1 is a method for determining an eye state according to an embodiment of the present invention. The method comprises the following steps:
step 1, positioning a first pupil center point according to an eye image;
step 2, extracting a first pupil boundary point according to the first pupil center point;
step 3, positioning a second pupil center point through the first pupil boundary point;
step 4, respectively calculating a first pupil area and a second pupil area according to the second pupil center point;
and 5, judging the eye state according to the first pupil area and the second pupil area.
Wherein, for step 1, it may include:
and estimating an eye central region through the gray value of the eye image, and searching the point with the minimum gray value of the eye central region to be positioned as the first pupil center point.
Wherein, for step 2, the method may include:
taking the first pupil center point as an origin, respectively emitting M first rays in the directions of a positive y-axis half shaft and a negative y-axis half shaft, wherein the first rays are symmetrical to the x axis;
calculating a gray scale gradient of the first ray direction;
and selecting the point with the maximum gray gradient as the first pupil boundary point.
Wherein, for step 2, may include:
emitting straight lines along the direction of the upper eyelid by taking the pupil center point as a starting point to form M rays;
and emitting straight lines along the direction of the lower eyelid by taking the pupil center point as a starting point to form N rays.
Wherein, for step 3, it may include:
fitting the first pupil boundary point, and acquiring a central point of the first pupil boundary point by an averaging method;
and taking the central point as the second pupil central point.
Wherein, for step 4, the method may include:
extracting a second pupil boundary point according to the second pupil center point;
and respectively calculating a first pupil area and a second pupil area through the second pupil boundary point.
For extracting a second pupil boundary point according to the second pupil center point in step 4, the extracting may include:
respectively emitting N second rays in the directions of the positive half axis and the negative half axis of the y axis by taking the center point of the second pupil as an origin, wherein the second rays are symmetrical to the x axis;
calculating a gray scale gradient of the second ray direction;
and selecting the point with the maximum gray gradient as the second pupil boundary point.
Wherein, calculating the first pupil area through the second pupil boundary point in step 4 may include:
fitting the second pupil boundary point to a circle;
calculating the circular area as the first pupil area.
For calculating the second pupil area according to the second pupil center point in step 4, the method may include:
connecting the second pupil boundary points in a pairwise manner to form a polygon;
and calculating the area of the polygon as the area of the second pupil.
Wherein, for step 5, the method may include:
determining an eye state value by adopting an eye state formula according to the first pupil area and the second pupil area;
selecting a first eye state threshold value and a second eye state threshold value;
and comparing the eye state value with a first eye state threshold value and a second eye state threshold value respectively to determine the eye state.
Further, for the eye state in step 5, the formula is:
wherein S is 1 Is the first pupil area, S 2 Is the second pupil area.
The eye state judging method provided by the invention does not need a large number of high-definition image learning templates, can well reduce the operation complexity, improves the real-time performance, has high reliability and wide application prospect, does not need expensive and complicated equipment and has low cost. Example two
On the basis of the above embodiments, the present embodiment further describes an eye state detection method based on pupil characteristics.
The method comprises the following steps:
step 1, obtaining an eye image
And after the eye image is acquired, processing the eye image and adjusting the eye part to be a horizontal position. Converting the eye image into an eye gray scale image, and performing gray scale contrast enhancement pretreatment on the eye gray scale image, wherein the treatment method comprises the following steps:
f=c*log(1+double(f 0 ))
wherein f is 0 The original image is shown, and f is a contrast-enhanced image.
And performing Laplace filtering processing on the image after the contrast enhancement.
The gray contrast enhancement pretreatment is carried out on the eye gray image, so that the pupil and the external area can be distinguished more conveniently; in addition, the nondirectional Laplace filtering can denoise the eye image in all directions.
Step 2, positioning the center point of the first pupil
Estimating the eye central area from the eye gray scale image processed in the step 1, searching a point with the minimum gray scale value of the eye central area, and if the point is approximately positioned at the middle point of the eye central area, positioning the point as a first pupil central point; otherwise, the search continues until a gray value minimum point approximately near the midpoint of the central region of the eye is found.
Step 3, extracting a first pupil boundary point
Taking the center point of the first pupil as an origin, respectively emitting M first rays in the directions of a positive half shaft and a negative half shaft of the y axis, wherein the first rays are symmetrical to the x axis;
calculating the gray gradient of the first ray direction, wherein the calculation steps are as follows:
a) Calculating partial differential of gray value in the first ray direction:
where f (i, j) is the gray value of the eye image at coordinate (i, j).
b) And calculating the gray gradient of the first ray direction:
extracting the point with the maximum D and recording the point as Dmax; when Dmax > boundary point threshold, then the point is the pupil boundary point. Wherein, the threshold value of the boundary point is selected from a specific value which is larger than the gray gradient at the junction of the pupil and the skin and smaller than the gray gradient at the junction of the pupil and the white of the eye, and the threshold value is self-defined according to the individual difference. The pupil boundary points are at the pupil sections and the white sections alternate.
Step 4, positioning the center point of the second pupil
Fitting the first pupil boundary point determined in the step 3, approximately fitting the first pupil boundary point into a circle, and extracting the central point of the first pupil boundary point by an averaging method;
the center point is taken as the second pupil center point.
Step 5, extracting a second pupil boundary point
Taking the center point of the second pupil as an origin, respectively emitting M second rays in the directions of the positive half shaft and the negative half shaft of the y axis, wherein the second rays are symmetrical to the x axis;
and calculating the gray gradient of the second ray direction, wherein the calculation steps are as follows:
a) And calculating partial differential of the gray value of the second ray direction:
where f (i, j) is the gray value of the eye image at coordinate (i, j).
b) And calculating the gray gradient of the second ray direction:
and selecting the point with the maximum gray gradient D as a second pupil boundary point.
Step 6, calculating the area of the first pupil
And fitting the second pupil boundary points determined in the step 5, approximately fitting the second pupil boundary points into a circle, calculating the area of the circle, and taking the area of the circle as the area of the first pupil.
Step 7, calculating the area of the second pupil
Connecting the second pupil boundary points in pairs to form a polygon;
and calculating the polygonal area as the second pupil area.
Step 8, judging the eye state
The eye state formula is:
wherein S is 1 Is the first pupil area, S 2 Is the second pupil area;
substituting the first pupil area and the second pupil area obtained in the steps 6 and 7 into an eye state formula to solve an eye state value;
selecting a first eye state threshold value cth1 and a second eye state threshold value cth2;
when θ > cth1, the eye is in a closed state,
when θ < cth2, the eyes are in an open state,
when the cth1 is not less than theta and not more than the cth2, the eyes are in a half-open and half-closed state.
The pupil of the eye of a person is small, the gray level is low, the pupil image cannot be shielded by the eyelid due to personal physiological factors, the pupil is complete when the eye is in a normally open state, the through hole disappears when the eye is in a closed state, and the upper edge and the lower edge of the pupil are shielded when the eye is in an intermediate state between the open state and the closed state, so the opening and the closing of the eye are judged by detecting the boundary of the pupil.
Example 3
On the basis of the above embodiments, the present embodiment exemplifies a method for determining the eye state.
Referring to fig. 2, fig. 2 is a schematic diagram illustrating an open state of eyes according to an embodiment of the present invention. As shown in the figure, the first closure degree threshold value is selected to be 0.8, the second closure degree threshold value is selected to be 0.2, the second pupil boundary points are fitted, the second pupil boundary points are approximately fitted to be a circle, the area of the circle is calculated to be 3.14, the second pupil boundary points are connected in a straight-connecting mode in pairs to form a polygon, the area of the polygon is calculated to be 2.6, an eye closure degree formula is substituted, theta is calculated to be 0.17, theta is smaller than the second closure degree threshold value and is 0.2, and therefore the eye is in an open state.
With continued reference to fig. 3, fig. 3 is a schematic view of an eye closing state according to an embodiment of the present invention. As shown in the figure, the first closure degree threshold value is selected to be 0.8, the second closure degree threshold value is selected to be 0.2, the second pupil boundary points are fitted, the second pupil boundary points are approximately fitted to be circular, the area of the circular shape is calculated to be 3.14, the second pupil boundary points are connected in pairs and directly, a polygon is formed by connecting, the area of the polygon is calculated to be 0.42, the polygon is substituted into an eye closure degree formula, theta is calculated to be 0.86, theta is larger than the first closure degree threshold value and is 0.2, and therefore the eye is in a closed state.
Referring to fig. 4, fig. 4 is a schematic diagram illustrating a half-open and half-closed state of eyes according to an embodiment of the present invention. As shown in the figure, the first closure degree threshold value is selected to be 0.8, the second closure degree threshold value is selected to be 0.2, the second pupil boundary points are fitted, the second pupil boundary points are approximately fitted to be a circle, the area of the circle is calculated to be 3.14, the second pupil boundary points are connected in a straight-connecting mode in pairs to form a polygon, the area of the polygon is calculated to be 1.7, the polygon is substituted into an eye closure degree formula, theta is 0.46, theta is larger than the second closure degree threshold value to be 0.2 and smaller than the first closure degree threshold value to be 0.8, and therefore the eye is in a half-open and half-closed state.
In summary, the present invention provides a method for determining eye state based on specific examples, and the above description of the embodiments is only used to help understanding the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in view of the above, the content of the present specification should not be construed as a limitation to the present invention, and the scope of the present invention should be subject to the appended claims.

Claims (10)

1. An eye state determination method, comprising:
positioning a first pupil center point according to the eye image;
extracting a first pupil boundary point according to the first pupil center point;
locating a second pupil center point through the first pupil boundary point;
respectively calculating a first pupil area and a second pupil area according to the second pupil center point;
and judging the eye state according to the first pupil area and the second pupil area.
2. The method of claim 1, wherein locating the first pupil center point from the eye image comprises:
and estimating an eye central region through the gray value of the eye image, and searching the point with the minimum gray value of the eye central region to be positioned as the first pupil center point.
3. The method of claim 1, wherein extracting a first pupil boundary point from the first pupil center point comprises:
taking the first pupil center point as an origin, respectively emitting M first rays in the directions of a positive y-axis half shaft and a negative y-axis half shaft, wherein the first rays are symmetrical to the x axis;
calculating a gray scale gradient of the first ray direction;
and selecting the point with the maximum gray gradient as the first pupil boundary point.
4. The method of claim 1, wherein locating a second pupil center point with the first pupil boundary point comprises:
fitting the first pupil boundary point, and acquiring a central point of the first pupil boundary point by using an averaging method;
and taking the central point as the second pupil central point.
5. The method of claim 1, wherein calculating a first pupil area and a second pupil area from the second pupil center point, respectively, comprises:
extracting a second pupil boundary point according to the second pupil center point;
and respectively calculating a first pupil area and a second pupil area through the second pupil boundary point.
6. The method of claim 5, wherein extracting a second pupil boundary point from the second pupil center point comprises:
respectively emitting N second rays in the directions of the positive half axis and the negative half axis of the y axis by taking the center point of the second pupil as an origin, wherein the second rays are symmetrical to the x axis;
calculating a gray scale gradient of the second ray direction;
and selecting the point with the maximum gray gradient as the second pupil boundary point.
7. The method of claim 5, wherein calculating a first pupil area from the second pupil boundary point comprises:
fitting the second pupil boundary points to a circle;
calculating the circular area as the first pupil area.
8. The method of claim 5, wherein calculating a second pupil area from the second pupil center point comprises:
connecting the second pupil boundary points in a pairwise manner to form a polygon;
and calculating the area of the polygon as the area of the second pupil.
9. The method of claim 1, wherein determining the eye state from the first pupil area and the second pupil area comprises:
determining an eye state value by adopting an eye state formula according to the first pupil area and the second pupil area;
selecting a first eye state threshold value and a second eye state threshold value;
comparing the eye state value with a first eye state threshold and a second eye state threshold, respectively, to determine the eye state.
10. The method of claim 9, wherein the eye state formula is:
wherein S is 1 Is the first pupil area, S 2 Is the second pupil area.
CN201711241703.3A 2017-11-30 2017-11-30 Eye state judging method Active CN107862304B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711241703.3A CN107862304B (en) 2017-11-30 2017-11-30 Eye state judging method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711241703.3A CN107862304B (en) 2017-11-30 2017-11-30 Eye state judging method

Publications (2)

Publication Number Publication Date
CN107862304A true CN107862304A (en) 2018-03-30
CN107862304B CN107862304B (en) 2021-11-26

Family

ID=61704297

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711241703.3A Active CN107862304B (en) 2017-11-30 2017-11-30 Eye state judging method

Country Status (1)

Country Link
CN (1) CN107862304B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109086713A (en) * 2018-07-27 2018-12-25 腾讯科技(深圳)有限公司 Eye recognition method, apparatus, terminal and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015231458A (en) * 2014-06-10 2015-12-24 株式会社エンファシス Device for transmitting signal by detecting visual line
US20170124394A1 (en) * 2015-11-02 2017-05-04 Fotonation Limited Iris liveness detection for mobile devices
CN106774863A (en) * 2016-12-03 2017-05-31 西安中科创星科技孵化器有限公司 A kind of method that Eye-controlling focus are realized based on pupil feature
CN107016381A (en) * 2017-05-11 2017-08-04 南宁市正祥科技有限公司 A kind of driven fast person's fatigue detection method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015231458A (en) * 2014-06-10 2015-12-24 株式会社エンファシス Device for transmitting signal by detecting visual line
US20170124394A1 (en) * 2015-11-02 2017-05-04 Fotonation Limited Iris liveness detection for mobile devices
CN106774863A (en) * 2016-12-03 2017-05-31 西安中科创星科技孵化器有限公司 A kind of method that Eye-controlling focus are realized based on pupil feature
CN107016381A (en) * 2017-05-11 2017-08-04 南宁市正祥科技有限公司 A kind of driven fast person's fatigue detection method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CELESTE BARTHEL ET AL: "Visitor interactions with 3-D visualizations on a spherical display at a science museum", 《OCEANS 2008》 *
蒋文博 等: "一种快速驾驶员疲劳检测方法", 《电子设计工程》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109086713A (en) * 2018-07-27 2018-12-25 腾讯科技(深圳)有限公司 Eye recognition method, apparatus, terminal and storage medium

Also Published As

Publication number Publication date
CN107862304B (en) 2021-11-26

Similar Documents

Publication Publication Date Title
US9842247B2 (en) Eye location method and device
Li et al. Learning to predict gaze in egocentric video
CN104794464B (en) A kind of biopsy method based on relative priority
CN104766059A (en) Rapid and accurate human eye positioning method and sight estimation method based on human eye positioning
CN106796449A (en) Eye-controlling focus method and device
JPH11175246A (en) Sight line detector and method therefor
CN111460950B (en) Cognitive distraction method based on head-eye evidence fusion in natural driving conversation behavior
CN104794465A (en) In-vivo detection method based on attitude information
CN112001215B (en) Text irrelevant speaker identity recognition method based on three-dimensional lip movement
CN110705454A (en) Face recognition method with living body detection function
CN107977622B (en) Eye state detection method based on pupil characteristics
CN116051631A (en) Light spot labeling method and system
Jingchao et al. Recognition of classroom student state features based on deep learning algorithms and machine learning
Yang et al. Student eye gaze tracking during MOOC teaching
CN107862304B (en) Eye state judging method
CN107315997B (en) Sight orientation judgment method and system based on rapid feature point positioning
Chen et al. Attention estimation system via smart glasses
Amudha et al. A fuzzy based eye gaze point estimation approach to study the task behavior in autism spectrum disorder
Cao et al. Gaze tracking on any surface with your phone
Guo et al. Monitoring and detection of driver fatigue from monocular cameras based on Yolo v5
CN107798316B (en) Method for judging eye state based on pupil characteristics
CN105760848B (en) A kind of pupil positioning method based on annular mask convolution
CN113627300A (en) Face recognition and living body detection method based on deep learning
Pangestu et al. Electric Wheelchair Control Mechanism Using Eye-mark Key Point Detection.
CN107944408A (en) Method based on canthus angle-determining eye state

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant