CN107798316A - A kind of method that eye state is judged based on pupil feature - Google Patents

A kind of method that eye state is judged based on pupil feature Download PDF

Info

Publication number
CN107798316A
CN107798316A CN201711237771.2A CN201711237771A CN107798316A CN 107798316 A CN107798316 A CN 107798316A CN 201711237771 A CN201711237771 A CN 201711237771A CN 107798316 A CN107798316 A CN 107798316A
Authority
CN
China
Prior art keywords
pupil
point
mrow
eyes
hum pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201711237771.2A
Other languages
Chinese (zh)
Other versions
CN107798316B (en
Inventor
张捷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yongmutang Co ltd
Original Assignee
Xian Cresun Innovation Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Cresun Innovation Technology Co Ltd filed Critical Xian Cresun Innovation Technology Co Ltd
Priority to CN201711237771.2A priority Critical patent/CN107798316B/en
Publication of CN107798316A publication Critical patent/CN107798316A/en
Application granted granted Critical
Publication of CN107798316B publication Critical patent/CN107798316B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Ophthalmology & Optometry (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The present invention provides a kind of method that eye state is judged based on pupil feature, and this method includes:Obtain eyes image hum pattern;According to the eyes image hum pattern, pupil center's point is determined;According to pupil center's point, pupil edge point is extracted;The symmetry of the pupil edge point is judged, to determine the state of eye state.It is provided by the invention eye state is judged based on pupil feature method the advantages of have:It is not higher by eyelid, eyelashes and the minute surface interference such as reflective, accuracy rate;Without a large amount of training samples, algorithm is easy, and computational efficiency is high;It is cheap without equipment costly, cost.

Description

A kind of method that eye state is judged based on pupil feature
Technical field
The present invention relates to Eye Tracking Technique, more particularly to a kind of method that eye state is judged based on pupil feature.
Background technology
Eye Tracking Technique is the man-machine interaction mode of rising in recent years.Eye-controlling focus is also referred to as the dynamic tracking of eye military, residual The directions such as disease people auxiliary, psychological study and web page analysis play the role of important.The method experience that Eye Tracking Technique is realized The development of last 100 yearses, mechanical recorder technique, electric current writing-method, electromagnetic recording method and the optical recording in the present age are successively employed, It is less and less to the human eye of experimenter intrusion degree, precision also more and more higher.
In practical application, the eyes of people have the blink activity of randomness, when the eyes are occluded, can not obtain in pupil The accurate location of the heart, lead to not accurately position the position of fixation point.Therefore, Eye-controlling focus may interrupt, nothing Method completes normal interaction.In addition, in sight line tracking system, it usually needs stare a certain specific objective it is certain when Between, subsequent operation could be carried out.If during this period, there is situation about blinking, the position of fixation point will produce skew, then have The action of mistake may be caused.Therefore, it is necessary to which that identifies eyes in present image opens closed state, maloperation is avoided.
Eyes detection method common at present can be divided into two major classes:Method based on image and the side based on study Method.When method based on image makes full use of eye opening eye closing, the difference of eye feature, such as whether iris edge, eye can be detected Whether the direction of eyelid is identical, distance of upper palpebra inferior etc..This kind of method easily by eyelid, eyelashes and the minute surface interference such as reflective, Accuracy rate is not high.A kind of classification problem is assigned the detection of eye state as to handle based on the method for study, by extracting feature, Learnt using grader, the detection of eye state is realized according to learning outcome.This kind of method needs to select effective feature, And select substantial amounts of sample to be trained, it can just obtain ideal result.
The content of the invention
Therefore, to solve technological deficiency and deficiency existing for above-mentioned prior art, the present invention provides a kind of special based on pupil The method that sign judges eye state.
Specifically, the embodiment of the present invention provides a kind of method that eye state is judged based on pupil feature, it is characterised in that Including:
Obtain eyes image hum pattern;
According to the eyes image hum pattern, pupil center's point is determined;
According to pupil center's point, pupil edge point is extracted;
The symmetry of the pupil edge point is judged, to determine the state of eye state.
On the basis of above-described embodiment, according to the eyes image hum pattern, before determining pupil center's point, also wrap Include:
Grey-scale contrast enhancing pretreatment is carried out to the eyes image hum pattern;
Laplce's filtering process is carried out to the eyes image hum pattern.
On the basis of above-described embodiment, the grey-scale contrast enhancing pretreatment formula is:
F=c*log (1+double (f0))
Wherein, c is constant coefficient, f0Original eyes image hum pattern is represented, f represents that grey-scale contrast enhancing is pretreated Eyes image hum pattern.
On the basis of above-described embodiment, according to the eyes image hum pattern, pupil center's point is determined, including:
According to the grey-scale contrast of the eyes image hum pattern, pupil center point coordinates (xmin, ymin) conduct is estimated Pupil center's point.
On the basis of above-described embodiment, according to pupil center's point, pupil edge point is extracted, including:
Using pupil center's point as starting point, lesser ring of Merkel is calculated on the eyes image hum pattern along specified directions of rays The shade of gray value in domain, and the position where when the shade of gray value is reached into maximum is defined as pupil edge point.
On the basis of above-described embodiment, the specified ray includes a plurality of first ray and the second ray;Wherein, it is described First ray is using pupil center's point as starting point, sends ray along y-axis positive axis direction, second ray is with institute It is starting point to state pupil center's point, and the ray opposite with first directions of rays is sent along minus half direction of principal axis of y-axis..
On the basis of above-described embodiment, the symmetry of the pupil edge point is judged, including:
Center circle coordinates (the x of multiple pupil edge points is determined using averaging method0, y0);
With the center circle coordinates (x0, y0) it is that origin establishes plane coordinate system;
Using the Y-axis of the plane coordinate system as symmetry axis, the horizontal symmetrical degree of the pupil edge point is calculated;
Using the X-axis of the plane coordinate system as symmetry axis, the vertically symmetrical degree of the pupil edge point is calculated.
On the basis of above-described embodiment, the calculation formula of the horizontal symmetrical degree is:
Wherein, N counts for total pupil edge, and Mx is to be counted along the pupil edge of Y-axis side.
On the basis of above-described embodiment, the calculation formula of the vertically symmetrical degree is:
Wherein, N counts for total pupil edge, and My is that the pupil edge of X-axis side is counted.
On the basis of above-described embodiment, determine that eyes open the state closed, including:
The horizontal symmetrical degree and horizontal symmetrical degree threshold value set in advance are compared, by the vertically symmetrical degree and in advance The vertically symmetrical degree threshold value of setting compares, and judging that eyes are according to comparison result opens state either closure state or centre State.
It is provided by the invention eye state is judged based on pupil feature method the advantages of have:
1) present invention is not higher by eyelid, eyelashes and the minute surface interference such as reflective, accuracy rate;
2) present invention is without a large amount of training samples, and algorithm is easy, and computational efficiency is high;
3) present invention is cheap without equipment costly, cost.
Brief description of the drawings
In order to more clearly illustrate the technical scheme of the present invention or prior art, embodiment or prior art will be retouched below The required accompanying drawing used is briefly described in stating.It should be evident that drawings in the following description are some of the present invention Embodiment, for those of ordinary skill in the art, on the premise of not paying creative work, can also be attached according to these Figure obtains other accompanying drawings.Below in conjunction with accompanying drawing, the embodiment of the present invention is described in detail.
Fig. 1 is a kind of flow chart for the method that eye state is judged based on pupil feature provided in an embodiment of the present invention;
Fig. 2 is in for a kind of eyes provided in an embodiment of the present invention and opens state pupil feature schematic diagram;
Fig. 3 is that a kind of eyes provided in an embodiment of the present invention are in intermediateness pupil feature schematic diagram.
Embodiment
To make the object, technical solutions and advantages of the present invention clearer, below in conjunction with the accompanying drawing of the present invention, to this hair Bright technical scheme carries out clear, complete description.Obviously, described embodiment is part of the embodiment of the present invention, without It is whole embodiments.Based on embodiments of the invention, those of ordinary skill in the art are not making creative work premise Lower obtained every other embodiment, belongs to protection scope of the present invention.
Embodiment one
The embodiment of the present invention provides a kind of method that Eye-controlling focus is realized based on pupil feature, refer to Fig. 1, and Fig. 1 is this A kind of flow chart for method that eye state is judged based on pupil feature that inventive embodiments provide.This method includes:
Obtain eyes image hum pattern;
According to the eyes image hum pattern, pupil center's point is determined;
According to pupil center's point, pupil edge point is extracted;
The symmetry of the pupil edge point is judged, to determine the state of eye state.
Wherein, after obtaining eyes image hum pattern, eyes image hum pattern is handled, the eyes in hum pattern are adjusted Whole is horizontal level.
Wherein, in the above-described embodiments, according to eyes image hum pattern, before determining pupil center's point, in addition to:
Grey-scale contrast enhancing pretreatment is carried out to eyes image hum pattern;
Laplce's filtering process is carried out to eyes image hum pattern.
Specifically, grey-scale contrast enhancing pretreatment formula is:
F=c*log (1+double (f0))
Wherein, c is constant coefficient, f0Original eyes image hum pattern is represented, f represents that grey-scale contrast enhancing is pretreated Eyes image hum pattern.
Grey-scale contrast enhancing pretreatment is carried out to eyes image hum pattern to be advantageous to distinguish pupil and perimeter.To eye Portion image information figure carries out Laplce's filtering process, be advantageous to carry out image all directions go it is dry.
Further, on the basis of above-described embodiment, according to the eyes image hum pattern, pupil center's point is determined, Including:
According to the grey-scale contrast of the eyes image hum pattern, pupil center point coordinates (xmin, ymin) conduct is estimated Pupil center's point.
Preferably, using the minimum region of gray scale in eyes image hum pattern as pupil region, the region is fitted to circle Shape, using the center of circle as pupil center's point.
Further, on the basis of above-described embodiment, according to pupil center's point, pupil edge point is extracted, including:
Using pupil center's point as starting point, lesser ring of Merkel is calculated on the eyes image hum pattern along specified directions of rays The shade of gray value in domain, and the position where when the shade of gray value is reached into maximum is defined as pupil edge point.
Specifically, with pupil center's point (xmin, ymin) for starting point, multiple rays are sent along y-axis positive axis direction;
Correspondingly, sent along minus half direction of principal axis of y-axis with y-axis positive axis direction using the pupil center as the symmetrical of reference Corresponding ray.
It is that shade of gray change is most violent at pupil portion and white of the eye part alternating in marginal position, it is eye to make f (i, j) In the gray value at coordinate (i, j) place, the partial differential of gray value is frame f
Then the shade of gray of the direction is:
Wherein D represents shade of gray.
The maximum points of D are extracted, are denoted as Dmax;Work as Dmax>Marginal point threshold value, then the point is pupil edge point.Wherein, edge Point threshold value is chosen more than the shade of gray at pupil and skin interface and less than the spy of pupil and the shade of gray of white of the eye intersection Definite value, according to individual difference self-defining.Pupil edge point is at pupil portion and white of the eye part alternating.
Further, on the basis of above-described embodiment, the symmetry of the pupil edge point is judged, including:
Center circle coordinates (the x of multiple pupil edge points is determined using averaging method0, y0);
Calculate in the center circle coordinates x-axis coordinate x0The horizontal symmetrical degree of the pupil edge point of both sides;
Calculate in the center circle coordinates y-axis coordinate y0The vertically symmetrical degree of the pupil edge point of both sides.
Specifically, the calculation formula of horizontal symmetrical degree is:
Wherein, N counts for total pupil edge, and Mx is to be counted along the pupil edge of Y-axis side.Mx is that x-axis is more than y0Coordinate Pupil edge point
The calculation formula of vertically symmetrical degree is:
Wherein, N counts for total pupil edge, and My is that the pupil edge of X-axis side is counted.
Further, on the basis of above-described embodiment, determine that eyes open the state closed, including:
The horizontal symmetrical degree and horizontal symmetrical degree threshold value set in advance are compared, by the vertically symmetrical degree and in advance The vertically symmetrical degree threshold value of setting compares, and judging that eyes are according to comparison result opens state either closure state or centre State.
Specifically, horizontal symmetrical degree threshold value and vertically symmetrical degree threshold value are preset.For example, it is assumed that share N number of pupil Marginal point, the center circle coordinates (X of N number of point is obtained by averaging method0, Y0), the ratio by calculating both sides point quantitatively calculates pair Title degree.With center circle coordinates (x0, y0) it is that origin establishes plane coordinate system;Using the Y-axis of plane coordinate system as symmetry axis, pupil is calculated The horizontal symmetrical degree of bore edges point;Using the X-axis of plane coordinate system as symmetry axis, the vertically symmetrical degree of pupil edge point is calculated.
Calculation formula is as follows:
Horizontal symmetrical degree:
Wherein, N counts for total pupil edge, and Mx is to be counted along the pupil edge of Y-axis side.
Vertically symmetrical degree:
Wherein, N counts for total pupil edge, and My is to be counted along the pupil edge of X-axis side.
Horizontal symmetrical degree and horizontal symmetrical degree threshold value set in advance are compared, vertically symmetrical degree is erected with set in advance Straight symmetry threshold value compares, and makes Tx, Ty represent horizontal symmetrical degree threshold value and vertically symmetrical degree threshold value respectively, and judged result is as follows:
A), when
It may determine that pupil edge point-symmetry property is good, i.e., eyes are in and open state.
B), when
It may determine that pupil edge point-symmetry property is too poor, eyes are in closure state.
C), when
It may determine that pupil edge point horizontal symmetrical is bad, vertically symmetrical property is good, and eyes are in intermediate state.
The eye pupil of people is smaller, and gray scale is low, because of individual physiological factor pupil image will not be caused to be blocked by eyelid, when Normal when opening state, pupil is complete, when being that through hole disappears in closure, when in open and close intermediate state when, pupil Hole lower edges are blocked, therefore judge that eyes are opened using the symmetry for detecting pupil edge and close.
Embodiment two
Fig. 2 is refer to, Fig. 2 is in for eyes provided in an embodiment of the present invention and opens state pupil feature schematic diagram.
As shown in Fig. 2 some stains of the white of the eye and pupil intersection are pupil edge point.The point centered on pupil, Pupil center's point top half draws five rays, correspondingly, five rays is symmetrically drawn in pupil center's point the latter half. Pupil top half obtains 5 pupil edge points, obtains 5 pupil edge points in pupil the latter half, it is assumed that horizontal symmetrical degree Threshold value Tx and vertical symmetry threshold value Ty is 0.3.
By formulaDraw,
Therefore pupil edge point horizontal symmetry and vertically symmetrical property it is good, it can be determined that go out eyes be in open shape State.
Fig. 3 is refer to, Fig. 3 is that eyes provided in an embodiment of the present invention are in intermediateness pupil feature schematic diagram.
As shown in figure 3, some stains of the white of the eye and pupil intersection are pupil edge point.The point centered on pupil, Pupil center's point top half draws five rays, correspondingly, five rays is symmetrically drawn in pupil center's point the latter half. Pupil top half obtains 2 pupil edge points, and 5 pupil edge points are obtained in pupil the latter half.
By formulaDraw
Therefore the vertically symmetrical property of pupil edge point is good, and horizontal symmetry is bad, it can be determined that goes out eyes and is in middle shape State.
Assuming that limiting case, eyes close completely, then pupil edge point is 0,
By formulaDraw
Eyes are in closure state.
To sum up, specific case used herein is set forth to the principle and embodiment of the present invention, and the above is implemented The explanation of example is only intended to help the method and its core concept for understanding the present invention;Meanwhile for the general technology people of this area Member, according to the thought of the present invention, there will be changes in specific embodiments and applications, to sum up, in this specification Appearance be should not be construed as limiting the invention, and protection scope of the present invention should be defined by appended claim.

Claims (10)

  1. A kind of 1. method that eye state is judged based on pupil feature, it is characterised in that including:
    Obtain eyes image hum pattern;
    According to the eyes image hum pattern, pupil center's point is determined;
    According to pupil center's point, pupil edge point is extracted;
    The symmetry of the pupil edge point is judged, to determine eyes state in which.
  2. 2. the method as described in claim 1, it is characterised in that according to the eyes image hum pattern, determine pupil center's point Before, in addition to:
    Grey-scale contrast enhancing pretreatment is carried out to the eyes image hum pattern;
    Laplce's filtering process is carried out to the eyes image hum pattern.
  3. 3. method as claimed in claim 2, it is characterised in that the grey-scale contrast enhancing pre-processes formula and is:
    F=c*log (1+double (f0))
    Wherein, c is constant coefficient, f0Original eyes image hum pattern is represented, f represents that grey-scale contrast strengthens pretreated eye Image information figure.
  4. 4. method as claimed in claim 3, it is characterised in that according to the eyes image hum pattern, pupil center's point is determined, Including:
    According to the grey-scale contrast of the eyes image hum pattern, estimation pupil center's point coordinates (xmin, ymin) is used as pupil Central point.
  5. 5. method as claimed in claim 4, it is characterised in that according to pupil center's point, extract pupil edge point, bag Include:
    Using pupil center's point as starting point, pupil region is calculated on the eyes image hum pattern along specified directions of rays Shade of gray value, and the position where when the shade of gray value is reached into maximum is defined as pupil edge point.
  6. 6. method as claimed in claim 5, it is characterised in that the specified ray is penetrated including a plurality of first ray and second Line;Wherein, first ray is using pupil center's point as starting point, and ray is sent along y-axis positive axis direction, described the Two rays are using pupil center's point as starting point, are sent along minus half direction of principal axis of y-axis opposite with first directions of rays Ray.
  7. 7. the method as described in claim 1, it is characterised in that judge the symmetry of the pupil edge point, including:
    Center circle coordinates (the x of multiple pupil edge points is determined using averaging method0, y0);
    With the center circle coordinates (x0, y0) it is that origin establishes plane coordinate system;
    Using the Y-axis of the plane coordinate system as symmetry axis, the horizontal symmetrical degree of the pupil edge point is calculated;
    Using the X-axis of the plane coordinate system as symmetry axis, the vertically symmetrical degree of the pupil edge point is calculated.
  8. 8. method as claimed in claim 7, it is characterised in that the calculation formula of the horizontal symmetrical degree is:
    <mrow> <mo>|</mo> <mfrac> <mrow> <msub> <mi>M</mi> <mi>x</mi> </msub> <mo>-</mo> <mrow> <mo>(</mo> <mi>N</mi> <mo>-</mo> <msub> <mi>M</mi> <mi>x</mi> </msub> <mo>)</mo> </mrow> </mrow> <mi>N</mi> </mfrac> <mo>|</mo> <mo>;</mo> </mrow>
    Wherein, N counts for total pupil edge, and Mx is to be counted along the pupil edge of Y-axis side.
  9. 9. method as claimed in claim 7, it is characterised in that the calculation formula of the vertically symmetrical degree is:
    <mrow> <mo>|</mo> <mfrac> <mrow> <msub> <mi>M</mi> <mi>y</mi> </msub> <mo>-</mo> <mrow> <mo>(</mo> <mi>N</mi> <mo>-</mo> <msub> <mi>M</mi> <mi>y</mi> </msub> <mo>)</mo> </mrow> </mrow> <mi>N</mi> </mfrac> <mo>|</mo> </mrow>
    Wherein, N counts for total pupil edge, and My is to be counted along the pupil edge of X-axis side.
  10. 10. method as claimed in claim 7, it is characterised in that determine that eyes open the state closed, including:
    The horizontal symmetrical degree and horizontal symmetrical degree threshold value set in advance are compared, by the vertically symmetrical degree with presetting Vertically symmetrical degree threshold value compare, judge eyes in opening state either closure state or intermediate state according to comparison result.
CN201711237771.2A 2017-11-30 2017-11-30 Method for judging eye state based on pupil characteristics Active CN107798316B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711237771.2A CN107798316B (en) 2017-11-30 2017-11-30 Method for judging eye state based on pupil characteristics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711237771.2A CN107798316B (en) 2017-11-30 2017-11-30 Method for judging eye state based on pupil characteristics

Publications (2)

Publication Number Publication Date
CN107798316A true CN107798316A (en) 2018-03-13
CN107798316B CN107798316B (en) 2021-05-14

Family

ID=61538097

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711237771.2A Active CN107798316B (en) 2017-11-30 2017-11-30 Method for judging eye state based on pupil characteristics

Country Status (1)

Country Link
CN (1) CN107798316B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111513671A (en) * 2020-01-20 2020-08-11 明月镜片股份有限公司 Glasses comfort evaluation method based on eye image

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101059836A (en) * 2007-06-01 2007-10-24 华南理工大学 Human eye positioning and human eye state recognition method
US20100002913A1 (en) * 2005-01-26 2010-01-07 Honeywell International Inc. distance iris recognition
TW201140511A (en) * 2010-05-11 2011-11-16 Chunghwa Telecom Co Ltd Drowsiness detection method
US20120177266A1 (en) * 2010-07-20 2012-07-12 Panasonic Corporation Pupil detection device and pupil detection method
CN103870796A (en) * 2012-12-13 2014-06-18 汉王科技股份有限公司 Eye sight evaluation method and device
CN106214166A (en) * 2016-09-30 2016-12-14 防城港市港口区高创信息技术有限公司 One is worn glasses Driver Fatigue Detection
CN106774863A (en) * 2016-12-03 2017-05-31 西安中科创星科技孵化器有限公司 A kind of method that Eye-controlling focus are realized based on pupil feature

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100002913A1 (en) * 2005-01-26 2010-01-07 Honeywell International Inc. distance iris recognition
CN101059836A (en) * 2007-06-01 2007-10-24 华南理工大学 Human eye positioning and human eye state recognition method
TW201140511A (en) * 2010-05-11 2011-11-16 Chunghwa Telecom Co Ltd Drowsiness detection method
US20120177266A1 (en) * 2010-07-20 2012-07-12 Panasonic Corporation Pupil detection device and pupil detection method
CN103870796A (en) * 2012-12-13 2014-06-18 汉王科技股份有限公司 Eye sight evaluation method and device
CN106214166A (en) * 2016-09-30 2016-12-14 防城港市港口区高创信息技术有限公司 One is worn glasses Driver Fatigue Detection
CN106774863A (en) * 2016-12-03 2017-05-31 西安中科创星科技孵化器有限公司 A kind of method that Eye-controlling focus are realized based on pupil feature

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
BEI Y.等: "A robust algorithm for pupil center detection", 《2011 6TH IEEE CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS》 *
MOHAMMAD D.等: "Design and implementation of a real time and train less eye state recognition system", 《EURASIP JOURNAL ON ADVANCES IN SIGNAL PROCESSING》 *
张文聪 等: "基于径向对称变换的眼睛睁闭状态检测", 《中国科学技术大学学报》 *
邓宏平 等: "视线跟踪系统中眼睛睁闭检测算法研究", 《电路与系统学报》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111513671A (en) * 2020-01-20 2020-08-11 明月镜片股份有限公司 Glasses comfort evaluation method based on eye image

Also Published As

Publication number Publication date
CN107798316B (en) 2021-05-14

Similar Documents

Publication Publication Date Title
CN103632136B (en) Human-eye positioning method and device
CN104123543B (en) A kind of eye movement recognition methods based on recognition of face
US11849998B2 (en) Method for pupil detection for cognitive monitoring, analysis, and biofeedback-based treatment and training
CN110221699B (en) Eye movement behavior identification method of front-facing camera video source
CN105224285A (en) Eyes open and-shut mode pick-up unit and method
CN108053615A (en) Driver tired driving condition detection method based on micro- expression
CN106846734A (en) A kind of fatigue driving detection device and method
CN107895157B (en) Method for accurately positioning iris center of low-resolution image
CN106650574A (en) Face identification method based on PCANet
CN106503644A (en) Glasses attribute detection method based on edge projection and color characteristic
CN109409298A (en) A kind of Eye-controlling focus method based on video processing
CN106934365A (en) A kind of reliable glaucoma patient self-detection method
CN106203338B (en) Human eye state method for quickly identifying based on net region segmentation and threshold adaptive
CN106557745A (en) Human eyeball&#39;s detection method and system based on maximum between-cluster variance and gamma transformation
CN110226913A (en) A kind of self-service examination machine eyesight detection intelligent processing method and device
CN107977622B (en) Eye state detection method based on pupil characteristics
CN110097012A (en) The fatigue detection method of eye movement parameter monitoring based on N-range image processing algorithm
CN108288040A (en) Multi-parameter face identification system based on face contour
CN107798316A (en) A kind of method that eye state is judged based on pupil feature
Amudha et al. A fuzzy based eye gaze point estimation approach to study the task behavior in autism spectrum disorder
CN109919050A (en) Identity recognition method and device
Gao et al. Research on facial expression recognition of video stream based on OpenCV
Zhao et al. Fast localization algorithm of eye centers based on improved hough transform
CN109409347A (en) A method of based on facial features localization fatigue driving
CN107862304B (en) Eye state judging method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information

Inventor after: Zheng Yuhui

Inventor after: Zhang Jie

Inventor before: Zhang Jie

CB03 Change of inventor or designer information
TA01 Transfer of patent application right

Effective date of registration: 20210415

Address after: 276003 Yinqueshan street, Lanshan District, Linyi City, Shandong Province

Applicant after: Yongmutang Co.,Ltd.

Address before: 710065 Xi'an new hi tech Zone, Shaanxi, No. 86 Gaoxin Road, No. second, 1 units, 22 stories, 12202 rooms, 51, B block.

Applicant before: XI'AN CREATION KEJI Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant
CP02 Change in the address of a patent holder

Address after: 276017 West Road, 100m south of the intersection of Tongda South Road and Dianchang Road, chenbaizhuang community, Shengzhuang street, Luozhuang District, Linyi City, Shandong Province

Patentee after: Yongmutang Co.,Ltd.

Address before: 276003 Yinqueshan street, Lanshan District, Linyi City, Shandong Province

Patentee before: Yongmutang Co.,Ltd.

CP02 Change in the address of a patent holder