CN108595008B - Human-computer interaction method based on eye movement control - Google Patents

Human-computer interaction method based on eye movement control Download PDF

Info

Publication number
CN108595008B
CN108595008B CN201810390132.8A CN201810390132A CN108595008B CN 108595008 B CN108595008 B CN 108595008B CN 201810390132 A CN201810390132 A CN 201810390132A CN 108595008 B CN108595008 B CN 108595008B
Authority
CN
China
Prior art keywords
human
eye movement
pupil
stage
eye
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810390132.8A
Other languages
Chinese (zh)
Other versions
CN108595008A (en
Inventor
蒋欣欣
冯帆
陈树峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Computer Technology and Applications
Original Assignee
Beijing Institute of Computer Technology and Applications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Computer Technology and Applications filed Critical Beijing Institute of Computer Technology and Applications
Priority to CN201810390132.8A priority Critical patent/CN108595008B/en
Publication of CN108595008A publication Critical patent/CN108595008A/en
Application granted granted Critical
Publication of CN108595008B publication Critical patent/CN108595008B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/165Detection; Localisation; Normalisation using facial parts and geometric relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Engineering & Computer Science (AREA)
  • Ophthalmology & Optometry (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a human-computer interaction method based on eye movement control, which comprises the following steps: positioning the pupils of the human eyes and extracting eye movement characteristics; wherein, carry out pupil center location based on grey scale information, include: the first stage is as follows: a human eye positioning stage; and a second stage: a pupil edge detection stage; and a third stage: pupil center positioning stage; the eye movement characteristic information extraction based on the visual point movement comprises the following steps: calculating a gaze point offset and a view point offset map: based on the obtained displacement difference between the front frame image and the rear frame image, the eyes move back and forth to calibrate the points marked by the coordinates on the screen, and a least square curve fitting algorithm is utilized to solve the mapping function; and after the eye movement characteristic information is obtained, corresponding system message response is carried out based on the obtained eye movement control displacement and angle information. The human-computer interaction method based on eye movement control can reduce energy consumption of a user, help the user to perform direct control, and realize more efficient and more natural interaction.

Description

Human-computer interaction method based on eye movement control
Technical Field
The invention belongs to the field of computer software, and particularly relates to a human-computer interaction method based on eye movement control for mobile terminal equipment.
Background
With the development of science and technology, the man-machine interaction mode gradually changes from an accurate interaction mode to a non-accurate interaction mode.
The precise interaction means that a user inputs interaction information through a precise interaction means, and the precise interaction means is common in life: 1) and inputting the operation instruction language based on the DOS. 2) The accurate input is carried out through equipment with positioning functions such as a mouse, a keyboard and the like.
Due to the development of multi-channel interactive systems, people no longer focus on how to replace one precise interaction technology with another, but try to shift to and develop non-precise interaction technologies.
Non-precise interaction refers to a user interacting with an interactive means (such as a voice recognition system) that cannot be input precisely. In the life of people, information interaction is often carried out in an inaccurate mode, so the inaccurate interactive mode is an interactive mode which accords with the habit of people. The inaccurate human-computer interaction can be used for information interaction in many ways, and the following are common:
1) and (3) voice recognition: and (3) taking the voice signal as input, and directly operating the machine to complete an interactive task. The two most obvious advantages of the voice interactive system are: the input of language is faster and more effective than the input of keyboard; in the case of multitasking, an additional reaction channel is provided for a user, so that the task of hands is relieved, and particularly when the user needs to process more than two operations simultaneously, for example, voice input and output are provided by some current vehicle-mounted GPS, so that convenience is provided for a driver when the driver cannot leave hands and eyes for operating navigation, and available resources of people are distributed more efficiently.
2) Gesture recognition: the motion of the hands or other body parts of the people is tracked and recognized mainly by using equipment such as data gloves and the like so as to complete an interaction task. For example, a human-machine sign language translation system is an expression system formed by combining gesture action and visual expression recognition, and a good communication platform is constructed for the deaf-mute.
3) Visual tracking: the human eye direction is mainly detected by using a high-definition camera and methods such as electromagnetism, ultrasonic waves, invisible infrared rays and the like, so that an interaction task is completed.
Although the inaccurate interaction mode can achieve the effect of human interaction with the mobile equipment to a certain extent, the inaccurate interaction mode also has obvious defects. The voice interaction is a process of realizing action interaction by identifying input voice, so that the voice interaction has higher requirements on the external environment, and when the environment is too noisy or the environment is complex, the voice interaction influences the definition of a voice contract and the accuracy of voice identification, thereby influencing the identification and interaction effects; in addition, the gesture-based interaction mode is an action language for expressing a desire and transmitting a command by the position and the configuration of the action of the arm, the palm and the finger of a person, and has high requirements on the control of the gesture, difficulty in controlling the precision of the action amplitude, and increased burden on both hands.
Disclosure of Invention
The invention aims to provide a human-computer interaction method based on eye movement control, which is used for solving the defects and shortcomings of the existing intelligent interaction mode.
The invention relates to a human-computer interaction method based on eye movement control, which comprises the following steps: positioning the pupils of the human eyes and extracting eye movement characteristics; wherein, carry out pupil center location based on grey scale information, include: the first stage is as follows: human eye positioning stage: acquiring a face image, and converting the face image into a corresponding gray image; extracting eye features; processing the human face gray level image, and determining the position of human eyes by using the change of the extracted eye feature value; and a second stage: pupil edge detection stage: performing edge detection on the obtained human eye gray level region by using the characteristic that the edge brightness of the image changes remarkably to obtain a pupil region; and a third stage: pupil center positioning stage: based on the obtained pupil area, processing the original pupil area by adopting a method of combining neighborhood gray value comparison and a geometric center, thereby obtaining the pupil center; the eye movement characteristic information extraction based on the visual point movement comprises the following steps: calculating the fixation point offset: setting the time threshold value as T and the eye movement time as T, then:
Figure BDA0001643241520000031
recording the current pupil center O1 based on the pupil center obtained when the eye is still, and recording the coordinate as (x)1,y1) Acquiring a pupil center point O2 of a second frame image by acquiring two adjacent frames of images before and after acquisition, and recording the coordinate as (x)2,y2) Calculating the displacement delta (x) generated by the eye moving once based on the suitable coordinate system established by the front and back pupil centersd,yd) And the rotation angle theta to obtain the offset of the fixation point; and (3) view offset mapping: based on the obtained displacement difference between the front frame image and the rear frame image, the eyes move back and forth to calibrate the points marked by the coordinates on the screen, and a least square curve fitting algorithm is utilized to solve the mapping function; and after the eye movement characteristic information is obtained, corresponding system message response is carried out based on the obtained eye movement control displacement and angle information.
According to an embodiment of the human-computer interaction method based on eye movement control, eye features are extracted in a Haar matrix-based mode.
According to an embodiment of the human-computer interaction method based on eye movement control, the pupil edge detection stage specifically includes: defining a Gaussian weighted smoothing function, and performing convolution smoothing on the image, wherein the Gaussian weighted smoothing function is defined as:
Figure BDA0001643241520000032
where σ is the mean square error of the Gaussian distribution; x is the number ofdAnd ydCoordinate values representing pixels of a two-dimensional image of the image; calculating image gradient by using a gradient operator, obtaining partial derivative matrixes of the image after the convolution smoothing in the x and y directions, and performing non-maximum suppression on the matrixes to obtain a binary image; and (4) performing threshold value screening on the binary image, wherein the screened image only comprises a pupil boundary with high brightness and an internal region thereof, and obtaining a pupil region.
According to an embodiment of the human-computer interaction method based on eye movement control, in the pupil center positioning stage, in the gray value comparison stage, the whole pupil region is traversed by using the window region of N × N, and the region with the largest sum of the gray values in the whole pupil region is obtained.
According to an embodiment of the human-computer interaction method based on eye movement control, in the stage of solving the central point, based on the N × N region with the largest sum of the obtained gray values, a geometric method is applied to take a point where diagonals of the window region intersect as a pupil central point.
According to an embodiment of the human-computer interaction method based on eye movement control, the eyes move back and forth to calibrate points calibrated by coordinates on the screen based on the obtained displacement difference of the front frame image and the rear frame image, the mapping function is solved by using a least square curve fitting algorithm, and the solved displacement is mapped onto the screen.
The human-computer interaction method based on the eye movement control is expanded by taking the change of the visual central point and the eye movement characteristics of eyeball fixation, horizontal movement, vertical movement and the like as entry points, and the realization of the human-computer interaction based on the eye movement control can release users from the limitation of intelligent equipment to hands, thereby reducing the energy consumption of the users, helping the users to directly control and realize more efficient and more natural interaction.
Drawings
FIG. 1 is a basic flow chart of a human-computer interaction method based on eye movement control according to the present invention;
fig. 2 is a flow chart of pupil center positioning.
Detailed Description
In order to make the objects, contents, and advantages of the present invention clearer, the following detailed description of the embodiments of the present invention will be made in conjunction with the accompanying drawings and examples.
FIG. 1 is a basic flow chart of the human-computer interaction method based on eye movement control, such as
As shown in fig. 1, the human-computer interaction method based on eye movement control of the present invention mainly includes positioning of pupils of human eyes and extraction of eye movement characteristics, and specifically includes:
performing a pupil center location based on gray scale information, comprising:
in order to obtain eye movement information in more detail and improve the accuracy of eye movement control, a pupil positioning method based on gray scale information is adopted, which is mainly divided into the following three stages:
the first stage is as follows: human eye positioning stage:
firstly, acquiring a face image, and converting the face image into a corresponding gray image;
secondly, extracting eye features by adopting a Haar matrix-based mode;
and finally, processing the face gray level image, and determining the positions of the human eyes by using the change of the extracted eye feature values.
And a second stage: pupil edge detection stage:
after the human eye position is obtained, edge detection is carried out on the obtained human eye gray scale region by utilizing the characteristic that the edge brightness of the image changes remarkably, namely the gray scale value of the pupil region is generally lower than the gray scale values of other regions.
Firstly, a Gaussian weighted smoothing function is defined, and the image is subjected to convolution smoothing. Gaussian weighted smoothing function h (x)d,yd) Is defined as:
Figure BDA0001643241520000051
wherein, the sigma is the mean square error of Gaussian distribution, the smaller the sigma is, the more accurate the edge positioning is, and the larger the sigma is, the stronger the anti-noise performance is; x is the number ofdAnd ydCoordinate values representing pixels of a two-dimensional image of the image.
Secondly, calculating the image gradient by using a gradient operator, obtaining partial derivative matrixes of the image after the convolution smoothing in the x and y directions, and performing non-maximum suppression on the matrixes to obtain a binary image.
And finally, threshold value screening is carried out on the binary image, and the screened image only contains the pupil boundary with high brightness and the internal region thereof, namely the pupil region.
And a third stage: pupil center positioning stage:
in order to better realize the extraction of the features, the pupil center needs to be positioned. And processing the original pupil area by adopting a method of combining neighborhood gray value comparison and a geometric center based on the pupil area obtained in the last step, thereby obtaining the pupil center.
Fig. 2 is a flow chart of pupil center location, as shown in fig. 2, in which gray value comparison and center point solution are two important processes. In the gray value comparison stage, traversing the whole pupil area by using a window area of N x N (N is more than 1) based on the characteristic that the gray value at the pupil is larger, thereby obtaining an area with the maximum sum of the gray values in the whole pupil area; and in the stage of solving the central point, based on the N-N area with the maximum sum of the obtained gray values, a geometric method is applied to take the point where the diagonals of the window area are intersected as the pupil central point.
The eye movement characteristic information extraction based on the visual point movement comprises the following steps:
after the central position of the pupil of the human eye is accurately acquired, the calculation and extraction of the related characteristic information can be carried out according to the shift of the sight line of the pupil, and the corresponding system control response is carried out based on the acquired characteristic information, so that the expected eye movement control is realized. The extraction of the eye movement characteristic information based on the movement of the visual points is mainly realized by the following steps:
the first stage is as follows: calculating the fixation point offset:
the time threshold is set to be T, the eye movement time is T, and the following rules are provided:
Figure BDA0001643241520000061
recording the current pupil center O1 based on the pupil center obtained when the eye is still, and recording the coordinate as (x)1,y1). Acquiring a pupil center point O2 of a second frame image by acquiring two adjacent frames of images before and after acquisition, and recording the coordinate as (x)2,y2). Establishing a proper coordinate system based on the centers of the anterior and posterior pupils to calculate the displacement delta (x) generated by one time of eye movementd,yd) And the rotation angle (direction) theta (the direction of the center of the latter pupil relative to the center of the former pupil) to obtain the offset of the fixation point.
And a second stage: and (3) view offset mapping:
based on the displacement difference of the two acquired images, the eyes move back and forth by using a statistical method to calibrate the points marked by the coordinates on the screen. Then, the mapping function is solved by using a least square curve fitting algorithm, and finally the solved displacement is mapped on a screen.
After the eye movement characteristic information is obtained by the method, corresponding system message response is carried out based on the obtained eye movement control displacement and angle information, so that an expected eye movement control task is achieved, and human-computer interaction based on eye movement control is realized.
For the eye movement interaction behavior, firstly, eye movement behavior data in the man-machine interaction process of the mobile equipment are collected by monitoring the eye movement behavior of a user; and secondly, eye movement interaction is realized by designing the form of eye movement behaviors in a targeted manner.
In order to achieve the best interaction experience effect and ensure the universality of the eye movement human-computer interaction mode, the human-computer interaction method for eye movement control adopts a first mode, namely natural eye movement behaviors in the human-computer interaction process, and researches are carried out according to visual feedback and prompt corresponding to capture and matching of specific eye movement postures and behaviors aiming at the motion characteristics and posture characteristics of the natural eye movement behaviors in the human-computer interaction process. The key point of selecting the mode is that the eye movement control process of the user is not interrupted naturally, and the human-computer interaction experience is improved by counting and utilizing the powerful information in the natural eye movement behaviors.
By adopting the human-computer interaction method based on the eye movement control, the natural eye movement behavior data in the human-computer interaction process of the mobile equipment is obtained by monitoring the eye movement behavior of the user, the pupil positioning of the human eye and the extraction of the eye movement characteristics are realized by effectively analyzing a large amount of data, and the interaction behavior based on the eye movement control is further realized. The method can relieve users from the limitation of the intelligent equipment on both hands, reduce the energy consumption of the users, help the users to be capable of directly controlling and realize more efficient and more natural interaction.
The above description is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, several modifications and variations can be made without departing from the technical principle of the present invention, and these modifications and variations should also be regarded as the protection scope of the present invention.

Claims (6)

1. A human-computer interaction method based on eye movement control is characterized by comprising the following steps: positioning the pupils of the human eyes and extracting eye movement characteristics;
wherein, carry out pupil center location based on grey scale information, include:
the first stage is as follows: human eye positioning stage:
acquiring a face image, and converting the face image into a corresponding gray image;
extracting eye features;
processing the human face gray level image, and determining the position of human eyes by using the change of the extracted eye feature value;
and a second stage: pupil edge detection stage:
performing edge detection on the obtained human eye gray level region by using the characteristic that the edge brightness of the image changes remarkably to obtain a pupil region;
and a third stage: pupil center positioning stage:
based on the obtained pupil area, processing the original pupil area by adopting a method of combining neighborhood gray value comparison and a geometric center, thereby obtaining the pupil center;
the eye movement characteristic information extraction based on the visual point movement comprises the following steps:
calculating the fixation point offset:
setting the time threshold value as T and the eye movement time as T, then:
Figure FDA0001643241510000011
recording the current pupil center O1 based on the pupil center obtained when the eye is still, and recording the coordinate as (x)1,y1) Acquiring a pupil center point O2 of a second frame image by acquiring two adjacent frames of images before and after acquisition, and recording the coordinate as (x)2,y2) Calculating the displacement delta (x) generated by the eye moving once based on the suitable coordinate system established by the front and back pupil centersd,yd) And the rotation angle theta to obtain the offset of the fixation point;
and (3) view offset mapping:
based on the obtained displacement difference between the front frame image and the rear frame image, the eyes move back and forth to calibrate the points marked by the coordinates on the screen, and a least square curve fitting algorithm is utilized to solve the mapping function;
and after the eye movement characteristic information is obtained, corresponding system message response is carried out based on the obtained eye movement control displacement and angle information.
2. The eye movement control-based human-computer interaction method according to claim 1, wherein the eye features are extracted in a Haar matrix-based manner.
3. The eye movement control-based human-computer interaction method according to claim 1, wherein the pupil edge detection stage specifically comprises:
defining a Gaussian weighted smoothing function, and performing convolution smoothing on the image, wherein the Gaussian weighted smoothing function is defined as:
Figure FDA0001643241510000021
where σ is the mean square error of the Gaussian distribution; x is the number ofdAnd ydCoordinate values representing pixels of a two-dimensional image of the image;
calculating image gradient by using a gradient operator, obtaining partial derivative matrixes of the image after the convolution smoothing in the x and y directions, and performing non-maximum suppression on the matrixes to obtain a binary image;
and (4) performing threshold value screening on the binary image, wherein the screened image only comprises a pupil boundary with high brightness and an internal region thereof, and obtaining a pupil region.
4. The eye movement control-based human-computer interaction method according to claim 1, wherein in the pupil centering stage, in the gray value comparison stage, the entire pupil region is traversed by using N × N window regions to find a region in the entire pupil region where the sum of gray values is the largest.
5. The eye-movement-control-based human-computer interaction method according to claim 4, wherein in the stage of solving the central point, a point where diagonals of the window region intersect is used as a pupil central point by applying a geometric method based on an N x N region where a sum of obtained gray values is maximum.
6. The human-computer interaction method based on eye movement control as claimed in claim 1, wherein based on the obtained displacement difference between the front and back frames of images, the eyes move back and forth to calibrate the points calibrated by coordinates on the screen, and after the mapping function is solved by using a least square curve fitting algorithm, the solved displacement is mapped onto the screen.
CN201810390132.8A 2018-04-27 2018-04-27 Human-computer interaction method based on eye movement control Active CN108595008B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810390132.8A CN108595008B (en) 2018-04-27 2018-04-27 Human-computer interaction method based on eye movement control

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810390132.8A CN108595008B (en) 2018-04-27 2018-04-27 Human-computer interaction method based on eye movement control

Publications (2)

Publication Number Publication Date
CN108595008A CN108595008A (en) 2018-09-28
CN108595008B true CN108595008B (en) 2022-02-08

Family

ID=63610507

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810390132.8A Active CN108595008B (en) 2018-04-27 2018-04-27 Human-computer interaction method based on eye movement control

Country Status (1)

Country Link
CN (1) CN108595008B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109614182A (en) * 2018-11-20 2019-04-12 维沃移动通信有限公司 A kind of display methods and terminal device
CN109613982A (en) * 2018-12-13 2019-04-12 叶成环 Wear-type AR shows the display exchange method of equipment
CN109766818B (en) * 2019-01-04 2021-01-26 京东方科技集团股份有限公司 Pupil center positioning method and system, computer device and readable storage medium
CN110119720B (en) * 2019-05-17 2023-04-28 南京邮电大学 Real-time blink detection and human eye pupil center positioning method
CN110472546B (en) * 2019-08-07 2024-01-12 南京大学 Infant non-contact eye movement feature extraction device and method
CN111078000B (en) * 2019-11-18 2023-04-28 中北大学 Method, device and system for performing eye machine interaction according to eye behavior characteristics
US10860098B1 (en) 2019-12-30 2020-12-08 Hulu, LLC Gesture-based eye tracking
CN111399658B (en) * 2020-04-24 2022-03-15 Oppo广东移动通信有限公司 Calibration method and device for eyeball fixation point, electronic equipment and storage medium
CN113050792A (en) * 2021-03-15 2021-06-29 广东小天才科技有限公司 Virtual object control method and device, terminal equipment and storage medium
CN113435357B (en) * 2021-06-30 2022-09-02 平安科技(深圳)有限公司 Voice broadcasting method, device, equipment and storage medium
CN113963416B (en) * 2021-11-05 2024-05-31 北京航空航天大学 Eye movement interaction method and system based on laser visual feedback
CN117912087B (en) * 2024-03-20 2024-05-31 杭州臻稀生物科技有限公司 Digital specific object detection method based on facial microexpressions and multi-stage classification

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102520796A (en) * 2011-12-08 2012-06-27 华南理工大学 Sight tracking method based on stepwise regression analysis mapping model
CN102830797A (en) * 2012-07-26 2012-12-19 深圳先进技术研究院 Man-machine interaction method and system based on sight judgment
CN102930252A (en) * 2012-10-26 2013-02-13 广东百泰科技有限公司 Sight tracking method based on neural network head movement compensation
CN103136512A (en) * 2013-02-04 2013-06-05 重庆市科学技术研究院 Pupil positioning method and system
CN103366157A (en) * 2013-05-03 2013-10-23 马建 Method for judging line-of-sight distance of human eye
CN105159460A (en) * 2015-09-10 2015-12-16 哈尔滨理工大学 Intelligent home controller based on eye-movement tracking and intelligent home control method based on eye-movement tracking
CN205181313U (en) * 2015-03-14 2016-04-27 中国科学院苏州生物医学工程技术研究所 Two mesh pupil light reflex tracker
CN107145226A (en) * 2017-04-20 2017-09-08 中国地质大学(武汉) Eye control man-machine interactive system and method
CN107656613A (en) * 2017-09-08 2018-02-02 国网山东省电力公司电力科学研究院 A kind of man-machine interactive system and its method of work based on the dynamic tracking of eye
CN107831900A (en) * 2017-11-22 2018-03-23 中国地质大学(武汉) The man-machine interaction method and system of a kind of eye-controlled mouse

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101788848B (en) * 2009-09-29 2012-05-23 北京科技大学 Eye characteristic parameter detecting method for sight line tracking system
CN101964111B (en) * 2010-09-27 2011-11-30 山东大学 Method for improving sight tracking accuracy based on super-resolution
CN103067662A (en) * 2013-01-21 2013-04-24 天津师范大学 Self-adapting sightline tracking system
US10016130B2 (en) * 2015-09-04 2018-07-10 University Of Massachusetts Eye tracker system and methods for detecting eye parameters
CN107784280A (en) * 2017-10-18 2018-03-09 张家港全智电子科技有限公司 A kind of dynamic pupil tracking method

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102520796A (en) * 2011-12-08 2012-06-27 华南理工大学 Sight tracking method based on stepwise regression analysis mapping model
CN102830797A (en) * 2012-07-26 2012-12-19 深圳先进技术研究院 Man-machine interaction method and system based on sight judgment
CN102930252A (en) * 2012-10-26 2013-02-13 广东百泰科技有限公司 Sight tracking method based on neural network head movement compensation
CN103136512A (en) * 2013-02-04 2013-06-05 重庆市科学技术研究院 Pupil positioning method and system
CN103366157A (en) * 2013-05-03 2013-10-23 马建 Method for judging line-of-sight distance of human eye
CN205181313U (en) * 2015-03-14 2016-04-27 中国科学院苏州生物医学工程技术研究所 Two mesh pupil light reflex tracker
CN105159460A (en) * 2015-09-10 2015-12-16 哈尔滨理工大学 Intelligent home controller based on eye-movement tracking and intelligent home control method based on eye-movement tracking
CN107145226A (en) * 2017-04-20 2017-09-08 中国地质大学(武汉) Eye control man-machine interactive system and method
CN107656613A (en) * 2017-09-08 2018-02-02 国网山东省电力公司电力科学研究院 A kind of man-machine interactive system and its method of work based on the dynamic tracking of eye
CN107831900A (en) * 2017-11-22 2018-03-23 中国地质大学(武汉) The man-machine interaction method and system of a kind of eye-controlled mouse

Also Published As

Publication number Publication date
CN108595008A (en) 2018-09-28

Similar Documents

Publication Publication Date Title
CN108595008B (en) Human-computer interaction method based on eye movement control
Hasan et al. RETRACTED ARTICLE: Static hand gesture recognition using neural networks
US9891716B2 (en) Gesture recognition in vehicles
CN102830797B (en) A kind of man-machine interaction method based on sight line judgement and system
US10891473B2 (en) Method and device for use in hand gesture recognition
Qi et al. Computer vision-based hand gesture recognition for human-robot interaction: a review
US20130120250A1 (en) Gesture recognition system and method
CN110321795B (en) User gesture recognition method and device, computer device and computer storage medium
KR20200111617A (en) Gesture recognition method, device, electronic device, and storage medium
CN106569613A (en) Multi-modal man-machine interaction system and control method thereof
CN107357428A (en) Man-machine interaction method and device based on gesture identification, system
JP2014501011A (en) Method, circuit and system for human machine interface with hand gestures
Rautaray et al. Design of gesture recognition system for dynamic user interface
Vasisht et al. Human computer interaction based eye controlled mouse
Vivek Veeriah et al. Robust hand gesture recognition algorithm for simple mouse control
CN108256379A (en) A kind of eyes posture identification method based on Pupil diameter
CN112488059A (en) Spatial gesture control method based on deep learning model cascade
Khan et al. Computer vision based mouse control using object detection and marker motion tracking
CN115951783A (en) Computer man-machine interaction method based on gesture recognition
US20220050528A1 (en) Electronic device for simulating a mouse
KR20190132885A (en) Apparatus, method and computer program for detecting hand from video
Dhamanskar et al. Human computer interaction using hand gestures and voice
KR101909326B1 (en) User interface control method and system using triangular mesh model according to the change in facial motion
Khan et al. Gesture recognition using Open-CV
Deb et al. Designing an intelligent blink analyzer tool for effective human computer interaction through eye

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant