CN101311882A - Eye tracking human-machine interaction method and apparatus - Google Patents

Eye tracking human-machine interaction method and apparatus Download PDF

Info

Publication number
CN101311882A
CN101311882A CNA2007100995113A CN200710099511A CN101311882A CN 101311882 A CN101311882 A CN 101311882A CN A2007100995113 A CNA2007100995113 A CN A2007100995113A CN 200710099511 A CN200710099511 A CN 200710099511A CN 101311882 A CN101311882 A CN 101311882A
Authority
CN
China
Prior art keywords
action
eye tracking
sight line
facial image
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CNA2007100995113A
Other languages
Chinese (zh)
Inventor
左坤隆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CNA2007100995113A priority Critical patent/CN101311882A/en
Publication of CN101311882A publication Critical patent/CN101311882A/en
Pending legal-status Critical Current

Links

Abstract

The invention relates to a view line tracking man-machine interactive method, including: view line tracking information is collected and a view line focus position is obtained according to the view line tracking information; facial image information is collected and facial action is recognized according to the facial image information; a control command corresponding to the facial action is output according to the view line focus position. the invention also relates to a view line tracking man-machine interactive device which comprises a view line tracing processing unit which is used for the view line tracking operation, a facial action recognition unit for collecting the facial image information and recognizing the facial action according to the facial image information and a control command output unit which is respectively connected with the view line tracking processing unit and the facial action recognition unit and is used for outputting the control command corresponding to the facial action according to the view line focus position. the embodiment of the invention provides a non-contact view line tracking man-machine interactive method and a device with interaction protocols.

Description

Eye tracking human-machine interaction method and device
Technical field
The present invention relates to man-machine interaction method and device, relate in particular to eye tracking human-machine interaction method and device, belong to field of human-computer interaction.
Background technology
Equipment such as keyboard, mouse, joystick provide the communication interface between machine and the operator, have realized man-machine interaction.In recent years,, studied and used in field of human-computer interaction, finished the man-machine communication such as auxiliary individuals with disabilities, auxiliary aviation driver's operation computing machine etc. based on the mouse simulation and the keyboard simulation of eye tracking along with the development of eye tracking technology.On PC, the motion that the result of eye tracking can the mouse beacon pointer, thus can substitute the function that mouse is realized man-machine interaction.In some Industry Control, aerospace applications scene or emergency, user's hand may all be used to operate other control gear except that computing machine, at this moment has only sight line can be used to the operational computations machine.
Prior art is used the auxiliary man-machine interaction process that realizes tracking of mouse device.When needs carry out target when choosing, user's sight line is pressed mouse button again and is finished selection after dropping on the target that needs to select.When moving a target, user's sight line is pressed mouse button and is chosen this target after dropping on the target that needs to select, and the user moves sight line to a new position then, unclamp mouse button after target promptly rest on new position.In realizing process of the present invention, the inventor finds that there are the following problems at least in this technology:
(1) adopts mouse device as auxiliary input, in reciprocal process, use the motion of hand to come mouse beacon to press and action of releasing.Just can't use under the situation that can't utilize hand to operate for special occasions such as aviation driving like this, the disabled person that some hands can't move can not use this system.
(2) when this system is used for communal facility, owing to need touch mouse equipment, thus increased the possibility of communicate illness among the crowd.
Also there is at present a kind of method according to the sight line residence time or selection control nictation option.The operator somewhere blinkpunkt stop reach certain hour after, current interworking pattern changes meticulous adjustment modes over to.Under this pattern, program is write down current cursor point position earlier, a popup menu appears on the screen then, comprise options such as " click, double-click, left button, right button ", sight line control cursor moves in the menu a certain then, and stop certain hour or select this nictation, thereby realize actions such as " click, double-click ".In realizing process of the present invention, the inventor finds that there are the following problems at least in this method:
Watch the more suitable value of very difficult selection of the residence time attentively, if select less numerical value, then cause maloperation easily, because when sight line is done pause slightly, popup menu will appear, and the operator does not wish to do any action, and this situation (as reading article, watching video etc.) in some specific operations is especially obvious.If select bigger numerical value, then can prolong the execution time of operational order, reduce mutual efficient greatly.In addition, popup menu also can influence the observation of operator to screen content in a lot of occasions.In a word, such interaction protocol can cause the not enough hommization of reciprocal process, and efficient is also lower, is easy to generate maloperation.
Summary of the invention
The purpose of the embodiment of the invention is to provide a kind of noncontact and simple eye tracking human-machine interaction method of interaction protocol and device.
For achieving the above object, among the embodiment of eye tracking human-machine interaction method of the present invention, comprising: gather eye tracking information, and obtain the sight line focal position according to described eye tracking information; Gather facial image information, and identify face action according to described facial image information; According to output of described sight line focal position and described face action control instruction corresponding.
For achieving the above object, the present invention realizes among the embodiment of tracking human-machine interaction device, comprise: the eye tracking processing unit, be used to carry out the eye tracking operation, described eye tracking operation comprises gathers eye tracking information and obtains the sight line focal position according to described eye tracking information; The face action recognition unit is used to gather facial image information and identifies face action according to described facial image information; And the steering order output unit, be connected respectively with described eye tracking processing unit and described face action recognition unit, be used for according to output of described sight line focal position and described face action control instruction corresponding.
The present invention realizes that the embodiment of tracking human-machine interaction method obtains the sight line focal position by gathering eye tracking information, gather facial image information and identify face action, at last according to output of described sight line focal position and described face action control instruction corresponding; In whole process, because operator's hand need not to contact keyboard or peripheral hardwares such as mouse or control lever, successfully used under the situation that makes special occasions such as aviation driving can't utilize hand to operate like this, in addition the disabled person that can also make things convenient for some hands to move; Adopt face action and sight line focal position simultaneously as steering order output according to condition, make that interaction protocol is simple, the output of steering order is more accurate.
The present invention realizes that the embodiment of tracking human-machine interaction device is by setting up the face action recognition unit on the basis of original eye tracking processing unit, export and described face action control instruction corresponding according to described sight line focal position by the steering order output unit, realized non-contacting control mode, improved the comfort level that the user uses, avoided in the common equipment use because operator's hand contacts the transmission of disease that keyboard or peripheral hardwares such as mouse or control lever cause.
Description of drawings
Fig. 1 is the process flow diagram of the embodiment one of eye tracking human-machine interaction method of the present invention;
Fig. 2 is a face action A synoptic diagram;
Fig. 3 is a face action B synoptic diagram;
Fig. 4 is a face action C synoptic diagram;
Fig. 5 is a face action D synoptic diagram;
Fig. 6 is the process flow diagram of the embodiment two of eye tracking human-machine interaction method of the present invention;
Fig. 7 is the process flow diagram of the embodiment three of eye tracking human-machine interaction method of the present invention;
Fig. 8 is the process flow diagram of the embodiment four of eye tracking human-machine interaction method of the present invention;
Fig. 9 is the process flow diagram of the embodiment five of eye tracking human-machine interaction method of the present invention;
Figure 10 is the process flow diagram of the embodiment six of eye tracking human-machine interaction method of the present invention;
Figure 11 is the process flow diagram of the embodiment seven of eye tracking human-machine interaction method of the present invention;
Figure 12 is the structural drawing of the embodiment one of eye tracking human-machine interaction device of the present invention;
Figure 13 is the structural drawing of the embodiment two of eye tracking human-machine interaction device of the present invention;
Figure 14 is the structural drawing of the embodiment three of eye tracking human-machine interaction device of the present invention;
Figure 15 is the structural drawing of the embodiment four of eye tracking human-machine interaction device of the present invention;
Figure 16 is the structural drawing of the embodiment five of eye tracking human-machine interaction device of the present invention;
Figure 17 is the structural drawing of the embodiment six of eye tracking human-machine interaction device of the present invention;
Figure 18 is the structural drawing of the embodiment seven of eye tracking human-machine interaction device of the present invention;
Figure 19 is the structural drawing of the embodiment eight of eye tracking human-machine interaction device of the present invention.
Embodiment
Below by drawings and Examples, technical scheme of the present invention is described in further detail.
Fig. 1 is the process flow diagram of the embodiment one of eye tracking human-machine interaction method of the present invention, comprising:
Step 1, collection eye tracking information.
Step 2, obtain the sight line focal position according to described eye tracking information.
Step 3, collection facial image information.The mode of gathering facial image information can have the vision facilities shooting operation person's of certain resolution face image for selection.
Step 4, identify face action according to described facial image information.As according to certain the several information point in the facial image information, action message and combination thereof as face such as eyes, face, eyebrows identify face action.
Step 5, according to described sight line focal position output and described face action control instruction corresponding.Sight line focal position correspondence the target item that needs control, as the page, file, file etc.For example, the sight line focal position is corresponding to the page, and the face action control instruction corresponding then will demonstrate the operation that is similar to mouse roller bearing key scroll through pages for rolling on the display screen.
The root problem of eye tracking technology is the direction of gaze that will measure eyes, and measuring method has many kinds, comprises methods such as pupil-corneal reflection vector method, electroculogram method (EOG), iris-sclera edge method, corneal reflection method, contact lense method.In these methods, what be adapted at the application of man-machine interaction scene most is pupil-corneal reflection vector method, also is the more a kind of method of utilization in the current eye tracking product.The ultimate principle of this method is as follows: the face when with an infrared secondary light source irradiation people, form reflection image at the eyes anterior corneal surface, and this reflection image is called as pul (Purkinje) spot by the emperor himself.Human eye is being stared at the diverse location of looking on the computer screen, and corresponding rotation can take place eyeball.The head of supposing the experimenter is under motionless situation because the position of infrarede emitting diode fixes, so and eyeball be an approximate spheroid when eyeball rotates, can think that the admire absolute position of spot of pul is constant; And corresponding variation will take place in the position of iris and pupil, the admire relative position relation of spot and pupil and iris of the pul that infrared like this secondary light source forms on cornea also will change, the determining and can realize by Flame Image Process of this relative position relation.By the relative position relation between them, draw the direction of sight line input then.Direction according to sight line output is obtained the sight line focal position.
In the facial movement coded system of Ekman (Facail Action Coding System is called for short FACS) specific definition the concrete form of face action, they are called as moving cell (Action Unit is called for short AU).Defined 44 kinds of AU among the FACS, but wherein many AU occur seldom in daily life, even some AU is difficult to for the ordinary people make.In order to make the operator in man-machine interaction, can easier make these AU, only need select for use wherein some to be easier to make in the embodiments of the invention and several AU that the computer Recognition rate is higher or the combination of AU, especially, can select proper A U by operator's situation according to oneself among the discernible AU of computing machine.For the accuracy that guarantees to discern, can before this device is brought into use, can send at the training of a certain face action to the operator and point out by the face action training aids, rise and open eyes wide as eyebrow, the operator carries out corresponding operating according to prompting, when the operator does not reach training aids required standard scope, training aids can send information to the operator, has reached training aids required standard scope up to the operator.In this embodiment and later embodiment, select as Fig. 2, Fig. 3, Fig. 4, several AU shown in Figure 5 or combination conduct and the specific steering order corresponding preset face action of AU.Wherein, the action of Fig. 2 representative is that eyebrow descends and is screwed in together and (frowns), is called for short face action A; On behalf of eyebrow, Fig. 3 rise and opens eyes wide, and is called for short face action B; On behalf of face, Fig. 4 open, and is called for short face action C.The steering order of face action A representative is that left button is clicked, and the steering order of face action B representative is that left button is double-clicked, and the steering order of face action C representative is a right-click.
Before the output steering order, promptly before the execution in step 5, all need to carry out above-mentioned steps 1, step 2, step 3 and step 4, wherein step 1 and step 2 have chain relationship successively in time, and step 3 and step 4 have chain relationship successively in time.Step 1 and step 3, step 2 and step 4 do not have chain relationship successively in time.
This embodiment obtains the sight line focal position by gathering eye tracking information, gathers facial image information and identifies face action, at last according to output of described sight line focal position and described face action control instruction corresponding; In whole process, because operator's hand need not to contact keyboard or peripheral hardwares such as mouse or control lever, successfully used under the situation that makes special occasions such as aviation driving can't utilize hand to operate like this, in addition the disabled person that can also make things convenient for some hands to move; Adopt face action and sight line focal position simultaneously as steering order output according to condition, make that interaction protocol is simple, the output of steering order is more accurate.
When eye tracking operates in and offers convenience to the operator, because the randomness that operator's sight line is sent, if lack the command information that starts or close the eye tracking operation, might produce maloperation, rest on certain position as operator's sight line, when face opened, may not represent the instruction that will send was the right-click operation.Therefore the action of one preset switches can be set, when to detect face action be described preset switches action, start or stop the eye tracking operation.On behalf of nose, Fig. 5 firmly alarmmed and forms wrinkle, is called for short face action D.It is the preset switches action that embodiment two and later embodiment preestablish face action D, when detecting face action and be face action D, start or close the eye tracking operation, whether if not detecting face action is face action D, then detecting described face action is the preset switches action.With the operational computations machine is example, embodiment two is with the difference of embodiment one, when computer booting, at first start the face action recognition unit, gather facial image information, when identifying face action D first, start the eye tracking processing unit, correspondingly start the eye tracking operation according to described facial image information.Fig. 6 compares with the embodiment one of eye tracking human-machine interaction method of the present invention for the process flow diagram of the embodiment two of the man-machine interaction method of eye tracking of the present invention, also comprises:
Step 1a, collection facial image information.
Step 2a, identify face action according to described facial image information.
Whether step 3a, the described face action of detection are face action D, if described face action is face action D, then execution in step 1; If not, then return execution in step 1a, whether detect described face action be face action D.
Among this embodiment, when computer booting, at first start the face action recognition unit, when detecting described face action first and be the preset switches action, start the eye tracking operation, described eye tracking operation comprises gathers eye tracking information and obtains the sight line focal position according to described eye tracking information, thereby has reduced the possibility that maloperation occurs.
After the computer booting, can be by judging whether the face action that identifies according to facial image information is the preset switches action, and current occurrence number is odd number time (or non-zero even number), starts the operation of (or closing) eye tracking.Fig. 7 compares with the embodiment one of eye tracking human-machine interaction method of the present invention for the process flow diagram of the embodiment three of eye tracking human-machine interaction method of the present invention, also comprises:
Step 1b, collection facial image information.
Step 2b, identify face action according to described facial image information.
Whether step 3b, the described face action of detection are face action D, are face action D as if described face action, then execution in step 4b; If not, then return execution in step 1b, whether detect described face action be face action D.
Step 4b, judge whether face action D occurrence number is odd number, if odd number, then execution in step 1; If not, execution in step 5b.
Step 5b, stop eye tracking operation, forward step 1b to.
Among this embodiment, when computer booting is later, whether be the preset switches action by detecting face action, and whether current occurrence number is odd number, decide startup or close the eye tracking operation, thereby further reduced the possibility that maloperation occurs.
Sight line focal position correspondence the target item that needs control, when the sight line in the eye tracking information is not in default sight line position range, the sight line focal position is just accordingly outside the scope of default sight line focal position, therefore can judge whether the sight line in the eye tracking information is preset in the sight line position range by setting default sight line position range.When the sight line in the affirmation eye tracking information is in default sight line position range, carry out the operation that next step identifies face action again.Be specially: detect sight line in the described eye tracking information whether in default sight line position range, if described sight line is in default sight line position range the time, current facial image information is promptly gathered in the work of face action recognition unit, identifies face action according to current facial image information; The eye tracking processing unit quits work, and stops the eye tracking operation.If the sight line in the eye tracking information is outside default sight line position range, the face action recognition unit quits work, and the eye tracking operation is carried out in the work of eye tracking processing unit.
Among the embodiment four, be example with the target item on the control screen, this embodiment, default sight line position range is the screen scope.Fig. 8 compares with the embodiment three of eye tracking human-machine interaction method of the present invention for the process flow diagram of the embodiment four of eye tracking human-machine interaction method of the present invention, also comprises:
Step 20, detect sight line in the described eye tracking information whether in the screen scope, if the sight line in the described eye tracking information is in the screen scope, then execution in step 3; Otherwise, execution in step 1.Among this embodiment, by increase detecting the sight line step in default sight line position range whether in the eye tracking information, when sight line is not in default sight line position range, can stops facial image acquisition and identifying operation, thereby further reduce the possibility that maloperation occurs.
Fig. 9 is the process flow diagram of the embodiment five of eye tracking human-machine interaction method of the present invention, the difference of this embodiment and a last embodiment is, after detecting sight line in the eye tracking information step in default sight line position range being arranged on step 1, and, be used for prompting operation person and adjust sight line to operator's information that gives a warning.Be specially:
Step 10, detect sight line in the described eye tracking information whether in the screen scope, if the sight line in the described eye tracking information is in the screen scope, execution in step 2; Otherwise, execution in step 10 ';
Step 10 ', send out prompt tone, this prompt tone is used for prompting operation person and adjusts sight line.
Among this embodiment, by after collecting eye tracking information, increase to detect the sight line step in default sight line position range whether in the eye tracking information, make it possible to promptly whether sight line be made judgement in default sight line position range, and by sending out prompt tone to the operator, pilot operationp person makes the operator adjust sight line in time, to reach the purpose of timely controlled target item.
The face action correspondence the steering order that need send, and when facial corner in the facial image information of gathering was not in default facial angle range, inaccurate situation may appear in the face action that identifies.Therefore can be by setting default facial angle range, judge that facial corner in the facial image information is whether in presetting facial angle range.When the facial corner in the affirmation facial image information is in default facial angle range, carry out the operation of next step identification face action again.
Figure 10 is the process flow diagram of the embodiment six of eye tracking human-machine interaction method of the present invention, be with embodiment four differences of eye tracking human-machine interaction method of the present invention, sight line in confirming eye tracking information in default sight line position range after, identify before the face action, also comprise:
Step 30, detect facial corner in the described facial image information whether in default facial angle range, if described facial corner is in default facial angle range, execution in step 4; Otherwise, execution in step 10.
People's appearance is facial corner for screen rotational angle in the horizontal direction, when facial corner is outside default facial angle range, stop the identification of facial image information, forward eye tracking operation to, detect sight line in the described eye tracking information whether in default sight line position range.The information that also gives a warning among the embodiment six quits work with the explanation interactive system, and prompting operation person adjusts sight line.The method of estimation of facial corner is, utilizes known human eye location technology to measure the distance at eyes center automatically on image, and the ratio of calculating distance between the eyes center during with people's face front, and this value can be weighed the size of facial corner.
Among this embodiment, when detecting described sight line in default sight line position range the time, further detect facial corner in the facial image information whether in default facial angle range, when facial corner is outside default facial angle range, stop the identification of facial image information, forward the eye tracking operation to, do not carry out the identifying operation of facial image information, thereby guaranteed the accuracy of identification.
Specific face action correspondence specific steering order, and generally default certain operations person can conveniently make, and the corresponding specific steering order of the face action of being convenient to discern.Default face action comprises preset switches action and default control action.After identifying face action, also comprise before according to output of described sight line focal position and described face action control instruction corresponding: confirm that described face action is default face action and the action of non-preset switches, confirm that promptly described face action is the default control action.Figure 11 is the process flow diagram of the embodiment seven of eye tracking human-machine interaction method of the present invention.The difference of this embodiment and a last embodiment is, after identifying face action, also comprises before according to output of described sight line focal position and described face action control instruction corresponding:
Whether step 40, the described face action of detection are default face action, if described face action is default face action A, then execution in step 51; If described face action is default face action B, then execution in step 52; If described face action is default face action C, then execution in step 53; If described face action is default face action D, then execution in step 5b; If the non-default face action of described face action, execution in step 10 ', sound gives a warning, the prompting user readjusts face-image, when the user adjusted face-image, sight line also will change in the corresponding eye tracking information, therefore needed to detect described sight line whether in default sight line position range.
Step 51, click steering order according to the described sight line focal position output left button corresponding with described face action A.
Step 52, the left button double-click steering order corresponding with described face action B according to described sight line focal position output.
Step 53, the right-click steering order corresponding with described face action C according to described sight line focal position output.
Among the embodiment of above eye tracking human-machine interaction method, the face action control instruction corresponding can be the default setting of system, also can adjust flexibly.When the face action that system default is provided with as the user and the correspondence mappings of steering order are uncomfortable, can be according to the individual preference of oneself and the steering order of selected neatly face action of custom and correspondence mappings thereof, also be the mapping relations that the user can upgrade face action and steering order, be specially and upgrade and described face action control instruction corresponding or the renewal face action corresponding with described steering order.This method makes face action and steering order realize flexible configuration, and the personal considerations according to the user disposes to greatest extent, and is not limited to default configuration, and it is more convenient to make the user use.
Figure 12 is the structural drawing of the embodiment one of eye tracking human-machine interaction device of the present invention, comprise: eye tracking processing unit 100, be used to carry out the eye tracking operation, described eye tracking operation comprises gathers eye tracking information and obtains the sight line focal position according to described eye tracking information; Face action recognition unit 200 is used to gather facial image information and identifies face action according to described facial image information; And steering order output unit 300, be connected respectively with described eye tracking processing unit 100 and described face action recognition unit 200, be used for according to output of described sight line focal position and described face action control instruction corresponding.
This embodiment is by setting up the face action recognition unit on the basis of original eye tracking processing unit, export and described face action control instruction corresponding according to described sight line focal position by the steering order output unit, realized non-contacting control mode, improved the comfort level that the user uses, avoided in the common equipment use because operator's hand contacts the transmission of disease that keyboard or peripheral hardwares such as mouse or control lever cause.
Figure 13 is the structural drawing of the embodiment two of eye tracking human-machine interaction device of the present invention.The difference of this embodiment and a last embodiment is, come opening and closing eye tracking processing unit by the face action recognition unit, be specially, face action recognition unit 200 is connected with eye tracking processing unit 100, when detecting described face action is preset switches when action, be used to notify eye tracking processing unit 100, correspondingly start or stop the eye tracking operation, described eye tracking operation comprises gathers eye tracking information and obtains the sight line focal position according to described eye tracking information.When computer booting, at first start the face action recognition unit, when detecting face action first and be preset switches information, notice eye tracking processing unit 100 starts the eye tracking operation, and whether be specially and detect face action is the preset switches action, if, then open the eye tracking processing unit, if not, then gathering facial image information and discern face action according to facial image information, is that the eye tracking processing unit is opened in the preset switches action again until described face action.
Among this embodiment, when computer booting, face action recognition unit at first when detecting face action first and be the preset switches action, starts the eye tracking operation, thereby has reduced the possibility that maloperation occurs.
Figure 14 is the structural drawing of the embodiment three of eye tracking human-machine interaction device of the present invention.The difference of the enforcement one of this embodiment and the interpersonal interactive device of eye tracking of the present invention is also to comprise synchronous processing module 400, eye tracking processing unit 100 is connected with face action recognition unit 200 by synchronous processing module 400, when the sight line in the described eye tracking information of detection is in default sight line position range, be used for identifying face action according to current facial image information by synchronous processing module 400 notice face action recognition units 200.
Figure 15 is the structural drawing of the embodiment four of eye tracking human-machine interaction device of the present invention.The difference of this embodiment and a last embodiment is, face action recognition unit 200 is also by synchronous processing module 400 and eye tracking processing unit 100, when the facial corner in the described facial image information of detection is outside default facial angle range, be used for by described synchronous processing module 400 notice eye tracking processing units 100, detect sight line in the described eye tracking information whether in default sight line position range.
Figure 16 is the structural drawing of the embodiment five of eye tracking human-machine interaction device of the present invention.Face action recognition unit 200 is connected with eye tracking processing unit 100, when detecting described face action is preset switches when action, correspondingly start or stop the eye tracking operation, described eye tracking operation comprises gathers eye tracking information and obtains the sight line focal position according to described eye tracking information; Synchronous processing module 400 is connected respectively with face action recognition unit 200 and eye tracking processing unit 100, eye tracking processing unit 100 is connected with face action recognition unit 200 by synchronous processing module 400, when the sight line in the described eye tracking information of detection is in default sight line position range, be used for by synchronous processing module 400 notice face action recognition units 200, according to current facial image information identification face action; Face action recognition unit 200 is by synchronous processing module 400 and eye tracking processing unit 100, when the facial corner in the described facial image information of detection is outside default facial angle range, be used for by described synchronous processing module 400 notice eye tracking processing units 100, detect sight line in the described eye tracking information whether in default sight line position range.
Figure 17 is the structural drawing of the embodiment six of eye tracking human-machine interaction device of the present invention.Among this embodiment, eye tracking processing unit 100 comprises: eye tracking acquisition module 101 is used to gather eye tracking information; And eye tracking processing module 102, be connected with eye tracking acquisition module 101, be used for obtaining the sight line focal position according to eye tracking information.Face action recognition unit 200 comprises: facial image acquisition module 201 is used to gather facial image information; And face action identification module 202, be connected with facial image acquisition module 201, be used for according to facial image information identification face action.Face action identification module 202 is connected with eye tracking acquisition module 101 and eye tracking processing module 102, when detecting described face action is preset switches when action, correspondingly start or stop the eye tracking operation, the operation of described eye tracking comprises gathers eye tracking information and obtains the sight line focal position according to described eye tracking information, promptly correspondingly starts or closes the eye tracking processing unit; Synchronous processing module 400 is connected respectively with facial image acquisition module 201 and eye tracking acquisition module 101, eye tracking acquisition module 101 is connected with face-image module 201 by synchronous processing module 400, when the sight line in the described eye tracking information of detection is in default sight line position range, be used for by synchronous processing module 400 notice facial image acquisition modules 201, current facial image information is sent to face action identification module 202, and face action identification module 202 is according to current facial image information identification face action; Facial image acquisition module 201 is by synchronous processing module 400 and eye tracking acquisition module 101, when the facial corner in the described facial image information of detection is outside default facial angle range, be used for by described synchronous processing module 400 notice eye tracking acquisition modules 101, detect sight line in the described eye tracking information whether in default sight line position range.
Figure 18 is the structural drawing of the embodiment seven of eye tracking human-machine interaction device of the present invention.The face action of storing in the steering order output unit and the mapping relations of steering order can be the default setting of system, also can adjust flexibly, therefore the eye tracking human-machine interaction device can also comprise a update module 500, be connected with steering order output unit 300, be used for upgrading the face action of described steering order output unit 300 and the mapping relations of steering order.
Figure 19 is the structural drawing of the embodiment eight of eye tracking human-machine interaction device of the present invention.Among this embodiment, the face action in the steering order output unit 300 and the mapping relations of steering order can be stored in the database, therefore steering order output unit 300 can also comprise a database 301, and this database 301 is used to store the mapping relations of face action and steering order.The face action of storage and the mapping relations of steering order can be the default setting of system in this database 301, also can adjust flexibly.Steering order output unit 300 also comprises a update module 302, and this update module 302 is connected with database 301, is used for the more face action of new database 301 and the mapping relations of steering order.
When the face action that system default is provided with as the user and the correspondence mappings of steering order are uncomfortable, can be according to the individual preference of oneself and the steering order of selected neatly face action of custom and correspondence mappings thereof, also be the mapping relations that the user can upgrade face action and steering order, be specially and upgrade and described face action control instruction corresponding or the renewal face action corresponding with described steering order.This method makes face action and steering order realize flexible configuration, and the personal considerations according to the user disposes to greatest extent, and is not limited to default configuration, and it is more convenient to make the user use.
The embodiment of above-mentioned eye tracking human-machine interaction method and device is by gathering eye tracking information acquisition sight line focal position, and gather facial image information and identify face action, at last according to output of sight line focal position and face action control instruction corresponding.In whole process, operator's hand need not to contact keyboard or peripheral hardwares such as mouse or control lever, and peripheral hardware input equipment such as analog mouse has been avoided transmission of disease in the common equipment use fully; Also can can't utilize under the situation that hand operates, be operating personnel's handled easily at special occasions such as aviation driving; Can also make things convenient for operational computations machine of the convenient for handicapped that some hands can't move etc. in addition.
It should be noted last that, above embodiment is only unrestricted in order to technical scheme of the present invention to be described, although the present invention is had been described in detail with reference to preferred embodiment, those of ordinary skill in the art is to be understood that, can make amendment or be equal to replacement technical scheme of the present invention, and not break away from the spirit and scope of technical solution of the present invention.

Claims (16)

1, a kind of eye tracking human-machine interaction method is characterized in that, comprising:
Gather eye tracking information, and obtain the sight line focal position according to described eye tracking information;
Gather facial image information, and identify face action according to described facial image information;
According to output of described sight line focal position and described face action control instruction corresponding.
2, method according to claim 1 is characterized in that, described face action comprises the preset switches action, and described method specifically comprises:
Gather facial image information, after the number of times that identifies described preset switches action first or identify described preset switches action according to described facial image information is odd number, carry out following steps:
Gather eye tracking information, and obtain the sight line focal position according to described eye tracking information;
Gather facial image information, and identify face action according to described facial image information;
According to output of described sight line focal position and described face action control instruction corresponding.
3, method according to claim 1 and 2 is characterized in that, after identifying face action, also comprises before according to output of described sight line focal position and described face action control instruction corresponding:
Confirm to recognize the non-preset switches action of face action.
4, method according to claim 1 and 2 is characterized in that, also comprises identify face action after gathering eye tracking information before:
Sight line in the affirmation eye tracking information is in default sight line position range.
5, method according to claim 4 is characterized in that, the sight line in confirming eye tracking information in default sight line position range after, identify face action and also comprise before:
Facial corner in the affirmation facial image information is in default facial angle range.
6, method according to claim 1 and 2 is characterized in that, after identifying face action, also comprises before according to output of described sight line focal position and described face action control instruction corresponding:
Confirm that described face action is default face action and the action of non-preset switches.
7, method according to claim 1 and 2 is characterized in that, also comprises: the mapping relations of upgrading face action and steering order.
8, a kind of eye tracking human-machine interaction device is characterized in that, comprising:
The eye tracking processing unit is used to carry out the eye tracking operation, and described eye tracking operation comprises gathers eye tracking information and obtain the sight line focal position according to described eye tracking information;
The face action recognition unit is used to gather facial image information and identifies face action according to described facial image information; And
The steering order output unit is connected respectively with described eye tracking processing unit and described face action recognition unit, is used for according to output of described sight line focal position and described face action control instruction corresponding.
9, device according to claim 8, it is characterized in that described face action recognition unit is connected with described eye tracking processing unit, is preset switches when action when detecting described face action, be used to notify the eye tracking processing unit, correspondingly start or stop the eye tracking operation.
10, device according to claim 8, it is characterized in that, also comprise synchronous processing module, described eye tracking processing unit is connected with described face action recognition unit by described synchronous processing module, when the sight line in the described eye tracking information of detection is in default sight line position range, be used for identifying current face action according to current facial image information by described synchronous processing module notice face action recognition unit.
11, device according to claim 9, it is characterized in that, described face action recognition unit is connected with described eye tracking processing unit by described synchronous processing module, when the facial corner in the described facial image information of detection is outside default facial angle range, be used for by described synchronous processing module notice eye tracking processing unit, detect sight line in the described eye tracking information whether in default sight line position range.
12, according to Claim 8 described-11 arbitrary device is characterized in that, also comprises: update module, be connected with described steering order output unit, and be used for upgrading the face action of described steering order output unit and the mapping relations of steering order.
13, according to Claim 8 described-11 arbitrary device is characterized in that, described eye tracking processing unit comprises:
The eye tracking acquisition module is used to gather eye tracking information; And
The eye tracking processing module is connected with described eye tracking acquisition module, is used for obtaining the sight line focal position according to eye tracking information.
14, according to Claim 8 described-11 arbitrary device is characterized in that, described face action recognition unit comprises:
The facial image acquisition module is used to gather facial image information; And
The face action identification module is connected with described facial image acquisition module, is used for identifying face action according to facial image information.
15, according to Claim 8 described-11 arbitrary device is characterized in that, described steering order output unit also comprises: database is used to store the mapping relations of face action and steering order.
16, device according to claim 15 is characterized in that, described steering order output unit also comprises: update module, be connected with described database, and be used for upgrading the face action of described database and the mapping relations of steering order.
CNA2007100995113A 2007-05-23 2007-05-23 Eye tracking human-machine interaction method and apparatus Pending CN101311882A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNA2007100995113A CN101311882A (en) 2007-05-23 2007-05-23 Eye tracking human-machine interaction method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNA2007100995113A CN101311882A (en) 2007-05-23 2007-05-23 Eye tracking human-machine interaction method and apparatus

Publications (1)

Publication Number Publication Date
CN101311882A true CN101311882A (en) 2008-11-26

Family

ID=40100548

Family Applications (1)

Application Number Title Priority Date Filing Date
CNA2007100995113A Pending CN101311882A (en) 2007-05-23 2007-05-23 Eye tracking human-machine interaction method and apparatus

Country Status (1)

Country Link
CN (1) CN101311882A (en)

Cited By (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101576771B (en) * 2009-03-24 2010-12-01 山东大学 Scaling method for eye tracker based on nonuniform sample interpolation
WO2011038527A1 (en) * 2009-09-29 2011-04-07 阿尔卡特朗讯 Method for viewing points detecting and apparatus thereof
CN102081503A (en) * 2011-01-25 2011-06-01 汉王科技股份有限公司 Electronic reader capable of automatically turning pages based on eye tracking and method thereof
CN102169348A (en) * 2011-01-22 2011-08-31 浙江大学 Method for controlling service robots with sight lines
CN102193621A (en) * 2010-03-17 2011-09-21 三星电子(中国)研发中心 Vision-based interactive electronic equipment control system and control method thereof
CN102307288A (en) * 2011-07-27 2012-01-04 中国计量学院 Projection system moving along with sightline of first person based on human face recognition
CN102551385A (en) * 2011-12-29 2012-07-11 广东工业大学 Automatically-paging reading chair based on sight tracking and control device and control method of automatically-paging reading chair
CN102572217A (en) * 2011-12-29 2012-07-11 华为技术有限公司 Visual-attention-based multimedia processing method and device
CN102945077A (en) * 2012-10-24 2013-02-27 广东欧珀移动通信有限公司 Image viewing method and device and intelligent terminal
CN103309450A (en) * 2013-06-09 2013-09-18 张家港市鸿嘉数字科技有限公司 Method for identifying facial expression of user to operate tablet personal computer
CN103336581A (en) * 2013-07-30 2013-10-02 黄通兵 Human eye movement characteristic design-based human-computer interaction method and system
CN103389798A (en) * 2013-07-23 2013-11-13 深圳市欧珀通信软件有限公司 Method and device for operating mobile terminal
CN103530623A (en) * 2013-09-16 2014-01-22 北京智谷睿拓技术服务有限公司 Information observation method and information observation device
CN103690146A (en) * 2013-12-13 2014-04-02 重庆大学 Novel eye tracker
CN103809737A (en) * 2012-11-13 2014-05-21 华为技术有限公司 Method and device for human-computer interaction
CN103853440A (en) * 2012-11-30 2014-06-11 北京三星通信技术研究有限公司 Mobile terminal and method for implementing shortcut operation on same
CN103974107A (en) * 2013-01-28 2014-08-06 海尔集团公司 Television eye movement control method and device and television
CN104134037A (en) * 2014-07-30 2014-11-05 京东方科技集团股份有限公司 Energy-saving control method and system for display device
CN104238726A (en) * 2013-06-17 2014-12-24 腾讯科技(深圳)有限公司 Intelligent glasses control method, intelligent glasses control device and intelligent glasses
CN104765442A (en) * 2014-01-08 2015-07-08 腾讯科技(深圳)有限公司 Automatic browsing method and device
CN104937531A (en) * 2013-01-05 2015-09-23 大众汽车有限公司 Operating method and operating system in a vehicle
CN105009032A (en) * 2013-09-11 2015-10-28 歌乐株式会社 Information processing device, gesture detection method, and gesture detection program
CN105068643A (en) * 2015-07-24 2015-11-18 广州视源电子科技股份有限公司 Method and device for regulating brightness of intelligent mirror
CN105630148A (en) * 2015-08-07 2016-06-01 宇龙计算机通信科技(深圳)有限公司 Terminal display method, terminal display apparatus and terminal
CN105765513A (en) * 2013-11-01 2016-07-13 索尼公司 Information processing device, information processing method, and program
CN105843401A (en) * 2016-05-12 2016-08-10 深圳市联谛信息无障碍有限责任公司 Screen reading instruction input method and device based on camera
CN105938390A (en) * 2015-03-03 2016-09-14 卡西欧计算机株式会社 Content output apparatus and content output method
WO2016161954A1 (en) * 2015-04-10 2016-10-13 Beijing Zhigu Rui Tuo Tech Co., Ltd. Information acquiring method, information acquiring apparatus, and user equipment
WO2016165052A1 (en) * 2015-04-13 2016-10-20 Empire Technology Development Llc Detecting facial expressions
CN106125940A (en) * 2016-07-05 2016-11-16 乐视控股(北京)有限公司 virtual reality interactive interface management method and device
CN106354263A (en) * 2016-09-09 2017-01-25 电子科技大学 Real-time man-machine interaction system based on facial feature tracking and working method of real-time man-machine interaction system
CN106462733A (en) * 2014-05-19 2017-02-22 微软技术许可有限责任公司 Gaze detection calibration
CN106575152A (en) * 2014-07-23 2017-04-19 微软技术许可有限责任公司 Alignable user interface
CN106648055A (en) * 2016-09-30 2017-05-10 珠海市魅族科技有限公司 Method of managing menu in virtual reality environment and virtual reality equipment
CN106873774A (en) * 2017-01-12 2017-06-20 北京奇虎科技有限公司 interaction control method, device and intelligent terminal based on eye tracking
WO2017152592A1 (en) * 2016-03-07 2017-09-14 乐视控股(北京)有限公司 Mobile terminal application operation method and mobile terminal
CN107678547A (en) * 2017-09-27 2018-02-09 维沃移动通信有限公司 A kind of processing method and mobile terminal of information notice
WO2018049747A1 (en) * 2016-09-14 2018-03-22 歌尔科技有限公司 Focus position determination method and device for virtual reality apparatus, and virtual reality apparatus
CN108206050A (en) * 2016-12-20 2018-06-26 德尔格制造股份两合公司 Device, method and computer program and the Medical Devices of Medical Devices are configured
CN108815845A (en) * 2018-05-15 2018-11-16 百度在线网络技术(北京)有限公司 The information processing method and device of human-computer interaction, computer equipment and readable medium
CN108920228A (en) * 2018-05-28 2018-11-30 云谷(固安)科技有限公司 A kind of control instruction input method and input unit
CN108985225A (en) * 2018-07-13 2018-12-11 北京猎户星空科技有限公司 Focus follower method, device, electronic equipment and storage medium
WO2019241920A1 (en) * 2018-06-20 2019-12-26 优视科技新加坡有限公司 Terminal control method and device
CN110896445A (en) * 2018-09-13 2020-03-20 网易(杭州)网络有限公司 Method and device for triggering photographing operation, storage medium and electronic device

Cited By (69)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101576771B (en) * 2009-03-24 2010-12-01 山东大学 Scaling method for eye tracker based on nonuniform sample interpolation
CN102473033B (en) * 2009-09-29 2015-05-27 阿尔卡特朗讯 Method for viewing points detecting and apparatus thereof
WO2011038527A1 (en) * 2009-09-29 2011-04-07 阿尔卡特朗讯 Method for viewing points detecting and apparatus thereof
CN102473033A (en) * 2009-09-29 2012-05-23 阿尔卡特朗讯 Method for viewing points detecting and apparatus thereof
CN102193621A (en) * 2010-03-17 2011-09-21 三星电子(中国)研发中心 Vision-based interactive electronic equipment control system and control method thereof
CN102169348A (en) * 2011-01-22 2011-08-31 浙江大学 Method for controlling service robots with sight lines
CN102169348B (en) * 2011-01-22 2012-07-04 浙江大学 Method for controlling service robots with sight lines
CN102081503A (en) * 2011-01-25 2011-06-01 汉王科技股份有限公司 Electronic reader capable of automatically turning pages based on eye tracking and method thereof
CN102307288A (en) * 2011-07-27 2012-01-04 中国计量学院 Projection system moving along with sightline of first person based on human face recognition
CN102551385A (en) * 2011-12-29 2012-07-11 广东工业大学 Automatically-paging reading chair based on sight tracking and control device and control method of automatically-paging reading chair
CN102572217B (en) * 2011-12-29 2014-08-20 华为技术有限公司 Visual-attention-based multimedia processing method and device
CN102572217A (en) * 2011-12-29 2012-07-11 华为技术有限公司 Visual-attention-based multimedia processing method and device
CN102945077B (en) * 2012-10-24 2015-12-16 广东欧珀移动通信有限公司 A kind of picture inspection method, device and intelligent terminal
CN102945077A (en) * 2012-10-24 2013-02-27 广东欧珀移动通信有限公司 Image viewing method and device and intelligent terminal
WO2014075418A1 (en) * 2012-11-13 2014-05-22 华为技术有限公司 Man-machine interaction method and device
CN103809737A (en) * 2012-11-13 2014-05-21 华为技术有限公司 Method and device for human-computer interaction
US9740281B2 (en) 2012-11-13 2017-08-22 Huawei Technologies Co., Ltd. Human-machine interaction method and apparatus
CN103853440A (en) * 2012-11-30 2014-06-11 北京三星通信技术研究有限公司 Mobile terminal and method for implementing shortcut operation on same
CN104937531A (en) * 2013-01-05 2015-09-23 大众汽车有限公司 Operating method and operating system in a vehicle
KR101928637B1 (en) 2013-01-05 2018-12-12 폭스바겐 악티엔 게젤샤프트 Operating method and operating system in a vehicle
CN103974107A (en) * 2013-01-28 2014-08-06 海尔集团公司 Television eye movement control method and device and television
CN103309450A (en) * 2013-06-09 2013-09-18 张家港市鸿嘉数字科技有限公司 Method for identifying facial expression of user to operate tablet personal computer
CN104238726A (en) * 2013-06-17 2014-12-24 腾讯科技(深圳)有限公司 Intelligent glasses control method, intelligent glasses control device and intelligent glasses
CN103389798A (en) * 2013-07-23 2013-11-13 深圳市欧珀通信软件有限公司 Method and device for operating mobile terminal
CN103336581A (en) * 2013-07-30 2013-10-02 黄通兵 Human eye movement characteristic design-based human-computer interaction method and system
CN105009032B (en) * 2013-09-11 2017-10-03 歌乐株式会社 Information processor and pose detection method
CN105009032A (en) * 2013-09-11 2015-10-28 歌乐株式会社 Information processing device, gesture detection method, and gesture detection program
CN103530623A (en) * 2013-09-16 2014-01-22 北京智谷睿拓技术服务有限公司 Information observation method and information observation device
US9946341B2 (en) 2013-09-16 2018-04-17 Beijing Zhigu Rui Tuo Tech Co., Ltd. Information observation method and information observation device
US10684680B2 (en) 2013-09-16 2020-06-16 Beijing Zhigu Rui Tuo Tech Co., Ltd. Information observation method and information observation device
CN103530623B (en) * 2013-09-16 2017-08-01 北京智谷睿拓技术服务有限公司 Information observational technique and information observation device
CN105765513B (en) * 2013-11-01 2020-02-21 索尼公司 Information processing apparatus, information processing method, and program
CN105765513A (en) * 2013-11-01 2016-07-13 索尼公司 Information processing device, information processing method, and program
CN103690146A (en) * 2013-12-13 2014-04-02 重庆大学 Novel eye tracker
CN104765442A (en) * 2014-01-08 2015-07-08 腾讯科技(深圳)有限公司 Automatic browsing method and device
US10248199B2 (en) 2014-05-19 2019-04-02 Microsoft Technology Licensing, Llc Gaze detection calibration
CN106462733B (en) * 2014-05-19 2019-09-20 微软技术许可有限责任公司 A kind of method and calculating equipment for line-of-sight detection calibration
CN106462733A (en) * 2014-05-19 2017-02-22 微软技术许可有限责任公司 Gaze detection calibration
CN106575152A (en) * 2014-07-23 2017-04-19 微软技术许可有限责任公司 Alignable user interface
CN106575152B (en) * 2014-07-23 2019-09-27 微软技术许可有限责任公司 The user interface that can be aligned
CN104134037A (en) * 2014-07-30 2014-11-05 京东方科技集团股份有限公司 Energy-saving control method and system for display device
CN104134037B (en) * 2014-07-30 2017-11-14 京东方科技集团股份有限公司 Display device energy-saving control method and system
CN105938390B (en) * 2015-03-03 2019-02-15 卡西欧计算机株式会社 Content output apparatus, content outputting method
CN105938390A (en) * 2015-03-03 2016-09-14 卡西欧计算机株式会社 Content output apparatus and content output method
CN106155288A (en) * 2015-04-10 2016-11-23 北京智谷睿拓技术服务有限公司 Information getting method, information acquisition device and user equipment
WO2016161954A1 (en) * 2015-04-10 2016-10-13 Beijing Zhigu Rui Tuo Tech Co., Ltd. Information acquiring method, information acquiring apparatus, and user equipment
US10394321B2 (en) 2015-04-10 2019-08-27 Beijing Zhigu Rui Tuo Tech Co., Ltd Information acquiring method, information acquiring apparatus, and user equipment
CN106155288B (en) * 2015-04-10 2019-02-12 北京智谷睿拓技术服务有限公司 Information acquisition method, information acquisition device and user equipment
WO2016165052A1 (en) * 2015-04-13 2016-10-20 Empire Technology Development Llc Detecting facial expressions
CN105068643B (en) * 2015-07-24 2018-02-27 广州视源电子科技股份有限公司 The method and apparatus for adjusting Intelligent mirror brightness
CN105068643A (en) * 2015-07-24 2015-11-18 广州视源电子科技股份有限公司 Method and device for regulating brightness of intelligent mirror
CN105630148A (en) * 2015-08-07 2016-06-01 宇龙计算机通信科技(深圳)有限公司 Terminal display method, terminal display apparatus and terminal
WO2017152592A1 (en) * 2016-03-07 2017-09-14 乐视控股(北京)有限公司 Mobile terminal application operation method and mobile terminal
CN105843401A (en) * 2016-05-12 2016-08-10 深圳市联谛信息无障碍有限责任公司 Screen reading instruction input method and device based on camera
CN106125940A (en) * 2016-07-05 2016-11-16 乐视控股(北京)有限公司 virtual reality interactive interface management method and device
CN106354263A (en) * 2016-09-09 2017-01-25 电子科技大学 Real-time man-machine interaction system based on facial feature tracking and working method of real-time man-machine interaction system
WO2018049747A1 (en) * 2016-09-14 2018-03-22 歌尔科技有限公司 Focus position determination method and device for virtual reality apparatus, and virtual reality apparatus
CN106648055A (en) * 2016-09-30 2017-05-10 珠海市魅族科技有限公司 Method of managing menu in virtual reality environment and virtual reality equipment
CN108206050A (en) * 2016-12-20 2018-06-26 德尔格制造股份两合公司 Device, method and computer program and the Medical Devices of Medical Devices are configured
CN106873774A (en) * 2017-01-12 2017-06-20 北京奇虎科技有限公司 interaction control method, device and intelligent terminal based on eye tracking
CN107678547A (en) * 2017-09-27 2018-02-09 维沃移动通信有限公司 A kind of processing method and mobile terminal of information notice
CN108815845B (en) * 2018-05-15 2019-11-26 百度在线网络技术(北京)有限公司 The information processing method and device of human-computer interaction, computer equipment and readable medium
CN108815845A (en) * 2018-05-15 2018-11-16 百度在线网络技术(北京)有限公司 The information processing method and device of human-computer interaction, computer equipment and readable medium
US10983606B2 (en) 2018-05-28 2021-04-20 Yungu (Gu'an) Technology Co., Ltd. Control instruction input methods and control instruction input devices
CN108920228A (en) * 2018-05-28 2018-11-30 云谷(固安)科技有限公司 A kind of control instruction input method and input unit
CN108920228B (en) * 2018-05-28 2021-01-15 云谷(固安)科技有限公司 Control instruction input method and input device
WO2019241920A1 (en) * 2018-06-20 2019-12-26 优视科技新加坡有限公司 Terminal control method and device
CN108985225A (en) * 2018-07-13 2018-12-11 北京猎户星空科技有限公司 Focus follower method, device, electronic equipment and storage medium
CN110896445A (en) * 2018-09-13 2020-03-20 网易(杭州)网络有限公司 Method and device for triggering photographing operation, storage medium and electronic device

Similar Documents

Publication Publication Date Title
CN101311882A (en) Eye tracking human-machine interaction method and apparatus
US11042205B2 (en) Intelligent user mode selection in an eye-tracking system
US10313587B2 (en) Power management in an eye-tracking system
US9039419B2 (en) Method and system for controlling skill acquisition interfaces
Argyros et al. Vision-based interpretation of hand gestures for remote control of a computer mouse
CN103336582A (en) Motion information control human-computer interaction method
CN103336581A (en) Human eye movement characteristic design-based human-computer interaction method and system
WO2002001336A2 (en) Automated visual tracking for computer access
CN108681399B (en) Equipment control method, device, control equipment and storage medium
Arai et al. Eye-based HCI with full specification of mouse and keyboard using pupil knowledge in the gaze estimation
CN109145802B (en) Kinect-based multi-person gesture man-machine interaction method and device
Sivasangari et al. Eyeball based Cursor Movement Control
Parmar et al. Facial-feature based Human-Computer Interface for disabled people
US20170206508A1 (en) Method and arrangement for generating event data
CN106681509A (en) Interface operating method and system
JPH1039995A (en) Line-of-sight/voice input device
Shi et al. Helping people with ICT device control by eye gaze
Petrushin et al. Gaze-controlled Laser Pointer Platform for People with Severe Motor Impairments: Preliminary Test in Telepresence
Prabhakar et al. Comparison of three hand movement tracking sensors as cursor controllers
Bilal et al. Design a Real-Time Eye Tracker
KR20140132906A (en) Device and method for mobile tooltip using eye detecting
WO2019227734A1 (en) Control instruction input method and apparatus
CN108536285B (en) Mouse interaction method and system based on eye movement recognition and control
US20210064132A1 (en) Systems, methods, and interfaces for performing inputs based on neuromuscular control
Rochmad et al. IMPLEMENTATION EYES MOVEMENT TO HELP COMMUNICATION PERSONS DISABILITIES

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C12 Rejection of a patent application after its publication
RJ01 Rejection of invention patent application after publication

Open date: 20081126