CN103631365B - A kind of terminal input control method and device - Google Patents

A kind of terminal input control method and device Download PDF

Info

Publication number
CN103631365B
CN103631365B CN201210301032.6A CN201210301032A CN103631365B CN 103631365 B CN103631365 B CN 103631365B CN 201210301032 A CN201210301032 A CN 201210301032A CN 103631365 B CN103631365 B CN 103631365B
Authority
CN
China
Prior art keywords
eye
region
white
nictation
core region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201210301032.6A
Other languages
Chinese (zh)
Other versions
CN103631365A (en
Inventor
柳阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Mobile Communications Group Co Ltd
Original Assignee
China Mobile Communications Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Mobile Communications Group Co Ltd filed Critical China Mobile Communications Group Co Ltd
Priority to CN201210301032.6A priority Critical patent/CN103631365B/en
Priority to PCT/CN2013/079567 priority patent/WO2014029245A1/en
Publication of CN103631365A publication Critical patent/CN103631365A/en
Application granted granted Critical
Publication of CN103631365B publication Critical patent/CN103631365B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a kind of terminal input control method and device, input control mode in order to abundant terminal, provide the user more naturally, interactive operation intuitively.Wherein, described terminal input control method, including: utilize the real time imaging that photographic head obtains, identify that the eyeball in described real time imaging, described eyeball include white of the eye region and eye core region;Identify operation nictation in described real time imaging;And determine the core region positional information relative to white of the eye region;Terminal input operation is determined according to operation nictation or according to operation nictation and described positional information.

Description

A kind of terminal input control method and device
Technical field
The present invention relates to terminal operation and control technical field, particularly relate to a kind of terminal input control method and device.
Background technology
Along with terminal technology, the especially development of mobile terminal technology, from strength to strength, terminal inputs termination function Control mode is more convenient, nature, close friend.At present, major part terminal all supports following three class input modes:
1) input through keyboard: keyboard be the most frequently used be also topmost input equipment, by keyboard, can by up and down side It is input in terminal to control, English alphabet, numeral, punctuation mark etc., thus sends order, input data etc. to terminal.
2) touch screen input
Along with the development of the mobile terminal technology of giant-screen, present touch screen technology comparative maturity, support Single-point and multiple point touching input, use simple and convenient, and can bring good Consumer's Experience for user.
3) phonetic entry
Along with the development of speech recognition technology is with ripe, it is more square that phonetic entry is that mobile terminal provides one Just, the man-machine interaction approach of intelligence.
By above-mentioned multiple input modes, user can realize clicking on, slides up and down, amplifies, reduces, rotation etc. is each Plant information input operation.For the interactive operation between convenient user and terminal, abundant terminal input control mode, it is provided that A kind of new terminal input control mode becomes one of prior art technical problem urgently to be resolved hurrily.
Summary of the invention
The embodiment of the present invention provides a kind of terminal input control method and device, inputs control mode in order to abundant terminal, Provide the user more naturally, interactive operation intuitively.
The embodiment of the present invention provides a kind of terminal input control method, including:
Utilize the real time imaging that photographic head obtains, identify the eyeball in described real time imaging, described eyeball bag Include white of the eye region and eye core region;
Identify operation nictation in described real time imaging;And
Determine the core region positional information relative to white of the eye region;
Terminal input operation is determined according to operation nictation or according to described operation nictation and described positional information.
The embodiment of the present invention provides a kind of terminal input control device, including:
First recognition unit, for the real time imaging utilizing photographic head to obtain, identifies the eyeball district in described real time imaging Territory, described eyeball includes white of the eye region and eye core region;
Second recognition unit, for identifying operation nictation in described real time imaging;
First determines unit, for determining the core region positional information relative to white of the eye region;
Second determines unit, for operating according to nictation or determining end according to described operation nictation and described positional information End input operation.
The terminal input control method of embodiment of the present invention offer and device, identify eyeball by photographic head, should Eyeball includes white of the eye region and eye core region, according to operation nictation identified, and combines a core region relative to the white of the eye The positional information in region, determines terminal input operation.In the embodiment of the present invention, by eyeball based on traditional computer vision Identification technology is applied to terminal, and is controlled terminal input based on eyeball action recognition, it is provided that a kind of new terminal is defeated Access control method, enriches terminal input control mode, has provided the user more naturally, interactive operation intuitively.
Other features and advantages of the present invention will illustrate in the following description, and, partly become from description Obtain it is clear that or understand by implementing the present invention.The purpose of the present invention and other advantages can be by the explanations write Structure specifically noted in book, claims and accompanying drawing realizes and obtains.
Accompanying drawing explanation
Fig. 1 is in the embodiment of the present invention, the implementing procedure schematic diagram of terminal input control method;
Fig. 2 is in the embodiment of the present invention, for the white of the eye and eye core color acquisition schematic diagram;
In Fig. 3 embodiment of the present invention, white of the eye region divides schematic diagram;
Fig. 4 is in the embodiment of the present invention, and eye core region is relative to the positional information schematic diagram in white of the eye region;
Fig. 5 is in the embodiment of the present invention, the structural representation of terminal input control device.
Detailed description of the invention
In order to enrich terminal input control mode, providing the user more naturally, interactive operation intuitively, the present invention implements Example provides a kind of terminal input control method and device.
Below in conjunction with Figure of description, the preferred embodiments of the present invention are illustrated, it will be appreciated that described herein Preferred embodiment is merely to illustrate and explains the present invention, is not intended to limit the present invention, and in the case of not conflicting, this Embodiment in bright and the feature in embodiment can be mutually combined.
As it is shown in figure 1, the implementing procedure schematic diagram of terminal input control method provided for the embodiment of the present invention, including with Lower step:
S101, utilize the real time imaging that photographic head obtains, identify the eyeball in this real time imaging;
Wherein, eyeball includes white of the eye region and eye core region.When being embodied as, due to the white of the eye and the face of eye core of people Color is different, in order to improve eyeball recognition accuracy, uses first user and is inputted by oculomotor control terminal During mode, need gather the white of the eye and the color of eye core and store corresponding white of the eye colouring information and eyeball colouring information.Such as Fig. 2 institute Show, gather schematic diagram for white of the eye colouring information and eye core colouring information, concrete, first start mobile terminal camera, and make Face just photographic head to mobile terminal, secondly, is placed on the little square frame of mobile terminal display interface respectively by eyeball and eye core In, to gather the white of the eye and the color of eye core, after alignment, mobile terminal records white of the eye color and eye core color, for the ease of describing, White of the eye colouring information and eye core colouring information is represented respectively with Cr1 and Cr2.
So, when user uses oculomotor control terminal to input, the image that photographic head obtains in real time is utilized, according to note The white of the eye colouring information of record and eye core colouring information, identify possible white of the eye region and eye core region, in the picture when detecting During with the region of white of the eye matches color information of record, determine that this region is as the criterion white of the eye region, when the eye core detected and record During the region of matches color information, determine that this region is as the criterion a core region.Owing to people has two eyes, therefore, typically can know Do not go out 2 white of the eye regions and 2 eye core regions, but, owing to colour recognition is affected bigger by light, easily occur identifying mistake By mistake, therefore, the actual white of the eye identified and eye core region can exceed that 2.It is preferred that in order to solve the problems referred to above, the present invention In embodiment, can further determine that whether the quasi-white of the eye region identified and quasi-eye core region are real in such a way White of the eye region and eye core region:
The boundary rectangle in every white of the eye region surely that definition identifies is Rwi(i=1,2,3 ...), arbitrary quasi-white of the eye region Four apex coordinates of boundary rectangle be respectively LTwi(LTwxi,LTwyi), RTwi(RTwxi,RTwyi), LBwi(LBwxi, LBwyi), RBwi(RBwxi,RBwyi), wherein, LTwi, RTwi, LBwi, RBwiIt is respectively a left side for this quasi-white of the eye region boundary rectangle Upper, upper right, lower-left, the coordinate of bottom right vertex, as a rule, LTwxi=LBwxi, RTwxi=RBwxi, LTwyi=RTwyi, LBwyi=RBwyi.Based on this, it may be determined that white of the eye regional center point coordinates Ewi(Ewxi, Ewyi) be respectively as follows:
Ew x i = | LTwx i + RTwx i | 2 + | LBwx i + RBwx i | 2 2 ;
Ew y i = | LTwy i + RTwy i | 2 + | LBwy i + RBwy i | 2 2 .
The boundary rectangle in every eye core region surely that definition identifies is Rki(i=1,2,3 ...), arbitrary quasi-eye core region Four apex coordinates of boundary rectangle be respectively LTki(LTkxi,LTkyi), RTki(RTkxi,RTkyi), LBki(LBkxi, LBkyi), RBki(RBkxi,RBkyi), wherein, LTki, RTki, LBki, RBkiIt is respectively a left side for this quasi-core white region boundary rectangle Upper, upper right, lower-left, the coordinate of bottom right vertex, as a rule, LTkxi=LBkxi, RTkxi=RBkxi, LTkyi=RTkyi, LBkyi=RBkyi.Based on this, it may be determined that white of the eye regional center point coordinates Eki(Ekxi, Ekyi) be respectively as follows:
Ek x i = | LTkx i + RTkx i | 2 + | LBkx i + RBkx i | 2 2 ;
Ek y i = | LTky i + RTky i | 2 + | LBky i + RBky i | 2 2 .
Owing to eye core region should be positioned at white of the eye intra-zone, accordingly, for the every eye core region surely identified, calculate This core region and the position relationship in each white of the eye region, if the central point in this core region is positioned at the external of a certain white of the eye region During rectangle inside, then this quasi-eye core region and this quasi-white of the eye region are formed pairing, form quasi-white of the eye region and the eye core of pairing Region is the eyeball of the eyes identified.Such as, when center point coordinate and certain white of the eye region surely in this quasi-eye core region Four apex coordinates of boundary rectangle when meeting following condition, it may be determined that this quasi-eye core region is positioned at this quasi-white of the eye region Portion: LBwxi< Ekxi< RBwxiAnd LBwyi< Ekyi< LTwyi
During it is preferred that be embodied as, in order to improve recognition accuracy further, it is also possible to combine eyeball shape feature and carry out Judge.Owing to eyeball shape is oval, and on face, there is multiple elliptical shape feature, such as, lip etc., therefore, photographic head The image obtained may comprise multiple region S meeting elliptical shapei(i=1,2,3 ...).Concrete, can will identify that It is shaped as the region S of ellipseiCarry out comparison one by one with eyeball (white of the eye) region of storage, calculate SiCentral point SpiWith the white of the eye Regional center point EwiBetween distance, it is assumed that SpiCoordinate is (Spxi,Spyi), then distance d between 2 is:If d is less than or equal to predetermined threshold value, determine that the region of this ellipse is eyeball district Territory.
S102, eyes operation nictation identified in real time imaging;
Once blinking and the most once close one's eyes and open eyes, the image obtained according to photographic head, when can't detect a core in the picture During region, it may be determined that for operation of closing one's eyes, follow-up when a core region being detected, it may be determined that this process is operation nictation. In the embodiment of the present invention, using operation nictation as a kind of trigger condition, i.e. every time after identifying operation nictation, in conjunction with eyeball Motion judges terminal input operation.
S103, determine the core region positional information relative to white of the eye region;
In order to realize by identifying that ocular movement controls the purpose of terminal input operation, as it is shown on figure 3, the present invention implements In example, white of the eye region is divided into following nine regions, is respectively as follows: stagnant zone, upper region, lower region, left region, You Qu Territory, top left region, lower left region, right regions and lower right area, by judging that the central point in eye core region is positioned at that region In, it may be determined that go out the core region positional information relative to white of the eye region, concrete, when the central point in eye core region is positioned at eye During the stagnant zone of white region, determine that a core region is static relative to the positional information in white of the eye region;When in eye core region When heart point is positioned at the upper region in white of the eye region, determine that a core region is upper relative to the positional information in white of the eye region;When Yan Ren district When the central point in territory is positioned at the lower region in white of the eye region, under determining that a core region relative to the positional information in white of the eye region is;When When the central point in eye core region is positioned at the left region in white of the eye region, determine that a core region relative to the positional information in white of the eye region is Left;When the central point in eye core region is positioned at the right region in white of the eye region, determine the position relative to white of the eye region, a core region Information is right;When the central point in eye core region is positioned at the top left region in white of the eye region, determine that a core region is relative to white of the eye district The positional information in territory is upper left;When the central point in eye core region is positioned at the lower left region in white of the eye region, determine a core region phase Positional information for white of the eye region is lower-left;When the central point in eye core region is positioned at the right regions in white of the eye region, determine Eye core region is upper right relative to the positional information in white of the eye region;When the central point in eye core region is positioned at the bottom right district in white of the eye region During territory, determine that a core region is bottom right relative to the positional information in white of the eye region, as shown in Figure 4, for eye core region relative to eye The positional information schematic diagram of white region.
S104, the positional information operating according to nictation or operating according to nictation and determine determine terminal input operation.
When being embodied as, in order to ensure the reliability that terminal input controls, in the embodiment of the present invention, in operation nictation Eye-closing period more than or equal to preset duration threshold value time, be just judged as identifying operation of once blinking, if eye-closing period is less than in advance If during duration threshold value, will be deemed as maloperation and ignore, wherein preset duration threshold value can determine according to actual needs, the most former Then for the duration of normally blinking more than human eye, for example, it is possible to be set to 1s.
It is preferred that in the embodiment of the present invention, following two terminal input operation can be realized according to operation nictation:
Operation one, when identifying operation nictation first, determine that terminal input operation is the contact triggering similar touch screen Event, when again identifying that out that nictation operates within a preset time interval, determines that terminal input operation is for triggering similar touch screen Noncontact event;
Operation two, within a preset time interval, when identifying twice event nictation continuously, determine that terminal input operation is a little Hit operation.
It should be noted that the prefixed time interval in mode two can arrange less than a kind of Preset Time of operation between Every, such as, when identifying operation nictation in less than 4s for twice, determine that the operation of terminal u is clicking operation, and identify first More than in the time of 4s after operation nictation, again identify that out when nictation operates, it may be determined that terminal input operation is similar for triggering The noncontact event of touch screen.
For operation two, in the embodiment of the present invention, can perform according to following steps:
Step one, when identifying operation nictation first, record the time and first identifying operation nictation first respectively The eye-closing period first of operation nictation;
If step 2 eye-closing period first is more than or equal to preset duration threshold value, and again identifies that within a preset time interval To operation nictation, the eye-closing period again of record operation nictation again;
If step 3 eye-closing period again is more than or equal to preset duration threshold value, determine that terminal input operation is clicking operation.
When being embodied as, can realize relative to the positional information in white of the eye region according to operation nictation and eye core region below Three kinds of terminal input operations:
Operate three slides
At a certain time interval, when identifying operation nictation first, trigger the contact event of similar touch screen, with Time, according to eye core region relative to the positional information in white of the eye region, start to perform 4, the upper and lower, left and right side of similar touch screen To slide, when again identifying that out that nictation operates, trigger the noncontact event of similar touch screen, terminate upper and lower, left, Right slide, wherein, it is the longest that the time interval in operation three can be arranged, this is because casual nictation in the short time Being the most rambunctious, the time interval of setting is the longest, can filter out some casual nictation, and such as, time interval is permissible It is set to 4 seconds.
Based on this, in the embodiment of the present invention, can perform according to following steps:
The eye-closing period first of step one, record operation nictation first;
If step 2 eye-closing period first is more than or equal to preset duration threshold value, and if eye core region is relative to white of the eye region Positional information be upper, determine terminal input operation be upward sliding operation;If eye core region is relative to the position in white of the eye region Under information is, determine that terminal input operation is slide downward operation;If eye core region relative to the positional information in white of the eye region is A left side, determines that terminal input operation is for slide to the left;If eye core region is right relative to the positional information in white of the eye region, determine Terminal input operation is slide to the right.
More preferably, it is also possible to comprise the following steps:
Step 4, when identifying operation nictation first, record identifies the time of operation nictation first;
Step 5, again identify that out when nictation operates within a preset time interval, again closing of record operation nictation at this The longest;
If step 6 eye-closing period again is more than or equal to preset duration threshold value, terminate present terminal input operation.
Operate four amplifieroperations and reduction operation
When being embodied as, when user by mobile terminal near face time, utilize the white of the eye district in the image that photographic head obtains Territory than mobile terminal away from big during face, therefore, by detecting the size variation in white of the eye region, it is possible to achieve similar touch screen The amplification of operation, reduction operation.Such as, at a certain time interval, when identifying operation nictation first, similar touching is triggered Touch the contact event of screen, start to record size Aw1 in white of the eye region, when again identifying that out that nictation operates, record white of the eye region Size Aw2, compare Aw1 and Aw2, if Aw1>Aw2, perform amplifieroperation, if Aw1<Aw2, perform reduction operation.Wherein, time Between interval set-up mode similar with aforesaid operations three, repeat no more here.
Based on this, in the embodiment of the present invention, can perform according to following steps:
Step one, when identifying operation nictation first, record the time and first identifying operation nictation first respectively The eye-closing period first of operation nictation;
If step 2 eye-closing period first is more than or equal to preset duration threshold value, record white of the eye area size the first parameter value;
Step 3, again identify that out when nictation operates within a preset time interval, again closing of record operation nictation again The longest;
If step 4 eye-closing period again is more than or equal to described preset duration threshold value, record white of the eye area size second is joined Numerical value;
Step 5, compare the first parameter value and the second parameter value, if the first parameter value is more than the second parameter value, determine terminal Input operation is amplifieroperation, if the first parameter value is less than the second parameter value, determines that terminal input operation is reduction operation.
Operate five rotation process
When being embodied as, when user's head turns left or turns right, utilize the white of the eye region in the image that photographic head obtains Major axis the rotation of certain angle also can occur, therefore, changed by the anglec of rotation of detection white of the eye region major axis, it is possible to achieve The rotation process of similar touch screen.Such as, within a preset time interval, when identifying operation nictation first, similar touching is triggered Touch the contact event of screen, record white of the eye region longitudinal axis L ine1, when again identifying that out that nictation operates, record white of the eye region major axis Line2, if the anglec of rotation that Line2 is relative to line1 is more than 0, perform to turn clockwise operation, if Line2 is relative When the anglec of rotation of line1 is less than 0, perform rotation process counterclockwise.
When being embodied as, owing to two ocular movement trends and directions are all consistent, therefore it may only be necessary to by identifying one The motion of individual eyeball determines terminal input operation.Outside two, owing to user would generally use mobile terminal during movement, this The most inevitably bring shake, the reliability that impact is inputted by oculomotor control terminal, easily cause maloperation, shake Being mainly characterized by displacement smaller, the distance between i.e. two positions is smaller and translational speed ratio is very fast, i.e. from one The time that position moves between another position is comparatively short, for the two above feature of shake, in the embodiment of the present invention, and can To use following methods detection shake, and during using oculomotor control terminal input operation, filter out these shakes. Such as, when meeting following condition, can be identified as shaking: ( Ewx 1 - Ewx 2 ) 2 + ( Ewy 1 - Ewy 2 ) 2 &times; | Pi 1 - Pi 2 | < Ts , Wherein, (Ewx1,Ewy1) for moving front white of the eye region boundary rectangle center point coordinate, (Ewx2,Ewy2) it is white of the eye region after movement Boundary rectangle center point coordinate, Pi1, Pi2 represent respectively mobile before and after time value, Ts is for presetting dithering threshold, i.e. when simultaneously Meet that displacement is less and during translational speed very fast the two condition, be identified as shake, and ignore this operation, to avoid Maloperation.
In the embodiment of the present invention, operated and eye core region phase by the nictation in the real time imaging that identification camera obtains For the position in white of the eye region, control terminal and perform different input operations such that it is able to liberation user's both hands, defeated for terminal Enter to provide a kind of new input control mode, it is achieved that more naturally, interactive operation intuitively.
Based on same inventive concept, the embodiment of the present invention additionally provides a kind of terminal input control device, due to this dress The principle putting solution problem is similar to above-mentioned terminal input control method, the terminal input that therefore enforcement of this device may refer to The enforcement of control method, repeats no more in place of repetition.
As it is shown in figure 5, the structural representation of the terminal input control device provided for the embodiment of the present invention, including:
First recognition unit 501, for the real time imaging utilizing photographic head to obtain, identifies the eyeball district in this real time imaging Territory, wherein, eyeball includes white of the eye region and eye core region;
Second recognition unit 502, for identifying operation nictation in real time imaging;
First determines unit 503, for determining the core region positional information relative to white of the eye region;
Second determines unit 504, for operating according to nictation or determining end according to operation nictation and described positional information End input operation.
When being embodied as, the first recognition unit 501, may include that
First determines subelement, for the white of the eye colouring information according to storage and eye core colouring information, determines and storage The region of white of the eye matches color information is as the criterion white of the eye region, determines that the region with the eye core matches color information of storage is as the criterion a core Region;And determine boundary rectangle and the central point in every eye core region surely in every white of the eye region surely respectively;
Pairing subelement, for for every eye core region surely, if the central point in this quasi-eye core region is positioned at arbitrary quasi-eye Inside the boundary rectangle of white region, form quasi-eye core region and quasi-white of the eye regions pair;
Second determines subelement, is the eyes identified for determining the quasi-white of the eye region forming pairing and eye core region Eyeball.
Wherein, second determines subelement, it is also possible to for determining that the quasi-white of the eye region forming pairing and eye core region are Before the eyeball of the eyes identified, determine the central point in the quasi-white of the eye region forming pairing with this quasi-eye core region;And Determine that the distance between described quasi-white of the eye regional center point and the white of the eye regional center point of storage is less than or equal to predeterminable range threshold value.
When being embodied as, white of the eye region include stagnant zone, upper region, lower region, left region, right region, top left region, Lower left region, right regions and lower right area, based on this, first determines unit 502, may include that
3rd determines subelement, for determining a central point in core region;
4th determines subelement, for when the central point in eye core region is positioned at the stagnant zone in white of the eye region, determines eye Core region is static relative to the positional information in white of the eye region;When the central point in eye core region is positioned at the upper region in white of the eye region Time, determine that a core region is upper relative to the positional information in white of the eye region;When the central point in eye core region is positioned at white of the eye region During lower region, under determining that a core region relative to the positional information in white of the eye region is;When the central point in eye core region is positioned at the white of the eye During the left region in region, determine that a core region is a left side relative to the positional information in white of the eye region;Central point position when eye core region When the right region in white of the eye region, determine that a core region is the right side relative to the positional information in white of the eye region;When in eye core region When heart point is positioned at the top left region in white of the eye region, determine that a core region is upper left relative to the positional information in white of the eye region;Work as eye When the central point in core region is positioned at the lower left region in white of the eye region, determine that a core region relative to the positional information in white of the eye region is Lower-left;When the central point in eye core region is positioned at the right regions in white of the eye region, determine that a core region is relative to white of the eye region Positional information is upper right;When the central point in eye core region is positioned at the lower right area in white of the eye region, determine a core region relative to The positional information in white of the eye region is bottom right.
When being embodied as, second determines unit 504, may include that
First record subelement, for the eye-closing period first of record operation nictation first;
5th determines subelement, if for eye-closing period first more than or equal to preset duration threshold value, and if eye core region phase Positional information for white of the eye region is upper, determines that terminal input operation is upward sliding operation;If eye core region is relative to eye Under the positional information of white region is, determine that terminal input operation is slide downward operation;If eye core region is relative to white of the eye region Positional information be left, determine that terminal input operation is for slide to the left;If eye core region is relative to the position in white of the eye region Information is right, determines that terminal input operation is for slide to the right.
Or, when being embodied as, second determines unit 504, may include that
Second record subelement, for when identifying operation nictation first, record identifying first operation nictation time Between;And again identify that out within a preset time interval when nictation operates, the eye-closing period again of record operation nictation again;
Terminate subelement, if for eye-closing period again more than or equal to preset duration threshold value, terminate present terminal input behaviour Make.
Or, second determines unit 504, may include that
3rd record subelement, for when identifying operation nictation first, records respectively and identifies operation nictation first Time and first nictation operation eye-closing period first;If eye-closing period is more than or equal to preset duration threshold value first, record eye White region size the first parameter value;And again identify that out within a preset time interval when nictation operates, record the behaviour that again blinks The eye-closing period again made;If eye-closing period is more than or equal to described preset duration threshold value again, record white of the eye area size second Parameter value;
First compares subelement, is used for comparing the first parameter value and the second parameter value, if the first parameter value is more than the second ginseng Numerical value, determines that terminal input operation is amplifieroperation, if the first parameter value is less than the second parameter value, determines that terminal input operation is Reduction operation.
Or, second determines unit 504, may include that
4th record subelement, for when identifying operation nictation first, records respectively and identifies operation nictation first Time and first nictation operation eye-closing period first;If eye-closing period is more than or equal to preset duration threshold value first, record eye White region the first major axis value;And again identify that out within a preset time interval when nictation operates, record operation nictation again Eye-closing period again;If eye-closing period is more than or equal to described preset duration threshold value, record white of the eye region the second major axis value again;
Second compares subelement, is used for comparing the first major axis value and the second major axis value, if the first major axis value is relative to second The anglec of rotation of major axis value is more than 0, determines that terminal input operation is for the operation that turns clockwise;If the first major axis value is relative to second The anglec of rotation of major axis value is less than 0, determines that terminal input operation is rotation process counterclockwise.
Or, second determines that unit 504 may include that
5th record subelement, for when identifying operation nictation first, records respectively and identifies operation nictation first Time and first nictation operation eye-closing period first;If eye-closing period is more than or equal to preset duration threshold value first, and in advance If again identifying that operation nictation in time interval, the eye-closing period again of record operation nictation again;
6th determines subelement, if for eye-closing period again more than or equal to described preset duration threshold value, determining that terminal is defeated Enter operation for clicking operation.
The terminal input control method of embodiment of the present invention offer and device, identify eyeball by photographic head, should Eyeball includes white of the eye region and eye core region, according to operation nictation identified, and combines a core region relative to the white of the eye The positional information in region, determines terminal input operation.In the embodiment of the present invention, by eyeball based on traditional computer vision Identification technology is applied to terminal, and is controlled terminal input based on eyeball action recognition, it is provided that a kind of new terminal is defeated Access control method, enriches terminal input control mode, has provided the user more naturally, interactive operation intuitively.
Those skilled in the art are it should be appreciated that embodiments herein can be provided as method, system or computer program Product.Therefore, the reality in terms of the application can use complete hardware embodiment, complete software implementation or combine software and hardware Execute the form of example.And, the application can use at one or more computers wherein including computer usable program code The upper computer program product implemented of usable storage medium (including but not limited to disk memory, CD-ROM, optical memory etc.) The form of product.
The application is with reference to method, equipment (system) and the flow process of computer program according to the embodiment of the present application Figure and/or block diagram describe.It should be understood that can the most first-class by computer program instructions flowchart and/or block diagram Flow process in journey and/or square frame and flow chart and/or block diagram and/or the combination of square frame.These computer programs can be provided Instruction arrives the processor of general purpose computer, special-purpose computer, Embedded Processor or other programmable data processing device to produce A raw machine so that the instruction performed by the processor of computer or other programmable data processing device is produced for real The device of the function specified in one flow process of flow chart or multiple flow process and/or one square frame of block diagram or multiple square frame now.
These computer program instructions may be alternatively stored in and computer or other programmable data processing device can be guided with spy Determine in the computer-readable memory that mode works so that the instruction being stored in this computer-readable memory produces and includes referring to Make the manufacture of device, this command device realize at one flow process of flow chart or multiple flow process and/or one square frame of block diagram or The function specified in multiple square frames.
These computer program instructions also can be loaded in computer or other programmable data processing device so that at meter Perform sequence of operations step on calculation machine or other programmable devices to produce computer implemented process, thus at computer or The instruction performed on other programmable devices provides for realizing at one flow process of flow chart or multiple flow process and/or block diagram one The step of the function specified in individual square frame or multiple square frame.
Although having been described for the preferred embodiment of the application, but those skilled in the art once know basic creation Property concept, then can make other change and amendment to these embodiments.So, claims are intended to be construed to include excellent Select embodiment and fall into all changes and the amendment of the application scope.
Obviously, those skilled in the art can carry out various change and the modification essence without deviating from the present invention to the present invention God and scope.So, if these amendments of the present invention and modification belong to the scope of the claims in the present invention and equivalent technologies thereof Within, then the present invention is also intended to comprise these change and modification.

Claims (16)

1. a terminal input control method, it is characterised in that including:
Utilize the real time imaging that photographic head obtains, identify that the eyeball in described real time imaging, described eyeball include eye White region and eye core region, specifically include: according to white of the eye colouring information and the eye core colouring information of storage, determines and the eye of storage The region of white colour information matches is as the criterion white of the eye region, determines that the region with the eye core matches color information of storage is as the criterion Yan Ren district Territory;Determine the boundary rectangle in every white of the eye region surely;And determine the central point in every eye core region surely;For every eye core surely Region, if the central point in this quasi-eye core region is positioned at inside the boundary rectangle in arbitrary quasi-white of the eye region, formed quasi-eye core region and Quasi-white of the eye regions pair;Determine and form the quasi-white of the eye region matched and the eyeball that eye core region is the eyes identified;
Identify operation nictation in described real time imaging;And
Determine the core region positional information relative to white of the eye region;
Terminal input operation is determined according to operation nictation or according to described operation nictation and described positional information.
2. the method for claim 1, it is characterised in that determine and form the quasi-white of the eye region matched and eye core region for knowing Before the eyeball of the eyes not gone out, also include:
Determine the central point in the quasi-white of the eye region forming pairing with this quasi-eye core region;And
Determine that the distance between described quasi-white of the eye regional center point and the white of the eye regional center point of storage is less than or equal to predeterminable range Threshold value.
3. the method for claim 1, it is characterised in that described white of the eye region includes stagnant zone, upper region, Xia Qu Territory, left region, right region, top left region, lower left region, right regions and lower right area;And
Determine that a core region, relative to the positional information in white of the eye region, specifically includes:
Determine a central point in core region;
When the central point in eye core region is positioned at the stagnant zone in white of the eye region, determine the position relative to white of the eye region, a core region Confidence breath is for static;
When the central point in eye core region is positioned at the upper region in white of the eye region, determine the position relative to white of the eye region, a core region Information is upper;
When the central point in eye core region is positioned at the lower region in white of the eye region, determine the position relative to white of the eye region, a core region Under information is;
When the central point in eye core region is positioned at the left region in white of the eye region, determine the position relative to white of the eye region, a core region Information is left;
When the central point in eye core region is positioned at the right region in white of the eye region, determine the position relative to white of the eye region, a core region Information is right;
When the central point in eye core region is positioned at the top left region in white of the eye region, determine the position relative to white of the eye region, a core region Confidence breath is upper left;
When the central point in eye core region is positioned at the lower left region in white of the eye region, determine the position relative to white of the eye region, a core region Confidence breath is lower-left;
When the central point in eye core region is positioned at the right regions in white of the eye region, determine the position relative to white of the eye region, a core region Confidence breath is upper right;
When the central point in eye core region is positioned at the lower right area in white of the eye region, determine the position relative to white of the eye region, a core region Confidence breath is bottom right.
4. method as claimed in claim 3, it is characterised in that determine according to eyes operation nictation and described position area information Terminal input operation, specifically includes:
The eye-closing period first of record operation nictation first;
If eye-closing period is more than or equal to preset duration threshold value first, and
If eye core region is upper relative to the positional information in white of the eye region, determine that terminal input operation is upward sliding operation;
If under eye core region relative to the positional information in white of the eye region is, determine that terminal input operation is slide downward operation;
If eye core region is left relative to the positional information in white of the eye region, determine that terminal input operation is for slide to the left;
If eye core region is right relative to the positional information in white of the eye region, determine that terminal input operation is for slide to the right.
5. method as claimed in claim 4, it is characterised in that also include:
When identifying operation nictation first, record identifies the time of operation nictation first;And
Again identify that out within a preset time interval when nictation operates, the eye-closing period again of record operation nictation again;
If eye-closing period is more than or equal to preset duration threshold value again, terminate present terminal input operation.
6. the method for claim 1, it is characterised in that determine according to eyes operation nictation and described position area information Terminal input operation, specifically includes:
When identifying operation nictation first, record time and the head of operation of blinking first identifying operation nictation first respectively Secondary eye-closing period;
If eye-closing period is more than or equal to preset duration threshold value first, record white of the eye area size the first parameter value;
Again identify that out within a preset time interval when nictation operates, the eye-closing period again of record operation nictation again;
If eye-closing period is more than or equal to described preset duration threshold value again, record white of the eye area size the second parameter value;
Relatively the first parameter value and the second parameter value, if the first parameter value is more than the second parameter value, determines that terminal input operation is Amplifieroperation, if the first parameter value is less than the second parameter value, determines that terminal input operation is reduction operation.
7. the method for claim 1, it is characterised in that determine according to eyes operation nictation and described position area information Terminal input operation, specifically includes:
When identifying operation nictation first, record time and the head of operation of blinking first identifying operation nictation first respectively Secondary eye-closing period;
If eye-closing period is more than or equal to preset duration threshold value, record white of the eye region the first major axis value first;
Again identify that out within a preset time interval when nictation operates, the eye-closing period again of record operation nictation again;
If eye-closing period is more than or equal to described preset duration threshold value, record white of the eye region the second major axis value again;
Relatively the first major axis value and the second major axis value, if the first major axis value relative to the anglec of rotation of the second major axis value more than 0, really Determine terminal input operation for the operation that turns clockwise;If the first major axis value relative to the anglec of rotation of the second major axis value less than 0, really Determining terminal input operation is rotation process counterclockwise.
8. the method for claim 1, it is characterised in that determine terminal input operation, specifically according to eyes operation nictation Including:
When identifying operation nictation first, record time and the head of operation of blinking first identifying operation nictation first respectively Secondary eye-closing period;
If eye-closing period is more than or equal to preset duration threshold value first, and
Again identify that operation nictation within a preset time interval, the eye-closing period again of record operation nictation again;
If eye-closing period is more than or equal to described preset duration threshold value again, determine that terminal input operation is clicking operation.
9. a terminal input control device, it is characterised in that including:
First recognition unit, for the real time imaging utilizing photographic head to obtain, identifies the eyeball in described real time imaging, institute State eyeball and include white of the eye region and eye core region;Described first recognition unit includes: first determines subelement, for basis The white of the eye colouring information of storage and eye core colouring information, determine that the region of white of the eye matches color information with storage is as the criterion white of the eye district Territory, determines that the region of eye core matches color information with storage is as the criterion a core region;And determine every white of the eye surely respectively The boundary rectangle in region and the central point in every eye core region surely;Pairing subelement, for for every eye core region surely, if should The central point in quasi-eye core region is positioned at inside the boundary rectangle in arbitrary quasi-white of the eye region, forms quasi-eye core region and quasi-white of the eye region Pairing;Second determines subelement, for determining the quasi-white of the eye region forming pairing and the eye that eye core region is the eyes identified Ball region;
Second recognition unit, for identifying operation nictation in described real time imaging;
First determines unit, for determining the core region positional information relative to white of the eye region;
Second determines unit, for operating according to nictation or determining that terminal is defeated according to described operation nictation and described positional information Enter operation.
10. device as claimed in claim 9, it is characterised in that
Described second determines subelement, and being additionally operable to determining the quasi-white of the eye region forming pairing and eye core region is the eye identified Before the eyeball of eyeball, determine the central point in the quasi-white of the eye region forming pairing with this quasi-eye core region;And determine described standard Distance between the white of the eye regional center point of white of the eye regional center point and storage is less than or equal to predeterminable range threshold value.
11. devices as claimed in claim 9, it is characterised in that described white of the eye region includes stagnant zone, upper region, Xia Qu Territory, left region, right region, top left region, lower left region, right regions and lower right area;And
Described first determines unit, including:
3rd determines subelement, for determining a central point in core region;
4th determines subelement, for when the central point in eye core region is positioned at the stagnant zone in white of the eye region, determines Yan Ren district Territory is static relative to the positional information in white of the eye region;When the central point in eye core region is positioned at the upper region in white of the eye region, really It is upper for determining eye core region relative to the positional information in white of the eye region;When the central point in eye core region is positioned at the lower region in white of the eye region Time, under determining that a core region relative to the positional information in white of the eye region is;When the central point in eye core region is positioned at white of the eye region During left region, determine that a core region is a left side relative to the positional information in white of the eye region;When the central point in eye core region is positioned at the white of the eye During the right region in region, determine that a core region is the right side relative to the positional information in white of the eye region;Central point position when eye core region When the top left region in white of the eye region, determine that a core region is upper left relative to the positional information in white of the eye region;When eye core region Central point when being positioned at the lower left region in white of the eye region, determine that a core region is lower-left relative to the positional information in white of the eye region; When the central point in eye core region is positioned at the right regions in white of the eye region, determine that a core region is believed relative to the position in white of the eye region Breath is upper right;When the central point in eye core region is positioned at the lower right area in white of the eye region, determine that a core region is relative to white of the eye district The positional information in territory is bottom right.
12. devices as claimed in claim 11, it is characterised in that described second determines unit, including:
First record subelement, for the eye-closing period first of record operation nictation first;
5th determines subelement, if for eye-closing period first more than or equal to preset duration threshold value, and if eye core region relative to The positional information in white of the eye region is upper, determines that terminal input operation is upward sliding operation;If eye core region is relative to white of the eye district Under the positional information in territory is, determine that terminal input operation is slide downward operation;If eye core region is relative to the position in white of the eye region Confidence breath is a left side, determines that terminal input operation is for slide to the left;If eye core region is relative to the positional information in white of the eye region For the right side, determine that terminal input operation is for slide to the right.
13. devices as claimed in claim 12, it is characterised in that described second determines unit, including:
Second record subelement, for when identifying operation nictation first, record identifies the time of operation nictation first;With And again identify that out within a preset time interval when nictation operates, the eye-closing period again of record operation nictation again;
Terminate subelement, if for eye-closing period again more than or equal to preset duration threshold value, terminate present terminal input operation.
14. devices as claimed in claim 9, it is characterised in that described second determines unit, including:
3rd record subelement, for when identifying operation nictation first, record respectively identifying first operation nictation time Between and first nictation operation eye-closing period first;If eye-closing period is more than or equal to preset duration threshold value first, record white of the eye district Territory size the first parameter value;And again identify that out within a preset time interval when nictation operates, record operation nictation again Eye-closing period again;If eye-closing period is more than or equal to described preset duration threshold value again, record white of the eye area size the second parameter Value;
First compares subelement, is used for comparing the first parameter value and the second parameter value, if the first parameter value is more than the second parameter value, Determine that terminal input operation is amplifieroperation, if the first parameter value is less than the second parameter value, determine that terminal input operation is for reducing Operation.
15. devices as claimed in claim 9, it is characterised in that described second determines unit, including:
4th record subelement, for when identifying operation nictation first, record respectively identifying first operation nictation time Between and first nictation operation eye-closing period first;If eye-closing period is more than or equal to preset duration threshold value first, record white of the eye district Territory the first major axis value;And again identify that out within a preset time interval when nictation operates, record operation nictation again is again Eye-closing period;If eye-closing period is more than or equal to described preset duration threshold value, record white of the eye region the second major axis value again;
Second compares subelement, is used for comparing the first major axis value and the second major axis value, if the first major axis value is relative to the second major axis The anglec of rotation of value, more than 0, determines that terminal input operation is for the operation that turns clockwise;If the first major axis value is relative to the second major axis The anglec of rotation of value, less than 0, determines that terminal input operation is rotation process counterclockwise.
16. devices as claimed in claim 9, it is characterised in that described second determines unit, including:
5th record subelement, for when identifying operation nictation first, record respectively identifying first operation nictation time Between and first nictation operation eye-closing period first;If eye-closing period is more than or equal to preset duration threshold value first, and when default Between interval in again identify that nictation operation, record again nictation operation eye-closing period again;
6th determines subelement, if for eye-closing period again more than or equal to described preset duration threshold value, determines terminal input behaviour As clicking operation.
CN201210301032.6A 2012-08-22 2012-08-22 A kind of terminal input control method and device Active CN103631365B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201210301032.6A CN103631365B (en) 2012-08-22 2012-08-22 A kind of terminal input control method and device
PCT/CN2013/079567 WO2014029245A1 (en) 2012-08-22 2013-07-18 Terminal input control method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210301032.6A CN103631365B (en) 2012-08-22 2012-08-22 A kind of terminal input control method and device

Publications (2)

Publication Number Publication Date
CN103631365A CN103631365A (en) 2014-03-12
CN103631365B true CN103631365B (en) 2016-12-21

Family

ID=50149399

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210301032.6A Active CN103631365B (en) 2012-08-22 2012-08-22 A kind of terminal input control method and device

Country Status (2)

Country Link
CN (1) CN103631365B (en)
WO (1) WO2014029245A1 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105242888B (en) * 2014-07-10 2018-10-12 联想(北京)有限公司 A kind of system control method and electronic equipment
CN106033253A (en) * 2015-03-12 2016-10-19 中国移动通信集团公司 A terminal control method and device
CN104951070A (en) * 2015-06-02 2015-09-30 无锡天脉聚源传媒科技有限公司 Method and device for manipulating device based on eyes
CN108351685B (en) * 2015-08-15 2022-07-08 谷歌有限责任公司 System and method for biomechanically based eye signals for interacting with real and virtual objects
CN105425971B (en) * 2016-01-15 2018-10-26 中意工业设计(湖南)有限责任公司 A kind of exchange method, device and the near-eye display at eye movement interface
CN106527705A (en) * 2016-10-28 2017-03-22 努比亚技术有限公司 Operation realization method and apparatus
CN106648175A (en) * 2016-12-30 2017-05-10 宇龙计算机通信科技(深圳)有限公司 Convenient fingerprint operation method and terminal
CN110162187A (en) * 2019-06-19 2019-08-23 重庆工商职业学院 Eyeball mobile identification method and device based on artificial intelligence
CN110297539A (en) * 2019-06-19 2019-10-01 重庆工商职业学院 A kind of eye movement recognition methods and device based on artificial intelligence

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101639939A (en) * 2008-07-29 2010-02-03 株式会社日立制作所 Image information processing method and apparatus
CN101813976A (en) * 2010-03-09 2010-08-25 华南理工大学 Sighting tracking man-computer interaction method and device based on SOC (System On Chip)
CN101950200A (en) * 2010-09-21 2011-01-19 浙江大学 Camera based method and device for controlling game map and role shift by eyeballs
CN102033696A (en) * 2009-09-24 2011-04-27 株式会社泛泰 Apparatus and method for controlling picture using image recognition
CN102193621A (en) * 2010-03-17 2011-09-21 三星电子(中国)研发中心 Vision-based interactive electronic equipment control system and control method thereof

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7736000B2 (en) * 2008-08-27 2010-06-15 Locarna Systems, Inc. Method and apparatus for tracking eye movement
ES2669058T3 (en) * 2009-07-16 2018-05-23 Tobii Ab Eye detection system and method that uses sequential data flow
US9910490B2 (en) * 2011-12-29 2018-03-06 Eyeguide, Inc. System and method of cursor position control based on the vestibulo-ocular reflex

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101639939A (en) * 2008-07-29 2010-02-03 株式会社日立制作所 Image information processing method and apparatus
CN102033696A (en) * 2009-09-24 2011-04-27 株式会社泛泰 Apparatus and method for controlling picture using image recognition
CN101813976A (en) * 2010-03-09 2010-08-25 华南理工大学 Sighting tracking man-computer interaction method and device based on SOC (System On Chip)
CN102193621A (en) * 2010-03-17 2011-09-21 三星电子(中国)研发中心 Vision-based interactive electronic equipment control system and control method thereof
CN101950200A (en) * 2010-09-21 2011-01-19 浙江大学 Camera based method and device for controlling game map and role shift by eyeballs

Also Published As

Publication number Publication date
WO2014029245A1 (en) 2014-02-27
CN103631365A (en) 2014-03-12

Similar Documents

Publication Publication Date Title
CN103631365B (en) A kind of terminal input control method and device
JP6159323B2 (en) Information processing method and information processing apparatus
US9940754B2 (en) Head-mounted display system and method for presenting display on head-mounted display
US9377859B2 (en) Enhanced detection of circular engagement gesture
EP3293620A1 (en) Multi-screen control method and system for display screen based on eyeball tracing technology
CN105892642A (en) Method and device for controlling terminal according to eye movement
WO2017112099A1 (en) Text functions in augmented reality
CN109656354A (en) Information processing unit and information processing method
US9544556B2 (en) Projection control apparatus and projection control method
US20150124069A1 (en) Information processing device and information processing method
US11809635B2 (en) Computer system and method for human-machine interaction
CN106557672A (en) The solution lock control method of head mounted display and device
EP3410277B1 (en) Image projection device
CN107992262A (en) Split screen display available interface control method, mobile terminal and computer-readable recording medium
US10248307B2 (en) Virtual reality headset device with front touch screen
CN106774833A (en) Virtual reality exchange method and device
EP3974949A1 (en) Head-mounted display
JP5124694B1 (en) Information processing apparatus, electronic device, information processing method, and program
JP5368618B2 (en) Information processing apparatus, electronic device, information processing method, and program
US20150379754A1 (en) Image processing apparatus, animation creation method, and computer-readable medium
KR101680084B1 (en) Designing method for gesture interface and designing apparatus for gesture interface
JP2013228992A (en) Information processing apparatus, electronic apparatus, information processing method and program
KR101474873B1 (en) Control device based on non-motion signal and motion signal, and device control method thereof
KR20130128143A (en) Apparatus and method for controlling interface using hand gesture and computer-readable recording medium with program therefor
EP4258253A1 (en) Method and apparatus for adjusting display color, electronic device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant