CN106814854A - A kind of method and device for preventing maloperation - Google Patents
A kind of method and device for preventing maloperation Download PDFInfo
- Publication number
- CN106814854A CN106814854A CN201611246265.5A CN201611246265A CN106814854A CN 106814854 A CN106814854 A CN 106814854A CN 201611246265 A CN201611246265 A CN 201611246265A CN 106814854 A CN106814854 A CN 106814854A
- Authority
- CN
- China
- Prior art keywords
- user
- eyes
- screen
- determined
- eye motion
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
This application discloses a kind of method and device for preventing maloperation, it is used to solve in the prior art when being operated by touch display screen, the maloperation problem caused because user touches touch display screen unintentionally.Method includes:It is determined that the position coordinates of touch point of the gesture operation corresponding to the operational order for receiving on screen;Obtain the eye motion information of user;According to the eye motion information for obtaining, the screen position region that eyes of user is watched attentively is determined;Judge whether the position coordinates of touch point falls into the screen position region that the eyes of user is watched attentively;When the position coordinates of touch point falls into the screen position region that eyes of user is watched attentively, determine that the corresponding operational order of gesture operation is effectively operation, and perform operation corresponding with operational order;When the position coordinates of touch point does not fall within the screen position region that eyes of user is watched attentively, it is determined that the corresponding operational order of gesture operation is maloperation, refusal performs operation corresponding with operational order.
Description
Technical field
The application is related to field of computer technology, more particularly to a kind of method and device for preventing maloperation.
Background technology
With science and technology continue to develop, increasing intelligent terminal is developed, in all fields for
Daily life brings facility.
And with the fast development of all kinds of intelligent terminals, people feel for user when using intelligent terminal
By the requirement with Consumer's Experience also more and more higher.Therefore, the various buttons for each generic operation on intelligent terminal are also slow
The touch-screen of operation is slowly more facilitated to replace.
Touch-screen is also called touch screen, is that a kind of induction type that can receive user's input signal by way of touching shows
Display screen, user can be referred to by the icon in touch display screen, interface or graphic button with sending each generic operation to intelligent terminal
Order, and then traditional mechanical operation button can be replaced.
And in order to user more preferable Consumer's Experience can be brought, the touch display screen curtain quilt of current intelligent terminal
Do bigger and bigger.Although larger display screen can with user-friendly, during using the touch-screen, due to
Touch display screen curtain too big or touch display screen problem in higher sensitivity, user is operated in point touching display screen
When, it is easy to current and unwanted application or virtual key are touched, and then is occurred opening the application of mistake or is redirected
The situation of the maloperations such as the page to mistake, great inconvenience is caused to the use of user.
Thus, how to avoid due to user's maloperation situation that false touch is touched touch display and occurred unintentionally, as existing
There is technology problem demanding prompt solution.
The content of the invention
The embodiment of the present application provides a kind of method for preventing maloperation, is used to solve in the prior art by touching display
When screen is operated, the maloperation problem caused because user touches touch display screen unintentionally.
The embodiment of the present application also provides a kind of device for preventing maloperation, is used to solve in the prior art aobvious by touching
When display screen is operated, the maloperation problem caused because user touches touch display screen unintentionally.
The embodiment of the present application uses following technical proposals:
A kind of method for preventing maloperation, including:
It is determined that the position coordinates of touch point of the gesture operation corresponding to the operational order for receiving on screen;
Obtain the eye motion information of user;
According to the eye motion information for obtaining, the screen position region that eyes of user is watched attentively is determined;
Judge whether the position coordinates of the touch point falls into the screen position region that the eyes of user is watched attentively;
When the position coordinates of the touch point falls into the screen position region that the eyes of user is watched attentively, the hand is determined
It is effectively operation that gesture operates corresponding operational order, and performs operation corresponding with the operational order;
When the position coordinates of the touch point does not fall within the screen position region that the eyes of user is watched attentively, it is determined that institute
The corresponding operational order of gesture operation is stated for maloperation, refusal performs operation corresponding with the operational order.
A kind of device for preventing maloperation, including:
Touch point position coordinates determining unit, the gesture operation corresponding to operational order received for determination is in screen
On touch point position coordinates;
Eye motion information obtainment unit, the eye motion information for obtaining user;
Screen position area determination unit, for according to the eye motion information for obtaining, determining that eyes of user is watched attentively
Screen position region;
Whether judging unit, the position coordinates for judging the touch point falls into the screen position that the eyes of user is watched attentively
Put region;
Execution unit, the position coordinates for judging the touch point when the judging unit falls into the eyes of user note
Depending on screen position region when, determine the corresponding operational order of the gesture operation for effectively operation, and perform and the operation
Instruct corresponding operation;
Execution unit, the position coordinates for judging the touch point when judging unit does not fall within the eyes of user to be watched attentively
Screen position region when, it is determined that the corresponding operational order of the gesture operation is maloperation, and refusal is performed and the operation
Instruct corresponding operation.
Above-mentioned at least one technical scheme that the embodiment of the present application is used can reach following beneficial effect:
Due to the eye motion information of user can be obtained, and according to the user's eye motion information for obtaining, determine the use
The screen position region of family eyes note, and passing through the screen position region that eyes of user watches attentively can reflect that user is current
Browse any partial content in screen, and then receive user by after the operational order touched display screen and trigger, can
Determine to connect with by judging whether position coordinates of the touch point on screen falls into screen position region that eyes of user watches attentively
Whether the operational order for receiving is to operate effectively instruction, can be generally considered as screen area of the user to currently browsing and leads to
The operational order crossed the gesture operations such as click and trigger is effectively operation, otherwise is then maloperation, then when the position of the touch point
When putting coordinate and falling into the screen position region that the eyes of user is watched attentively, the corresponding operational order of the gesture operation is determined to have
Effect operation, and perform operation corresponding with the operational order.And the position coordinates for working as the touch point does not fall within the user
During the screen position region of eye gaze, it is determined that the corresponding operational order of the gesture operation is maloperation, refusal perform with
The corresponding operation of the operational order, therefore can to a certain extent reduce the maloperation caused due to user's false touch.
Brief description of the drawings
Accompanying drawing described herein is used for providing further understanding of the present application, constitutes the part of the application, this Shen
Schematic description and description please does not constitute the improper restriction to the application for explaining the application.In the accompanying drawings:
A kind of method for preventing maloperation that Fig. 1 is provided for the embodiment of the present application implements schematic flow sheet;
Fig. 2 is the trajectory diagram of the eyes of user blinkpunkt motion when passage is read;
Fig. 3 is the screen position for preventing the eyes of user that the method for maloperation determines from watching attentively provided using the embodiment of the present application
Put region;
Fig. 4 is that the position coordinates of the touch point of user's touching smart mobile phone falls into the screen position region that eyes of user is watched attentively
Design sketch;
Fig. 5 is that the position coordinates of the touch point of user's touching smart mobile phone does not fall within the screen position area that eyes of user is watched attentively
The design sketch in domain;
A kind of concrete structure schematic diagram of device for preventing maloperation that Fig. 6 is provided for the embodiment of the present application.
Specific embodiment
To make the purpose, technical scheme and advantage of the application clearer, below in conjunction with the application specific embodiment and
Corresponding accompanying drawing is clearly and completely described to technical scheme.Obviously, described embodiment is only the application one
Section Example, rather than whole embodiments.Based on the embodiment in the application, those of ordinary skill in the art are not doing
Go out the every other embodiment obtained under the premise of creative work, belong to the scope of the application protection.
Below in conjunction with accompanying drawing, the technical scheme that each embodiment of the application is provided is described in detail.
Embodiment 1
The embodiment of the present application provides a kind of method for preventing maloperation, is used to solve in the prior art aobvious by touching
When display screen is operated, the maloperation problem caused because user touches touch display screen unintentionally.
The executive agent of the method for preventing maloperation that the embodiment of the present application is provided, can be, but not limited to be mobile phone, flat board
At least one in the intelligent terminal such as computer and PC (Personal Computer, PC).
For ease of description, as a example by hereafter executive agent in this way is smart mobile phone, the implementation method to the method is entered
Row is introduced.It is appreciated that the executive agent of the method is a kind of exemplary explanation for smart mobile phone, it is right to be not construed as
The restriction of the method.
The method implements schematic flow sheet as shown in figure 1, mainly comprising the steps:
Step 11, it is determined that the position of touch point of the gesture operation corresponding to the operational order for receiving on screen is sat
Mark;
Wherein, touch point of the gesture operation on screen, refers to by touching the touch-screen of smart mobile phone in user
When curtain is entered the operating instructions, the point that user's finger is contacted with the smart mobile phone Touch Screen.
User can be by the specific keys as shown in the Touch screen for clicking on smart mobile phone, to trigger to intelligent hand
The operational orders such as machine input similar " unlatching ", " closing " or " choosing ", when user is by single clicing on the touch of smart mobile phone
When screen is instructed with triggering assigned operation, then the touch point with the gesture operation corresponding to the operational order on screen is user
The point that finger is contacted with smart mobile phone Touch Screen when this time is clicked on.
And when user is instructed by the slip gesture in Touch screen with triggering assigned operation, user input slides hand
During gesture is operated, with slip of the user's finger on mobile phone screen, will be produced along the sliding trace of user's finger
Several touch points, then those touch points are properly termed as and the slip gesture operation corresponding to operational order is tactile on screen
Control point.
Step 12, obtains the eye motion information of user;
Wherein, the eye motion information of user can refer to the information for reflecting user eyeball activity, typically can be by surveying
The motion for measuring the position or eyeball opposing headers of the blinkpunkt of eyes and then the eyeball dynamic information of eye for obtaining user.
Usually, the ocular movemeut of user can be gathered by the camera on intelligent terminal, and it is dynamic by eye
Instrument is analyzed calculating to the eye activity for collecting, to obtain the eye motion information of user.
Need exist for explanation, eye tracker is not necessarily meant to refer to an independent hardware unit, can also refer to by with
In the software program of the software systems composition for realizing obtaining eye motion information.At present, existing eye tracker generally comprises four
System, mainly there is optical system, center coordinate of eye pupil extraction system, what comes into a driver's and pupil coordinate Superimposition System and image and data
Record analysis system.The eye tracker being previously mentioned in the embodiment of the present application, may generally refer to that aforementioned four system can be realized
The eye tracker application program (Application, APP) of function of uniting, is arranged on mobile phone, and pass through by by eye tracker APP
The camera of mobile phone gathers the ocular movemeut of user, to allow that the mobile phone possesses the function of eye tracker.
Eye motion mainly has three kinds of basic modes:Watch attentively (fixation), twitching of the eyelid (saccades) and following movement
(pursuit movement).And the dynamic mode of eye is often by the blinkpunkt (or can be described as eye pupil) of eyes
Motion mode determines, and at present eye tracker when being analyzed to eye is dynamic, mainly using following several eye motion parameters:
Blinkpunkt trajectory diagram, eye motion time, eye motion direction and eye motion distance (or amplitude A MPLITUDE) etc..
In order to reach the purpose being analyzed to user's sight line by eye tracker, then by performing step 12, primarily to obtaining eye
The eye motion information such as eye motion direction, eye motion distance and eye motion time in portion's motion process.
And during eye motion, the eye motion information such as eye motion direction and eye motion distance generally requires to lead to
Cross to be analyzed the position that the blinkpunkt of eyes of user during eye motion is carried out and more just can determine that, thus in order to obtain
The eye motion information of user, often firstly the need of at least two blinkpunkts during collection user's eye motion.A kind of real
Apply in mode, the specific implementation of step 12 can include:Using camera, according to default time interval collection user's eye
The blinkpunkt of eyeball;According to the sequencing of acquisition time, at least two blinkpunkts are determined;At least two notes according to the determination
Viewpoint, determines the eye motion information of user.
It should be noted that in routine work life, particularly in read books or during browsing webpage, using
The frequency of the eye motion at family is often higher, in order to avoid due to acquisition interval it is more long, and do not collect eyes of user blinkpunkt
Certain change in location, in turn result in the inaccurate problem of the eye motion information that obtains of analysis, it is in one embodiment, past
Toward shorter user's blinkpunkt acquisition time interval can be set, to allow that smart mobile phone is as much as possible by camera
Eyes of user is collected in motion process, the blinkpunkt position of different time points, and then can be by the blinkpunkt that collects
It is determined that more accurate user's eye motion information.
It should be noted that by the eyes of user that will collect in motion process, the blinkpunkt of different time points by
Order according to the time is combined, you can to obtain the blinkpunkt trajectory diagram in user's eye moving process.For example, as shown in Fig. 2 figure
In each stain represent user read passage when blinkpunkt, then in Fig. 2 by several blinkpunkts can composition use
Blinkpunkt trajectory diagram of the family when this section of word is read.
Similarly, according to the distance between each blinkpunkt for collecting, you can determine the eye of each eye motion of user
Move distance, and according to most start recording in interval of time to blinkpunkt and the last blinkpunkt that recorded it is relative
Position, it may be determined that during this period of time eyes of user motion distance and eye motion direction, for example as shown in Fig. 2
According to those blinkpunkts for collecting, it may be determined that the direction of user movement for from left to right, and according to blinkpunkt 1 in Fig. 2
And distance determines eye eye dynamic distance of the user when this section of word is read between blinkpunkt 2.Likewise, according to collection
To the time of blinkpunkt 1 and the time of the blinkpunkt 2 for collecting between time interval, it may be determined that user this time eye fortune
The dynamic eye motion time.
The above method provided by step 12, smart mobile phone can utilize the camera for carrying to collect user's eye fortune
At least two blinkpunkts during dynamic, and then position according to the blinkpunkt for collecting and time for collecting, are calculated
The eye motion information of user, it is possible to which the eye motion information that will be calculated is supplied to the eye on the smart mobile phone
Instrument APP is moved to be analyzed.
Step 13, according to by performing the eye motion information that step 12 is obtained, determines the screen position that eyes of user is watched attentively
Put region;
In one embodiment, smart mobile phone can utilize eye tracker APP to by performing the eye that step 12 is obtained
Movable information is analyzed, to determine the mobile phone screen band of position that eyes of user is watched attentively.Specifically, the implementation of step 13
Can be:Calculating is analyzed to the user's eye motion information for collecting using eye tracker, the side of watching attentively of eyes of user is determined
To;According to the direction of gaze for determining, the screen position region that eyes of user is watched attentively is determined.
The eye tracker APP installed on smart mobile phone can be according to by performing user's eye motion mode that step 12 is obtained
And eye motion distance, determine eyes of user direction of gaze now, and according to eyes of user and smart mobile phone display screen
The distance between and determine direction of gaze, determine the screen position region that eyes of user is watched attentively.
Specifically, the direction of gaze of the eyes of user that can be obtained according to analysis, determines at least two notes of eyes of user
Viewpoint position on the screen;According at least two blinkpunkt position on the screen, eyes of user is determined
The screen position region watched attentively.For example, it is assumed that by performing step 12, smart mobile phone collects user one minute by camera
The motion conditions of interior eye gaze point, are analyzed by the motion conditions of the blinkpunkt to collecting, and can obtain the user
Shielded according to first blinkpunkt collected in this minute to move downward in eye motion direction within this minute
The position of position and last blinkpunkt on curtain on screen, it may be determined that eyes of user is watched attentively within this minute
Screen position region.
For example, as shown in figure 3, two black round dots in Fig. 3 from top to bottom represent that smart mobile phone camera is adopted in 30s
Position of the user's blinkpunkt for collecting on screen, and the black round dot being located in Fig. 3 above screen is watching attentively of first collecting
Point, and the blinkpunkt collected after the black round dot below screen is is located at, according to the position of the two blinkpunkts, it may be determined that
The direction of user's eye motion is for from top to bottom in this 30s, and is in this screen position region that eyes of user is watched attentively in 30
Dotted line frame encloses region in Fig. 3.
Step 14, judges whether the position coordinates of the touch point falls into the screen position area that the eyes of user is watched attentively
Domain, when judged result is to be, performs step 15, when judged result is no, performs step 16;
So that user browses webpage using smart mobile phone as an example, except user is input into stir webpage to smart mobile phone
Beyond operation, user, just can be by clicking on the touch-screen of smart mobile phone often only when comparing content interested is browsed to
Curtain is to be input into specific operation (such as choosing the operation of content of interest), therefore we can be generally considered as user and are often only possible to
To oneself, currently browsed region is operated, and typically the region that do not browsed currently will not be operated, i.e., we can
To think that the operational order that screen area of the user to currently browsing is triggered by gesture operations such as clicks is effectively to grasp
Make, and to being maloperation currently without the operation in the screen area for browsing, then when the position coordinates of the touch point falls into institute
When stating the screen position region that eyes of user is watched attentively, determine that the corresponding operational order of the gesture operation is effectively operation, and hold
Row operation corresponding with the operational order.
Step 15, when the position coordinates of the touch point falls into the screen position region that the eyes of user is watched attentively, really
It is effectively operation to determine the corresponding operational order of the gesture operation, and performs operation corresponding with the operational order;
For example, it is assumed that the screen position region for determining that eyes of user is watched attentively by performing step 13 is figure
Middle dotted line frame encloses region, now, it is assumed that user is instructed by clicking on the region as shown in finger in Fig. 4 with trigger action, by
Fallen through in the position coordinates of the touch point of user's contact screen and perform the screen position region that step 13 determines, then by holding
Row step 14~step 15, it may be determined that the operational order corresponding to the gesture operation is effectively to operate, and then the intelligence
Mobile phone can be performed and corresponding operation is specified in the operation in response to the operational order that receives.
Step 16, when the position coordinates of the touch point does not fall within the screen position region that the eyes of user is watched attentively,
The corresponding operational order of the gesture operation is then determined for maloperation, and refusal performs operation corresponding with the operational order.
For example, as shown in fig. 5, it is assumed that the screen position region for determining that eyes of user is watched attentively by performing step 13 is figure
Middle dotted line frame encloses region, now, it is assumed that user is instructed by clicking on the region as shown in finger in Fig. 5 with trigger action, by
Do not fallen within by performing the screen position region that step 13 determines in the position coordinates of the touch point of user's contact screen, then passed through
Perform step 14~step 15, it may be determined that operational order corresponding to the gesture operation is maloperation, and then the intelligence
Mobile phone will be refused to perform operation corresponding with the operational order.So as to avoid because smart mobile phone is performed by user's false touch
And the maloperation for triggering, and the operation current to user is impacted.
The method for preventing maloperation provided using the embodiment of the present application 1, because the eye motion that can obtain user is believed
Breath, and according to the user's eye motion information for obtaining, determine the screen position region of eyes of user note, and pass through eyes of user
The screen position region watched attentively can reflect user is currently browsing which partial content in screen, and then receive use
Whether family, can be by judging position coordinates of the touch point on screen by after the operational order touched display screen and trigger
Screen position region that eyes of user watches attentively is fallen into determine whether the operational order for receiving is to operate effectively instruction, one
As it is considered that the operational order that is triggered by the gesture operation such as clicking on to the screen area that is currently browsing of user is to have
Effect operation, on the contrary be then maloperation, then when the position coordinates of the touch point falls into the screen position that the eyes of user is watched attentively
During region, determine that the corresponding operational order of the gesture operation is effectively operation, and perform behaviour corresponding with the operational order
Make.And when the position coordinates of the touch point does not fall within the screen position region that the eyes of user is watched attentively, it is determined that it is described
The corresponding operational order of gesture operation is maloperation, and refusal performs operation corresponding with the operational order, therefore can be one
Determine to reduce the maloperation caused due to user's false touch in degree.
Embodiment 2
The embodiment of the present application provides a kind of device for preventing maloperation, is used to solve in the prior art aobvious by touching
When display screen is operated, the maloperation problem caused because user touches touch display screen unintentionally.The device it is specific
Structural diagrams be intended to as shown in fig. 6, including:Touch point position coordinates determining unit 21, eye motion information obtainment unit 22, screen
Curtain band of position determining unit 23, judging unit 24 and execution unit 25.
Wherein, touch point position coordinates determining unit 21 is used for the gesture operation corresponding to the operational order that determination is received
The position coordinates of the touch point on screen;
Eye motion information obtainment unit 22, the eye motion information for obtaining user;
Screen position area determination unit 23, for the eye fortune obtained according to eye motion information obtainment unit 22
Dynamic information, determines the screen position region that eyes of user is watched attentively;
Whether judging unit 24, the position coordinates for judging the touch point falls into the screen that the eyes of user is watched attentively
The band of position;
Execution unit 25, the position coordinates for judging the touch point when the judging unit falls into the eyes of user
During the screen position region watched attentively, determine that the corresponding operational order of the gesture operation is effectively operation, and perform and the behaviour
Make the corresponding operation of instruction;
Execution unit 25, the position coordinates for judging the touch point when the judging unit does not fall within user's eye
During the screen position region that eyeball is watched attentively, it is determined that the corresponding operational order of the gesture operation is maloperation, refusal is performed and institute
State the corresponding operation of operational order.
In one embodiment, eye motion information obtainment unit 22, specifically for:Using camera, according to default
Time interval gather eyes of user blinkpunkt;According to the sequencing of acquisition time, at least two blinkpunkts are determined;According to
At least two blinkpunkts of the determination, determine the eye motion information of user.
In one embodiment, the eye motion information, including:Eye motion direction;Eye motion distance.
In one embodiment, screen position area determination unit 23, specifically for:Using eye tracker to collecting
User's eye motion information is analyzed calculating, determines the direction of gaze of eyes of user;According to the direction of gaze for determining, it is determined that with
The screen position region of family eye gaze.
In one embodiment, screen position area determination unit 23, specifically for:According to the direction of gaze, really
At least two blinkpunkts of fixed eyes of user position on the screen;According at least two blinkpunkt described
Position on screen, determines the screen position region that eyes of user is watched attentively.
Additionally, the embodiment of the present application additionally provides a kind of with the mobile terminal for preventing maloperation function, it is used to solve existing
Have in technology when being operated by touch display screen, the mistake caused because user touches touch display screen unintentionally is grasped
Make problem.The mobile terminal includes:Memory, processor and camera.
Wherein, the memory, for storage program instruction;
The processor, is coupled to the memory, the programmed instruction for reading the memory storage, and as sound
Should, perform following operation:It is determined that the position of touch point of the gesture operation corresponding to the operational order for receiving on screen is sat
Mark;According to the eye motion information for obtaining, the screen position region that eyes of user is watched attentively is determined;Judge the touch point
Whether position coordinates falls into the screen position region that the eyes of user is watched attentively;Described in being fallen into when the position coordinates of the touch point
During the screen position region that eyes of user is watched attentively, determine that the corresponding operational order of the gesture operation is effectively operation, and perform
Operation corresponding with the operational order;When the position coordinates of the touch point does not fall within the screen position that the eyes of user is watched attentively
When putting region, it is determined that the corresponding operational order of the gesture operation is maloperation, refusal performs corresponding with the operational order
Operation;
The camera, under the control of the processor, obtaining the eye motion information of user.
In one embodiment, the camera, specifically for:According to default time interval collection eyes of user
Blinkpunkt;According to the sequencing of acquisition time, at least two blinkpunkts are determined;According to the determination at least two watch attentively
Point, determines the eye motion information of user.
In one embodiment, the processor, specifically for:Using eye tracker to user's eye motion for collecting
Information is analyzed calculating, determines the direction of gaze of eyes of user;According to the direction of gaze for determining, determine what eyes of user was watched attentively
Screen position region.
In one embodiment, the processor, specifically for:According to the direction of gaze, user's eye is determined
At least two blinkpunkts of eyeball position on the screen;According at least two blinkpunkt position on the screen
Put, determine the screen position region that eyes of user is watched attentively.
The device for preventing maloperation provided using the embodiment of the present application 2, because the eye motion that can obtain user is believed
Breath, and according to the user's eye motion information for obtaining, determine the screen position region of eyes of user note, and pass through eyes of user
The screen position region watched attentively can reflect user is currently browsing which partial content in screen, and then receive use
Whether family, can be by judging position coordinates of the touch point on screen by after the operational order touched display screen and trigger
Screen position region that eyes of user watches attentively is fallen into determine whether the operational order for receiving is to operate effectively instruction, one
As it is considered that the operational order that is triggered by the gesture operation such as clicking on to the screen area that is currently browsing of user is to have
Effect operation, on the contrary be then maloperation, then when the position coordinates of the touch point falls into the screen position that the eyes of user is watched attentively
During region, determine that the corresponding operational order of the gesture operation is effectively operation, and perform behaviour corresponding with the operational order
Make.And when the position coordinates of the touch point does not fall within the screen position region that the eyes of user is watched attentively, it is determined that it is described
The corresponding operational order of gesture operation is maloperation, and refusal performs operation corresponding with the operational order, therefore can be one
Determine to reduce the maloperation caused due to user's false touch in degree.
It should be understood by those skilled in the art that, embodiments of the invention can be provided as method, system or computer program
Product.Therefore, the present invention can be using the reality in terms of complete hardware embodiment, complete software embodiment or combination software and hardware
Apply the form of example.And, the present invention can be used and wherein include the computer of computer usable program code at one or more
The computer program implemented in usable storage medium (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.) is produced
The form of product.
The present invention is the flow with reference to method according to embodiments of the present invention, equipment (system) and computer program product
Figure and/or block diagram are described.It should be understood that every first-class during flow chart and/or block diagram can be realized by computer program instructions
The combination of flow and/or square frame in journey and/or square frame and flow chart and/or block diagram.These computer programs can be provided
The processor of all-purpose computer, special-purpose computer, Embedded Processor or other programmable data processing devices is instructed to produce
A raw machine so that produced for reality by the instruction of computer or the computing device of other programmable data processing devices
The device of the function of being specified in present one flow of flow chart or multiple one square frame of flow and/or block diagram or multiple square frames.
These computer program instructions may be alternatively stored in can guide computer or other programmable data processing devices with spy
In determining the computer-readable memory that mode works so that instruction of the storage in the computer-readable memory is produced and include finger
Make the manufacture of device, the command device realize in one flow of flow chart or multiple one square frame of flow and/or block diagram or
The function of being specified in multiple square frames.
These computer program instructions can be also loaded into computer or other programmable data processing devices so that in meter
Series of operation steps is performed on calculation machine or other programmable devices to produce computer implemented treatment, so as in computer or
The instruction performed on other programmable devices is provided for realizing in one flow of flow chart or multiple flows and/or block diagram one
The step of function of being specified in individual square frame or multiple square frames.
In a typical configuration, computing device includes one or more processors (CPU), input/output interface, net
Network interface and internal memory.
Internal memory potentially includes the volatile memory in computer-readable medium, random access memory (RAM) and/or
The forms such as Nonvolatile memory, such as read-only storage (ROM) or flash memory (flash RAM).Internal memory is computer-readable medium
Example.
Computer-readable medium includes that permanent and non-permanent, removable and non-removable media can be by any method
Or technology realizes information Store.Information can be computer-readable instruction, data structure, the module of program or other data.
The example of the storage medium of computer includes, but are not limited to phase transition internal memory (PRAM), static RAM (SRAM), moves
State random access memory (DRAM), other kinds of random access memory (RAM), read-only storage (ROM), electric erasable
Programmable read only memory (EEPROM), fast flash memory bank or other memory techniques, read-only optical disc read-only storage (CD-ROM),
Digital versatile disc (DVD) or other optical storages, magnetic cassette tape, the storage of tape magnetic rigid disk or other magnetic storage apparatus
Or any other non-transmission medium, can be used to store the information that can be accessed by a computing device.Defined according to herein, calculated
Machine computer-readable recording medium does not include temporary computer readable media (transitory media), such as data-signal and carrier wave of modulation.
Also, it should be noted that term " including ", "comprising" or its any other variant be intended to nonexcludability
Comprising so that process, method, commodity or equipment including a series of key elements not only include those key elements, but also wrapping
Include other key elements being not expressly set out, or also include for this process, method, commodity or equipment is intrinsic wants
Element.In the absence of more restrictions, the key element limited by sentence "including a ...", it is not excluded that wanted including described
Also there is other identical element in process, method, commodity or the equipment of element.
It will be understood by those skilled in the art that embodiments herein can be provided as method, system or computer program product.
Therefore, the application can be using the embodiment in terms of complete hardware embodiment, complete software embodiment or combination software and hardware
Form.And, the application can be used to be can use in one or more computers for wherein including computer usable program code and deposited
The shape of the computer program product implemented on storage media (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.)
Formula.
Embodiments herein is the foregoing is only, the application is not limited to.For those skilled in the art
For, the application can have various modifications and variations.It is all any modifications made within spirit herein and principle, equivalent
Replace, improve etc., within the scope of should be included in claims hereof.
Claims (14)
1. a kind of method for preventing maloperation, it is characterised in that including:
It is determined that the position coordinates of touch point of the gesture operation corresponding to the operational order for receiving on screen;
Obtain the eye motion information of user;
According to the eye motion information for obtaining, the screen position region that eyes of user is watched attentively is determined;
Judge whether the position coordinates of the touch point falls into the screen position region that the eyes of user is watched attentively;
When the position coordinates of the touch point falls into the screen position region that the eyes of user is watched attentively, the gesture behaviour is determined
It is effectively operation to make corresponding operational order, and performs operation corresponding with the operational order;
When the position coordinates of the touch point does not fall within the screen position region that the eyes of user is watched attentively, it is determined that the hand
It is maloperation that gesture operates corresponding operational order, and refusal performs operation corresponding with the operational order.
2. the method for claim 1, it is characterised in that obtain the eye motion information of user, specifically include:
Using camera, the blinkpunkt of eyes of user is gathered according to default time interval;
According to the sequencing of acquisition time, at least two blinkpunkts are determined;
At least two blinkpunkts according to the determination, determine the eye motion information of user.
3. method as claimed in claim 2, it is characterised in that the eye motion information, including:
Eye motion direction;
Eye motion distance.
4. the method for claim 1, it is characterised in that according to the eye motion information for obtaining, determines user's eye
The screen position region that eyeball is watched attentively, specifically includes:
Calculating is analyzed to the user's eye motion information for collecting using eye tracker, the direction of gaze of eyes of user is determined;
According to the direction of gaze for determining, the screen position region that eyes of user is watched attentively is determined.
5. method as claimed in claim 4, it is characterised in that according to the direction of gaze for determining, determine what eyes of user was watched attentively
Screen position region, specifically includes:
According to the direction of gaze, at least two blinkpunkts of eyes of user position on the screen is determined;
According at least two blinkpunkt position on the screen, the screen position region that eyes of user is watched attentively is determined.
6. a kind of device for preventing maloperation, it is characterised in that including:
Touch point position coordinates determining unit, the gesture operation corresponding to operational order received for determination is on screen
The position coordinates of touch point;
Eye motion information obtainment unit, the eye motion information for obtaining user;
Screen position area determination unit, for according to the eye motion information for obtaining, determining the screen that eyes of user is watched attentively
The curtain band of position;
Whether judging unit, the position coordinates for judging the touch point falls into the screen position area that the eyes of user is watched attentively
Domain;
Execution unit, the position coordinates for judging the touch point when the judging unit falls into what the eyes of user was watched attentively
During the region of screen position, determine that the corresponding operational order of the gesture operation is effectively operation, and perform and the operational order
Corresponding operation;
Execution unit, the position coordinates for judging the touch point when judging unit does not fall within the screen that the eyes of user is watched attentively
During the curtain band of position, it is determined that the corresponding operational order of the gesture operation is maloperation, refusal is performed and the operational order
Corresponding operation.
7. device as claimed in claim 6, it is characterised in that eye motion information obtainment unit, specifically for:
Using camera, the blinkpunkt of eyes of user is gathered according to default time interval;
According to the sequencing of acquisition time, at least two blinkpunkts are determined;
At least two blinkpunkts according to the determination, determine the eye motion information of user.
8. device as claimed in claim 7, it is characterised in that the eye motion information, including:
Eye motion direction;
Eye motion distance.
9. device as claimed in claim 6, it is characterised in that screen position area determination unit, specifically for:
Calculating is analyzed to the user's eye motion information for collecting using eye tracker, the direction of gaze of eyes of user is determined;
According to the direction of gaze for determining, the screen position region that eyes of user is watched attentively is determined.
10. device as claimed in claim 9, it is characterised in that screen position area determination unit, specifically for:
According to the direction of gaze, at least two blinkpunkts of eyes of user position on the screen is determined;
According at least two blinkpunkt position on the screen, the screen position region that eyes of user is watched attentively is determined.
11. is a kind of with the mobile terminal for preventing maloperation function, it is characterised in that including:
Memory, for storage program instruction;
Processor, is coupled to the memory, the programmed instruction for reading the memory storage, and as response, performs
Following operation:It is determined that the position coordinates of touch point of the gesture operation corresponding to the operational order for receiving on screen;According to
The eye motion information for obtaining, determines the screen position region that eyes of user is watched attentively;Judge that the position of the touch point is sat
Whether mark falls into the screen position region that the eyes of user is watched attentively;When the position coordinates of the touch point falls into user's eye
During the screen position region that eyeball is watched attentively, determine the corresponding operational order of the gesture operation for effectively operation, and perform with it is described
The corresponding operation of operational order;When the position coordinates of the touch point does not fall within the screen position region that the eyes of user is watched attentively
When, it is determined that the corresponding operational order of the gesture operation is maloperation, and refusal performs operation corresponding with the operational order;
Camera, under the control of the processor, obtaining the eye motion information of user.
12. mobile terminals as claimed in claim 11, it is characterised in that the camera, specifically for:
The blinkpunkt of eyes of user is gathered according to default time interval;
According to the sequencing of acquisition time, at least two blinkpunkts are determined;
At least two blinkpunkts according to the determination, determine the eye motion information of user.
13. mobile terminals as claimed in claim 11, it is characterised in that the processor, specifically for:
Calculating is analyzed to the user's eye motion information for collecting using eye tracker, the direction of gaze of eyes of user is determined;
According to the direction of gaze for determining, the screen position region that eyes of user is watched attentively is determined.
14. mobile terminals as claimed in claim 13, it is characterised in that the processor, specifically for:
According to the direction of gaze, at least two blinkpunkts of eyes of user position on the screen is determined;
According at least two blinkpunkt position on the screen, the screen position region that eyes of user is watched attentively is determined.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611246265.5A CN106814854A (en) | 2016-12-29 | 2016-12-29 | A kind of method and device for preventing maloperation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611246265.5A CN106814854A (en) | 2016-12-29 | 2016-12-29 | A kind of method and device for preventing maloperation |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106814854A true CN106814854A (en) | 2017-06-09 |
Family
ID=59110225
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201611246265.5A Pending CN106814854A (en) | 2016-12-29 | 2016-12-29 | A kind of method and device for preventing maloperation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106814854A (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107908147A (en) * | 2017-09-25 | 2018-04-13 | 太原重型机械集团工程技术研发有限公司 | Anti-misoperation system and method |
CN109101110A (en) * | 2018-08-10 | 2018-12-28 | 北京七鑫易维信息技术有限公司 | A kind of method for executing operating instructions, device, user terminal and storage medium |
WO2019024056A1 (en) * | 2017-08-03 | 2019-02-07 | 华为技术有限公司 | Anti-misoperation method and terminal |
CN109343760A (en) * | 2018-10-25 | 2019-02-15 | 联想(北京)有限公司 | A kind of control method and device |
CN109597489A (en) * | 2018-12-27 | 2019-04-09 | 武汉市天蝎科技有限公司 | A kind of method and system of the eye movement tracking interaction of near-eye display device |
CN110244853A (en) * | 2019-06-21 | 2019-09-17 | 四川众信互联科技有限公司 | Gestural control method, device, intelligent display terminal and storage medium |
CN110929241A (en) * | 2019-11-12 | 2020-03-27 | 北京字节跳动网络技术有限公司 | Rapid start method, device, medium and electronic equipment of small program |
CN111142656A (en) * | 2019-07-29 | 2020-05-12 | 广东小天才科技有限公司 | Content positioning method, electronic equipment and storage medium |
CN111158507A (en) * | 2019-10-18 | 2020-05-15 | 广东小天才科技有限公司 | Method for determining designated content and electronic equipment |
CN111432131A (en) * | 2020-04-30 | 2020-07-17 | 广东小天才科技有限公司 | Photographing frame selection method and device, electronic equipment and storage medium |
CN111435284A (en) * | 2019-01-15 | 2020-07-21 | 腾讯科技(深圳)有限公司 | Method and device for detecting mistaken clicking of media content display position and storage medium |
US11055517B2 (en) | 2018-03-09 | 2021-07-06 | Qisda Corporation | Non-contact human input method and non-contact human input system |
CN113885735A (en) * | 2021-11-03 | 2022-01-04 | 业成科技(成都)有限公司 | Touch display device, control method thereof and electronic device |
CN114115532A (en) * | 2021-11-11 | 2022-03-01 | 珊瑚石(上海)视讯科技有限公司 | AR labeling method and system based on display content |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102841683A (en) * | 2012-07-24 | 2012-12-26 | 东莞宇龙通信科技有限公司 | Application starting method and communication terminal of application |
CN103336582A (en) * | 2013-07-30 | 2013-10-02 | 黄通兵 | Motion information control human-computer interaction method |
CN103631483A (en) * | 2013-11-27 | 2014-03-12 | 华为技术有限公司 | Positioning method and positioning device |
CN104349002A (en) * | 2013-07-30 | 2015-02-11 | 柯尼卡美能达株式会社 | Operating device and image processing apparatus |
-
2016
- 2016-12-29 CN CN201611246265.5A patent/CN106814854A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102841683A (en) * | 2012-07-24 | 2012-12-26 | 东莞宇龙通信科技有限公司 | Application starting method and communication terminal of application |
CN103336582A (en) * | 2013-07-30 | 2013-10-02 | 黄通兵 | Motion information control human-computer interaction method |
CN104349002A (en) * | 2013-07-30 | 2015-02-11 | 柯尼卡美能达株式会社 | Operating device and image processing apparatus |
CN103631483A (en) * | 2013-11-27 | 2014-03-12 | 华为技术有限公司 | Positioning method and positioning device |
Non-Patent Citations (3)
Title |
---|
刘芳平: "为VR加入注视点渲染和眼控交互,七鑫易维推出VR眼球追踪模组", 《HTTPS://WWW.LEIPHONE.COM/NEWS/201611/MHWSXI4MRSVDDVM4.HTML》 * |
程利等: "眼动仪在广告心理学的应用研究", 《商业经济》 * |
郑玉玮等: "眼动追踪技术在多媒体学习中的应用:2005—2015 年相关研究的综述", 《电化教育研究》 * |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019024056A1 (en) * | 2017-08-03 | 2019-02-07 | 华为技术有限公司 | Anti-misoperation method and terminal |
CN107908147A (en) * | 2017-09-25 | 2018-04-13 | 太原重型机械集团工程技术研发有限公司 | Anti-misoperation system and method |
US11055517B2 (en) | 2018-03-09 | 2021-07-06 | Qisda Corporation | Non-contact human input method and non-contact human input system |
CN109101110A (en) * | 2018-08-10 | 2018-12-28 | 北京七鑫易维信息技术有限公司 | A kind of method for executing operating instructions, device, user terminal and storage medium |
CN109343760A (en) * | 2018-10-25 | 2019-02-15 | 联想(北京)有限公司 | A kind of control method and device |
CN109597489A (en) * | 2018-12-27 | 2019-04-09 | 武汉市天蝎科技有限公司 | A kind of method and system of the eye movement tracking interaction of near-eye display device |
CN111435284A (en) * | 2019-01-15 | 2020-07-21 | 腾讯科技(深圳)有限公司 | Method and device for detecting mistaken clicking of media content display position and storage medium |
CN111435284B (en) * | 2019-01-15 | 2023-09-22 | 腾讯科技(深圳)有限公司 | Method, device and storage medium for detecting false click of media content display bit |
CN110244853A (en) * | 2019-06-21 | 2019-09-17 | 四川众信互联科技有限公司 | Gestural control method, device, intelligent display terminal and storage medium |
CN111142656A (en) * | 2019-07-29 | 2020-05-12 | 广东小天才科技有限公司 | Content positioning method, electronic equipment and storage medium |
CN111142656B (en) * | 2019-07-29 | 2024-03-19 | 广东小天才科技有限公司 | Content positioning method, electronic equipment and storage medium |
CN111158507A (en) * | 2019-10-18 | 2020-05-15 | 广东小天才科技有限公司 | Method for determining designated content and electronic equipment |
CN111158507B (en) * | 2019-10-18 | 2024-03-19 | 广东小天才科技有限公司 | Specified content determining method and electronic equipment |
CN110929241B (en) * | 2019-11-12 | 2023-05-16 | 北京字节跳动网络技术有限公司 | Method and device for quickly starting small program, medium and electronic equipment |
CN110929241A (en) * | 2019-11-12 | 2020-03-27 | 北京字节跳动网络技术有限公司 | Rapid start method, device, medium and electronic equipment of small program |
CN111432131A (en) * | 2020-04-30 | 2020-07-17 | 广东小天才科技有限公司 | Photographing frame selection method and device, electronic equipment and storage medium |
CN113885735A (en) * | 2021-11-03 | 2022-01-04 | 业成科技(成都)有限公司 | Touch display device, control method thereof and electronic device |
CN113885735B (en) * | 2021-11-03 | 2023-03-21 | 业成科技(成都)有限公司 | Touch display device, control method thereof and electronic device |
CN114115532A (en) * | 2021-11-11 | 2022-03-01 | 珊瑚石(上海)视讯科技有限公司 | AR labeling method and system based on display content |
CN114115532B (en) * | 2021-11-11 | 2023-09-29 | 珊瑚石(上海)视讯科技有限公司 | AR labeling method and system based on display content |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106814854A (en) | A kind of method and device for preventing maloperation | |
Oh et al. | The challenges and potential of end-user gesture customization | |
US20140049462A1 (en) | User interface element focus based on user's gaze | |
CN105446673B (en) | The method and terminal device of screen display | |
US10540083B2 (en) | Use of hand posture to improve text entry | |
US20140320434A1 (en) | Method for gesture control | |
DE112011101203T5 (en) | Portable electronic device and method for its control | |
CN107666987A (en) | Robotic process automates | |
CN103797481B (en) | Search based on gesture | |
CN104335561A (en) | Biometric initiated communication | |
Menges et al. | Improving user experience of eye tracking-based interaction: Introspecting and adapting interfaces | |
CN102087582A (en) | Automatic scrolling method and device | |
CN104035677A (en) | Displaying method and device of prompt information | |
CN106445972B (en) | Page display method and device | |
Kumar et al. | Eye-controlled interfaces for multimedia interaction | |
CN107194213A (en) | A kind of personal identification method and device | |
US9684445B2 (en) | Mobile gesture reporting and replay with unresponsive gestures identification and analysis | |
EP2458489A2 (en) | Portable device and method for operating portable device | |
CN105677194A (en) | Method and terminal for selecting objects | |
CN103530044B (en) | page gesture triggering method and device | |
Müller et al. | Designing for noticeability: Understanding the impact of visual importance on desktop notifications | |
CN110647268B (en) | Control method and control device for display window in game | |
CN105874411B (en) | A kind of processing operation method and terminal | |
He et al. | Mobile | |
Maleckar et al. | Evaluation of common input devices for web browsing: Mouse vs touchpad vs touchscreen |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170609 |