CN107037951B - Automatic operation mode identification method and terminal - Google Patents

Automatic operation mode identification method and terminal Download PDF

Info

Publication number
CN107037951B
CN107037951B CN201610076799.1A CN201610076799A CN107037951B CN 107037951 B CN107037951 B CN 107037951B CN 201610076799 A CN201610076799 A CN 201610076799A CN 107037951 B CN107037951 B CN 107037951B
Authority
CN
China
Prior art keywords
operation mode
coordinate point
area
touch
contact
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610076799.1A
Other languages
Chinese (zh)
Other versions
CN107037951A (en
Inventor
姚均营
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ZTE Corp
Original Assignee
ZTE Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ZTE Corp filed Critical ZTE Corp
Priority to CN201610076799.1A priority Critical patent/CN107037951B/en
Priority to PCT/CN2016/080055 priority patent/WO2016197714A1/en
Publication of CN107037951A publication Critical patent/CN107037951A/en
Application granted granted Critical
Publication of CN107037951B publication Critical patent/CN107037951B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses an automatic identification method of an operation mode and a terminal, wherein the method comprises the following steps: acquiring touch events acting on a touch screen, and extracting input events generated by the touch events; and calling a pre-established standard model of each operation mode, matching the standard models according to the input event, and taking the operation mode corresponding to the matched standard model as the identified operation mode. The method provided by the invention can intelligently identify the operation mode of the user without adding an additional hardware module on the basis of the existing input system of the terminal, provides a basis for automatically adjusting the terminal setting, and improves the user experience of the product to a greater extent.

Description

Automatic operation mode identification method and terminal
Technical Field
The present invention relates to the field of communications, and in particular, to an automatic operation mode identification method and a terminal.
Background
The screen size of the intelligent terminal is larger and larger, the situation brings great trouble to one-hand operation, the interface or the keys of the mobile terminal do not change along with the use habits of the user, for example, the most common return keys are used in the use process of the user, and if the mobile terminal is arranged on the rightmost side, the return keys are very inconvenient to operate when the user operates with the left hand.
With regard to the method for intelligently judging left-hand and right-hand operations, there are some proposals, such as:
the first scheme is as follows: dynamically configuring a scheme of keys by judging the screen rotation state of the terminal;
scheme II: the scheme that the handheld state of a user on the terminal is automatically judged through light sensing modules arranged on two sides of the terminal;
the third scheme is as follows: detecting the offset angle of the terminal during the operation of a user to identify the scheme of left-handed or right-handed operation by a deflection angle sensor arranged in the terminal;
and the scheme is as follows: and identifying the left-hand operation or the right-hand operation by acquiring the fingerprint lines of the fingers.
The existing technical scheme solves the problem of intelligently judging the operation of the left hand or the right hand to a certain extent, but can also see that the existing technical schemes have some disadvantages and have places which need further improvement. The concrete expression is as follows:
according to the first scheme and the third scheme, the rotation angle of the terminal is obtained under the action of the sensor, and then the judgment of the left hand and the right hand is carried out, the wrong judgment may occur in the scheme, and because the terminal can generate various rotation angles when a user operates with the left hand or the right hand, the rotation angles are taken as the basis independently, and the judgment is insufficient;
in the second scheme, an additional light sensing module is required to be arranged, so that the equipment cost and the structural complexity are increased, and the universality is not strong;
the scheme IV is based on pattern and fingerprint grain identification, the method needs an additional pattern distinguishing module to identify the pattern of a user pressing a screen, then, a fingerprint grain identification function needs to be added, and the modes firstly need additional hardware or module support, so that the terminal cost is increased. In addition, in the process of identifying the pattern or the fingerprint grain, certain algorithms and operations are needed, more electric quantity is consumed, the image identification algorithm occupies a CPU, and the terminal performance is affected. Finally, the pattern or fingerprint identification itself is not completely accurate, and there is a certain possibility of misjudgment.
Disclosure of Invention
The invention provides an automatic operation mode identification method and a terminal, which are used for solving the problems that an operation mode identification mode in the prior art depends on extra hardware and the identification operation cost is high.
According to an aspect of the present invention, there is provided an operation mode automatic recognition method, including:
acquiring a touch event acting on a touch screen, and extracting an input event generated by the touch event; the input event includes: the touch event is in contact with the direction of the oval area at the position of the contact center coordinate point on the touch screen and the position of the contact center coordinate point;
and calling a pre-established standard model of each operation mode, matching the standard models according to the input event, and taking the operation mode corresponding to the matched standard model as the identified operation mode.
In accordance with another aspect of the present invention, there is provided a terminal comprising:
the touch screen driver is used for acquiring a touch event acting on the touch screen;
the processor is used for extracting an input event generated by the touch event collected by the touch screen driver; calling a pre-established standard model of each operation mode, matching the standard models according to the input event, and taking the operation mode corresponding to the matched standard model as the identified operation mode; the input event includes: the touch event is in contact with the direction of the oval area at the position of the contact center coordinate point on the touch screen and the position of the contact center coordinate point.
The invention has the following beneficial effects:
according to the scheme provided by the invention, the operation mode of the user can be intelligently identified without adding an additional hardware module on the basis of the existing input system of the terminal, the terminal setting can be automatically adjusted according to the identified operation mode, the user operation is facilitated, and the user experience of the product is greatly improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a flowchart of a method for automatically identifying a terminal operation mode according to an embodiment of the present invention;
FIG. 2 is a schematic diagram illustrating the meaning of a multi-touch protocol input event parameter according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating division of a terminal screen area according to an embodiment of the present invention;
FIG. 4 is a schematic diagram illustrating characteristics of an input event during a single-handed operation of a left hand according to an embodiment of the present invention;
FIG. 5 is a schematic diagram illustrating features of an input event during a right-handed single-handed operation according to an embodiment of the present invention;
fig. 6 is a block diagram of a terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In order to solve various defects of an operation mode identification mode in the prior art, an embodiment of the present invention provides an automatic operation mode identification method, where a terminal touch screen input event follows a Multi-touch protocol (Multi-touch protocol), and in a process of operating a screen by a user, a touch screen driver reports the input event to an upper layer program according to a rule formulated in the Multi-touch protocol.
As shown in fig. 1, an embodiment of the present invention provides an automatic operation mode identification method, where the method includes the following steps:
step S101, collecting a touch event acting on a touch screen, and extracting an input event generated by the touch event; the input event includes: the touch event touch center coordinate point position and the direction of the touch oval area at the touch center coordinate point position; preferably, the input event further includes the size of a contact oval area at the position of the touch event contact center coordinate point;
as shown in fig. 2, the explanation of a part of input events (i.e. input events used in the present invention) defined for the multi-touch protocol includes:
ABS _ MT _ POSITION _ X, which represents the X-axis coordinate of the contact center of the touch event;
ABS _ MT _ POSITION _ Y, which represents the Y-axis coordinate of the touch center of the touch event;
ABS _ MT _ TOUCH _ MAJOR, which represents the MAJOR axis of the TOUCH event contact ellipse region;
ABS _ MT _ TOUCH _ MINOR, representing the MINOR axis of the TOUCH event contact elliptical area;
ABS _ MT _ origin, representing the direction in which the touch event touches the elliptical area; specifically, a value of 0 is returned when the major axis of the ellipse is aligned with the Y direction of the screen coordinate system, a negative value is returned when the ellipse is rotated left, and a positive value is returned when the ellipse is rotated right.
And step S102, calling a pre-established standard model of each operation mode, matching the standard models according to the input event, and taking the operation mode corresponding to the matched standard model as the identified operation mode.
In this embodiment, the standard model of each operation mode may be an inherent configuration in the terminal (i.e., a default configuration when the terminal leaves a factory), but more preferably, the standard model is established by collecting touch operations of the user and conforms to the use habit of the user. At this time, the standard model establishing process is as follows:
(1) providing an interactive interface for a user and prompting the user to operate in an operation mode;
(2) acquiring a certain number of touch events as standard sample events, and recording input events generated by the standard sample events;
(3) performing standard model modeling on the current operation mode according to an input event generated by the recorded standard sample event;
(4) and prompting the user to operate in the next operation mode, and repeating the acquisition and modeling processes until all the operation modes are modeled.
In one embodiment of the present invention, the standard model for each mode of operation comprises: the direction of the contact elliptical area on a plurality of coordinate point positions in the operation area corresponding to each operation mode; preferably, the size of the contact oval area at a plurality of coordinate point positions in the operation area corresponding to each operation mode is further included.
The standard model matching is carried out according to the input event, and the similarity of the operation area to which the touch event contact center coordinate point belongs, the direction of the contact elliptical area on the contact center coordinate point position and the direction of the contact elliptical area on the same or similar coordinate point position in the standard model corresponding to the operation area to which the contact center coordinate point belongs is mainly considered;
preferably, in order to improve the accuracy of the determination, the similarity between the size of the contact oval area at the contact center coordinate point and the size of the contact oval area at the same or similar coordinate point in the standard model corresponding to the operation area to which the contact center coordinate point belongs needs to be considered.
Preferably, in order to improve the recognition accuracy, in the embodiment, in the process of performing standard model matching according to the input event, the standard model matching is performed by using the input events of the multiple touch events. At this time, when the models are matched, the main consideration is the proportion of the position of the contact center coordinate point of each touch event falling in the operation area corresponding to each operation mode, and the similarity between the direction (or direction and size) of the contact elliptical area at the position of each contact center coordinate point and the direction (or direction and size) of the contact elliptical area at the same or similar coordinate point position in the standard model corresponding to the operation area to which each contact center coordinate point belongs. For example, when the ratio and the similarity both reach the corresponding thresholds set, it can be determined that a certain operation mode is satisfied. Of course, those skilled in the art can also customize the model matching strategy under the inventive concept.
In addition, after the operation mode is identified, the identification result can be notified to the user, so that the user can judge whether the identification result is correct, and under the correct condition, whether the acquired input event information of the touch event is in the standard model base corresponding to the operation mode is judged, if so, the processing is not carried out, otherwise, the input event of the touch event which is not in the standard model base is stored in the standard model base, and the self-learning of the standard model is realized.
Further, the method according to the embodiment of the present invention, after identifying the operation mode, further includes: and adjusting the layout mode of the terminal operation interface according to the identified operation mode. Namely, the layout mode of the terminal operation interface is adjusted to the layout mode corresponding to the current operation mode. The layout mode of the terminal operation interface comprises the following steps: and (4) a layout mode of the virtual keyboard.
Therefore, the method can intelligently identify the operation mode of the user and automatically adjust the operation interface setting of the terminal on the basis of the existing input system of the terminal without adding extra hardware and software modules, thereby facilitating the user operation and improving the user experience of the product to a greater extent.
For a more clear explanation of the invention, a preferred embodiment of the invention is given below with reference to fig. 3 to 5, which illustrates the method of the invention in more detail by disclosing further technical details, which are used for explaining the invention and are not meant to limit the invention solely.
As shown in fig. 3, the terminal screen area in the embodiment of the present invention is divided into A, B, C, D four areas. When a user operates the terminal with a left hand and a single hand, coordinates of touch events are mainly concentrated in the area A and the area C; when a user operates the terminal with a right hand and a single hand, coordinates of a touch event are mainly concentrated in a B area and a C area; the D area is close to the upper side of the terminal screen, and when the terminal screen is operated by a single hand, the coordinates of the touch event are less located in the area.
As shown in fig. 4, in the embodiment of the present invention, the coordinates of the touch event are mainly concentrated in the area a and the area C when the left-hand and single-handed operation is performed, and the size and the direction of the contact oval area at each position in the area a and the area C have regularity.
As shown in fig. 5, in the embodiment of the present invention, the coordinates of the touch event are mainly concentrated in the B area and the C area when the user operates the touch screen with one hand, and the size and the direction of the contact oval area at each position in the B area and the C area have regularity.
As can be seen from fig. 4 and 5, when the left-hand operation and the right-hand operation are both used, the coordinate position of the touch event falls in the C area, but when the left-hand operation and the right-hand operation are both used, the direction of the contact ellipse in the C area is different, and the directions of the two are generally different by about 90 degrees, that is, the ABS _ MT _ origin value is different by positive and negative.
As can be seen from fig. 3 to 5, when the user uses the left-hand single-hand operation, the positions of operating the TOUCH screen are mainly concentrated in the sector area at the lower left corner of the terminal screen, and the size and direction of the TOUCH oval area of the TOUCH screen for each operation at each position in the area are substantially consistent, that is, the size of ABS _ MT _ TOUCH _ MAJOR and ABS _ MT _ TOUCH _ MINOR of the TOUCH event at the fixed position is substantially unchanged, and the positive and negative values of ABS _ MT _ ease are substantially unchanged; when the user uses the right hand to operate with one hand, the positions of the TOUCH screen are mainly concentrated in the sector area at the lower right corner of the terminal screen, and the direction of the TOUCH screen contact oval area of each operation at each position in the area is basically consistent, namely the sizes of the ABS _ MT _ TOUCH _ MAJOR and ABS _ MT _ TOUCH _ MINOR of the TOUCH events at the fixed positions are basically kept unchanged, and the positive and negative values of the ABS _ MT _ ORIENTATION are basically kept unchanged.
Therefore, the invention can establish the standard model of each operation mode as the judgment basis for the automatic identification of the operation mode.
In this embodiment, the standard model modeling manner of each operation mode includes:
step 1, collecting input events when the left-hand single-hand operation is performed. In order to improve the identification accuracy, in specific implementation, the terminal can provide an interactive interface for the user, prompt the user to use left-handed single-handed operation, and collect events when a certain number of screens are operated by the left-handed single-handed operation as standard sample events.
The specific collected content comprises input events generated by touch operations such as clicking and sliding performed by a user in a range which can be reached by a left hand of the user by using a left-hand single-hand operation terminal.
And 2, generating a standard model of the input event of the left-hand single-hand operation mode. And after the input event collection work is finished, performing standard modeling according to the collected event characteristic points to obtain a left-hand single-hand operation input event standard model.
The specific modeling content comprises: an input coordinate point range determined by ABS _ MT _ POSITION _ X, ABS _ MT _ POSITION _ Y, thereby establishing the areas where a and C shown in fig. 3 are located; the size and direction of the contact oval area at each contact center coordinate point position determined by ABS _ MT _ TOUCH _ MAJOR, ABS _ MT _ TOUCH _ MINOR, ABS _ MT _ origin.
And 3, collecting an input event when the user operates the keyboard with one hand, wherein the process is similar to the step 1.
And 4, generating a right-hand single-hand operation mode input event standard model, wherein the process is similar to the step 2.
According to the modeling mode of the standard model of the left hand and the right hand, a two-hand operation mode input event standard model can be obtained, but considering that the operation modes which can be used by the user are not more than three: the left-handed one-handed operation, the right-handed one-handed operation and the two-handed operation are performed, so that the standard model of the input event in the two-handed operation mode is not established, and whether the input event is in the two-handed operation mode can be judged by excluding the one-handed operation.
Further, on the basis of the standard models established in step 2 and step 4, the ranges of the area a, the area B, the area C and the area D shown in fig. 3 are obtained by synthesis, and the difference of the elliptical directions in the area C is obtained by analysis when the left-hand single-hand operation and the right-hand single-hand operation are respectively used.
After the standard model is established, the standard model can be used for automatically identifying the operation mode, and the identification process is as follows:
and S1, initializing recognition, clearing the cached input events and starting to recognize the operation mode.
S2, collecting TOUCH events when a certain number of continuous users operate the TOUCH screen, and recording a series of ABS _ MT _ POSITION _ X, ABS _ MT _ POSITION _ Y, ABS _ MT _ TOUCH _ MAJOR, ABS _ MT _ TOUCH _ MINOR and ABS _ MT _ ORIENTATION event sequences;
and S3, counting the coordinate POSITION distribution determined by ABS _ MT _ POSITION _ X, ABS _ MT _ POSITION _ Y in the collected event sequence, and counting and analyzing the size and the direction of the TOUCH screen contact oval area determined by ABS _ MT _ TOUCH _ MAJOR, ABS _ MT _ TOUCH _ MINOR and ABS _ MT _ ORIENTATION at each coordinate POSITION.
S4, determining an operation mode to which the statistical result conforms according to the pre-established standard model of each operation mode, which specifically includes:
when the ratio of coordinates distributed in the areas A and C at the lower left corner of the screen in the collected touch event reaches a set threshold value, and the size and direction of the contact ellipse area of each contact center coordinate point position and the size and direction of the ellipse of the coordinate point position or the coordinate point close to the coordinate point position in the standard model of the left-hand one-handed operation (namely, the standard model does not have data of the coordinate point position, so the coordinate point close to the coordinate point position is adopted for replacement) reach the set threshold value, the current use of the user is considered to be left-hand one-handed operation.
And when the proportion of the coordinates distributed in the areas B and C at the lower right corner of the screen in the collected touch event reaches a set threshold value, and the size and the direction of the contact ellipse area of each contact center coordinate point position and the size and the direction similarity of the coordinate point position or the ellipse of the coordinate point close to the coordinate point position in the standard model of the right-hand one-hand operation reach the set threshold value, the user is considered to be the right-hand one-hand operation currently used.
And when the areas in which the coordinates are distributed in the collected touch event do not reach the threshold value and the similarity of the size and the direction of the contact oval area of each coordinate position does not reach the set threshold value, the user is considered to be in two-hand operation currently.
The thresholds can be flexibly set according to requirements, and the values of the thresholds can be the same or different.
In a preferred embodiment, in order to improve the recognition efficiency, only the coordinate position distribution ratio and the direction of the contact ellipse area in the C area may be considered, and the determination of the size and direction of each coordinate position contact ellipse in the a area may be supplemented, and whether the size and direction of each coordinate position contact ellipse in the a area may be selected as the basis of the determination according to the setting.
This preferred embodiment is illustrated below with the identification of a left-handed single-handed operation as an example:
first, the area where the coordinate position of the touch event is located is analyzed, and when the ratio of the number of the coordinate positions of the center of the touch event concentrated in the area a and the area C in fig. 3 reaches a certain threshold, it is determined that one of the requirements of the user for the left-handed single-handed operation is met.
Secondly, analyzing the direction of the touch event in the C area to touch the ellipse, comparing the result with the direction of the touch ellipse in the C area in the standard model of the left-hand single-hand operation, and judging that one of other necessary conditions of the user for the left-hand single-hand operation is met when the similarity reaches a preset threshold value.
And when the similarity reaches a preset threshold value, judging that a supplement condition for the user to operate by using the left hand with one hand is met.
And finally, comprehensively judging whether the user uses the left-handed single-handed operation or not according to the analysis and judgment result. The process of identifying and judging the right-handed one-handed operation is similar to the above steps, and is not described herein again.
And S5, after the operation mode is successfully identified, automatically adjusting the relevant settings of the terminal according to the identification result, including the layout of the virtual keyboard, the layout of the operation interface and the like, and popping up a message to prompt the user that the terminal is set to be in the left-hand operation mode or the right-hand operation mode.
S6, when the automatic identification switch is operated by one hand and is always turned on, the process flow enters S1 to start the next round of identification; when the automatic identification switch is turned off, the process ends.
An embodiment of the present invention further provides a terminal, as shown in fig. 6, including:
a touch screen driver 610 for acquiring a touch event acting on the touch screen;
a processor 620, configured to extract an input event generated by a touch event collected by the touch screen driver 610; calling a pre-established standard model of each operation mode, matching the standard models according to the input event, and taking the operation mode corresponding to the matched standard model as the identified operation mode; the input event includes: the touch event touches the direction of the elliptical region at the touch center coordinate point position and the touch center coordinate point position on the touch screen.
Based on the above structural framework and implementation principle, several specific and preferred embodiments under the above structure are given below to refine and optimize the functions of the terminal of the present invention, so as to make the implementation of the scheme of the present invention more convenient and accurate. The method specifically comprises the following steps:
in this embodiment, the operation mode includes: a left-handed, one-handed operation mode, a right-handed, one-handed operation mode, and a two-handed operation mode.
Further, in this embodiment, the pre-established standard model of each operation mode includes: and the direction of the contact elliptical area at a plurality of coordinate point positions in the operation area corresponding to each operation mode.
In an embodiment of the invention, the processor 620 performs the standard model matching according to a ratio in an operation area to which the touch event touch center coordinate point belongs and a similarity between a direction of the contact oval area at the touch center coordinate point and a direction of the contact oval area at the same or similar coordinate point in a standard model corresponding to the operation area to which the touch center coordinate point belongs.
In a preferred embodiment of the invention: the input event further comprises: the size of the contact elliptical area at the position of the contact center coordinate point; the pre-established standard model for each operation mode further comprises: and the size of the contact oval area at a plurality of coordinate point positions in the operation area corresponding to each operation mode.
At this time, the processor 620 performs standard model matching according to the operation region to which the touch event touch center coordinate point position belongs, and the similarity between the direction and the size of the contact elliptical region at the touch center coordinate point position and the direction and the size of the contact elliptical region at the same or similar coordinate point position in the standard model corresponding to the operation region to which the touch center coordinate point position belongs.
In another preferred embodiment of the present invention, the processor is further configured to provide an interactive interface for a user, and collect touch sample events in various operation modes input by the user through the interactive interface; and establishing a standard model of each operation mode according to the collected touch sample event.
In another preferred embodiment of the present invention, the processor is further configured to adjust a layout manner of the terminal operation interface according to the identified operation mode.
According to the embodiment, the terminal can intelligently identify the operation mode of the user without adding an additional hardware module on the basis of the existing input system of the terminal, and the operation interface setting of the terminal is automatically adjusted, so that the user operation is facilitated, and the user experience of products is greatly improved.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. Especially for the terminal embodiment, since it is basically similar to the method embodiment, the description is relatively simple, and for the relevant points, refer to the partial description of the method embodiment.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable storage medium, and the storage medium may include: ROM, RAM, magnetic or optical disks, and the like.
In short, the above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (12)

1. An automatic operation mode recognition method is characterized by comprising the following steps:
acquiring a touch event acting on a touch screen, and extracting an input event generated by the touch event; the input event includes: the touch event is in contact with the direction of the oval area at the position of the contact center coordinate point on the touch screen and the position of the contact center coordinate point;
calling a pre-established standard model of each operation mode, matching the standard models according to the input event, and taking the operation mode corresponding to the matched standard model as the identified operation mode;
the pre-established standard model of each operation mode comprises: the direction of the contact elliptical area on a plurality of coordinate point positions in the operation area corresponding to each operation mode;
according to the input event, standard model matching is carried out, and the method specifically comprises the following steps:
and performing standard model matching according to the operation area to which the touch event contact center coordinate point belongs and the similarity between the direction of the contact elliptical area at the contact center coordinate point position and the direction of the contact elliptical area at the same or similar coordinate point position in a standard model corresponding to the operation area to which the contact center coordinate point belongs.
2. The method of claim 1, wherein the input event further comprises: the size of a contact oval area at the position of the contact center coordinate point;
the pre-established standard model for each operation mode further comprises: and the size of the contact oval area at a plurality of coordinate point positions in the operation area corresponding to each operation mode.
3. The method of claim 2, wherein performing a standard model match based on the input event specifically comprises:
and performing standard model matching according to the operation area to which the touch event contact center coordinate point belongs and the similarity of the direction and the size of the contact elliptical area on the contact center coordinate point position and the direction and the size of the contact elliptical area on the same or similar coordinate point position in a standard model corresponding to the operation area to which the contact center coordinate point belongs.
4. A method according to any one of claims 1 to 3, wherein the standard modeling of each mode of operation comprises:
providing an interactive interface for a user;
acquiring touch sample events input by a user through the interactive interface in various operation modes;
and establishing a standard model of each operation mode according to the collected touch sample event.
5. The method of claim 1,
the operation modes include: a left-handed, one-handed operation mode, a right-handed, one-handed operation mode, and a two-handed operation mode.
6. The method of any one of claims 1 to 3 or 5, further comprising, after identifying the operational mode: and adjusting the layout mode of the terminal operation interface according to the identified operation mode.
7. A terminal, comprising:
the touch screen driver is used for acquiring a touch event acting on the touch screen;
the processor is used for extracting an input event generated by the touch event collected by the touch screen driver; calling a pre-established standard model of each operation mode, matching the standard models according to the input event, and taking the operation mode corresponding to the matched standard model as the identified operation mode; the input event includes: the touch event is in contact with the direction of the oval area at the position of the contact center coordinate point on the touch screen and the position of the contact center coordinate point;
the standard model of each operating mode invoked by the processor comprises: the direction of the contact elliptical area on a plurality of coordinate point positions in the operation area corresponding to each operation mode;
the processor is specifically configured to perform standard model matching according to an operation area to which the touch event contact center coordinate point belongs and a similarity between a direction of the contact oval area at the contact center coordinate point position and a direction of the contact oval area at the same or similar coordinate point position in a standard model corresponding to the operation area to which the contact center coordinate point belongs.
8. The terminal of claim 7, wherein the input event further comprises: the size of a contact oval area at the position of the contact center coordinate point;
the standard model for each mode of operation invoked by the processor further comprises: and the size of the contact oval area at a plurality of coordinate point positions in the operation area corresponding to each operation mode.
9. The terminal of claim 8, wherein the processor is specifically configured to perform standard model matching according to similarity between an operation area to which the touch event touch center coordinate point belongs and a direction and a size of a contact elliptical area at the touch center coordinate point and the direction and the size of the contact elliptical area at the same or similar coordinate point position in a standard model corresponding to the operation area to which the touch center coordinate point belongs.
10. The terminal according to any one of claims 7 to 9, wherein the processor is further configured to provide an interactive interface for a user, and collect touch sample events in various operation modes input by the user through the interactive interface; and establishing a standard model of each operation mode according to the collected touch sample event.
11. The terminal of claim 7,
the operation modes include: a left-handed, one-handed operation mode, a right-handed, one-handed operation mode, and a two-handed operation mode.
12. The terminal according to any one of claims 7 to 9 and 11, wherein the processor is further configured to adjust a layout of the terminal operation interface according to the identified operation mode.
CN201610076799.1A 2016-02-04 2016-02-04 Automatic operation mode identification method and terminal Active CN107037951B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201610076799.1A CN107037951B (en) 2016-02-04 2016-02-04 Automatic operation mode identification method and terminal
PCT/CN2016/080055 WO2016197714A1 (en) 2016-02-04 2016-04-22 Method for automatically identifying operation mode, and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610076799.1A CN107037951B (en) 2016-02-04 2016-02-04 Automatic operation mode identification method and terminal

Publications (2)

Publication Number Publication Date
CN107037951A CN107037951A (en) 2017-08-11
CN107037951B true CN107037951B (en) 2020-02-21

Family

ID=57503126

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610076799.1A Active CN107037951B (en) 2016-02-04 2016-02-04 Automatic operation mode identification method and terminal

Country Status (2)

Country Link
CN (1) CN107037951B (en)
WO (1) WO2016197714A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109933271B (en) * 2017-12-18 2022-06-17 佳能株式会社 Data processing apparatus and method, user interface adjusting apparatus and method, and medium
CN110858120B (en) * 2018-08-24 2023-02-17 北京搜狗科技发展有限公司 Input keyboard recommendation method and device
CN113996058B (en) * 2021-11-01 2023-07-25 腾讯科技(深圳)有限公司 Information processing method, apparatus, electronic device, and computer-readable storage medium
CN114103845B (en) * 2022-01-25 2022-04-15 星河智联汽车科技有限公司 Vehicle central control screen operator identity recognition method and device and vehicle

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103927105A (en) * 2013-01-11 2014-07-16 联想(北京)有限公司 User interface display method and electronic device
CN104281368A (en) * 2014-09-29 2015-01-14 小米科技有限责任公司 Interface display method and device and terminal device
CN104932825A (en) * 2015-06-15 2015-09-23 金陵科技学院 Method for automatically sensing left hand/right hand to operate mobile phone and determining moving thermal region of thumb

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8154529B2 (en) * 2009-05-14 2012-04-10 Atmel Corporation Two-dimensional touch sensors
CN101916161B (en) * 2010-08-04 2012-10-10 宇龙计算机通信科技(深圳)有限公司 Interface model selection method based on image of region pressed by finger and mobile terminal
WO2012019350A1 (en) * 2010-08-12 2012-02-16 Google Inc. Finger identification on a touchscreen
CN103488406B (en) * 2012-06-11 2016-09-07 中兴通讯股份有限公司 Adjust the method for mobile terminal screen keyboard, device and mobile terminal

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103927105A (en) * 2013-01-11 2014-07-16 联想(北京)有限公司 User interface display method and electronic device
CN104281368A (en) * 2014-09-29 2015-01-14 小米科技有限责任公司 Interface display method and device and terminal device
CN104932825A (en) * 2015-06-15 2015-09-23 金陵科技学院 Method for automatically sensing left hand/right hand to operate mobile phone and determining moving thermal region of thumb

Also Published As

Publication number Publication date
WO2016197714A1 (en) 2016-12-15
CN107037951A (en) 2017-08-11

Similar Documents

Publication Publication Date Title
CN106775084B (en) A kind of false-touch prevention method, device and mobile terminal of touch screen
CN106681638B (en) A kind of touch screen control method, device and mobile terminal
CN106527818B (en) Control method, device and the mobile terminal of touch operation on a kind of mobile terminal
CN107037951B (en) Automatic operation mode identification method and terminal
CN107132986B (en) Method and device for intelligently adjusting touch response area through virtual keys
US20130222338A1 (en) Apparatus and method for processing a plurality of types of touch inputs
CN106681554B (en) A kind of control method of mobile terminal touch screen, device and mobile terminal
CN105487809A (en) Terminal control method and device
CN106598335A (en) Touch screen control method and apparatus for mobile terminal, and mobile terminal
CN106775087A (en) A kind of touch-screen control method of mobile terminal, device and mobile terminal
CN105739868B (en) A kind of method and device that identification terminal is accidentally touched
CN104765541A (en) Method and system for identifying whether left hand or right hand operates mobile phone
JP2017529582A (en) Touch classification
CN108874234B (en) Touch identification method and device and touch display device
CN101968714B (en) Method and system for identifying operation locus input on mobile terminal interface
CN104571882A (en) User operating mode judging method and device based on terminal and terminal
TWI528271B (en) Method, apparatus and computer program product for polygon gesture detection and interaction
CN106227520A (en) A kind of application interface changing method and device
CN103870071B (en) One kind touches source discrimination and system
CN105159446A (en) One-hand operation method and apparatus for terminal
CN105487784A (en) Control method and device of electronic terminal, terminal and system
CN105867821A (en) Icon arranging method and device and terminal
CN104866226A (en) Terminal device and method for controlling same
CN105630370A (en) Slide touch operation method and apparatus for terminal, and terminal
CN103294175A (en) Electronic device and method for electronic device to automatically switch input modes

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant