WO2015149588A1 - 手持设备上用户操作模式的识别方法及手持设备 - Google Patents
手持设备上用户操作模式的识别方法及手持设备 Download PDFInfo
- Publication number
- WO2015149588A1 WO2015149588A1 PCT/CN2015/072531 CN2015072531W WO2015149588A1 WO 2015149588 A1 WO2015149588 A1 WO 2015149588A1 CN 2015072531 W CN2015072531 W CN 2015072531W WO 2015149588 A1 WO2015149588 A1 WO 2015149588A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- operation mode
- finger
- sliding
- hand operation
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Definitions
- the present invention relates to the field of human-computer interaction technologies, and in particular, to a method for identifying a user operation mode on a handheld device and a handheld device.
- UI User Interface
- many applications need to manually set the user operation mode, such as the left-hand operation mode or the right-hand operation mode, and then determine the UI presentation mode according to the set operation mode.
- the application software In the prior art, there are two ways to obtain the user operation mode.
- One is to manually set the application software, and the application software usually provides two single-hand operation modes, which are manually set by the user before using the application software.
- the other is to automatically obtain the user's operation mode, and automatically obtain the user's operation mode in two ways. The first is to identify the operation mode through the sensor in the mobile phone. The second is to identify the left and right hand mode of operation by calculating the slope of the user's sliding on the screen.
- the inventor of the present application found in the long-term research and development that the sensor is specifically used to identify the operation mode, the additional cost is increased, and the accuracy of the judgment depends on the sensitivity of the sensor; the way of calculating the slope of the user sliding on the screen is not accurate. High, the impact of individual differences is also large.
- the technical problem to be solved by the present invention is to provide a method for identifying a user operation mode on a handheld device and a handheld device, which can enrich the way of identifying the user operation mode and increase the accuracy of recognition without additional cost.
- the present invention provides a method for identifying a user operation mode on a handheld device, including: when detecting that a user's finger is sliding on a screen of the handheld device, acquiring sliding information of the user's finger during the sliding process;
- the operation mode of the user is identified according to the sliding information of the user's finger during the sliding process, and the operation mode includes: a left-hand operation mode and a right-hand operation mode.
- the sliding information of the user's finger during the sliding process includes a change of a contact area of the user's finger with the screen during sliding, and a sliding acceleration Change either or both.
- the determining, according to the sliding information of the user's finger during the sliding process, identifying an operation mode of the user Including: if the contact area of the user's finger with the screen is gradually reduced from left to right, identifying an operation mode of the user is a right-hand operation mode; if the user's finger is in contact with the screen The area is gradually enlarged from left to right, and the operation mode identifying the user is the left-hand operation mode.
- the determining, according to the sliding information of the user's finger during a sliding process, identifying an operation mode of the user Including: if the sliding acceleration of the user's finger is gradually increasing from left to right, the operation mode identifying the user is a right-hand operation mode; if the sliding acceleration of the user's finger is gradually changed from left to right Small, it is recognized that the operation mode of the user is a left-hand operation mode.
- the determining, according to the sliding information of the user's finger in a sliding process, identifying an operation mode of the user The weight value of the change of the contact area of the finger of the user with the screen during the sliding process is set to w2, and the weight value of the change of the sliding acceleration of the finger of the user during the sliding process is w3;
- the contact area of the user's finger with the screen is gradually reduced from left to right, and then determining that the user's right hand operation mode probability increases the weight value w2, if the contact area of the user's finger with the screen is from the left To the right is gradually larger, determining that the user's left-hand operation mode probability increases the weight value w2; if the sliding acceleration of the user's finger gradually increases from left to right, determining the user's right-hand operation mode probability increase weight value W3, if the sliding acceleration of the user's finger gradually becomes smaller from left to right, determining that the user
- the sliding information of the user's finger during the sliding process further includes a sliding direction of the user's finger;
- the identifying the operation mode of the user according to the sliding information of the user's finger during the sliding process comprises: setting a weight value of the sliding direction of the finger of the user to w1, and the user's finger is in the sliding process
- the weight value of the change of the contact area with the screen is w2
- the weight value of the change of the sliding acceleration of the user's finger during the sliding process is w3
- Determining the right-hand operation mode probability increase weight value w1 of the user, if the sliding direction of the user's finger is to the left, determining that the user's left-hand operation mode probability increases the weight value w1; if the user's finger and the screen
- the contact area is gradually reduced from left to right, and then determining that the user's right hand
- the sliding information of the user's finger during the sliding process further includes that the user's finger is in the sliding process Identifying an operation mode of the user according to the sliding information of the user's finger during the sliding process, including: setting a pixel point that the user's finger passes during the sliding process
- the weight value of the entered area is w0
- the weight value of the change of the contact area of the finger of the user with the screen during the sliding process is w2
- the weight of the change of the sliding acceleration of the user's finger during the sliding process The value is w3; if the pixel point that the user's finger passes falls into the right area of the screen, it is determined that the user's right hand operation mode probability increases the weight value w0, if the pixel point of the user's finger passes
- the left area of the screen determines that the user's left hand operation mode probability increases the weight value w0; if the contact area of the user's finger with the
- the weight values w0, w1, w2, and w3 are according to the The size of the screen and the length and shape of the user's finger sliding on the screen are set.
- any one of the first to sixth possible implementations of the first aspect, in the eighth possible implementation manner of the first aspect, the action of the user's finger sliding on the screen is The unlocking action of the screen.
- the operating mode of the user is further included: automatically switching the display mode of the operating interface of the handheld device according to the operating mode of the user.
- the present invention provides a handheld device, the handheld device includes: a detection module, an acquisition module, and an identification module; the detection module is configured to detect whether a user's finger slides on a screen of the handheld device; The acquiring module is configured to: when the detecting module detects that the finger of the user slides on the screen of the handheld device, acquire sliding information of the finger of the user during sliding; the identifying module is configured to be used according to the acquiring module The obtained sliding information of the user's finger during the sliding process identifies the operating mode of the user, the operating modes including: a left-hand operating mode and a right-hand operating mode.
- the sliding information of the user's finger during the sliding process includes a change of a contact area of the user's finger with the screen during sliding, and a sliding acceleration Change either or both.
- the identifying module includes: a first determining unit and a first identifying unit; Determining whether the contact area of the user's finger with the screen is gradually decreasing from left to right; the first identifying unit is configured to determine, at the first determining unit, that the user's finger and the location are The contact area of the screen is gradually changed from left to right, and the operation mode for identifying the user is a right-hand operation mode; the first identification unit is further configured to: the result of the determination by the first determination unit is the user When the contact area of the finger with the screen is gradually increased from left to right, the operation mode for recognizing the user is the left-hand operation mode.
- the identifying module includes: a second determining unit and a second identifying unit; Determining whether the sliding acceleration of the finger of the user is gradually increasing from left to right; the second identifying unit is configured to determine, in the second determining unit, that the sliding acceleration of the finger of the user is from left to When the right is gradually increasing, the operation mode for identifying the user is a right-hand operation mode; the second recognition unit is further configured to: in the second determination unit, the determination result is that the sliding acceleration of the user's finger is from left to The right is gradually getting smaller, and the operation mode identifying the user is the left-hand operation mode.
- the identifying module includes: a first setting unit, a first determining unit, a first comparing unit, and a third a recognition unit;
- the first setting unit is configured to set a weight value of the change of the contact area of the finger of the user with the screen during the sliding process, and the change of the sliding acceleration of the user's finger during the sliding process
- the weight value of the case is w3;
- the first determining unit is configured to determine that the user's right hand operation mode probability increases the weight value w2 when the contact area of the user's finger and the screen is gradually decreasing from left to right.
- the first determining unit is further configured to be in the user When the sliding acceleration of the finger gradually increases from left to right, it is determined that the user's right hand operation mode probability increases the weight value w3, and the sliding acceleration of the user's finger is from left to right.
- the first comparison unit is configured to compare the user right hand operation mode probability and the left hand operation mode probability; the third identification unit is used in the When the comparison result of the first comparison unit is that the right-hand operation mode probability of the user is greater than the left-hand operation mode probability, the operation mode for identifying the user is a right-hand operation mode, and the comparison result of the first comparison unit is the right hand of the user. When the operating mode probability is less than the left-hand operating mode probability, the operating mode identifying the user is the left-hand operating mode.
- the sliding information of the user's finger during the sliding process further includes a sliding direction of the user's finger;
- the identification module includes: a second setting unit, a second determining unit, a second comparing unit, and a fourth identifying unit;
- the second setting unit is configured to set a weight value of the sliding direction of the finger of the user to w1, The weight value of the change of the contact area of the finger of the user with the screen during the sliding process is w2, and the weight value of the change of the sliding acceleration of the finger of the user during the sliding process is w3;
- the second determining unit Determining, when the sliding direction of the finger of the user is to the right, determining the right-hand operation mode probability increase weight value w1 of the user, and determining the left-hand operation mode of the user when the sliding direction of the user's finger is to the left Probabilistically increasing the weight value w1;
- the second determining unit is further configured to determine that the
- the sliding information of the user's finger during the sliding process further includes that the user's finger is in the sliding process
- the recognition module includes: a third setting unit, a third determining unit, a third comparing unit, and a fifth identifying unit; the third setting unit is configured to set the finger of the user The weight value of the area where the pixel point that passes through during the sliding process is w0, and the weight value of the change of the contact area of the finger of the user with the screen during the sliding process is w2, and the user's finger is in the sliding process.
- the weighting value of the change of the sliding acceleration is w3; the third determining unit is configured to determine that the right-hand operation mode probability of the user increases when the pixel point that the user's finger passes falls into the right area of the screen a weight value w0, when the pixel point that the user's finger passes falls into the left area of the screen, determining that the user's left-hand operation mode probability increases the weight value w0; The third determining unit is further configured to determine that the user's right hand operation mode probability increases the weight value w2 when the contact area of the user's finger and the screen is gradually decreasing from left to right, in the user's finger and the location Determining the user's left-hand operation mode probability increase weight value w2 when the contact area of the screen is gradually increasing from left to right; the third determining unit is further configured to use the sliding acceleration of the user's finger from left to right When the user is gradually getting larger, determining that the user's right-hand operation mode probability increases the weight value w3, and when the sliding acceleration
- the weight values w0, w1, w2, and w3 are according to the The size of the screen and the length and shape of the user's finger sliding on the screen are set.
- any one of the first to sixth possible implementation manners of the second aspect, in the seventh possible implementation manner of the second aspect, the action of the user's finger sliding on the screen is The unlocking action of the screen.
- the handheld device further includes a switching module, configured to automatically switch the device according to the operation mode of the user.
- the display mode of the handheld device operation interface is not limited to the operation mode of the user.
- the invention has the beneficial effects that, when the user's finger is detected to slide on the screen of the handheld device, the sliding information of the user's finger during the sliding process is acquired; according to the user
- the sliding information of the finger during the sliding process identifies the operating mode of the user, the operating modes including: a left-hand operating mode and a right-hand operating mode. In this way, the additional cost can be eliminated, the way of identifying the user's operating mode is enriched, and the accuracy of the recognition is increased.
- FIG. 1 is a schematic structural view of a handheld device according to the present invention.
- FIG. 2 is a flow chart of an embodiment of a method for identifying a user operation mode on a handheld device of the present invention
- FIG. 3 is a flow chart of another embodiment of a method for identifying a user operation mode on a handheld device of the present invention.
- FIG. 4 is a flow chart of still another embodiment of a method for identifying a user operation mode on a handheld device of the present invention
- FIG. 5 is a flow chart of still another embodiment of a method for identifying a user operation mode on a handheld device of the present invention.
- FIG. 6 is a schematic diagram showing a change in contact area between a user's right finger and a screen in a method for recognizing a user operation mode on a handheld device;
- FIG. 7 is a schematic diagram showing a change in contact area between a user's left finger and a screen in a method for identifying a user operation mode on the handheld device of the present invention
- FIG. 8 is a flow chart of still another embodiment of a method for identifying a user operation mode on a handheld device of the present invention.
- FIG. 9 is a flow chart of still another embodiment of a method for identifying a user operation mode on a handheld device of the present invention.
- FIG. 10 is a schematic diagram of sliding unlocking in a method for identifying a user operation mode on a handheld device of the present invention
- FIG. 11 is another schematic diagram of sliding unlocking in a method for identifying a user operation mode on a handheld device of the present invention.
- FIG. 12 is another schematic diagram of sliding unlocking in a method for identifying a user operation mode on a handheld device of the present invention
- FIG. 13 is a flow chart of still another embodiment of a method for identifying a user operation mode on a handheld device of the present invention.
- FIG. 14 is a schematic structural diagram of an embodiment of a handheld device according to the present invention.
- FIG. 15 is a schematic structural view of another embodiment of a handheld device according to the present invention.
- FIG. 16 is a schematic structural view of still another embodiment of the handheld device of the present invention.
- FIG. 17 is a schematic structural view of still another embodiment of the handheld device of the present invention.
- FIG. 18 is a schematic structural view of still another embodiment of the handheld device of the present invention.
- FIG. 19 is a schematic structural view of still another embodiment of the handheld device of the present invention.
- FIG. 20 is a schematic structural view of still another embodiment of the handheld device of the present invention.
- 21 is a schematic diagram showing the physical structure of an embodiment of a handheld device according to the present invention.
- UI User Interface
- many applications need to determine how the UI is presented based on the user's operating mode when operating the handheld device with one hand.
- FIG. 1 is a schematic structural view of a handheld device according to the present invention.
- the logical structure of the handheld device applied to the method for identifying the user operation mode on the handheld device provided by the embodiment of the present invention is described by using FIG. 1 as an example.
- the handheld device can be specifically a smart phone.
- the hardware layer of the handheld device includes a CPU, a GPU, and the like, and may further include a memory, an input/output device, a memory, a memory controller, a network interface, and the like.
- the input device may include a touch screen or the like, and the output device may include Display devices such as LCD, CRT, Holographic, Projector, and the like.
- the handheld device further includes a driving layer, a frame layer, and an application layer.
- the driver layer may include a CPU driver, a GPU driver, a display controller driver, and the like.
- the framework layer can include system services (System Service), web service (Web Service) and user service (Customer) Service) and so on.
- the application layer may include a desktop, a media player, a browser, and the like.
- FIG. 2 is a flowchart of an embodiment of a method for identifying a user operation mode on a handheld device according to the present invention, including:
- Step S101 When it is detected that the user's finger slides on the screen of the handheld device, the sliding information of the user's finger during the sliding process is acquired.
- Step S102 Identify a user's operation mode according to the sliding information of the user's finger during the sliding process, and the operation mode includes: a left-hand operation mode and a right-hand operation mode.
- the operation mode includes: a left-hand operation mode and a right-hand operation mode; the left-hand operation mode is a mode of operating the handheld device by the left hand, and the right-hand operation mode is a mode of operating the handheld device by the right hand, that is, the left-hand operation mode indicates that the user operates the handheld device with the left hand.
- the right hand mode of operation indicates that the user is operating the handheld device with his right hand.
- the sliding information is naturally generated when the user's finger slides on the handheld screen, as long as the sliding information is captured or collected, the user's operation mode can be recognized by the sliding information without additional sensors.
- the sliding information of the user's finger during the sliding process is acquired; when the user's finger is detected to slide on the screen of the handheld device, the user's finger is acquired. Sliding information during the sliding process. In this way, the cost of the handheld device can be reduced when the user's operating mode is recognized, and the accuracy of the recognition can be improved.
- FIG. 2 is a flowchart of another embodiment of a method for identifying a user operation mode on a handheld device of the present invention, including:
- Step S201 When detecting that the user's finger slides on the screen, obtain one or both of a change in the contact area of the user's finger with the screen during the sliding process, a change in the sliding acceleration.
- the contact area between the finger and the screen is constantly changing, and the acceleration of the finger sliding is constantly changing, obtaining the change of the contact area between the finger and the screen, or acquiring the acceleration of the sliding of the finger.
- Step S202 Identify the operation mode of the user according to whether the user's finger changes with the contact area of the screen during the sliding process, the change of the sliding acceleration, or both.
- step S202 may include: step S202a and step S202b; or step S202 may include step S202c and step S202d; or step S202 may include step S202e, step S202f, step S202g, step S202h, and Step S202i.
- step S202 includes: step S202a and step S202b.
- Step S202a If the contact area of the user's finger with the screen is gradually reduced from left to right, the operation mode of the recognized user is the right-hand operation mode.
- Step S202b If the contact area of the user's finger with the screen is gradually increased from left to right, the operation mode of the recognized user is the left-hand operation mode.
- step S202 includes: step S202c and step S202d.
- Step S202c If the sliding acceleration of the user's finger is gradually increased from left to right, the operation mode of the recognized user is the right-hand operation mode.
- Step S202d If the sliding acceleration of the user's finger is gradually reduced from left to right, the operation mode of the recognized user is the left-hand operation mode.
- step S202 includes: step S202e, step S202f Step S202g, step S202h, and step S202i.
- Step S202e setting a weight value of the change of the contact area of the finger of the user with the screen during the sliding process is w2, and the weight value of the change of the sliding acceleration of the user's finger during the sliding process is w3.
- step S201 and step S202e There is no obvious sequence in step S201 and step S202e.
- Step S202f If the contact area of the user's finger and the screen gradually decreases from left to right, it is determined that the user's right hand operation mode probability increases the weight value w2, and if the contact area of the user's finger and the screen is gradually increased from left to right. Then, it is determined that the user's left-hand operation mode probability increases the weight value w2.
- Step S202g If the sliding acceleration of the user's finger gradually increases from left to right, it is determined that the user's right hand operation mode probability increases the weight value w3, and if the sliding acceleration of the user's finger gradually decreases from left to right, the user's left hand operation is determined. The mode probability increases the weight value w3.
- step S202f There is no sequence in step S202f and step S202g.
- Step S202h Compare the magnitudes of the user right hand operation mode probability and the left hand operation mode probability.
- Step S202i If the user right hand operation mode probability is greater than the left hand operation mode probability, the recognition user's operation mode is a right hand operation mode, and if the user right hand operation mode probability is less than the left hand operation mode probability, the recognition user's operation mode is a left hand operation mode.
- the thumb when the thumb is on the screen in the left hand (the left-hand operation mode), the thumb touches the screen from the right to the curved state.
- the left is gradually getting smaller (or gradually increasing from left to right).
- the principle is shown in Figure 7.
- the left hand position is 21, the left hand thumb is straight and the contact area with the screen is 22, and the left thumb is bent.
- the operation mode of the recognized user is the right-hand operation mode. If the contact area of the user's finger with the screen is gradually increased from left to right or the sliding acceleration of the user's finger is gradually reduced from left to right, the operation mode of the recognized user is the left-hand operation mode.
- the weight values of the two are separately set in advance; when the user's finger The contact area with the screen gradually decreases from left to right, and the probability of increasing the right hand operation mode of the user is increased by the weight value w2. If the contact area of the user's finger and the screen is gradually increased from left to right, the probability of the user's left hand operation mode is determined.
- the weight value w2 is increased; if the sliding acceleration of the user's finger gradually increases from left to right, it is determined that the user's right hand operation mode probability increases the weight value w3, and if the sliding acceleration of the user's finger gradually decreases from left to right, the user is determined.
- the left hand operation mode probability increases the weight value w3; then the user right hand operation mode probability and the left hand operation mode probability are compared, and the user's operation mode is identified according to the comparison result.
- step S202 can identify the user's operation mode according to one or both of the change of the contact area of the user's finger with the screen during the sliding process, or the change of the sliding acceleration.
- the user's operation mode can be recognized after obtaining different rules.
- the embodiment of the present invention acquires one or both of the change of the contact area of the finger of the user with the screen during the sliding process, and the change of the sliding acceleration; the change of the contact area with the screen according to the finger of the user during the sliding process
- the change in the sliding acceleration either or both, identifies the user's operating mode. In this way, on the one hand, no additional cost is required and the way of recognizing the left and right handsets of the user is enriched, and on the other hand, when the contact area of the screen with the screen of the user's finger is changed, the sliding acceleration changes. When both recognize the user's operation mode, the accuracy of recognition can be increased.
- the change of the contact area between the finger and the screen when the user's finger is unlocked by sliding, and the sliding speed may be acquired.
- the change of the situation, either or both, and further, the user's finger changes the contact area of the finger with the screen when sliding, the change of the sliding speed, or both, to identify the user's operation mode, That is, whether the user is in the left-hand operation mode or the right-hand operation mode.
- the user's operation mode can be recognized, and then the user interface can be switched to the form that matches the user's operation mode at the first time before the user performs the next operation (for example, the left hand is convenient).
- the form of operation, or the form of convenient right-hand operation further enhances the user experience.
- FIG. 8 to FIG. 9 are flowcharts of two embodiments of a method for identifying a user operation mode on a handheld device according to the present invention, and the specific contents are as follows:
- Step S301 detecting that the user's finger slides on the screen.
- Step S302 Acquire a change of the contact area of the finger of the user with the screen during the sliding process, a change of the sliding acceleration, and a sliding direction of the finger of the user.
- the sliding direction of the user's finger specifically includes the following contents:
- the position of the starting and ending points of the finger in the sliding track also helps to determine the sliding direction of the finger. According to the sliding direction of the finger, it is also possible to roughly determine which hand the user is holding.
- Step S303 setting the weight value of the sliding direction of the user's finger to w1, the weight value of the change of the contact area of the user's finger with the screen during the sliding process is w2, and the sliding acceleration of the user's finger during the sliding process is changed.
- the weight value is w3.
- Step S304 If the sliding direction of the finger of the user is rightward, the user right hand operation mode probability increase weight value w1 is determined, and if the user's finger sliding direction is leftward, the user left hand operation mode probability increase weight value w1 is determined.
- Step S305 If the contact area of the user's finger and the screen is gradually reduced from left to right, determining that the user's right hand operation mode probability increases the weight value w2, if the contact area of the user's finger and the screen is gradually increased from left to right. Then, it is determined that the user's left-hand operation mode probability increases the weight value w2.
- Step S306 If the sliding acceleration of the user's finger gradually increases from left to right, it is determined that the user's right hand operation mode probability increases the weight value w3, and if the sliding acceleration of the user's finger gradually decreases from left to right, the user's left hand operation is determined. The mode probability increases the weight value w3.
- step S304 There is no sequence in step S304, step S305, and step S306.
- Step S307 Compare the magnitudes of the user's right hand operation mode probability and the left hand operation mode probability.
- Step S308 If the user right hand operation mode probability is greater than the left hand operation mode probability, the recognition user's operation mode is a right hand operation mode, and if the user right hand operation mode probability is less than the left hand operation mode probability, the recognition user's operation mode is a left hand operation mode.
- the sliding information of the user's finger during the sliding process in the embodiment of FIG. 9 includes: a change in the contact area of the user's finger with the screen during the sliding process, a change in the sliding acceleration, and a sliding process of the user's finger.
- Step S401 detecting that the user's finger is sliding on the screen.
- Step S402 Acquire a change of the contact area of the finger of the user with the screen during the sliding process, a change of the sliding acceleration, and an area where the pixel point of the user's finger passes during the sliding process.
- Step S403 setting the weight value of the area where the pixel point of the user's finger passes during the sliding process is w0, and the weight value of the change of the contact area of the user's finger with the screen during the sliding process is w2, and the user's finger is at The weight value of the change in the sliding acceleration during the sliding process is w3.
- Step S404 If the pixel point that the user's finger passes falls into the right area of the screen, it is determined that the user's right hand operation mode probability increases the weight value w0, and if the pixel point that the user's finger passes falls into the left area of the screen, the user is determined.
- the left hand operation mode probability increases the weight value w0.
- the user's single handset has a limited range of motion of the finger.
- the pixels passing through during the sliding process are basically concentrated in a certain area. Which handheld device is used, the pixels passing through during the sliding process are basically concentrated in the area close to the hand, so According to the area where the pixel of the user's finger passes during the sliding process, it is also possible to roughly determine which hand the user is holding.
- Step S405 If the contact area of the user's finger and the screen is gradually reduced from left to right, determining that the user's right hand operation mode probability increases the weight value w2, if the contact area of the user's finger and the screen is gradually increased from left to right. , determining the user left hand operation mode probability increase weight value w2;
- Step S406 If the sliding acceleration of the user's finger gradually increases from left to right, it is determined that the user's right hand operation mode probability increases the weight value w3, and if the sliding acceleration of the user's finger gradually decreases from left to right, the user's left hand operation is determined. The mode probability increases the weight value w3.
- step S404 There is no sequence in step S404, step S405, and step S406.
- Step S407 Compare the size of the user's right hand operation mode probability and the left hand operation mode probability.
- Step S408 If the user right hand operation mode probability is greater than the left hand operation mode probability, the recognition user's operation mode is a right hand operation mode, and if the user right hand operation mode probability is less than the left hand operation mode probability, the recognition user's operation mode is a left hand operation mode.
- Detecting that the user's finger is sliding on the screen there are four parameters to be obtained: the area where the user's finger passes during the sliding process, the sliding direction of the user's finger, and the user's finger during the sliding process.
- the change of the contact area of the screen and the change of the sliding acceleration of the user's finger during the sliding process according to the actual application situation, some parameters can be obtained to comprehensively identify the probability of the left and right handsets, and the more parameters are acquired, the recognized The higher the accuracy, the above embodiment is only to select several combinations, and other combinations are not described here.
- the action of the user's finger sliding on the screen is the unlocking action of the screen.
- the left and right handset modes can be quickly and accurately identified without requiring the user to add any additional actions; and the result of the identification is applied throughout the unlock period after unlocking.
- the weight value of the area where the pixel that the user's finger passes during the sliding process is set to w0, the weight of the sliding direction of the user's finger is w1, and the contact area of the user's finger with the screen during the sliding process changes.
- the weight value of the situation is w2, and the weight value of the change in the sliding acceleration of the user's finger during the sliding process is w3.
- the user's right hand operation mode probability increases the weight value w0, and if the pixel point that the user's finger passes falls into the left area of the screen, the user's left hand operation mode is determined. Probability increases the weight value w0;
- the user's right hand operation mode probability increases the weight value w2
- the contact area of the user's finger and the screen gradually increases from left to right, then it is determined.
- the user's left hand operation mode probability increases the weight value w2;
- weight value w3 If the sliding acceleration of the user's finger gradually increases from left to right, it is determined that the user's right hand operation mode probability increases the weight value w3, and if the sliding acceleration of the user's finger gradually decreases from left to right, it is determined that the user's left hand operation mode probability increases. Weight value w3.
- the recognition user's operation mode is the right-hand operation mode, and if the user's right-hand operation mode probability is less than the left-hand operation mode probability, the recognition user's operation mode is the left-hand operation mode.
- the right-hand operation mode probability is w0+w2+w3
- the left-hand operation mode probability is w1
- the sizes of w0+w2+w3 and w1 are compared. If w0+w2+w3 is greater than w1, the recognition user's operation mode is the right-hand operation mode. .
- the weight values w0, w1, w2, and w3 are set according to the size of the screen and the length and shape of the user's finger sliding on the screen.
- the thumb has difficulty in approaching the position of the opposite side of the frame (ie, the left hand touches the right border and the right hand touches the left border).
- a method of laterally sliding a certain distance to unlock is designed, as shown in FIG. 10, the user can draw a horizontal slide on the screen to unlock, and then it is obviously more effective to judge with the finger falling in the area where the pixel passes during the sliding process.
- the weight value w0 needs to be enlarged.
- the weight value may be w0>w2>w1>w3.
- the unlocking mode shown in Figure 11 is designed.
- the weight value needs to be adjusted to: w2>w1>w3>w0. Since the two ends of the unlocking pattern of FIG. 11 are completely symmetrical, it is judged that the weight value w0 of the region in which the pixel that the finger passes during the sliding process can be completely ignored (that is, the w0 weight value is 0).
- the above four parameters do not necessarily need to be used at the same time. Depending on the size of the screen and the design of the unlocked sliding shape, it is possible to judge only by using several of the parameters.
- the unlocking sliding area can be designed in the left and right lower corners when designing the unlocking interface, and the one-handed thumb is impossible.
- the opposite side in this extreme case, it is possible to obtain an accurate judgment by using only one weight value w0 of the finger falling on the screen area when sliding as a judgment basis.
- the embodiment of the present invention acquires one or both of the change of the contact area of the finger of the user with the screen during the sliding process, and the change of the sliding acceleration; the change of the contact area with the screen according to the finger of the user during the sliding process
- the change in the sliding acceleration either or both, identifies the user's operating mode. In this way, on the one hand, no additional cost is required and the way of recognizing the user's operation mode is enriched.
- the contact area with the screen changes, and the sliding acceleration changes. When the probability of the user's operation mode is recognized, the accuracy of the recognition can be increased.
- the recognition accuracy can be further improved by combining the area where the pixel of the parameter user's finger passes during the sliding process and the sliding direction of the user's finger; the action of the user's finger sliding on the screen is the unlocking action of the screen, which can be The left and right handset modes are quickly and accurately identified without requiring the user to add any additional actions; and the identified results are applied throughout the unlock period after unlocking.
- FIG. 13 is a flowchart of still another embodiment of a method for identifying a user operation mode on a handheld device according to the present invention, including:
- Step S501 When it is detected that the user's finger slides on the screen of the handheld device, obtain one or both of a change in the contact area of the user's finger with the screen during the sliding process, a change in the sliding acceleration.
- Step S502 Identify the operation mode of the user according to the change of the contact area of the user's finger with the screen during the sliding process, the change of the sliding acceleration, or both.
- Step S503 Automatically switch the display mode of the operation interface of the handheld device according to the operation mode of the user.
- the display mode of the operation interface is automatically switched, wherein the display modes of the operation interface include: a left-hand display mode and a right-hand display mode; in the left-hand display mode, the operation interface Usually displayed in the left side of the screen, so that the user can operate the left hand.
- the operation interface is usually displayed in the right side of the screen to facilitate the user's right hand operation. That is, the left-hand display mode matches the left-hand operation mode, and the right-hand display mode matches the right-hand operation mode.
- the switched display mode is applied to the entire unlocking period after the unlocking.
- the embodiment of the present invention acquires one or both of the change of the contact area of the finger of the user with the screen during the sliding process, and the change of the sliding acceleration; the change of the contact area with the screen according to the finger of the user during the sliding process
- the change of the sliding acceleration either or both, identifies the user's operating mode; the operating interface is automatically switched to a display mode that matches the user's operating mode according to the user's operating mode. In this way, on the one hand, no additional cost is required and the manner of recognizing the operation mode of the user is enriched, and on the other hand, when the contact area of the screen with the screen of the user's finger is changed, the change of the sliding acceleration is changed. When the two recognize the user's operation mode, the recognition accuracy can be increased, so that the display mode of the operation interface is more accurate.
- FIG. 14 is a schematic structural diagram of an embodiment of a handheld device according to the present invention.
- the handheld device 10 includes a detection module 101, an acquisition module 102, and an identification module 103.
- the handheld device of the present embodiment can perform the steps in FIG. 2.
- the detecting module 101 is configured to detect whether the user's finger is sliding on the screen of the handheld device.
- the obtaining module 102 is configured to acquire sliding information of the user's finger during the sliding process when the detecting module 101 detects that the user's finger slides on the screen of the handheld device.
- the identification module 103 is configured to identify an operation mode of the user according to the sliding information of the user's finger acquired during the sliding process, and the operation mode includes: a left-hand operation mode and a right-hand operation mode.
- the operation mode includes: a left-hand operation mode and a right-hand operation mode; the left-hand operation mode is a mode of operating the handheld device by the left hand, and the right-hand operation mode is a mode of operating the handheld device by the right hand.
- the sliding information is naturally generated when the user's finger slides on the handheld screen, as long as the sliding information is captured or collected, the user's operation mode can be recognized by the sliding information without additional sensors.
- the sliding information of the user's finger during the sliding process is acquired; when the user's finger is detected to slide on the screen of the handheld device, the user's finger is acquired. Sliding information during the sliding process. In this way, the cost of the handheld device can be reduced when the user's operating mode is recognized, and the accuracy of the recognition can be improved.
- FIG. 15 is a schematic structural diagram of five embodiments of a handheld device according to the present invention.
- the handheld device 20 includes a detection module 201, an acquisition module 202, and an identification module 203.
- the handheld device of the present embodiment can perform the steps in FIG. 3, FIG. 4, FIG. 5, FIG. 8, and FIG.
- the detecting module 201 is configured to detect whether the user's finger is sliding on the screen of the handheld device.
- the obtaining module 202 is configured to acquire sliding information of the user's finger during the sliding process when the detecting module 201 detects that the user's finger slides on the screen of the handheld device.
- the identification module 203 is configured to identify an operation mode of the user according to the sliding information of the user's finger acquired during the sliding process, and the operation mode includes: a left-hand operation mode and a right-hand operation mode.
- the sliding information of the user's finger during the sliding process includes the change of the contact area of the user's finger with the screen during the sliding process, the change of the sliding acceleration, or both.
- the contact area between the finger and the screen is constantly changing, and the acceleration of the finger sliding is constantly changing, obtaining the change of the contact area between the finger and the screen, or acquiring the acceleration of the sliding of the finger.
- the identification module 203 is specifically configured to identify the operation mode of the user according to the change of the contact area of the user's finger with the screen during the sliding process, the change of the sliding acceleration, or both.
- the change of the contact area between the finger and the screen and the change of the acceleration of the finger slide are different for the left and right hands. Therefore, according to the change of the contact area between the finger and the screen, The change in sliding acceleration, either or both, can identify the user's operating mode.
- the identification module 203 includes a first determining unit 2031 and a first identifying unit 2032.
- the first determining unit 2031 is configured to determine whether the contact area of the user's finger and the screen is gradually decreasing from left to right.
- the first recognition unit 2032 is configured to determine that the contact area of the user's finger and the screen gradually decreases from left to right when the determination result of the first determination unit 2031 is that the operation mode of the user is the right-hand operation mode.
- the first identification unit 2032 is further configured to recognize that the operation mode of the user is the left-hand operation mode when the determination result of the first determination unit 2031 is that the contact area of the user's finger and the screen is gradually increased from left to right.
- the identification module 203 includes a second determining unit 2033 and a second identifying unit 2034.
- the second judging unit 2033 is for judging whether the sliding acceleration of the user's finger is gradually increasing from left to right.
- the second recognition unit 2034 is configured to recognize that the operation mode of the user is the right-hand operation mode when the determination result of the second determination unit 2033 is that the sliding acceleration of the user's finger is gradually increasing from left to right.
- the second identification unit 2034 is further configured to determine that the sliding acceleration of the user's finger is gradually decreasing from left to right when the determination result of the second determining unit 2033 is that the operation mode of the user is the left-hand operation mode.
- the identification module 203 includes: the first setting unit 2035.
- the first setting unit 2035 is configured to set a weight value w2 of a change in the contact area of the finger of the user with the screen during the sliding process, and a weight value w3 of the change of the sliding acceleration of the user's finger during the sliding process.
- the first determining unit 2036 is configured to determine that the user's right hand operation mode probability increases the weight value w2 when the contact area of the user's finger and the screen is gradually changed from left to right, and the contact area of the user's finger and the screen is from left to right. When gradually increasing, it is determined that the user's left-hand operation mode probability increases the weight value w2.
- the first determining unit 2036 is further configured to determine that the user's right hand operation mode probability increase weight value w3 when the sliding acceleration of the user's finger gradually increases from left to right, and the sliding acceleration of the user's finger gradually decreases from left to right. Determining the user's left hand operation mode probability increases the weight value w3.
- the first comparison unit 2037 is for comparing the magnitudes of the user's right hand operation mode probability and the left hand operation mode probability.
- the third identifying unit 2038 is configured to: when the comparison result of the first comparing unit is that the user right hand operating mode probability is greater than the left hand operating mode probability, the operating mode of the identifying user is a right hand operating mode, and the comparison result of the first comparing unit is a user right hand operation. When the mode probability is less than the left-hand operation mode probability, the recognition user's operation mode is the left-hand operation mode.
- the embodiment of the present invention acquires one or both of the change of the contact area of the finger of the user with the screen during the sliding process, and the change of the sliding acceleration; the change of the contact area with the screen according to the finger of the user during the sliding process
- the change in the sliding acceleration either or both, identifies the user's operating mode. In this way, on the one hand, no additional cost is required and the way of recognizing the left and right handsets of the user is enriched, and on the other hand, when the contact area of the screen with the screen of the user's finger is changed, the sliding acceleration changes. When both recognize the user's operation mode, the accuracy of recognition can be increased.
- the change of the contact area between the finger and the screen when the user's finger is unlocked by sliding, and the sliding speed may be acquired.
- the change of the situation, either or both, and further, the user's finger changes the contact area of the finger with the screen when sliding, the change of the sliding speed, or both, to identify the user's operation mode, That is, whether the user is in the left-hand operation mode or the right-hand operation mode.
- the user's operation mode can be recognized, and then the user interface can be switched to the form that matches the user's operation mode at the first time before the user performs the next operation (for example, the left hand is convenient).
- the form of operation, or the form of convenient right-hand operation further enhances the user experience.
- the identification module 203 if the sliding information of the user's finger during the sliding process includes a change in the contact area of the user's finger with the screen during the sliding process, a change in the sliding acceleration, and a sliding direction of the user's finger, the identification module 203
- the second setting unit 2039, the second determining unit 20310, the second comparing unit 20311, and the fourth identifying unit 20312 are included.
- the second setting unit 2039 is configured to set a weight value of the sliding direction of the user's finger as w1, a weight value of the change of the contact area of the user's finger with the screen during the sliding process, and a sliding acceleration of the user's finger during the sliding process.
- the weight of the change is w3;
- the second determining unit 20310 is configured to determine a user right hand operation mode probability increase weight value w1 when the sliding direction of the user's finger is rightward, and determine a user left hand operation mode probability increase weight when the user's finger sliding direction is to the left. Value w1;
- the second determining unit 20310 is further configured to determine that the user's right hand operation mode probability increases the weight value w2 when the contact area of the user's finger and the screen is gradually decreased from left to right, and the contact area of the user's finger and the screen is from left to right. When it is gradually getting larger, it is determined that the user's left-hand operation mode probability increases the weight value w2;
- the second determining unit 20310 is further configured to determine that the user's right hand operation mode probability increase weight value w3 when the sliding acceleration of the user's finger gradually increases from left to right, and the sliding acceleration of the user's finger gradually decreases from left to right. Determining the user left hand operation mode probability increase weight value w3;
- the second comparison unit 20311 is configured to compare the size of the user right hand operation mode probability and the left hand operation mode probability
- the fourth identification unit 20312 is configured to: when the comparison result of the second comparison unit 20311 is that the user's right-hand operation mode probability is greater than the left-hand operation mode probability, the operation mode of the recognition user is the right-hand operation mode, and the comparison result of the second comparison unit 20311 is the user. When the right hand operation mode probability is less than the left hand operation mode probability, the recognition user's operation mode is the left hand operation mode.
- the identification module 203 includes a third setting unit 20313, a third determining unit 20314, a third comparing unit 20315, and a fifth identifying unit 20316.
- the third setting unit 20313 is configured to set a weight value of the area where the pixel point of the user's finger passes during the sliding process is w0, and the weight value of the change of the contact area of the user's finger with the screen during the sliding process is w2, The weight value of the change in the sliding acceleration of the user's finger during the sliding process is w3.
- the third determining unit 20314 is configured to determine, when the pixel point that the user's finger passes, falls into the right area of the screen, determine the user right hand operation mode probability increase weight value w0, and the pixel point that the user's finger passes falls into the left area of the screen. At the time, it is determined that the user's left-hand operation mode probability increases the weight value w0.
- the third determining unit 20314 is further configured to determine that the user's right hand operation mode probability increases the weight value w2 when the contact area of the user's finger and the screen is gradually changed from left to right, and the contact area of the user's finger and the screen is from left to right. When it is gradually getting larger, it is determined that the user's left-hand operation mode probability increases the weight value w2.
- the third determining unit 20314 is further configured to determine that the user's right hand operation mode probability increase weight value w3 when the sliding acceleration of the user's finger gradually increases from left to right, and the sliding acceleration of the user's finger gradually decreases from left to right. Determining the user's left hand operation mode probability increases the weight value w3.
- the third comparison unit 20315 is for comparing the magnitudes of the user's right hand operation mode probability and the left hand operation mode probability.
- the fifth identification unit 20316 is configured to recognize that the operation mode of the user is the right-hand operation mode when the comparison result of the third comparison unit 20315 is that the user's right-hand operation mode probability is greater than the left-hand operation mode probability, and the comparison result of the third comparison unit 20315 is the user.
- the recognition user's operation mode is the left hand operation mode.
- Detecting that the user's finger is sliding on the screen there are four parameters to be obtained: the area where the user's finger passes during the sliding process, the sliding direction of the user's finger, and the user's finger during the sliding process.
- the change of the contact area of the screen and the change of the sliding acceleration of the user's finger during the sliding process according to the actual application situation, some parameters can be obtained to comprehensively identify the probability of the left and right handsets, and the more parameters are acquired, the recognized The higher the accuracy, the above embodiment is only to select several combinations, and other combinations are not described here.
- the action of the user's finger sliding on the screen is the unlocking action of the screen.
- the left and right handset modes can be quickly and accurately identified without requiring the user to add any additional actions; and the result of the identification is applied throughout the unlock period after unlocking.
- the identification module includes: a fourth setting unit, a fourth determining unit, a fourth comparing unit, and a sixth identifying unit.
- the fourth setting unit is configured to set a weight value of the area where the pixel point of the user's finger passes during the sliding process is w0, and the weight value of the sliding direction of the user's finger is w1, and the user's finger is in the process of sliding with the screen.
- the weight value of the change in the contact area is w2
- the weight value of the change in the sliding acceleration of the user's finger during the sliding process is w3.
- the fourth determining unit is configured to determine, when the pixel point that the user's finger passes, falls into the right area of the screen, the user right hand operation mode probability increase weight value w0, when the pixel point that the user's finger passes falls into the left area of the screen , determining that the user's left hand operation mode probability increases the weight value w0.
- the fourth determining unit is further configured to: when the sliding direction of the user's finger is to the right, determine that the sliding direction result is the user right hand operation mode holding probability increase weight value w1, and when the sliding direction of the user's finger is to the left, determine the sliding The result of the direction is that the user's left hand operation mode probability increases the weight value w1.
- the fourth determining unit is further configured to determine that the user's right hand operation mode probability increases the weight value w2 when the contact area of the user's finger and the screen is gradually changed from left to right, and the contact area between the user's finger and the screen is from left to right. When gradually increasing, it is determined that the user's left-hand operation mode probability increases the weight value w2.
- the fourth determining unit is further configured to: when the sliding acceleration of the user's finger gradually increases from left to right, determine the user right hand operation mode probability increase weight value w3, and determine that the sliding acceleration of the user's finger gradually decreases from left to right, and determine The user's left hand operation mode probability increases the weight value w3.
- the fourth comparison unit is for comparing the probability of the user's right handset and the probability of the left handset.
- the user's right-hand operation mode probability of the user is greater than the left-hand operation mode probability, the user's right-hand operation mode is recognized, and if the user's right-hand operation mode probability is less than the left-hand operation mode probability, the user's left-hand operation mode is recognized.
- the sixth identifying unit is configured to: when the comparison result of the fourth comparing unit is that the right-hand operating mode probability is greater than the left-hand operating mode probability, the operating mode of the recognized user is a right-hand operating mode, and the comparison result of the fourth comparing unit is a user right-hand operating mode. When the probability is less than the left-hand operation mode probability, the recognition user's operation mode is the left-hand operation mode.
- the weight values w0, w1, w2, and w3 are set according to the size of the screen and the length and shape of the user's finger sliding on the screen.
- the embodiment of the present invention acquires one or both of the change of the contact area of the finger of the user with the screen during the sliding process, and the change of the sliding acceleration; the change of the contact area with the screen according to the finger of the user during the sliding process
- the change in the sliding acceleration either or both, identifies the user's operating mode. In this way, on the one hand, no additional cost is required and the manner of recognizing the operation mode of the user is enriched, and on the other hand, when the contact area of the screen with the screen of the user's finger is changed, the change of the sliding acceleration is changed. When both recognize the user's operating mode probability, the accuracy of the recognition can be increased.
- the recognition accuracy can be further improved by combining the area where the pixel of the parameter user's finger passes during the sliding process and the sliding direction of the user's finger; the action of the user's finger sliding on the screen is the unlocking action of the screen, which can be The left and right handset modes are quickly and accurately identified without requiring the user to add any additional actions; and the identified results are applied throughout the unlock period after unlocking.
- FIG. 20 is a schematic structural diagram of still another embodiment of a handheld device according to the present invention.
- the handheld device 30 includes a detection module 301, an acquisition module 302, an identification module 303, and a switching module 304.
- the handheld device in this embodiment may perform the steps in FIG.
- the detecting module 301 is configured to detect whether the user's finger slides on the screen of the handheld device.
- the obtaining module 302 is configured to acquire, when the detecting module 301 detects that the user's finger slides on the screen of the handheld device, the change of the contact area of the user's finger with the screen during the sliding process, the change of the sliding acceleration, or both one.
- the identification module 303 is configured to identify the operation mode of the user according to the change of the contact area of the user's finger with the screen during the sliding process, the change of the sliding acceleration, or both.
- the switching module 304 is configured to automatically switch the display mode of the operation interface according to the operation mode of the user identified by the identification module 303.
- the embodiment of the present invention acquires one or both of the change of the contact area of the finger of the user with the screen during the sliding process, and the change of the sliding acceleration; the change of the contact area with the screen according to the finger of the user during the sliding process
- the change of the sliding acceleration either or both, identifies the user's operating mode; automatically switches the display mode of the operating interface according to the user's operating mode. In this way, on the one hand, no additional cost is required and the manner of recognizing the operation mode of the user is enriched, and on the other hand, when the contact area of the screen with the screen of the user's finger is changed, the change of the sliding acceleration is changed.
- the two recognize the user's operation mode probability the recognition accuracy can be increased, so that the display mode of the operation interface is more accurate.
- FIG. 21 is a schematic diagram showing the physical structure of still another embodiment of the handheld device of the present invention.
- the handheld device includes a processor 31, a memory 32 coupled to the processor 31, a detector 33, and a collector 34.
- the detector 33 is for detecting whether the user's finger is sliding on the screen of the handheld device.
- the collector 34 is configured to acquire, when the detector 33 detects that the finger of the user slides on the screen of the handheld device, the change of the contact area of the finger of the user with the screen during the sliding process, and the sliding acceleration a change of both or both, and storing either or both of the change in the contact area of the user's finger with the screen during the sliding process, or the change in the sliding acceleration in the memory 32.
- the processor 31 is configured to extract, or both, a change of a contact area of the finger of the user stored in the memory 32 with the screen during sliding, or a change of the sliding acceleration, and according to The change of the contact area of the user's finger with the screen during the sliding process, the change of the sliding acceleration, or both, identifies the operation mode of the user.
- the processor 31 is configured to determine whether the contact area of the finger of the user with the screen is gradually decreasing from left to right, or whether the sliding acceleration of the user's finger is gradually increasing from left to right. Identifying the user's operation mode when the result of the judgment is that the contact area of the user's finger with the screen is gradually decreasing from left to right, or the sliding acceleration of the user's finger is gradually increasing from left to right a right-hand operation mode; in the judgment result, the contact area of the user's finger with the screen is gradually increased from left to right, or the sliding acceleration of the user's finger is gradually decreasing from left to right, and the user is recognized.
- the mode of operation is the left-hand mode of operation.
- the processor 31 is configured to set a weight value of the change of the contact area of the finger of the user with the screen during the sliding process, and the weight value of the change of the sliding acceleration of the user's finger during the sliding process is w3;
- the contact area of the user's finger with the screen is gradually reduced from left to right, and then the user's right hand operation mode probability is increased by the weight value w2. If the contact area of the user's finger and the screen is gradually increased from left to right, the user is determined.
- the left-hand operation mode probability increases the weight value w2; if the sliding acceleration of the user's finger gradually increases from left to right, it determines that the user's right-hand operation mode probability increases the weight value w3, and if the sliding acceleration of the user's finger gradually decreases from left to right And determining a user left hand operation mode probability increase weight value w3; comparing a user right hand operation mode probability and a left hand operation mode probability; if the user right hand operation mode probability is greater than a left hand operation mode probability, identifying the user's operation mode is a right hand operation mode, If the user's right hand operation mode probability is less than the left hand operation mode probability, the user's Left-handed mode is the mode of operation.
- the sliding information during the sliding process of the user's finger includes: a change in the contact area of the user's finger with the screen during the sliding process, a change in the sliding acceleration, and a sliding direction of the user's finger:
- the collector 34 is configured to acquire a change in the contact area of the finger of the user with the screen during the sliding process, a change in the sliding acceleration, and a sliding direction of the finger of the user.
- the processor 31 is configured to set a weight value of the sliding direction of the user's finger as w1, a weight value of the change of the contact area of the finger of the user with the screen during the sliding process, and a sliding acceleration of the user's finger during the sliding process.
- the weight value of the change case is w3; if the sliding direction of the user's finger is to the right, it is determined that the user's right hand operation mode probability increases the weight value w1, and if the user's finger's sliding direction is to the left, the user's left hand operation mode probability is determined.
- the sliding information during the sliding process of the user's finger includes: a change in the contact area of the user's finger with the screen during the sliding process, a change in the sliding acceleration, and a pixel point that the user's finger passes during the sliding process.
- the collector 34 is configured to acquire a change in the contact area of the finger of the user with the screen during the sliding process, a change in the sliding acceleration, and an area in which the pixel point of the user's finger passes during the sliding process.
- the processor 31 is configured to set a weight value of the area where the pixel point that the user's finger passes during the sliding process is w0, and the weight value of the change of the contact area of the user's finger with the screen during the sliding process is w2.
- the weight value of the change of the sliding acceleration of the user's finger during the sliding process is w3; if the pixel point of the user's finger falls into the right area of the screen, it is determined that the user's right hand operation mode probability increases the weight value w0, if the user's finger If the passing pixel points fall into the left area of the screen, it is determined that the user's left hand operation mode probability increases the weight value w0; if the contact area of the user's finger and the screen gradually decreases from left to right, it is determined that the user's right hand operation mode probability increases.
- the weight value w2 if the contact area of the user's finger and the screen is gradually increased from left to right, determining that the user's left hand operation mode probability increases the weight value w2; if the sliding acceleration of the user's finger gradually increases from left to right, then Determining the right-hand operation mode probability of the user increases the weight value w3, and if the sliding acceleration of the user's finger gradually becomes smaller from left to right, it is determined
- the left-hand operation mode probability increases the weight value w3; compares the user right-hand operation mode probability and the left-hand operation mode probability; if the user's right-hand operation mode probability is greater than the left-hand operation mode probability, the recognition user's operation mode is the right-hand operation mode, if the user right-hand operation The mode probability is less than the left hand mode mode probability, and the user's mode of operation is identified as the left hand mode of operation.
- the processor 31 is configured to set a weight value of an area where a pixel point of the user's finger passes during the sliding process is w0, and a weight value of the sliding direction of the user's finger is w1, the user The weight value of the change of the contact area of the finger with the screen during the sliding process is w2, and the weight value of the change of the sliding acceleration of the user's finger during the sliding process is w3; acquired at the collector 34 When the pixel point that the user's finger passes falls into the right area of the screen, the right-hand operation mode probability increase weight value w0 is determined, and the pixel point of the user's finger acquired by the collector 34 is determined.
- the weight values w0, w1, w2, and w3 are set according to the size of the screen and the length and shape of the user's finger sliding on the screen.
- the action of the user's finger sliding on the screen is an unlocking action of the screen.
- the embodiment of the present invention acquires one or both of the change of the contact area of the finger of the user with the screen during the sliding process, and the change of the sliding acceleration; the change of the contact area with the screen according to the finger of the user during the sliding process
- the change in the sliding acceleration either or both, identifies the user's operating mode. In this way, on the one hand, no additional cost is required and the manner of recognizing the operation mode of the user is enriched, and on the other hand, when the contact area of the screen with the screen of the user's finger is changed, the change of the sliding acceleration is changed. When both recognize the probability of the user's left and right handsets, the accuracy of the recognition can be increased.
- the recognition accuracy can be further improved by combining the area where the pixel of the parameter user's finger passes during the sliding process and the sliding direction of the user's finger; the action of the user's finger sliding on the screen is the unlocking action of the screen, which can be The left and right handset modes are quickly and accurately identified without requiring the user to add any additional actions; and the identified results are applied throughout the unlock period after unlocking.
- the disclosed system, apparatus, and method may be implemented in other manners.
- the device implementations described above are merely illustrative.
- the division of the modules or units is only a logical function division.
- there may be another division manner for example, multiple units or components may be used. Combinations can be integrated into another system, or some features can be ignored or not executed.
- the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be in an electrical, mechanical or other form.
- the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
- each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
- the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
- the integrated unit if implemented in the form of a software functional unit and sold or used as a standalone product, may be stored in a computer readable storage medium.
- the technical solution of the present invention which is essential or contributes to the prior art, or all or part of the technical solution, may be embodied in the form of a software product stored in a storage medium.
- a number of instructions are included to cause a computer device (which may be a personal computer, server, or network device, etc.) or a processor to perform all or part of the steps of the methods of the various embodiments of the present invention.
- the foregoing storage medium includes: a U disk, a mobile hard disk, a read only memory (ROM, Read-Only) Memory, random access memory (RAM), disk or optical disk, and other media that can store program code.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Telephone Function (AREA)
- User Interface Of Digital Computer (AREA)
- Databases & Information Systems (AREA)
- Health & Medical Sciences (AREA)
- Bioethics (AREA)
- General Health & Medical Sciences (AREA)
- Computer Security & Cryptography (AREA)
- Software Systems (AREA)
Abstract
Description
Claims (20)
- 一种手持设备上用户操作模式的识别方法,其特征在于,包括:当检测到用户的手指在手持设备的屏幕上滑动时,获取所述用户的手指在滑动过程中的滑动信息;根据所述用户的手指在滑动过程中的滑动信息,识别所述用户的操作模式,所述操作模式包括:左手操作模式和右手操作模式。
- 根据权利要求1所述的方法,其特征在于,所述用户的手指在滑动过程中的滑动信息包括所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一。
- 根据权利要求2所述的方法,其特征在于,所述根据所述用户的手指在滑动过程中的滑动信息,识别所述用户的操作模式,包括:若所述用户的手指与所述屏幕的接触面积从左至右是逐渐变小,则识别所述用户的操作模式是右手操作模式;若所述用户的手指与所述屏幕的接触面积从左至右是逐渐变大,则识别所述用户的操作模式是左手操作模式。
- 根据权利要求2所述的方法,其特征在于,所述根据所述用户的手指在滑动过程中的滑动信息,识别所述用户的操作模式,包括:若所述用户的手指的滑动加速度从左至右是逐渐变大,则识别所述用户的操作模式是右手操作模式;若所述用户的手指的滑动加速度从左至右是逐渐变小,则识别所述用户的操作模式是左手操作模式。
- 根据权利要求2所述的方法,其特征在于,所述根据所述用户的手指在滑动过程中的滑动信息,识别所述用户的操作模式,包括:设置所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况的权重值为w2,所述用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3;若所述用户的手指与所述屏幕的接触面积从左至右是逐渐变小,则确定所述用户右手操作模式概率增加权重值w2,若所述用户的手指与所述屏幕的接触面积从左至右是逐渐变大,则确定所述用户左手操作模式概率增加权重值w2;若所述用户的手指的滑动加速度从左至右逐渐变大,则确定所述用户右手操作模式概率增加权重值w3,若所述用户的手指的滑动加速度从左至右逐渐变小,则确定所述用户左手操作模式概率增加权重值w3;比较所述用户右手操作模式概率和左手操作模式概率的大小;若所述用户右手操作模式概率大于左手操作模式概率,则识别所述用户的操作模式是右手操作模式,若所述用户右手操作模式概率小于左手操作模式概率,则识别所述用户的操作模式是左手操作模式。
- 根据权利要求2所述的方法,其特征在于,所述用户的手指在滑动过程中的滑动信息还包括所述用户的手指的滑动方向;所述根据所述用户的手指在滑动过程中的滑动信息,识别所述用户的操作模式,包括:设置所述用户的手指的滑动方向的权重值为w1,所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况的权重值为w2,所述用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3;若所述用户的手指的滑动方向是向右,则确定所述用户右手操作模式概率增加权重值w1,若所述用户的手指的滑动方向是向左,则确定所述用户左手操作模式概率增加权重值w1;若所述用户的手指与所述屏幕的接触面积从左至右是逐渐变小,则确定所述用户右手操作模式概率增加权重值w2,若所述用户的手指与所述屏幕的接触面积从左至右是逐渐变大,则确定所述用户左手操作模式概率增加权重值w2;若所述用户的手指的滑动加速度从左至右逐渐变大,则确定所述用户右手操作模式概率增加权重值w3,若所述用户的手指的滑动加速度从左至右逐渐变小,则确定所述用户左手操作模式概率增加权重值w3;比较所述用户右手操作模式概率和左手操作模式概率的大小;若所述用户右手操作模式概率大于左手操作模式概率,则识别所述用户的操作模式是右手操作模式,若所述用户右手操作模式概率小于左手操作模式概率,则识别所述用户的操作模式是左手操作模式。
- 根据权利要求2所述的方法,其特征在于,所述用户的手指在滑动过程中的滑动信息还包括所述用户的手指在滑动过程中经过的像素点落入的区域;所述根据所述用户的手指在滑动过程中的滑动信息,识别所述用户的操作模式,包括:设置所述用户的手指在滑动过程中经过的像素点落入的区域的权重值为w0,所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况的权重值为w2,所述用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3;若所述用户的手指经过的像素点落入所述屏幕的右方区域,则确定所述用户右手操作模式概率增加权重值w0,若所述用户的手指经过的像素点落入所述屏幕的左方区域,则确定所述用户左手操作模式概率增加权重值w0;若所述用户的手指与所述屏幕的接触面积从左至右是逐渐变小,则确定所述用户右手操作模式概率增加权重值w2,若所述用户的手指与所述屏幕的接触面积从左至右是逐渐变大,则确定所述用户左手操作模式概率增加权重值w2;若所述用户的手指的滑动加速度从左至右逐渐变大,则确定所述用户右手操作模式概率增加权重值w3,若所述用户的手指的滑动加速度从左至右逐渐变小,则确定所述用户左手操作模式概率增加权重值w3;比较所述用户右手操作模式概率和左手操作模式概率的大小;若所述用户右手操作模式概率大于左手操作模式概率,则识别所述用户的操作模式是右手操作模式,若所述用户右手操作模式概率小于左手操作模式概率,则识别所述用户的操作模式是左手操作模式。
- 根据权利要求5、6、7任一项所述的方法,其特征在于,所述权重值w0、w1、w2以及w3是根据所述屏幕的大小和所述用户的手指在所述屏幕上滑动的长度和形状进行设置的。
- 根据权利要求1-7任一项所述的方法,其特征在于,所述用户的手指在屏幕上滑动的动作是所述屏幕的解锁动作。
- 根据权利要求1-7任一项所述的方法,其特征在于,所述根据所述用户的手指在滑动过程中的滑动信息,识别所述用户的操作模式之后,还包括:根据所述用户的操作模式,自动切换所述手持设备操作界面的显示模式。
- 一种手持设备,其特征在于,所述手持设备包括:检测模块、获取模块以及识别模块;所述检测模块用于检测用户的手指是否在所述手持设备的屏幕上滑动;所述获取模块用于当所述检测模块检测到用户的手指在所述手持设备的屏幕上滑动时,获取所述用户的手指在滑动过程中的滑动信息;所述识别模块用于根据所述获取模块获取的用户的手指在滑动过程中的滑动信息,识别所述用户的操作模式,所述操作模式包括:左手操作模式和右手操作模式。
- 根据权利要求11所述的手持设备,其特征在于,所述用户的手指在滑动过程中的滑动信息包括所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一。
- 根据权利要求12所述的手持设备,其特征在于,所述识别模块包括:第一判断单元和第一识别单元;所述第一判断单元用于判断所述用户的手指与所述屏幕的接触面积从左至右是否是逐渐变小;所述第一识别单元用于在所述第一判断单元的判断结果为所述用户的手指与所述屏幕的接触面积从左至右是逐渐变小时,识别所述用户的操作模式是右手操作模式;所述第一识别单元还用于在所述第一判断单元的判断结果为所述用户的手指与所述屏幕的接触面积从左至右是逐渐变大时,识别所述用户的操作模式是左手操作模式。
- 根据权利要求12所述的手持设备,其特征在于,所述识别模块包括:第二判断单元和第二识别单元;所述第二判断单元用于判断所述用户的手指的滑动加速度从左至右是否是逐渐变大;所述第二识别单元用于在所述第二判断单元的判断结果为所述用户的手指的滑动加速度从左至右是逐渐变大时,识别所述用户的操作模式是右手操作模式;所述第二识别单元还用于在所述第二判断单元的判断结果为所述用户的手指的滑动加速度从左至右是逐渐变小时,识别所述用户的操作模式是左手操作模式。
- 根据权利要求12所述的手持设备,其特征在于,所述识别模块包括:第一设置单元、第一确定单元、第一比较单元、第三识别单元;所述第一设置单元用于设置所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况的权重值为w2,所述用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3;所述第一确定单元用于在所述用户的手指与所述屏幕的接触面积从左至右是逐渐变小时,确定所述用户右手操作模式概率增加权重值w2,在所述用户的手指与所述屏幕的接触面积从左至右是逐渐变大时,确定所述用户左手操作模式概率增加权重值w2;所述第一确定单元还用于在所述用户的手指的滑动加速度从左至右逐渐变大时,确定所述用户右手操作模式概率增加权重值w3,在所述用户的手指的滑动加速度从左至右逐渐变小时,确定所述用户左手操作模式概率增加权重值w3;所述第一比较单元用于比较所述用户右手操作模式概率和左手操作模式概率的大小;所述第三识别单元用于在所述第一比较单元的比较结果为所述用户右手操作模式概率大于左手操作模式概率时,识别所述用户的操作模式是右手操作模式,在所述第一比较单元的比较结果为所述用户右手操作模式概率小于左手操作模式概率时,识别所述用户的操作模式是左手操作模式。
- 根据权利要求12所述的手持设备,其特征在于,所述用户的手指在滑动过程中的滑动信息还包括所述用户的手指的滑动方向;所述识别模块包括:第二设置单元、第二确定单元、第二比较单元、第四识别单元;所述第二设置单元用于设置所述用户的手指的滑动方向的权重值为w1,所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况的权重值为w2,所述用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3;所述第二确定单元用于在所述用户的手指的滑动方向是向右时,确定所述用户右手操作模式概率增加权重值w1,在所述用户的手指的滑动方向是向左时,确定所述用户左手操作模式概率增加权重值w1;所述第二确定单元还用于在所述用户的手指与所述屏幕的接触面积从左至右是逐渐变小时,确定所述用户右手操作模式概率增加权重值w2,在所述用户的手指与所述屏幕的接触面积从左至右是逐渐变大时,确定所述用户左手操作模式概率增加权重值w2;所述第二确定单元还用于在所述用户的手指的滑动加速度从左至右逐渐变大时,确定所述用户右手操作模式概率增加权重值w3,在所述用户的手指的滑动加速度从左至右逐渐变小时,确定所述用户左手操作模式概率增加权重值w3;所述第二比较单元用于比较所述用户右手操作模式概率和左手操作模式概率的大小;所述第四识别单元用于在所述第二比较单元的比较结果为所述用户右手操作模式概率大于左手操作模式概率时,识别所述用户的操作模式是右手操作模式,在所述第二比较单元的比较结果为所述用户右手操作模式概率小于左手操作模式概率时,识别所述用户的操作模式是左手操作模式。
- 根据权利要求12所述的手持设备,其特征在于,所述用户的手指在滑动过程中的滑动信息还包括所述用户的手指在滑动过程中经过的像素点落入的区域;所述识别模块包括:第三设置单元、第三确定单元、第三比较单元、第五识别单元;所述第三设置单元用于设置所述用户的手指在滑动过程中经过的像素点落入的区域的权重值为w0,所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况的权重值为w2,所述用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3;所述第三确定单元用于在所述用户的手指经过的像素点落入所述屏幕的右方区域时,确定所述用户右手操作模式概率增加权重值w0,在所述用户的手指经过的像素点落入所述屏幕的左方区域时,确定所述用户左手操作模式概率增加权重值w0;所述第三确定单元还用于在所述用户的手指与所述屏幕的接触面积从左至右是逐渐变小时,确定所述用户右手操作模式概率增加权重值w2,在所述用户的手指与所述屏幕的接触面积从左至右是逐渐变大时,确定所述用户左手操作模式概率增加权重值w2;所述第三确定单元还用于在所述用户的手指的滑动加速度从左至右逐渐变大时,确定所述用户右手操作模式概率增加权重值w3,在所述用户的手指的滑动加速度从左至右逐渐变小时,确定所述用户左手操作模式概率增加权重值w3;所述第三比较单元用于比较所述用户右手操作模式概率和左手操作模式概率的大小;所述第五识别单元用于在所述第三比较单元的比较结果为所述用户右手操作模式概率大于左手操作模式概率时,识别所述用户的操作模式是右手操作模式,在所述第三比较单元的比较结果为所述用户右手操作模式概率小于左手操作模式概率时,识别所述用户的操作模式是左手操作模式。
- 根据权利要求15、16、17任一项所述的手持设备,其特征在于,所述权重值w0、w1、w2以及w3是根据所述屏幕的大小和所述用户的手指在所述屏幕上滑动的长度和形状进行设置的。
- 根据权利要求11-17任一项所述的手持设备,其特征在于,所述用户的手指在屏幕上滑动的动作是所述屏幕的解锁动作。
- 根据权利要求11-17任一项所述的手持设备,其特征在于,所述手持设备还包括切换模块,所述切换模块用于根据所述用户的操作模式,自动切换所述手持设备操作界面的显示模式。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020197008553A KR20190035938A (ko) | 2014-03-31 | 2015-02-09 | 핸드헬드형 기기 상의 사용자 조작 모드를 식별하는 방법 및 핸드헬드형 기기 |
KR1020167030111A KR101963782B1 (ko) | 2014-03-31 | 2015-02-09 | 핸드헬드형 기기 상의 사용자 조작 모드를 식별하는 방법 및 핸드헬드형 기기 |
JP2016559832A JP6272502B2 (ja) | 2014-03-31 | 2015-02-09 | 携帯型デバイス上のユーザ操作モードを識別する方法及び携帯型デバイス |
EP15772243.0A EP3118733B1 (en) | 2014-03-31 | 2015-02-09 | Method for recognizing operation mode of user on handheld device, and handheld device |
US15/279,733 US10444951B2 (en) | 2014-03-31 | 2016-09-29 | Method and device for identifying a left-hand or a right-hand mode of operation on a user handheld device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410126867.1 | 2014-03-31 | ||
CN201410126867.1A CN103870199B (zh) | 2014-03-31 | 2014-03-31 | 手持设备上用户操作模式的识别方法及手持设备 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/279,733 Continuation US10444951B2 (en) | 2014-03-31 | 2016-09-29 | Method and device for identifying a left-hand or a right-hand mode of operation on a user handheld device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015149588A1 true WO2015149588A1 (zh) | 2015-10-08 |
Family
ID=50908787
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2015/072531 WO2015149588A1 (zh) | 2014-03-31 | 2015-02-09 | 手持设备上用户操作模式的识别方法及手持设备 |
Country Status (7)
Country | Link |
---|---|
US (1) | US10444951B2 (zh) |
EP (1) | EP3118733B1 (zh) |
JP (1) | JP6272502B2 (zh) |
KR (2) | KR101963782B1 (zh) |
CN (1) | CN103870199B (zh) |
TW (1) | TW201602867A (zh) |
WO (1) | WO2015149588A1 (zh) |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104281367A (zh) * | 2014-09-28 | 2015-01-14 | 北京数字天域科技股份有限公司 | 一种移动终端系统管理方法及装置 |
CN105511774A (zh) * | 2014-10-17 | 2016-04-20 | 深圳Tcl新技术有限公司 | 显示终端界面显示方法及装置 |
CN105700782A (zh) * | 2014-11-25 | 2016-06-22 | 中兴通讯股份有限公司 | 一种调整虚拟按键布局的方法、装置及移动终端 |
CN106155525A (zh) * | 2015-04-14 | 2016-11-23 | 中兴通讯股份有限公司 | 一种终端操作方式的识别方法及装置 |
CN106210242A (zh) * | 2015-04-29 | 2016-12-07 | 宇龙计算机通信科技(深圳)有限公司 | 天线切换方法、装置及终端 |
CN105260121A (zh) * | 2015-10-23 | 2016-01-20 | 东莞酷派软件技术有限公司 | 终端操控方法、终端操控装置和终端 |
CN107346171A (zh) * | 2016-05-04 | 2017-11-14 | 深圳市中兴微电子技术有限公司 | 一种操作模式的切换方法和装置 |
WO2019056393A1 (zh) | 2017-09-25 | 2019-03-28 | 华为技术有限公司 | 一种终端界面的显示方法及终端 |
CN107704190B (zh) * | 2017-11-06 | 2020-07-10 | Oppo广东移动通信有限公司 | 手势识别方法、装置、终端及存储介质 |
CN110769096A (zh) * | 2019-10-21 | 2020-02-07 | Oppo(重庆)智能科技有限公司 | 一种马达振动方法、终端及存储介质 |
CN111078087A (zh) * | 2019-11-25 | 2020-04-28 | 深圳传音控股股份有限公司 | 移动终端、控制模式切换方法及计算机可读存储介质 |
CN113553568B (zh) * | 2020-04-23 | 2024-06-18 | 京东科技控股股份有限公司 | 人机识别方法、滑块验证方法、装置、介质和设备 |
CN113204305B (zh) * | 2021-04-30 | 2023-06-09 | 网易(杭州)网络有限公司 | 移动终端的握持模式检测方法、装置、介质及移动终端 |
TWI775474B (zh) * | 2021-06-07 | 2022-08-21 | 華碩電腦股份有限公司 | 可攜式電子裝置與其單手觸控操作方法 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102624977A (zh) * | 2012-02-17 | 2012-08-01 | 深圳市金立通信设备有限公司 | 根据用户左右手使用习惯进行手机界面切换的系统及方法 |
US20130212535A1 (en) * | 2012-02-13 | 2013-08-15 | Samsung Electronics Co., Ltd. | Tablet having user interface |
CN103354581A (zh) * | 2013-06-14 | 2013-10-16 | 广东欧珀移动通信有限公司 | 一种通过左右手来自动调整手机控件的方法及系统 |
Family Cites Families (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008204402A (ja) * | 2007-02-22 | 2008-09-04 | Eastman Kodak Co | ユーザインターフェース装置 |
US9134972B2 (en) * | 2008-04-02 | 2015-09-15 | Kyocera Corporation | User interface generation apparatus |
JP2010277197A (ja) * | 2009-05-26 | 2010-12-09 | Sony Corp | 情報処理装置、情報処理方法およびプログラム |
US20100310136A1 (en) | 2009-06-09 | 2010-12-09 | Sony Ericsson Mobile Communications Ab | Distinguishing right-hand input and left-hand input based on finger recognition |
CN101599001B (zh) * | 2009-07-13 | 2012-11-14 | 青岛海信移动通信技术股份有限公司 | 触摸屏显示界面更新方法和多媒体电子设备 |
JP2011081646A (ja) * | 2009-10-08 | 2011-04-21 | Seiko Epson Corp | 情報処理装置、情報処理方法およびプログラム |
WO2011158475A1 (ja) * | 2010-06-16 | 2011-12-22 | パナソニック株式会社 | 情報入力装置、情報入力方法及びプログラム |
KR101694787B1 (ko) * | 2010-06-30 | 2017-01-10 | 엘지전자 주식회사 | 이동 단말기 및 이동 단말기의 제어 방법 |
US8471869B1 (en) * | 2010-11-02 | 2013-06-25 | Google Inc. | Optimizing display orientation |
CN102096513B (zh) * | 2011-02-23 | 2014-04-16 | 惠州Tcl移动通信有限公司 | 一种触摸屏的滑动解决方法及使用该方法的电子设备 |
US20130100063A1 (en) * | 2011-04-20 | 2013-04-25 | Panasonic Corporation | Touch panel device |
JP2013003949A (ja) * | 2011-06-20 | 2013-01-07 | Nec Casio Mobile Communications Ltd | 情報端末装置、入力方法およびプログラム |
CN103176724A (zh) * | 2011-12-21 | 2013-06-26 | 富泰华工业(深圳)有限公司 | 可切换左右手使用模式的操作界面的电子设备及方法 |
TWI493438B (zh) * | 2012-01-09 | 2015-07-21 | Amtran Technology Co Ltd | 觸控方法 |
US8863042B2 (en) * | 2012-01-24 | 2014-10-14 | Charles J. Kulas | Handheld device with touch controls that reconfigure in response to the way a user operates the device |
CN202475551U (zh) * | 2012-03-20 | 2012-10-03 | 深圳市金立通信设备有限公司 | 基于左右手使用习惯调整手机显示界面的系统 |
JP2013232118A (ja) * | 2012-04-27 | 2013-11-14 | Panasonic Corp | 操作処理装置 |
CN102799268A (zh) | 2012-07-03 | 2012-11-28 | 广东欧珀移动通信有限公司 | 一种手持终端左右手识别方法 |
CN104520798B (zh) * | 2012-08-08 | 2018-07-03 | 日本电气株式会社 | 便携电子设备及其控制方法和程序 |
CN102830935B (zh) * | 2012-08-22 | 2015-05-06 | 上海华勤通讯技术有限公司 | 触控终端及操作界面的调整方法 |
JP2014041498A (ja) * | 2012-08-23 | 2014-03-06 | Sanyo Electric Co Ltd | 通信端末装置 |
US8665238B1 (en) * | 2012-09-21 | 2014-03-04 | Google Inc. | Determining a dominant hand of a user of a computing device |
US8782549B2 (en) * | 2012-10-05 | 2014-07-15 | Google Inc. | Incremental feature-based gesture-keyboard decoding |
EP2720172A1 (en) * | 2012-10-12 | 2014-04-16 | Nederlandse Organisatie voor toegepast -natuurwetenschappelijk onderzoek TNO | Video access system and method based on action type detection |
TWM461837U (zh) * | 2012-10-15 | 2013-09-11 | Guan-Ru Wang | 行動設備滑動解鎖裝置 |
KR101995278B1 (ko) * | 2012-10-23 | 2019-07-02 | 삼성전자 주식회사 | 터치 디바이스의 ui 표시방법 및 장치 |
US8769431B1 (en) * | 2013-02-28 | 2014-07-01 | Roy Varada Prasad | Method of single-handed software operation of large form factor mobile electronic devices |
DE102013011689A1 (de) * | 2013-07-12 | 2015-01-15 | e.solutions GmbH | Verfahren und Vorrichtung zum Verarbeiten von Berührungssignalen eines Touchscreens |
-
2014
- 2014-03-31 CN CN201410126867.1A patent/CN103870199B/zh active Active
-
2015
- 2015-02-09 WO PCT/CN2015/072531 patent/WO2015149588A1/zh active Application Filing
- 2015-02-09 KR KR1020167030111A patent/KR101963782B1/ko active IP Right Grant
- 2015-02-09 EP EP15772243.0A patent/EP3118733B1/en active Active
- 2015-02-09 KR KR1020197008553A patent/KR20190035938A/ko not_active Application Discontinuation
- 2015-02-09 JP JP2016559832A patent/JP6272502B2/ja active Active
- 2015-03-31 TW TW104110426A patent/TW201602867A/zh unknown
-
2016
- 2016-09-29 US US15/279,733 patent/US10444951B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130212535A1 (en) * | 2012-02-13 | 2013-08-15 | Samsung Electronics Co., Ltd. | Tablet having user interface |
CN102624977A (zh) * | 2012-02-17 | 2012-08-01 | 深圳市金立通信设备有限公司 | 根据用户左右手使用习惯进行手机界面切换的系统及方法 |
CN103354581A (zh) * | 2013-06-14 | 2013-10-16 | 广东欧珀移动通信有限公司 | 一种通过左右手来自动调整手机控件的方法及系统 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3118733A4 * |
Also Published As
Publication number | Publication date |
---|---|
KR20160136446A (ko) | 2016-11-29 |
US20170017799A1 (en) | 2017-01-19 |
JP2017518553A (ja) | 2017-07-06 |
CN103870199B (zh) | 2017-09-29 |
CN103870199A (zh) | 2014-06-18 |
JP6272502B2 (ja) | 2018-01-31 |
TW201602867A (zh) | 2016-01-16 |
EP3118733A1 (en) | 2017-01-18 |
EP3118733A4 (en) | 2017-03-08 |
KR101963782B1 (ko) | 2019-03-29 |
TWI560595B (zh) | 2016-12-01 |
US10444951B2 (en) | 2019-10-15 |
KR20190035938A (ko) | 2019-04-03 |
EP3118733B1 (en) | 2018-10-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2015149588A1 (zh) | 手持设备上用户操作模式的识别方法及手持设备 | |
WO2016104922A1 (ko) | 웨어러블 전자기기 | |
WO2016060397A1 (en) | Method and apparatus for processing screen using device | |
WO2018030594A1 (en) | Mobile terminal and method for controlling the same | |
WO2017034116A1 (en) | Mobile terminal and method for controlling the same | |
WO2016209020A1 (en) | Image processing apparatus and image processing method | |
WO2017043857A1 (ko) | 어플리케이션 제공 방법 및 이를 위한 전자 기기 | |
WO2016129784A1 (en) | Image display apparatus and method | |
WO2016072674A1 (en) | Electronic device and method of controlling the same | |
WO2012144666A1 (en) | Display device and control method therof | |
WO2014035113A1 (en) | Method of controlling touch function and an electronic device thereof | |
WO2014035054A1 (en) | Method and apparatus for controlling zoom function in an electronic device | |
WO2016074235A1 (zh) | 一种移动物体的控制方法、装置及移动设备 | |
WO2019168238A1 (ko) | 이동단말기 및 그 제어 방법 | |
WO2015180013A1 (zh) | 一种终端的触摸操作方法及装置 | |
WO2014027818A2 (en) | Electronic device for displaying touch region to be shown and method thereof | |
WO2017007090A1 (en) | Display device and method of controlling therefor | |
WO2015012629A1 (en) | Method of processing input and electronic device thereof | |
WO2021025534A1 (ko) | 카메라 프리뷰 이미지를 제공하기 위한 전자 장치 및 그의 동작 방법 | |
WO2016089074A1 (en) | Device and method for receiving character input through the same | |
WO2021054784A1 (en) | Electronic device and method for changing user interface according to user input | |
WO2017159931A1 (en) | Electronic device including touch panel and method of controlling the electronic device | |
WO2017206892A1 (zh) | 一种移动终端的传感器处理方法、装置、存储介质及电子设备 | |
WO2016195197A1 (en) | Pen terminal and method for controlling the same | |
WO2020149600A1 (en) | Electronic device and operation method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15772243 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016559832 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REEP | Request for entry into the european phase |
Ref document number: 2015772243 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015772243 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 20167030111 Country of ref document: KR Kind code of ref document: A |