WO2015149588A1 - 手持设备上用户操作模式的识别方法及手持设备 - Google Patents

手持设备上用户操作模式的识别方法及手持设备 Download PDF

Info

Publication number
WO2015149588A1
WO2015149588A1 PCT/CN2015/072531 CN2015072531W WO2015149588A1 WO 2015149588 A1 WO2015149588 A1 WO 2015149588A1 CN 2015072531 W CN2015072531 W CN 2015072531W WO 2015149588 A1 WO2015149588 A1 WO 2015149588A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
operation mode
finger
sliding
hand operation
Prior art date
Application number
PCT/CN2015/072531
Other languages
English (en)
French (fr)
Inventor
方元
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to KR1020197008553A priority Critical patent/KR20190035938A/ko
Priority to KR1020167030111A priority patent/KR101963782B1/ko
Priority to JP2016559832A priority patent/JP6272502B2/ja
Priority to EP15772243.0A priority patent/EP3118733B1/en
Publication of WO2015149588A1 publication Critical patent/WO2015149588A1/zh
Priority to US15/279,733 priority patent/US10444951B2/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present invention relates to the field of human-computer interaction technologies, and in particular, to a method for identifying a user operation mode on a handheld device and a handheld device.
  • UI User Interface
  • many applications need to manually set the user operation mode, such as the left-hand operation mode or the right-hand operation mode, and then determine the UI presentation mode according to the set operation mode.
  • the application software In the prior art, there are two ways to obtain the user operation mode.
  • One is to manually set the application software, and the application software usually provides two single-hand operation modes, which are manually set by the user before using the application software.
  • the other is to automatically obtain the user's operation mode, and automatically obtain the user's operation mode in two ways. The first is to identify the operation mode through the sensor in the mobile phone. The second is to identify the left and right hand mode of operation by calculating the slope of the user's sliding on the screen.
  • the inventor of the present application found in the long-term research and development that the sensor is specifically used to identify the operation mode, the additional cost is increased, and the accuracy of the judgment depends on the sensitivity of the sensor; the way of calculating the slope of the user sliding on the screen is not accurate. High, the impact of individual differences is also large.
  • the technical problem to be solved by the present invention is to provide a method for identifying a user operation mode on a handheld device and a handheld device, which can enrich the way of identifying the user operation mode and increase the accuracy of recognition without additional cost.
  • the present invention provides a method for identifying a user operation mode on a handheld device, including: when detecting that a user's finger is sliding on a screen of the handheld device, acquiring sliding information of the user's finger during the sliding process;
  • the operation mode of the user is identified according to the sliding information of the user's finger during the sliding process, and the operation mode includes: a left-hand operation mode and a right-hand operation mode.
  • the sliding information of the user's finger during the sliding process includes a change of a contact area of the user's finger with the screen during sliding, and a sliding acceleration Change either or both.
  • the determining, according to the sliding information of the user's finger during the sliding process, identifying an operation mode of the user Including: if the contact area of the user's finger with the screen is gradually reduced from left to right, identifying an operation mode of the user is a right-hand operation mode; if the user's finger is in contact with the screen The area is gradually enlarged from left to right, and the operation mode identifying the user is the left-hand operation mode.
  • the determining, according to the sliding information of the user's finger during a sliding process, identifying an operation mode of the user Including: if the sliding acceleration of the user's finger is gradually increasing from left to right, the operation mode identifying the user is a right-hand operation mode; if the sliding acceleration of the user's finger is gradually changed from left to right Small, it is recognized that the operation mode of the user is a left-hand operation mode.
  • the determining, according to the sliding information of the user's finger in a sliding process, identifying an operation mode of the user The weight value of the change of the contact area of the finger of the user with the screen during the sliding process is set to w2, and the weight value of the change of the sliding acceleration of the finger of the user during the sliding process is w3;
  • the contact area of the user's finger with the screen is gradually reduced from left to right, and then determining that the user's right hand operation mode probability increases the weight value w2, if the contact area of the user's finger with the screen is from the left To the right is gradually larger, determining that the user's left-hand operation mode probability increases the weight value w2; if the sliding acceleration of the user's finger gradually increases from left to right, determining the user's right-hand operation mode probability increase weight value W3, if the sliding acceleration of the user's finger gradually becomes smaller from left to right, determining that the user
  • the sliding information of the user's finger during the sliding process further includes a sliding direction of the user's finger;
  • the identifying the operation mode of the user according to the sliding information of the user's finger during the sliding process comprises: setting a weight value of the sliding direction of the finger of the user to w1, and the user's finger is in the sliding process
  • the weight value of the change of the contact area with the screen is w2
  • the weight value of the change of the sliding acceleration of the user's finger during the sliding process is w3
  • Determining the right-hand operation mode probability increase weight value w1 of the user, if the sliding direction of the user's finger is to the left, determining that the user's left-hand operation mode probability increases the weight value w1; if the user's finger and the screen
  • the contact area is gradually reduced from left to right, and then determining that the user's right hand
  • the sliding information of the user's finger during the sliding process further includes that the user's finger is in the sliding process Identifying an operation mode of the user according to the sliding information of the user's finger during the sliding process, including: setting a pixel point that the user's finger passes during the sliding process
  • the weight value of the entered area is w0
  • the weight value of the change of the contact area of the finger of the user with the screen during the sliding process is w2
  • the weight of the change of the sliding acceleration of the user's finger during the sliding process The value is w3; if the pixel point that the user's finger passes falls into the right area of the screen, it is determined that the user's right hand operation mode probability increases the weight value w0, if the pixel point of the user's finger passes
  • the left area of the screen determines that the user's left hand operation mode probability increases the weight value w0; if the contact area of the user's finger with the
  • the weight values w0, w1, w2, and w3 are according to the The size of the screen and the length and shape of the user's finger sliding on the screen are set.
  • any one of the first to sixth possible implementations of the first aspect, in the eighth possible implementation manner of the first aspect, the action of the user's finger sliding on the screen is The unlocking action of the screen.
  • the operating mode of the user is further included: automatically switching the display mode of the operating interface of the handheld device according to the operating mode of the user.
  • the present invention provides a handheld device, the handheld device includes: a detection module, an acquisition module, and an identification module; the detection module is configured to detect whether a user's finger slides on a screen of the handheld device; The acquiring module is configured to: when the detecting module detects that the finger of the user slides on the screen of the handheld device, acquire sliding information of the finger of the user during sliding; the identifying module is configured to be used according to the acquiring module The obtained sliding information of the user's finger during the sliding process identifies the operating mode of the user, the operating modes including: a left-hand operating mode and a right-hand operating mode.
  • the sliding information of the user's finger during the sliding process includes a change of a contact area of the user's finger with the screen during sliding, and a sliding acceleration Change either or both.
  • the identifying module includes: a first determining unit and a first identifying unit; Determining whether the contact area of the user's finger with the screen is gradually decreasing from left to right; the first identifying unit is configured to determine, at the first determining unit, that the user's finger and the location are The contact area of the screen is gradually changed from left to right, and the operation mode for identifying the user is a right-hand operation mode; the first identification unit is further configured to: the result of the determination by the first determination unit is the user When the contact area of the finger with the screen is gradually increased from left to right, the operation mode for recognizing the user is the left-hand operation mode.
  • the identifying module includes: a second determining unit and a second identifying unit; Determining whether the sliding acceleration of the finger of the user is gradually increasing from left to right; the second identifying unit is configured to determine, in the second determining unit, that the sliding acceleration of the finger of the user is from left to When the right is gradually increasing, the operation mode for identifying the user is a right-hand operation mode; the second recognition unit is further configured to: in the second determination unit, the determination result is that the sliding acceleration of the user's finger is from left to The right is gradually getting smaller, and the operation mode identifying the user is the left-hand operation mode.
  • the identifying module includes: a first setting unit, a first determining unit, a first comparing unit, and a third a recognition unit;
  • the first setting unit is configured to set a weight value of the change of the contact area of the finger of the user with the screen during the sliding process, and the change of the sliding acceleration of the user's finger during the sliding process
  • the weight value of the case is w3;
  • the first determining unit is configured to determine that the user's right hand operation mode probability increases the weight value w2 when the contact area of the user's finger and the screen is gradually decreasing from left to right.
  • the first determining unit is further configured to be in the user When the sliding acceleration of the finger gradually increases from left to right, it is determined that the user's right hand operation mode probability increases the weight value w3, and the sliding acceleration of the user's finger is from left to right.
  • the first comparison unit is configured to compare the user right hand operation mode probability and the left hand operation mode probability; the third identification unit is used in the When the comparison result of the first comparison unit is that the right-hand operation mode probability of the user is greater than the left-hand operation mode probability, the operation mode for identifying the user is a right-hand operation mode, and the comparison result of the first comparison unit is the right hand of the user. When the operating mode probability is less than the left-hand operating mode probability, the operating mode identifying the user is the left-hand operating mode.
  • the sliding information of the user's finger during the sliding process further includes a sliding direction of the user's finger;
  • the identification module includes: a second setting unit, a second determining unit, a second comparing unit, and a fourth identifying unit;
  • the second setting unit is configured to set a weight value of the sliding direction of the finger of the user to w1, The weight value of the change of the contact area of the finger of the user with the screen during the sliding process is w2, and the weight value of the change of the sliding acceleration of the finger of the user during the sliding process is w3;
  • the second determining unit Determining, when the sliding direction of the finger of the user is to the right, determining the right-hand operation mode probability increase weight value w1 of the user, and determining the left-hand operation mode of the user when the sliding direction of the user's finger is to the left Probabilistically increasing the weight value w1;
  • the second determining unit is further configured to determine that the
  • the sliding information of the user's finger during the sliding process further includes that the user's finger is in the sliding process
  • the recognition module includes: a third setting unit, a third determining unit, a third comparing unit, and a fifth identifying unit; the third setting unit is configured to set the finger of the user The weight value of the area where the pixel point that passes through during the sliding process is w0, and the weight value of the change of the contact area of the finger of the user with the screen during the sliding process is w2, and the user's finger is in the sliding process.
  • the weighting value of the change of the sliding acceleration is w3; the third determining unit is configured to determine that the right-hand operation mode probability of the user increases when the pixel point that the user's finger passes falls into the right area of the screen a weight value w0, when the pixel point that the user's finger passes falls into the left area of the screen, determining that the user's left-hand operation mode probability increases the weight value w0; The third determining unit is further configured to determine that the user's right hand operation mode probability increases the weight value w2 when the contact area of the user's finger and the screen is gradually decreasing from left to right, in the user's finger and the location Determining the user's left-hand operation mode probability increase weight value w2 when the contact area of the screen is gradually increasing from left to right; the third determining unit is further configured to use the sliding acceleration of the user's finger from left to right When the user is gradually getting larger, determining that the user's right-hand operation mode probability increases the weight value w3, and when the sliding acceleration
  • the weight values w0, w1, w2, and w3 are according to the The size of the screen and the length and shape of the user's finger sliding on the screen are set.
  • any one of the first to sixth possible implementation manners of the second aspect, in the seventh possible implementation manner of the second aspect, the action of the user's finger sliding on the screen is The unlocking action of the screen.
  • the handheld device further includes a switching module, configured to automatically switch the device according to the operation mode of the user.
  • the display mode of the handheld device operation interface is not limited to the operation mode of the user.
  • the invention has the beneficial effects that, when the user's finger is detected to slide on the screen of the handheld device, the sliding information of the user's finger during the sliding process is acquired; according to the user
  • the sliding information of the finger during the sliding process identifies the operating mode of the user, the operating modes including: a left-hand operating mode and a right-hand operating mode. In this way, the additional cost can be eliminated, the way of identifying the user's operating mode is enriched, and the accuracy of the recognition is increased.
  • FIG. 1 is a schematic structural view of a handheld device according to the present invention.
  • FIG. 2 is a flow chart of an embodiment of a method for identifying a user operation mode on a handheld device of the present invention
  • FIG. 3 is a flow chart of another embodiment of a method for identifying a user operation mode on a handheld device of the present invention.
  • FIG. 4 is a flow chart of still another embodiment of a method for identifying a user operation mode on a handheld device of the present invention
  • FIG. 5 is a flow chart of still another embodiment of a method for identifying a user operation mode on a handheld device of the present invention.
  • FIG. 6 is a schematic diagram showing a change in contact area between a user's right finger and a screen in a method for recognizing a user operation mode on a handheld device;
  • FIG. 7 is a schematic diagram showing a change in contact area between a user's left finger and a screen in a method for identifying a user operation mode on the handheld device of the present invention
  • FIG. 8 is a flow chart of still another embodiment of a method for identifying a user operation mode on a handheld device of the present invention.
  • FIG. 9 is a flow chart of still another embodiment of a method for identifying a user operation mode on a handheld device of the present invention.
  • FIG. 10 is a schematic diagram of sliding unlocking in a method for identifying a user operation mode on a handheld device of the present invention
  • FIG. 11 is another schematic diagram of sliding unlocking in a method for identifying a user operation mode on a handheld device of the present invention.
  • FIG. 12 is another schematic diagram of sliding unlocking in a method for identifying a user operation mode on a handheld device of the present invention
  • FIG. 13 is a flow chart of still another embodiment of a method for identifying a user operation mode on a handheld device of the present invention.
  • FIG. 14 is a schematic structural diagram of an embodiment of a handheld device according to the present invention.
  • FIG. 15 is a schematic structural view of another embodiment of a handheld device according to the present invention.
  • FIG. 16 is a schematic structural view of still another embodiment of the handheld device of the present invention.
  • FIG. 17 is a schematic structural view of still another embodiment of the handheld device of the present invention.
  • FIG. 18 is a schematic structural view of still another embodiment of the handheld device of the present invention.
  • FIG. 19 is a schematic structural view of still another embodiment of the handheld device of the present invention.
  • FIG. 20 is a schematic structural view of still another embodiment of the handheld device of the present invention.
  • 21 is a schematic diagram showing the physical structure of an embodiment of a handheld device according to the present invention.
  • UI User Interface
  • many applications need to determine how the UI is presented based on the user's operating mode when operating the handheld device with one hand.
  • FIG. 1 is a schematic structural view of a handheld device according to the present invention.
  • the logical structure of the handheld device applied to the method for identifying the user operation mode on the handheld device provided by the embodiment of the present invention is described by using FIG. 1 as an example.
  • the handheld device can be specifically a smart phone.
  • the hardware layer of the handheld device includes a CPU, a GPU, and the like, and may further include a memory, an input/output device, a memory, a memory controller, a network interface, and the like.
  • the input device may include a touch screen or the like, and the output device may include Display devices such as LCD, CRT, Holographic, Projector, and the like.
  • the handheld device further includes a driving layer, a frame layer, and an application layer.
  • the driver layer may include a CPU driver, a GPU driver, a display controller driver, and the like.
  • the framework layer can include system services (System Service), web service (Web Service) and user service (Customer) Service) and so on.
  • the application layer may include a desktop, a media player, a browser, and the like.
  • FIG. 2 is a flowchart of an embodiment of a method for identifying a user operation mode on a handheld device according to the present invention, including:
  • Step S101 When it is detected that the user's finger slides on the screen of the handheld device, the sliding information of the user's finger during the sliding process is acquired.
  • Step S102 Identify a user's operation mode according to the sliding information of the user's finger during the sliding process, and the operation mode includes: a left-hand operation mode and a right-hand operation mode.
  • the operation mode includes: a left-hand operation mode and a right-hand operation mode; the left-hand operation mode is a mode of operating the handheld device by the left hand, and the right-hand operation mode is a mode of operating the handheld device by the right hand, that is, the left-hand operation mode indicates that the user operates the handheld device with the left hand.
  • the right hand mode of operation indicates that the user is operating the handheld device with his right hand.
  • the sliding information is naturally generated when the user's finger slides on the handheld screen, as long as the sliding information is captured or collected, the user's operation mode can be recognized by the sliding information without additional sensors.
  • the sliding information of the user's finger during the sliding process is acquired; when the user's finger is detected to slide on the screen of the handheld device, the user's finger is acquired. Sliding information during the sliding process. In this way, the cost of the handheld device can be reduced when the user's operating mode is recognized, and the accuracy of the recognition can be improved.
  • FIG. 2 is a flowchart of another embodiment of a method for identifying a user operation mode on a handheld device of the present invention, including:
  • Step S201 When detecting that the user's finger slides on the screen, obtain one or both of a change in the contact area of the user's finger with the screen during the sliding process, a change in the sliding acceleration.
  • the contact area between the finger and the screen is constantly changing, and the acceleration of the finger sliding is constantly changing, obtaining the change of the contact area between the finger and the screen, or acquiring the acceleration of the sliding of the finger.
  • Step S202 Identify the operation mode of the user according to whether the user's finger changes with the contact area of the screen during the sliding process, the change of the sliding acceleration, or both.
  • step S202 may include: step S202a and step S202b; or step S202 may include step S202c and step S202d; or step S202 may include step S202e, step S202f, step S202g, step S202h, and Step S202i.
  • step S202 includes: step S202a and step S202b.
  • Step S202a If the contact area of the user's finger with the screen is gradually reduced from left to right, the operation mode of the recognized user is the right-hand operation mode.
  • Step S202b If the contact area of the user's finger with the screen is gradually increased from left to right, the operation mode of the recognized user is the left-hand operation mode.
  • step S202 includes: step S202c and step S202d.
  • Step S202c If the sliding acceleration of the user's finger is gradually increased from left to right, the operation mode of the recognized user is the right-hand operation mode.
  • Step S202d If the sliding acceleration of the user's finger is gradually reduced from left to right, the operation mode of the recognized user is the left-hand operation mode.
  • step S202 includes: step S202e, step S202f Step S202g, step S202h, and step S202i.
  • Step S202e setting a weight value of the change of the contact area of the finger of the user with the screen during the sliding process is w2, and the weight value of the change of the sliding acceleration of the user's finger during the sliding process is w3.
  • step S201 and step S202e There is no obvious sequence in step S201 and step S202e.
  • Step S202f If the contact area of the user's finger and the screen gradually decreases from left to right, it is determined that the user's right hand operation mode probability increases the weight value w2, and if the contact area of the user's finger and the screen is gradually increased from left to right. Then, it is determined that the user's left-hand operation mode probability increases the weight value w2.
  • Step S202g If the sliding acceleration of the user's finger gradually increases from left to right, it is determined that the user's right hand operation mode probability increases the weight value w3, and if the sliding acceleration of the user's finger gradually decreases from left to right, the user's left hand operation is determined. The mode probability increases the weight value w3.
  • step S202f There is no sequence in step S202f and step S202g.
  • Step S202h Compare the magnitudes of the user right hand operation mode probability and the left hand operation mode probability.
  • Step S202i If the user right hand operation mode probability is greater than the left hand operation mode probability, the recognition user's operation mode is a right hand operation mode, and if the user right hand operation mode probability is less than the left hand operation mode probability, the recognition user's operation mode is a left hand operation mode.
  • the thumb when the thumb is on the screen in the left hand (the left-hand operation mode), the thumb touches the screen from the right to the curved state.
  • the left is gradually getting smaller (or gradually increasing from left to right).
  • the principle is shown in Figure 7.
  • the left hand position is 21, the left hand thumb is straight and the contact area with the screen is 22, and the left thumb is bent.
  • the operation mode of the recognized user is the right-hand operation mode. If the contact area of the user's finger with the screen is gradually increased from left to right or the sliding acceleration of the user's finger is gradually reduced from left to right, the operation mode of the recognized user is the left-hand operation mode.
  • the weight values of the two are separately set in advance; when the user's finger The contact area with the screen gradually decreases from left to right, and the probability of increasing the right hand operation mode of the user is increased by the weight value w2. If the contact area of the user's finger and the screen is gradually increased from left to right, the probability of the user's left hand operation mode is determined.
  • the weight value w2 is increased; if the sliding acceleration of the user's finger gradually increases from left to right, it is determined that the user's right hand operation mode probability increases the weight value w3, and if the sliding acceleration of the user's finger gradually decreases from left to right, the user is determined.
  • the left hand operation mode probability increases the weight value w3; then the user right hand operation mode probability and the left hand operation mode probability are compared, and the user's operation mode is identified according to the comparison result.
  • step S202 can identify the user's operation mode according to one or both of the change of the contact area of the user's finger with the screen during the sliding process, or the change of the sliding acceleration.
  • the user's operation mode can be recognized after obtaining different rules.
  • the embodiment of the present invention acquires one or both of the change of the contact area of the finger of the user with the screen during the sliding process, and the change of the sliding acceleration; the change of the contact area with the screen according to the finger of the user during the sliding process
  • the change in the sliding acceleration either or both, identifies the user's operating mode. In this way, on the one hand, no additional cost is required and the way of recognizing the left and right handsets of the user is enriched, and on the other hand, when the contact area of the screen with the screen of the user's finger is changed, the sliding acceleration changes. When both recognize the user's operation mode, the accuracy of recognition can be increased.
  • the change of the contact area between the finger and the screen when the user's finger is unlocked by sliding, and the sliding speed may be acquired.
  • the change of the situation, either or both, and further, the user's finger changes the contact area of the finger with the screen when sliding, the change of the sliding speed, or both, to identify the user's operation mode, That is, whether the user is in the left-hand operation mode or the right-hand operation mode.
  • the user's operation mode can be recognized, and then the user interface can be switched to the form that matches the user's operation mode at the first time before the user performs the next operation (for example, the left hand is convenient).
  • the form of operation, or the form of convenient right-hand operation further enhances the user experience.
  • FIG. 8 to FIG. 9 are flowcharts of two embodiments of a method for identifying a user operation mode on a handheld device according to the present invention, and the specific contents are as follows:
  • Step S301 detecting that the user's finger slides on the screen.
  • Step S302 Acquire a change of the contact area of the finger of the user with the screen during the sliding process, a change of the sliding acceleration, and a sliding direction of the finger of the user.
  • the sliding direction of the user's finger specifically includes the following contents:
  • the position of the starting and ending points of the finger in the sliding track also helps to determine the sliding direction of the finger. According to the sliding direction of the finger, it is also possible to roughly determine which hand the user is holding.
  • Step S303 setting the weight value of the sliding direction of the user's finger to w1, the weight value of the change of the contact area of the user's finger with the screen during the sliding process is w2, and the sliding acceleration of the user's finger during the sliding process is changed.
  • the weight value is w3.
  • Step S304 If the sliding direction of the finger of the user is rightward, the user right hand operation mode probability increase weight value w1 is determined, and if the user's finger sliding direction is leftward, the user left hand operation mode probability increase weight value w1 is determined.
  • Step S305 If the contact area of the user's finger and the screen is gradually reduced from left to right, determining that the user's right hand operation mode probability increases the weight value w2, if the contact area of the user's finger and the screen is gradually increased from left to right. Then, it is determined that the user's left-hand operation mode probability increases the weight value w2.
  • Step S306 If the sliding acceleration of the user's finger gradually increases from left to right, it is determined that the user's right hand operation mode probability increases the weight value w3, and if the sliding acceleration of the user's finger gradually decreases from left to right, the user's left hand operation is determined. The mode probability increases the weight value w3.
  • step S304 There is no sequence in step S304, step S305, and step S306.
  • Step S307 Compare the magnitudes of the user's right hand operation mode probability and the left hand operation mode probability.
  • Step S308 If the user right hand operation mode probability is greater than the left hand operation mode probability, the recognition user's operation mode is a right hand operation mode, and if the user right hand operation mode probability is less than the left hand operation mode probability, the recognition user's operation mode is a left hand operation mode.
  • the sliding information of the user's finger during the sliding process in the embodiment of FIG. 9 includes: a change in the contact area of the user's finger with the screen during the sliding process, a change in the sliding acceleration, and a sliding process of the user's finger.
  • Step S401 detecting that the user's finger is sliding on the screen.
  • Step S402 Acquire a change of the contact area of the finger of the user with the screen during the sliding process, a change of the sliding acceleration, and an area where the pixel point of the user's finger passes during the sliding process.
  • Step S403 setting the weight value of the area where the pixel point of the user's finger passes during the sliding process is w0, and the weight value of the change of the contact area of the user's finger with the screen during the sliding process is w2, and the user's finger is at The weight value of the change in the sliding acceleration during the sliding process is w3.
  • Step S404 If the pixel point that the user's finger passes falls into the right area of the screen, it is determined that the user's right hand operation mode probability increases the weight value w0, and if the pixel point that the user's finger passes falls into the left area of the screen, the user is determined.
  • the left hand operation mode probability increases the weight value w0.
  • the user's single handset has a limited range of motion of the finger.
  • the pixels passing through during the sliding process are basically concentrated in a certain area. Which handheld device is used, the pixels passing through during the sliding process are basically concentrated in the area close to the hand, so According to the area where the pixel of the user's finger passes during the sliding process, it is also possible to roughly determine which hand the user is holding.
  • Step S405 If the contact area of the user's finger and the screen is gradually reduced from left to right, determining that the user's right hand operation mode probability increases the weight value w2, if the contact area of the user's finger and the screen is gradually increased from left to right. , determining the user left hand operation mode probability increase weight value w2;
  • Step S406 If the sliding acceleration of the user's finger gradually increases from left to right, it is determined that the user's right hand operation mode probability increases the weight value w3, and if the sliding acceleration of the user's finger gradually decreases from left to right, the user's left hand operation is determined. The mode probability increases the weight value w3.
  • step S404 There is no sequence in step S404, step S405, and step S406.
  • Step S407 Compare the size of the user's right hand operation mode probability and the left hand operation mode probability.
  • Step S408 If the user right hand operation mode probability is greater than the left hand operation mode probability, the recognition user's operation mode is a right hand operation mode, and if the user right hand operation mode probability is less than the left hand operation mode probability, the recognition user's operation mode is a left hand operation mode.
  • Detecting that the user's finger is sliding on the screen there are four parameters to be obtained: the area where the user's finger passes during the sliding process, the sliding direction of the user's finger, and the user's finger during the sliding process.
  • the change of the contact area of the screen and the change of the sliding acceleration of the user's finger during the sliding process according to the actual application situation, some parameters can be obtained to comprehensively identify the probability of the left and right handsets, and the more parameters are acquired, the recognized The higher the accuracy, the above embodiment is only to select several combinations, and other combinations are not described here.
  • the action of the user's finger sliding on the screen is the unlocking action of the screen.
  • the left and right handset modes can be quickly and accurately identified without requiring the user to add any additional actions; and the result of the identification is applied throughout the unlock period after unlocking.
  • the weight value of the area where the pixel that the user's finger passes during the sliding process is set to w0, the weight of the sliding direction of the user's finger is w1, and the contact area of the user's finger with the screen during the sliding process changes.
  • the weight value of the situation is w2, and the weight value of the change in the sliding acceleration of the user's finger during the sliding process is w3.
  • the user's right hand operation mode probability increases the weight value w0, and if the pixel point that the user's finger passes falls into the left area of the screen, the user's left hand operation mode is determined. Probability increases the weight value w0;
  • the user's right hand operation mode probability increases the weight value w2
  • the contact area of the user's finger and the screen gradually increases from left to right, then it is determined.
  • the user's left hand operation mode probability increases the weight value w2;
  • weight value w3 If the sliding acceleration of the user's finger gradually increases from left to right, it is determined that the user's right hand operation mode probability increases the weight value w3, and if the sliding acceleration of the user's finger gradually decreases from left to right, it is determined that the user's left hand operation mode probability increases. Weight value w3.
  • the recognition user's operation mode is the right-hand operation mode, and if the user's right-hand operation mode probability is less than the left-hand operation mode probability, the recognition user's operation mode is the left-hand operation mode.
  • the right-hand operation mode probability is w0+w2+w3
  • the left-hand operation mode probability is w1
  • the sizes of w0+w2+w3 and w1 are compared. If w0+w2+w3 is greater than w1, the recognition user's operation mode is the right-hand operation mode. .
  • the weight values w0, w1, w2, and w3 are set according to the size of the screen and the length and shape of the user's finger sliding on the screen.
  • the thumb has difficulty in approaching the position of the opposite side of the frame (ie, the left hand touches the right border and the right hand touches the left border).
  • a method of laterally sliding a certain distance to unlock is designed, as shown in FIG. 10, the user can draw a horizontal slide on the screen to unlock, and then it is obviously more effective to judge with the finger falling in the area where the pixel passes during the sliding process.
  • the weight value w0 needs to be enlarged.
  • the weight value may be w0>w2>w1>w3.
  • the unlocking mode shown in Figure 11 is designed.
  • the weight value needs to be adjusted to: w2>w1>w3>w0. Since the two ends of the unlocking pattern of FIG. 11 are completely symmetrical, it is judged that the weight value w0 of the region in which the pixel that the finger passes during the sliding process can be completely ignored (that is, the w0 weight value is 0).
  • the above four parameters do not necessarily need to be used at the same time. Depending on the size of the screen and the design of the unlocked sliding shape, it is possible to judge only by using several of the parameters.
  • the unlocking sliding area can be designed in the left and right lower corners when designing the unlocking interface, and the one-handed thumb is impossible.
  • the opposite side in this extreme case, it is possible to obtain an accurate judgment by using only one weight value w0 of the finger falling on the screen area when sliding as a judgment basis.
  • the embodiment of the present invention acquires one or both of the change of the contact area of the finger of the user with the screen during the sliding process, and the change of the sliding acceleration; the change of the contact area with the screen according to the finger of the user during the sliding process
  • the change in the sliding acceleration either or both, identifies the user's operating mode. In this way, on the one hand, no additional cost is required and the way of recognizing the user's operation mode is enriched.
  • the contact area with the screen changes, and the sliding acceleration changes. When the probability of the user's operation mode is recognized, the accuracy of the recognition can be increased.
  • the recognition accuracy can be further improved by combining the area where the pixel of the parameter user's finger passes during the sliding process and the sliding direction of the user's finger; the action of the user's finger sliding on the screen is the unlocking action of the screen, which can be The left and right handset modes are quickly and accurately identified without requiring the user to add any additional actions; and the identified results are applied throughout the unlock period after unlocking.
  • FIG. 13 is a flowchart of still another embodiment of a method for identifying a user operation mode on a handheld device according to the present invention, including:
  • Step S501 When it is detected that the user's finger slides on the screen of the handheld device, obtain one or both of a change in the contact area of the user's finger with the screen during the sliding process, a change in the sliding acceleration.
  • Step S502 Identify the operation mode of the user according to the change of the contact area of the user's finger with the screen during the sliding process, the change of the sliding acceleration, or both.
  • Step S503 Automatically switch the display mode of the operation interface of the handheld device according to the operation mode of the user.
  • the display mode of the operation interface is automatically switched, wherein the display modes of the operation interface include: a left-hand display mode and a right-hand display mode; in the left-hand display mode, the operation interface Usually displayed in the left side of the screen, so that the user can operate the left hand.
  • the operation interface is usually displayed in the right side of the screen to facilitate the user's right hand operation. That is, the left-hand display mode matches the left-hand operation mode, and the right-hand display mode matches the right-hand operation mode.
  • the switched display mode is applied to the entire unlocking period after the unlocking.
  • the embodiment of the present invention acquires one or both of the change of the contact area of the finger of the user with the screen during the sliding process, and the change of the sliding acceleration; the change of the contact area with the screen according to the finger of the user during the sliding process
  • the change of the sliding acceleration either or both, identifies the user's operating mode; the operating interface is automatically switched to a display mode that matches the user's operating mode according to the user's operating mode. In this way, on the one hand, no additional cost is required and the manner of recognizing the operation mode of the user is enriched, and on the other hand, when the contact area of the screen with the screen of the user's finger is changed, the change of the sliding acceleration is changed. When the two recognize the user's operation mode, the recognition accuracy can be increased, so that the display mode of the operation interface is more accurate.
  • FIG. 14 is a schematic structural diagram of an embodiment of a handheld device according to the present invention.
  • the handheld device 10 includes a detection module 101, an acquisition module 102, and an identification module 103.
  • the handheld device of the present embodiment can perform the steps in FIG. 2.
  • the detecting module 101 is configured to detect whether the user's finger is sliding on the screen of the handheld device.
  • the obtaining module 102 is configured to acquire sliding information of the user's finger during the sliding process when the detecting module 101 detects that the user's finger slides on the screen of the handheld device.
  • the identification module 103 is configured to identify an operation mode of the user according to the sliding information of the user's finger acquired during the sliding process, and the operation mode includes: a left-hand operation mode and a right-hand operation mode.
  • the operation mode includes: a left-hand operation mode and a right-hand operation mode; the left-hand operation mode is a mode of operating the handheld device by the left hand, and the right-hand operation mode is a mode of operating the handheld device by the right hand.
  • the sliding information is naturally generated when the user's finger slides on the handheld screen, as long as the sliding information is captured or collected, the user's operation mode can be recognized by the sliding information without additional sensors.
  • the sliding information of the user's finger during the sliding process is acquired; when the user's finger is detected to slide on the screen of the handheld device, the user's finger is acquired. Sliding information during the sliding process. In this way, the cost of the handheld device can be reduced when the user's operating mode is recognized, and the accuracy of the recognition can be improved.
  • FIG. 15 is a schematic structural diagram of five embodiments of a handheld device according to the present invention.
  • the handheld device 20 includes a detection module 201, an acquisition module 202, and an identification module 203.
  • the handheld device of the present embodiment can perform the steps in FIG. 3, FIG. 4, FIG. 5, FIG. 8, and FIG.
  • the detecting module 201 is configured to detect whether the user's finger is sliding on the screen of the handheld device.
  • the obtaining module 202 is configured to acquire sliding information of the user's finger during the sliding process when the detecting module 201 detects that the user's finger slides on the screen of the handheld device.
  • the identification module 203 is configured to identify an operation mode of the user according to the sliding information of the user's finger acquired during the sliding process, and the operation mode includes: a left-hand operation mode and a right-hand operation mode.
  • the sliding information of the user's finger during the sliding process includes the change of the contact area of the user's finger with the screen during the sliding process, the change of the sliding acceleration, or both.
  • the contact area between the finger and the screen is constantly changing, and the acceleration of the finger sliding is constantly changing, obtaining the change of the contact area between the finger and the screen, or acquiring the acceleration of the sliding of the finger.
  • the identification module 203 is specifically configured to identify the operation mode of the user according to the change of the contact area of the user's finger with the screen during the sliding process, the change of the sliding acceleration, or both.
  • the change of the contact area between the finger and the screen and the change of the acceleration of the finger slide are different for the left and right hands. Therefore, according to the change of the contact area between the finger and the screen, The change in sliding acceleration, either or both, can identify the user's operating mode.
  • the identification module 203 includes a first determining unit 2031 and a first identifying unit 2032.
  • the first determining unit 2031 is configured to determine whether the contact area of the user's finger and the screen is gradually decreasing from left to right.
  • the first recognition unit 2032 is configured to determine that the contact area of the user's finger and the screen gradually decreases from left to right when the determination result of the first determination unit 2031 is that the operation mode of the user is the right-hand operation mode.
  • the first identification unit 2032 is further configured to recognize that the operation mode of the user is the left-hand operation mode when the determination result of the first determination unit 2031 is that the contact area of the user's finger and the screen is gradually increased from left to right.
  • the identification module 203 includes a second determining unit 2033 and a second identifying unit 2034.
  • the second judging unit 2033 is for judging whether the sliding acceleration of the user's finger is gradually increasing from left to right.
  • the second recognition unit 2034 is configured to recognize that the operation mode of the user is the right-hand operation mode when the determination result of the second determination unit 2033 is that the sliding acceleration of the user's finger is gradually increasing from left to right.
  • the second identification unit 2034 is further configured to determine that the sliding acceleration of the user's finger is gradually decreasing from left to right when the determination result of the second determining unit 2033 is that the operation mode of the user is the left-hand operation mode.
  • the identification module 203 includes: the first setting unit 2035.
  • the first setting unit 2035 is configured to set a weight value w2 of a change in the contact area of the finger of the user with the screen during the sliding process, and a weight value w3 of the change of the sliding acceleration of the user's finger during the sliding process.
  • the first determining unit 2036 is configured to determine that the user's right hand operation mode probability increases the weight value w2 when the contact area of the user's finger and the screen is gradually changed from left to right, and the contact area of the user's finger and the screen is from left to right. When gradually increasing, it is determined that the user's left-hand operation mode probability increases the weight value w2.
  • the first determining unit 2036 is further configured to determine that the user's right hand operation mode probability increase weight value w3 when the sliding acceleration of the user's finger gradually increases from left to right, and the sliding acceleration of the user's finger gradually decreases from left to right. Determining the user's left hand operation mode probability increases the weight value w3.
  • the first comparison unit 2037 is for comparing the magnitudes of the user's right hand operation mode probability and the left hand operation mode probability.
  • the third identifying unit 2038 is configured to: when the comparison result of the first comparing unit is that the user right hand operating mode probability is greater than the left hand operating mode probability, the operating mode of the identifying user is a right hand operating mode, and the comparison result of the first comparing unit is a user right hand operation. When the mode probability is less than the left-hand operation mode probability, the recognition user's operation mode is the left-hand operation mode.
  • the embodiment of the present invention acquires one or both of the change of the contact area of the finger of the user with the screen during the sliding process, and the change of the sliding acceleration; the change of the contact area with the screen according to the finger of the user during the sliding process
  • the change in the sliding acceleration either or both, identifies the user's operating mode. In this way, on the one hand, no additional cost is required and the way of recognizing the left and right handsets of the user is enriched, and on the other hand, when the contact area of the screen with the screen of the user's finger is changed, the sliding acceleration changes. When both recognize the user's operation mode, the accuracy of recognition can be increased.
  • the change of the contact area between the finger and the screen when the user's finger is unlocked by sliding, and the sliding speed may be acquired.
  • the change of the situation, either or both, and further, the user's finger changes the contact area of the finger with the screen when sliding, the change of the sliding speed, or both, to identify the user's operation mode, That is, whether the user is in the left-hand operation mode or the right-hand operation mode.
  • the user's operation mode can be recognized, and then the user interface can be switched to the form that matches the user's operation mode at the first time before the user performs the next operation (for example, the left hand is convenient).
  • the form of operation, or the form of convenient right-hand operation further enhances the user experience.
  • the identification module 203 if the sliding information of the user's finger during the sliding process includes a change in the contact area of the user's finger with the screen during the sliding process, a change in the sliding acceleration, and a sliding direction of the user's finger, the identification module 203
  • the second setting unit 2039, the second determining unit 20310, the second comparing unit 20311, and the fourth identifying unit 20312 are included.
  • the second setting unit 2039 is configured to set a weight value of the sliding direction of the user's finger as w1, a weight value of the change of the contact area of the user's finger with the screen during the sliding process, and a sliding acceleration of the user's finger during the sliding process.
  • the weight of the change is w3;
  • the second determining unit 20310 is configured to determine a user right hand operation mode probability increase weight value w1 when the sliding direction of the user's finger is rightward, and determine a user left hand operation mode probability increase weight when the user's finger sliding direction is to the left. Value w1;
  • the second determining unit 20310 is further configured to determine that the user's right hand operation mode probability increases the weight value w2 when the contact area of the user's finger and the screen is gradually decreased from left to right, and the contact area of the user's finger and the screen is from left to right. When it is gradually getting larger, it is determined that the user's left-hand operation mode probability increases the weight value w2;
  • the second determining unit 20310 is further configured to determine that the user's right hand operation mode probability increase weight value w3 when the sliding acceleration of the user's finger gradually increases from left to right, and the sliding acceleration of the user's finger gradually decreases from left to right. Determining the user left hand operation mode probability increase weight value w3;
  • the second comparison unit 20311 is configured to compare the size of the user right hand operation mode probability and the left hand operation mode probability
  • the fourth identification unit 20312 is configured to: when the comparison result of the second comparison unit 20311 is that the user's right-hand operation mode probability is greater than the left-hand operation mode probability, the operation mode of the recognition user is the right-hand operation mode, and the comparison result of the second comparison unit 20311 is the user. When the right hand operation mode probability is less than the left hand operation mode probability, the recognition user's operation mode is the left hand operation mode.
  • the identification module 203 includes a third setting unit 20313, a third determining unit 20314, a third comparing unit 20315, and a fifth identifying unit 20316.
  • the third setting unit 20313 is configured to set a weight value of the area where the pixel point of the user's finger passes during the sliding process is w0, and the weight value of the change of the contact area of the user's finger with the screen during the sliding process is w2, The weight value of the change in the sliding acceleration of the user's finger during the sliding process is w3.
  • the third determining unit 20314 is configured to determine, when the pixel point that the user's finger passes, falls into the right area of the screen, determine the user right hand operation mode probability increase weight value w0, and the pixel point that the user's finger passes falls into the left area of the screen. At the time, it is determined that the user's left-hand operation mode probability increases the weight value w0.
  • the third determining unit 20314 is further configured to determine that the user's right hand operation mode probability increases the weight value w2 when the contact area of the user's finger and the screen is gradually changed from left to right, and the contact area of the user's finger and the screen is from left to right. When it is gradually getting larger, it is determined that the user's left-hand operation mode probability increases the weight value w2.
  • the third determining unit 20314 is further configured to determine that the user's right hand operation mode probability increase weight value w3 when the sliding acceleration of the user's finger gradually increases from left to right, and the sliding acceleration of the user's finger gradually decreases from left to right. Determining the user's left hand operation mode probability increases the weight value w3.
  • the third comparison unit 20315 is for comparing the magnitudes of the user's right hand operation mode probability and the left hand operation mode probability.
  • the fifth identification unit 20316 is configured to recognize that the operation mode of the user is the right-hand operation mode when the comparison result of the third comparison unit 20315 is that the user's right-hand operation mode probability is greater than the left-hand operation mode probability, and the comparison result of the third comparison unit 20315 is the user.
  • the recognition user's operation mode is the left hand operation mode.
  • Detecting that the user's finger is sliding on the screen there are four parameters to be obtained: the area where the user's finger passes during the sliding process, the sliding direction of the user's finger, and the user's finger during the sliding process.
  • the change of the contact area of the screen and the change of the sliding acceleration of the user's finger during the sliding process according to the actual application situation, some parameters can be obtained to comprehensively identify the probability of the left and right handsets, and the more parameters are acquired, the recognized The higher the accuracy, the above embodiment is only to select several combinations, and other combinations are not described here.
  • the action of the user's finger sliding on the screen is the unlocking action of the screen.
  • the left and right handset modes can be quickly and accurately identified without requiring the user to add any additional actions; and the result of the identification is applied throughout the unlock period after unlocking.
  • the identification module includes: a fourth setting unit, a fourth determining unit, a fourth comparing unit, and a sixth identifying unit.
  • the fourth setting unit is configured to set a weight value of the area where the pixel point of the user's finger passes during the sliding process is w0, and the weight value of the sliding direction of the user's finger is w1, and the user's finger is in the process of sliding with the screen.
  • the weight value of the change in the contact area is w2
  • the weight value of the change in the sliding acceleration of the user's finger during the sliding process is w3.
  • the fourth determining unit is configured to determine, when the pixel point that the user's finger passes, falls into the right area of the screen, the user right hand operation mode probability increase weight value w0, when the pixel point that the user's finger passes falls into the left area of the screen , determining that the user's left hand operation mode probability increases the weight value w0.
  • the fourth determining unit is further configured to: when the sliding direction of the user's finger is to the right, determine that the sliding direction result is the user right hand operation mode holding probability increase weight value w1, and when the sliding direction of the user's finger is to the left, determine the sliding The result of the direction is that the user's left hand operation mode probability increases the weight value w1.
  • the fourth determining unit is further configured to determine that the user's right hand operation mode probability increases the weight value w2 when the contact area of the user's finger and the screen is gradually changed from left to right, and the contact area between the user's finger and the screen is from left to right. When gradually increasing, it is determined that the user's left-hand operation mode probability increases the weight value w2.
  • the fourth determining unit is further configured to: when the sliding acceleration of the user's finger gradually increases from left to right, determine the user right hand operation mode probability increase weight value w3, and determine that the sliding acceleration of the user's finger gradually decreases from left to right, and determine The user's left hand operation mode probability increases the weight value w3.
  • the fourth comparison unit is for comparing the probability of the user's right handset and the probability of the left handset.
  • the user's right-hand operation mode probability of the user is greater than the left-hand operation mode probability, the user's right-hand operation mode is recognized, and if the user's right-hand operation mode probability is less than the left-hand operation mode probability, the user's left-hand operation mode is recognized.
  • the sixth identifying unit is configured to: when the comparison result of the fourth comparing unit is that the right-hand operating mode probability is greater than the left-hand operating mode probability, the operating mode of the recognized user is a right-hand operating mode, and the comparison result of the fourth comparing unit is a user right-hand operating mode. When the probability is less than the left-hand operation mode probability, the recognition user's operation mode is the left-hand operation mode.
  • the weight values w0, w1, w2, and w3 are set according to the size of the screen and the length and shape of the user's finger sliding on the screen.
  • the embodiment of the present invention acquires one or both of the change of the contact area of the finger of the user with the screen during the sliding process, and the change of the sliding acceleration; the change of the contact area with the screen according to the finger of the user during the sliding process
  • the change in the sliding acceleration either or both, identifies the user's operating mode. In this way, on the one hand, no additional cost is required and the manner of recognizing the operation mode of the user is enriched, and on the other hand, when the contact area of the screen with the screen of the user's finger is changed, the change of the sliding acceleration is changed. When both recognize the user's operating mode probability, the accuracy of the recognition can be increased.
  • the recognition accuracy can be further improved by combining the area where the pixel of the parameter user's finger passes during the sliding process and the sliding direction of the user's finger; the action of the user's finger sliding on the screen is the unlocking action of the screen, which can be The left and right handset modes are quickly and accurately identified without requiring the user to add any additional actions; and the identified results are applied throughout the unlock period after unlocking.
  • FIG. 20 is a schematic structural diagram of still another embodiment of a handheld device according to the present invention.
  • the handheld device 30 includes a detection module 301, an acquisition module 302, an identification module 303, and a switching module 304.
  • the handheld device in this embodiment may perform the steps in FIG.
  • the detecting module 301 is configured to detect whether the user's finger slides on the screen of the handheld device.
  • the obtaining module 302 is configured to acquire, when the detecting module 301 detects that the user's finger slides on the screen of the handheld device, the change of the contact area of the user's finger with the screen during the sliding process, the change of the sliding acceleration, or both one.
  • the identification module 303 is configured to identify the operation mode of the user according to the change of the contact area of the user's finger with the screen during the sliding process, the change of the sliding acceleration, or both.
  • the switching module 304 is configured to automatically switch the display mode of the operation interface according to the operation mode of the user identified by the identification module 303.
  • the embodiment of the present invention acquires one or both of the change of the contact area of the finger of the user with the screen during the sliding process, and the change of the sliding acceleration; the change of the contact area with the screen according to the finger of the user during the sliding process
  • the change of the sliding acceleration either or both, identifies the user's operating mode; automatically switches the display mode of the operating interface according to the user's operating mode. In this way, on the one hand, no additional cost is required and the manner of recognizing the operation mode of the user is enriched, and on the other hand, when the contact area of the screen with the screen of the user's finger is changed, the change of the sliding acceleration is changed.
  • the two recognize the user's operation mode probability the recognition accuracy can be increased, so that the display mode of the operation interface is more accurate.
  • FIG. 21 is a schematic diagram showing the physical structure of still another embodiment of the handheld device of the present invention.
  • the handheld device includes a processor 31, a memory 32 coupled to the processor 31, a detector 33, and a collector 34.
  • the detector 33 is for detecting whether the user's finger is sliding on the screen of the handheld device.
  • the collector 34 is configured to acquire, when the detector 33 detects that the finger of the user slides on the screen of the handheld device, the change of the contact area of the finger of the user with the screen during the sliding process, and the sliding acceleration a change of both or both, and storing either or both of the change in the contact area of the user's finger with the screen during the sliding process, or the change in the sliding acceleration in the memory 32.
  • the processor 31 is configured to extract, or both, a change of a contact area of the finger of the user stored in the memory 32 with the screen during sliding, or a change of the sliding acceleration, and according to The change of the contact area of the user's finger with the screen during the sliding process, the change of the sliding acceleration, or both, identifies the operation mode of the user.
  • the processor 31 is configured to determine whether the contact area of the finger of the user with the screen is gradually decreasing from left to right, or whether the sliding acceleration of the user's finger is gradually increasing from left to right. Identifying the user's operation mode when the result of the judgment is that the contact area of the user's finger with the screen is gradually decreasing from left to right, or the sliding acceleration of the user's finger is gradually increasing from left to right a right-hand operation mode; in the judgment result, the contact area of the user's finger with the screen is gradually increased from left to right, or the sliding acceleration of the user's finger is gradually decreasing from left to right, and the user is recognized.
  • the mode of operation is the left-hand mode of operation.
  • the processor 31 is configured to set a weight value of the change of the contact area of the finger of the user with the screen during the sliding process, and the weight value of the change of the sliding acceleration of the user's finger during the sliding process is w3;
  • the contact area of the user's finger with the screen is gradually reduced from left to right, and then the user's right hand operation mode probability is increased by the weight value w2. If the contact area of the user's finger and the screen is gradually increased from left to right, the user is determined.
  • the left-hand operation mode probability increases the weight value w2; if the sliding acceleration of the user's finger gradually increases from left to right, it determines that the user's right-hand operation mode probability increases the weight value w3, and if the sliding acceleration of the user's finger gradually decreases from left to right And determining a user left hand operation mode probability increase weight value w3; comparing a user right hand operation mode probability and a left hand operation mode probability; if the user right hand operation mode probability is greater than a left hand operation mode probability, identifying the user's operation mode is a right hand operation mode, If the user's right hand operation mode probability is less than the left hand operation mode probability, the user's Left-handed mode is the mode of operation.
  • the sliding information during the sliding process of the user's finger includes: a change in the contact area of the user's finger with the screen during the sliding process, a change in the sliding acceleration, and a sliding direction of the user's finger:
  • the collector 34 is configured to acquire a change in the contact area of the finger of the user with the screen during the sliding process, a change in the sliding acceleration, and a sliding direction of the finger of the user.
  • the processor 31 is configured to set a weight value of the sliding direction of the user's finger as w1, a weight value of the change of the contact area of the finger of the user with the screen during the sliding process, and a sliding acceleration of the user's finger during the sliding process.
  • the weight value of the change case is w3; if the sliding direction of the user's finger is to the right, it is determined that the user's right hand operation mode probability increases the weight value w1, and if the user's finger's sliding direction is to the left, the user's left hand operation mode probability is determined.
  • the sliding information during the sliding process of the user's finger includes: a change in the contact area of the user's finger with the screen during the sliding process, a change in the sliding acceleration, and a pixel point that the user's finger passes during the sliding process.
  • the collector 34 is configured to acquire a change in the contact area of the finger of the user with the screen during the sliding process, a change in the sliding acceleration, and an area in which the pixel point of the user's finger passes during the sliding process.
  • the processor 31 is configured to set a weight value of the area where the pixel point that the user's finger passes during the sliding process is w0, and the weight value of the change of the contact area of the user's finger with the screen during the sliding process is w2.
  • the weight value of the change of the sliding acceleration of the user's finger during the sliding process is w3; if the pixel point of the user's finger falls into the right area of the screen, it is determined that the user's right hand operation mode probability increases the weight value w0, if the user's finger If the passing pixel points fall into the left area of the screen, it is determined that the user's left hand operation mode probability increases the weight value w0; if the contact area of the user's finger and the screen gradually decreases from left to right, it is determined that the user's right hand operation mode probability increases.
  • the weight value w2 if the contact area of the user's finger and the screen is gradually increased from left to right, determining that the user's left hand operation mode probability increases the weight value w2; if the sliding acceleration of the user's finger gradually increases from left to right, then Determining the right-hand operation mode probability of the user increases the weight value w3, and if the sliding acceleration of the user's finger gradually becomes smaller from left to right, it is determined
  • the left-hand operation mode probability increases the weight value w3; compares the user right-hand operation mode probability and the left-hand operation mode probability; if the user's right-hand operation mode probability is greater than the left-hand operation mode probability, the recognition user's operation mode is the right-hand operation mode, if the user right-hand operation The mode probability is less than the left hand mode mode probability, and the user's mode of operation is identified as the left hand mode of operation.
  • the processor 31 is configured to set a weight value of an area where a pixel point of the user's finger passes during the sliding process is w0, and a weight value of the sliding direction of the user's finger is w1, the user The weight value of the change of the contact area of the finger with the screen during the sliding process is w2, and the weight value of the change of the sliding acceleration of the user's finger during the sliding process is w3; acquired at the collector 34 When the pixel point that the user's finger passes falls into the right area of the screen, the right-hand operation mode probability increase weight value w0 is determined, and the pixel point of the user's finger acquired by the collector 34 is determined.
  • the weight values w0, w1, w2, and w3 are set according to the size of the screen and the length and shape of the user's finger sliding on the screen.
  • the action of the user's finger sliding on the screen is an unlocking action of the screen.
  • the embodiment of the present invention acquires one or both of the change of the contact area of the finger of the user with the screen during the sliding process, and the change of the sliding acceleration; the change of the contact area with the screen according to the finger of the user during the sliding process
  • the change in the sliding acceleration either or both, identifies the user's operating mode. In this way, on the one hand, no additional cost is required and the manner of recognizing the operation mode of the user is enriched, and on the other hand, when the contact area of the screen with the screen of the user's finger is changed, the change of the sliding acceleration is changed. When both recognize the probability of the user's left and right handsets, the accuracy of the recognition can be increased.
  • the recognition accuracy can be further improved by combining the area where the pixel of the parameter user's finger passes during the sliding process and the sliding direction of the user's finger; the action of the user's finger sliding on the screen is the unlocking action of the screen, which can be The left and right handset modes are quickly and accurately identified without requiring the user to add any additional actions; and the identified results are applied throughout the unlock period after unlocking.
  • the disclosed system, apparatus, and method may be implemented in other manners.
  • the device implementations described above are merely illustrative.
  • the division of the modules or units is only a logical function division.
  • there may be another division manner for example, multiple units or components may be used. Combinations can be integrated into another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be in an electrical, mechanical or other form.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
  • the integrated unit if implemented in the form of a software functional unit and sold or used as a standalone product, may be stored in a computer readable storage medium.
  • the technical solution of the present invention which is essential or contributes to the prior art, or all or part of the technical solution, may be embodied in the form of a software product stored in a storage medium.
  • a number of instructions are included to cause a computer device (which may be a personal computer, server, or network device, etc.) or a processor to perform all or part of the steps of the methods of the various embodiments of the present invention.
  • the foregoing storage medium includes: a U disk, a mobile hard disk, a read only memory (ROM, Read-Only) Memory, random access memory (RAM), disk or optical disk, and other media that can store program code.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)
  • Databases & Information Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Bioethics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)

Abstract

本发明公开了一种手持设备上用户操作模式的识别方法及手持设备,该方法包括:当检测到用户的手指在手持设备的屏幕上滑动时,获取所述用户的手指在滑动过程中的滑动信息;根据所述用户的手指在滑动过程中的滑动信息,识别所述用户的操作模式,所述操作模式包括:左手操作模式和右手操作模式。通过上述方式,本发明能够不需要额外的成本即可识别用户的操作模式,能够增加识别的精度。

Description

手持设备上用户操作模式的识别方法及手持设备
【技术领域】
本发明涉及人机交互技术领域,特别是涉及一种手持设备上用户操作模式的识别方法及手持设备。
【背景技术】
用户单手操作手持设备的情况在现实生活中比较常见,例如,在用户乘坐公交车、用餐等情形时。目前市场上的手持设备例如手机,屏幕越来越大,如果用户界面(UI,User Interface)设计不合理,那么用户单手操作手机较为困难。针对这种情况,很多应用软件需要通过人工的方式,设置用户操作模式,比如左手操作模式或右手操作模式,然后根据设置的操作模式确定UI的呈现方式。
现有技术中,获取用户操作模式的方式有两种,一种是通过手工设置,应用软件通常提供左右手两种单手操作模式,用户在使用应用软件前手工设置。但是,在某些情况下,例如,用户站立在公交车上,经常需要左右手切换持机,在这种情况下,手工设置的方式显然很不方便。另一种是自动获取用户的操作模式,自动获取用户的操作模式有两种方式,第一种是通过手机内的传感器来识别操作模式。第二种是通过计算用户在屏幕上滑动的斜率的方式来识别左右手操作模式。
但是,本申请的发明人在长期的研发中发现:专门使用传感器来识别操作模式,额外增加成本且判断的精度依赖传感器的敏感度;计算用户在屏幕上滑动的斜率的方式,判断的精度不高,用户个体差异的影响也大。
【发明内容】
本发明主要解决的技术问题是提供一种手持设备上用户操作模式的识别方法及手持设备,能够不需要额外的成本,丰富了识别用户操作模式的方式,且增加识别的精度。
第一方面,本发明提供一种手持设备上用户操作模式的识别方法,包括:当检测到用户的手指在手持设备的屏幕上滑动时,获取所述用户的手指在滑动过程中的滑动信息;根据所述用户的手指在滑动过程中的滑动信息,识别所述用户的操作模式,所述操作模式包括:左手操作模式和右手操作模式。
在第一方面的第一种可能的实现方式中,所述用户的手指在滑动过程中的滑动信息包括所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一。
结合第一方面的第一种可能的实现方式,在第一方面的第二种可能的实现方式中,所述根据所述用户的手指在滑动过程中的滑动信息,识别所述用户的操作模式,包括:若所述用户的手指与所述屏幕的接触面积从左至右是逐渐变小,则识别所述用户的操作模式是右手操作模式;若所述用户的手指与所述屏幕的接触面积从左至右是逐渐变大,则识别所述用户的操作模式是左手操作模式。
结合第一方面的第一种可能的实现方式,在第一方面的第三种可能的实现方式中,所述根据所述用户的手指在滑动过程中的滑动信息,识别所述用户的操作模式,包括:若所述用户的手指的滑动加速度从左至右是逐渐变大,则识别所述用户的操作模式是右手操作模式;若所述用户的手指的滑动加速度从左至右是逐渐变小,则识别所述用户的操作模式是左手操作模式。
结合第一方面的第一种可能的实现方式,在第一方面的第四种可能的实现方式中,所述根据所述用户的手指在滑动过程中的滑动信息,识别所述用户的操作模式,包括:设置所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况的权重值为w2,所述用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3;若所述用户的手指与所述屏幕的接触面积从左至右是逐渐变小,则确定所述用户右手操作模式概率增加权重值w2,若所述用户的手指与所述屏幕的接触面积从左至右是逐渐变大,则确定所述用户左手操作模式概率增加权重值w2;若所述用户的手指的滑动加速度从左至右逐渐变大,则确定所述用户右手操作模式概率增加权重值w3,若所述用户的手指的滑动加速度从左至右逐渐变小,则确定所述用户左手操作模式概率增加权重值w3;比较所述用户右手操作模式概率和左手操作模式概率的大小;若所述用户右手操作模式概率大于左手操作模式概率,则识别所述用户的操作模式是右手操作模式,若所述用户右手操作模式概率小于左手操作模式概率,则识别所述用户的操作模式是左手操作模式。
结合第一方面的第一种可能的实现方式,在第一方面的第五种可能的实现方式中,所述用户的手指在滑动过程中的滑动信息还包括所述用户的手指的滑动方向;所述根据所述用户的手指在滑动过程中的滑动信息,识别所述用户的操作模式,包括:设置所述用户的手指的滑动方向的权重值为w1,所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况的权重值为w2,所述用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3;若所述用户的手指的滑动方向是向右,则确定所述用户右手操作模式概率增加权重值w1,若所述用户的手指的滑动方向是向左,则确定所述用户左手操作模式概率增加权重值w1;若所述用户的手指与所述屏幕的接触面积从左至右是逐渐变小,则确定所述用户右手操作模式概率增加权重值w2,若所述用户的手指与所述屏幕的接触面积从左至右是逐渐变大,则确定所述用户左手操作模式概率增加权重值w2;若所述用户的手指的滑动加速度从左至右逐渐变大,则确定所述用户右手操作模式概率增加权重值w3,若所述用户的手指的滑动加速度从左至右逐渐变小,则确定所述用户左手操作模式概率增加权重值w3;比较所述用户右手操作模式概率和左手操作模式概率的大小;若所述用户右手操作模式概率大于左手操作模式概率,则识别所述用户的操作模式是右手操作模式,若所述用户右手操作模式概率小于左手操作模式概率,则识别所述用户的操作模式是左手操作模式。
结合第一方面的第一种可能的实现方式,在第一方面的第六种可能的实现方式中,所述用户的手指在滑动过程中的滑动信息还包括所述用户的手指在滑动过程中经过的像素点落入的区域;所述根据所述用户的手指在滑动过程中的滑动信息,识别所述用户的操作模式,包括:设置所述用户的手指在滑动过程中经过的像素点落入的区域的权重值为w0,所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况的权重值为w2,所述用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3;若所述用户的手指经过的像素点落入所述屏幕的右方区域,则确定所述用户右手操作模式概率增加权重值w0,若所述用户的手指经过的像素点落入所述屏幕的左方区域,则确定所述用户左手操作模式概率增加权重值w0;若所述用户的手指与所述屏幕的接触面积从左至右是逐渐变小,则确定所述用户右手操作模式概率增加权重值w2,若所述用户的手指与所述屏幕的接触面积从左至右是逐渐变大,则确定所述用户左手操作模式概率增加权重值w2;若所述用户的手指的滑动加速度从左至右逐渐变大,则确定所述用户右手操作模式概率增加权重值w3,若所述用户的手指的滑动加速度从左至右逐渐变小,则确定所述用户左手操作模式概率增加权重值w3;比较所述用户右手操作模式概率和左手操作模式概率的大小;若所述用户右手操作模式概率大于左手操作模式概率,则识别所述用户的操作模式是右手操作模式,若所述用户右手操作模式概率小于左手操作模式概率,则识别所述用户的操作模式是左手操作模式。
结合第一方面的第四、五、六种中任意一种可能的实现方式,在第一方面的第七种可能的实现方式中,所述权重值w0、w1、w2以及w3是根据所述屏幕的大小和所述用户的手指在所述屏幕上滑动的长度和形状进行设置的。
结合第一方面、第一方面的第一至第六种中任意一种可能的实现方式,在第一方面的第八种可能的实现方式中,所述用户的手指在屏幕上滑动的动作是所述屏幕的解锁动作。
结合第一方面、第一方面的第一至第六种中任意一种可能的实现方式,在第一方面的第九种可能的实现方式中,所述根据所述用户的手指在滑动过程中的滑动信息,识别所述用户的操作模式之后,还包括:根据所述用户的操作模式,自动切换所述手持设备操作界面的显示模式。
第二方面,本发明提供一种手持设备,所述手持设备包括:检测模块、获取模块以及识别模块;所述检测模块用于检测用户的手指是否在所述手持设备的屏幕上滑动;所述获取模块用于当所述检测模块检测到用户的手指在所述手持设备的屏幕上滑动时,获取所述用户的手指在滑动过程中的滑动信息;所述识别模块用于根据所述获取模块获取的用户的手指在滑动过程中的滑动信息,识别所述用户的操作模式,所述操作模式包括:左手操作模式和右手操作模式。
在第二方面的第一种可能的实现方式中,所述用户的手指在滑动过程中的滑动信息包括所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一。
结合第二方面的第一种可能的实现方式,在第二方面的第二种可能的实现方式中,所述识别模块包括:第一判断单元和第一识别单元;所述第一判断单元用于判断所述用户的手指与所述屏幕的接触面积从左至右是否是逐渐变小;所述第一识别单元用于在所述第一判断单元的判断结果为所述用户的手指与所述屏幕的接触面积从左至右是逐渐变小时,识别所述用户的操作模式是右手操作模式;所述第一识别单元还用于在所述第一判断单元的判断结果为所述用户的手指与所述屏幕的接触面积从左至右是逐渐变大时,识别所述用户的操作模式是左手操作模式。
结合第二方面的第一种可能的实现方式,在第二方面的第三种可能的实现方式中,所述识别模块包括:第二判断单元和第二识别单元;所述第二判断单元用于判断所述用户的手指的滑动加速度从左至右是否是逐渐变大;所述第二识别单元用于在所述第二判断单元的判断结果为所述用户的手指的滑动加速度从左至右是逐渐变大时,识别所述用户的操作模式是右手操作模式;所述第二识别单元还用于在所述第二判断单元的判断结果为所述用户的手指的滑动加速度从左至右是逐渐变小时,识别所述用户的操作模式是左手操作模式。
结合第二方面的第一种可能的实现方式,在第二方面的第四种可能的实现方式中,所述识别模块包括:第一设置单元、第一确定单元、第一比较单元、第三识别单元;所述第一设置单元用于设置所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况的权重值为w2,所述用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3;所述第一确定单元用于在所述用户的手指与所述屏幕的接触面积从左至右是逐渐变小时,确定所述用户右手操作模式概率增加权重值w2,在所述用户的手指与所述屏幕的接触面积从左至右是逐渐变大时,确定所述用户左手操作模式概率增加权重值w2;所述第一确定单元还用于在所述用户的手指的滑动加速度从左至右逐渐变大时,确定所述用户右手操作模式概率增加权重值w3,在所述用户的手指的滑动加速度从左至右逐渐变小时,确定所述用户左手操作模式概率增加权重值w3;所述第一比较单元用于比较所述用户右手操作模式概率和左手操作模式概率的大小;所述第三识别单元用于在所述第一比较单元的比较结果为所述用户右手操作模式概率大于左手操作模式概率时,识别所述用户的操作模式是右手操作模式,在所述第一比较单元的比较结果为所述用户右手操作模式概率小于左手操作模式概率时,识别所述用户的操作模式是左手操作模式。
结合第二方面的第一种可能的实现方式,在第二方面的第五种可能的实现方式中,所述用户的手指在滑动过程中的滑动信息还包括所述用户的手指的滑动方向;所述识别模块包括:第二设置单元、第二确定单元、第二比较单元、第四识别单元;所述第二设置单元用于设置所述用户的手指的滑动方向的权重值为w1,所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况的权重值为w2,所述用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3;所述第二确定单元用于在所述用户的手指的滑动方向是向右时,确定所述用户右手操作模式概率增加权重值w1,在所述用户的手指的滑动方向是向左时,确定所述用户左手操作模式概率增加权重值w1;所述第二确定单元还用于在所述用户的手指与所述屏幕的接触面积从左至右是逐渐变小时,确定所述用户右手操作模式概率增加权重值w2,在所述用户的手指与所述屏幕的接触面积从左至右是逐渐变大时,确定所述用户左手操作模式概率增加权重值w2;所述第二确定单元还用于在所述用户的手指的滑动加速度从左至右逐渐变大时,确定所述用户右手操作模式概率增加权重值w3,在所述用户的手指的滑动加速度从左至右逐渐变小时,确定所述用户左手操作模式概率增加权重值w3;所述第二比较单元用于比较所述用户右手操作模式概率和左手操作模式概率的大小;所述第四识别单元用于在所述第二比较单元的比较结果为所述用户右手操作模式概率大于左手操作模式概率时,识别所述用户的操作模式是右手操作模式,在所述第二比较单元的比较结果为所述用户右手操作模式概率小于左手操作模式概率时,识别所述用户的操作模式是左手操作模式。
结合第二方面的第一种可能的实现方式,在第二方面的第六种可能的实现方式中,所述用户的手指在滑动过程中的滑动信息还包括所述用户的手指在滑动过程中经过的像素点落入的区域;所述识别模块包括:第三设置单元、第三确定单元、第三比较单元、第五识别单元;所述第三设置单元用于设置所述用户的手指在滑动过程中经过的像素点落入的区域的权重值为w0,所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况的权重值为w2,所述用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3;所述第三确定单元用于在所述用户的手指经过的像素点落入所述屏幕的右方区域时,确定所述用户右手操作模式概率增加权重值w0,在所述用户的手指经过的像素点落入所述屏幕的左方区域时,确定所述用户左手操作模式概率增加权重值w0;所述第三确定单元还用于在所述用户的手指与所述屏幕的接触面积从左至右是逐渐变小时,确定所述用户右手操作模式概率增加权重值w2,在所述用户的手指与所述屏幕的接触面积从左至右是逐渐变大时,确定所述用户左手操作模式概率增加权重值w2;所述第三确定单元还用于在所述用户的手指的滑动加速度从左至右逐渐变大时,确定所述用户右手操作模式概率增加权重值w3,在所述用户的手指的滑动加速度从左至右逐渐变小时,确定所述用户左手操作模式概率增加权重值w3;所述第三比较单元用于比较所述用户右手操作模式概率和左手操作模式概率的大小;所述第五识别单元用于在所述第三比较单元的比较结果为所述用户右手操作模式概率大于左手操作模式概率时,识别所述用户的操作模式是右手操作模式,在所述第三比较单元的比较结果为所述用户右手操作模式概率小于左手操作模式概率时,识别所述用户的操作模式是左手操作模式。
结合第二方面的第四、五、六种中任意一种可能的实现方式,在第二方面的第七种可能的实现方式中,所述权重值w0、w1、w2以及w3是根据所述屏幕的大小和所述用户的手指在所述屏幕上滑动的长度和形状进行设置的。
结合第二方面、第二方面的第一至第六种中任意一种可能的实现方式,在第二方面的第七种可能的实现方式中,所述用户的手指在屏幕上滑动的动作是所述屏幕的解锁动作。
结合第二方面、第二方面的第一至第六种中任意一种可能的实现方式,所述手持设备还包括切换模块,所述切换模块用于根据所述用户的操作模式,自动切换所述手持设备操作界面的显示模式。
本发明的有益效果是:区别于现有技术的情况,本发明当检测到用户的手指在手持设备的屏幕上滑动时,获取所述用户的手指在滑动过程中的滑动信息;根据所述用户的手指在滑动过程中的滑动信息,识别所述用户的操作模式,所述操作模式包括:左手操作模式和右手操作模式。通过这种方式,能够不需要额外的成本,丰富了识别用户操作模式的方式,且增加识别的精度。
【附图说明】
图1是本发明一种手持设备的结构示意图;
图2是本发明手持设备上用户操作模式的识别方法一实施方式的流程图;
图3是本发明手持设备上用户操作模式的识别方法另一实施方式的流程图;
图4是本发明手持设备上用户操作模式的识别方法又一实施方式的流程图;
图5是本发明手持设备上用户操作模式的识别方法又一实施方式的流程图;
图6是手持设备上用户操作模式的识别方法中用户右手指与屏幕接触面积变化原理图;
图7是本发明手持设备上用户操作模式的识别方法中用户左手指与屏幕接触面积变化原理图;
图8是本发明手持设备上用户操作模式的识别方法又一实施方式的流程图;
图9是本发明手持设备上用户操作模式的识别方法又一实施方式的流程图;
图10是本发明手持设备上用户操作模式的识别方法中一滑动解锁示意图;
图11是本发明手持设备上用户操作模式的识别方法中另一滑动解锁示意图;
图12是本发明手持设备上用户操作模式的识别方法中又一滑动解锁示意图;
图13是本发明手持设备上用户操作模式的识别方法又一实施方式的流程图;
图14是本发明手持设备一实施方式的结构示意图;
图15是本发明手持设备另一实施方式的结构示意图;
图16是本发明手持设备又一实施方式的结构示意图;
图17是本发明手持设备又一实施方式的结构示意图;
图18是本发明手持设备又一实施方式的结构示意图;
图19是本发明手持设备又一实施方式的结构示意图;
图20是本发明手持设备又一实施方式的结构示意图;
图21是本发明手持设备一实施方式的实体结构示意图。
【具体实施方式】
下面先简单介绍下本发明的应用场景和硬件环境。
用户单手操作手持设备的情况在现实生活中比较常见,例如,在用户乘坐公交车、用餐等情形时。目前市场上的手持设备例如手机,屏幕越来越大,如果用户界面(UI,User Interface)设计不合理,那么用户单手操作手机较为困难。针对这种情况,很多应用软件需要根据用户的在单手操作手持设备时的操作模式确定UI的呈现方式。
参阅图1,图1是本发明一种手持设备的结构示意图。以图1为例介绍本发明实施方式提供的手持设备上用户操作模式的识别方法应用的手持设备的逻辑结构。该手持设备具体可以为一智能手机。如图1所示,该手持设备的硬件层包括CPU、GPU等,当然还可以包括存储器、输入/输出设备、内存、内存控制器、网络接口等,输入设备可包括触摸屏等,输出设备可包括显示设备如LCD、CRT、全息成像(Holographic)、投影(Projector)等。在硬件层之上可运行有操作系统(如Android等)以及一些应用程序。核心库是操作系统的核心部分,包括输入/输出服务、核心服务、图形设备接口以及实现CPU、GPU图形处理的图形引擎(Graphics Engine)等。除此之外,该手持设备还包括驱动层、框架层和应用层。驱动层可包括CPU驱动、GPU驱动、显示控制器驱动等。框架层可包括系统服务(System service)、网页服务(Web Service)和用户服务(Customer Service)等。应用层可包括桌面(launcher)、媒体播放器(Media Player)、浏览器(Browser)等。
下面结合附图和实施方式对本发明进行详细说明。
参阅图2,图2是本发明手持设备上用户操作模式的识别方法一实施方式的流程图,包括:
步骤S101:当检测到用户的手指在手持设备的屏幕上滑动时,获取用户的手指在滑动过程中的滑动信息。
步骤S102:根据用户的手指在滑动过程中的滑动信息,识别用户的操作模式,操作模式包括:左手操作模式和右手操作模式。
用户的手指在手持设备的屏幕上滑动时,会产生许多滑动信息,获取这些滑动信息,有助于识别用户的操作模式,因此,根据用户的手指在滑动过程中的滑动信息,即可识别用户的操作模式。其中,操作模式包括:左手操作模式和右手操作模式;左手操作模式即为左手操作手持设备的模式,右手操作模式即为右手操作手持设备的模式,即左手操作模式表示用户用左手操作手持设备,右手操作模式表示用户用右手操作手持设备。
由于滑动信息是在用户的手指在手持的屏幕上滑动时,自然而然的产生的,只要捕获或采集这些滑动信息,不需要额外通过其它的传感器,即可通过这些滑动信息识别用户的操作模式,降低了手持设备的成本;另外,用户的手指在滑动过程中产生的滑动信息通常包活很多相关的滑动信息,综合考虑这些滑动信息,有助于提高识别的精度。
本发明实施方式当检测到用户的手指在手持设备的屏幕上滑动时,获取用户的手指在滑动过程中的滑动信息;当检测到用户的手指在手持设备的屏幕上滑动时,获取用户的手指在滑动过程中的滑动信息。通过这种方式,能够在识别用户的操作模式时降低手持设备的成本,提高识别的精度。
参阅图2,图2是本发明手持设备上用户操作模式的识别方法另一实施方式的流程图,包括:
步骤S201:当检测到用户的手指在屏幕上滑动时,获取用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一。
用户的手指在屏幕上滑动时,手指与屏幕的接触面积是不断发生变化的,手指滑动的加速度也是在不断发生变化的,获取手指与屏幕的接触面积的变化情况、或获取手指滑动的加速度的变化情况、或获取手指与屏幕的接触面积的变化情况和手指滑动的加速度的变化情况。
步骤S202:根据用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一,识别用户的操作模式。
左右手指在屏幕上滑动时,手指与屏幕的接触面积的变化情况和手指滑动的加速度的变化情况,对于左右手来说变化的规律是不一样的,因此,根据手指与屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一,可以识别用户的操作模式。具体来说,参阅图3至图5,步骤S202可以包括:步骤S202a和步骤S202b;或者步骤S202可以包括步骤S202c和步骤S202d;或者步骤S202可以包括步骤S202e、步骤S202f、步骤S202g、步骤S202h以及步骤S202i。
参阅图3,如果用户的手指在滑动过程中的滑动信息是用户的手指在滑动过程中与屏幕的接触面积的变化情况,那么步骤S202包括:步骤S202a和步骤S202b。
步骤S202a:若用户的手指与屏幕的接触面积从左至右是逐渐变小,则识别用户的操作模式是右手操作模式。
步骤S202b:若用户的手指与屏幕的接触面积从左至右是逐渐变大,则识别用户的操作模式是左手操作模式。
参阅图4,如果用户的手指在滑动过程中的滑动信息是用户的手指在滑动过程中的滑动加速度的变化情况,那么步骤S202包括:步骤S202c和步骤S202d。
步骤S202c:若用户的手指的滑动加速度从左至右是逐渐变大,则识别用户的操作模式是右手操作模式。
步骤S202d:若用户的手指的滑动加速度从左至右是逐渐变小,则识别用户的操作模式是左手操作模式。
参阅图5,如果用户的手指在滑动过程中的滑动信息包括用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者,那么步骤S202包括:步骤S202e、步骤S202f、步骤S202g、步骤S202h以及步骤S202i。
步骤S202e:设置用户的手指在滑动过程中与屏幕的接触面积的变化情况的权重值为w2,用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3。
其中,步骤S201和步骤S202e没有明显的先后顺序。
步骤S202f:若用户的手指与屏幕的接触面积从左至右是逐渐变小,则确定用户右手操作模式概率增加权重值w2,若用户的手指与屏幕的接触面积从左至右是逐渐变大,则确定用户左手操作模式概率增加权重值w2。
步骤S202g:若用户的手指的滑动加速度从左至右逐渐变大,则确定用户右手操作模式概率增加权重值w3,若用户的手指的滑动加速度从左至右逐渐变小,则确定用户左手操作模式概率增加权重值w3。
其中,步骤S202f和步骤S202g没有先后顺序。
步骤S202h:比较用户右手操作模式概率和左手操作模式概率的大小。
步骤S202i:若用户右手操作模式概率大于左手操作模式概率,则识别用户的操作模式是右手操作模式,若用户右手操作模式概率小于左手操作模式概率,则识别用户的操作模式是左手操作模式。
当右边单手持机(即右手操作模式),大拇指在屏幕上滑动时,大拇指由远端的伸直状态变为弯曲状态的过程中,手指与屏幕的接触面积从左至右是逐渐变小,即越是趋向伸直状态,手指与屏幕的接触面积越大,越是趋向弯曲状态,手指与屏幕的接触面积越小,原理如图,6所示,右手持机位置在11,右手拇指伸直状态下与屏幕的接触面积为12,右手拇指弯曲状态下与屏幕的接触面积为13;在常规情况下,用户在滑动屏幕时对于手指的用力F(即手指对屏幕的按压力度和手指的滑动力度)几乎一致的,但由于手指与屏幕的接触面积从左至右是逐渐变小,使得手指与屏幕之间的摩擦力F’也随着手指与屏幕的面积的变化而从左至右是逐渐变小,根据加速度a的物理公式(F- F’)=ma,可获知加速度a从左至右是逐渐变大的,其中,m是手指的质量。
同样的道理,当左边单手持机(即左手操作模式),大拇指在屏幕上滑动时,大拇指由远端的伸直状态变为弯曲状态的过程中,手指与屏幕的接触面积从右至左是逐渐变小(或从左至右是逐渐变大),原理如图7所示,左手持机位置在21,左手拇指伸直状态下与屏幕的接触面积为22,左手拇指弯曲状态下与屏幕的接触面积为23;根据加速度a的物理公式(F- F’)=ma,可获知加速度a从左至右是逐渐变小的,其中,m是手指的质量。
因此,如果用户的手指与屏幕的接触面积从左至右是逐渐变小或用户的手指的滑动加速度从左至右是逐渐变大,则识别用户的操作模式是右手操作模式。如果用户的手指与屏幕的接触面积从左至右是逐渐变大或用户的手指的滑动加速度从左至右是逐渐变小,则识别用户的操作模式是左手操作模式。如果用户的手指在滑动过程中的滑动信息包括用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者,则预先分别设置两者的权重值;当用户的手指与屏幕的接触面积从左至右是逐渐变小时,确定用户右手操作模式概率增加权重值w2,若用户的手指与屏幕的接触面积从左至右是逐渐变大,则确定用户左手操作模式概率增加权重值w2;若用户的手指的滑动加速度从左至右逐渐变大,则确定用户右手操作模式概率增加权重值w3,若用户的手指的滑动加速度从左至右逐渐变小,则确定用户左手操作模式概率增加权重值w3;然后比较用户右手操作模式概率和左手操作模式概率的大小,根据比较结果,识别用户的操作模式。
当然,在实际应用中,步骤S202根据用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一,识别用户的操作模式的方式还可以是其它的方式,例如:如果手指滑动的过程中,手指与屏幕的接触面积的变化情况、滑动加速度的变化情况与上述的规律不一样,在获得不一样的规律后,即可以识别用户的操作模式。
本发明实施方式获取用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一;根据用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一,识别用户的操作模式。通过这种方式,一方面不需要额外的成本且丰富了识别用户左右手持机的方式,另一方面,当根据用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者识别用户的操作模式时,能够增加识别的精度。
进一步地,在一个更优的实施例中,可以在用户滑动解锁(比如滑块解锁或者图案解锁)的过程中,获取用户的手指在滑动解锁时手指与屏幕的接触面积的变化情况、滑动速度的变化情况两者或两者之一,并进一步用户的手指在滑动解锁时手指与屏幕的接触面积的变化情况、滑动速度的变化情况两者或两者之一,识别出用户的操作模式,即用户是左手操作模式还是右手操作模式。这样在用户执行完解锁操作之后,即可识别出用户的操作模式,进而可以在用户进行下一步操作前,第一时间将用户界面切换到与用户的操作模式相匹配的形式(例如,方便左手操作的形式、或方便右手操作的形式),进一步提升了用户体验。
参阅图8至图9,图8至图9是本发明手持设备上用户操作模式的识别方法两个实施方式的流程图,具体内容如下:
步骤S301:检测到用户的手指在屏幕上滑动。
步骤S302:获取用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况以及用户的手指的滑动方向。
其中,获取用户的手指的滑动方向具体包括以下内容:
A.获取用户的手指在滑动轨迹中的起点和终点的位置。
B.若用户的手指在滑动轨迹中终点的位置位于起点的位置的右边,则用户的手指向右滑动。
C.若用户的手指在滑动轨迹中终点的位置位于起点的位置的左边,则用户的手指向左滑动。
用户单手持机,手指在滑动轨迹中起点和终点的位置也有助于判断手指的滑动方向,根据手指的滑动方向,也可以大致确定用户是哪只手在持机。
步骤S303:设置用户的手指的滑动方向的权重值为w1,用户的手指在滑动过程中与屏幕的接触面积的变化情况的权重值为w2,用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3。
步骤S304:若用户的手指的滑动方向是向右,则确定用户右手操作模式概率增加权重值w1,若用户的手指的滑动方向是向左,则确定用户左手操作模式概率增加权重值w1。
步骤S305:若用户的手指与屏幕的接触面积从左至右是逐渐变小,则确定用户右手操作模式概率增加权重值w2,若用户的手指与屏幕的接触面积从左至右是逐渐变大,则确定用户左手操作模式概率增加权重值w2。
步骤S306:若用户的手指的滑动加速度从左至右逐渐变大,则确定用户右手操作模式概率增加权重值w3,若用户的手指的滑动加速度从左至右逐渐变小,则确定用户左手操作模式概率增加权重值w3。
其中,步骤S304、步骤S305以及步骤S306没有先后顺序。
步骤S307:比较用户右手操作模式概率和左手操作模式概率的大小。
步骤S308:若用户右手操作模式概率大于左手操作模式概率,则识别用户的操作模式是右手操作模式,若用户右手操作模式概率小于左手操作模式概率,则识别用户的操作模式是左手操作模式。
参阅图9,图9的实施方式中用户的手指在滑动过程中的滑动信息包括:用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况以及用户的手指在滑动过程中经过的像素点落入的区域。具体内容如下:
步骤S401:检测到用户的手指在屏幕上滑动。
步骤S402:获取用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况以及用户的手指在滑动过程中经过的像素点落入的区域。
步骤S403:设置用户的手指在滑动过程中经过的像素点落入的区域的权重值为w0,用户的手指在滑动过程中与屏幕的接触面积的变化情况的权重值为w2,用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3。
步骤S404:若用户的手指经过的像素点落入屏幕的右方区域,则确定用户右手操作模式概率增加权重值w0,若用户的手指经过的像素点落入屏幕的左方区域,则确定用户左手操作模式概率增加权重值w0。
用户单手持机,手指的活动范围有限,在滑动过程中经过的像素点基本集中在某一个区域,哪只手持机,在滑动过程中经过的像素点基本集中在靠近那只手的区域,因此,根据用户的手指在滑动过程中经过的像素点落入的区域,也可以大致确定用户是哪只手在持机。
步骤S405:若用户的手指与屏幕的接触面积从左至右是逐渐变小,则确定用户右手操作模式概率增加权重值w2,若用户的手指与屏幕的接触面积从左至右是逐渐变大,则确定用户左手操作模式概率增加权重值w2;
步骤S406:若用户的手指的滑动加速度从左至右逐渐变大,则确定用户右手操作模式概率增加权重值w3,若用户的手指的滑动加速度从左至右逐渐变小,则确定用户左手操作模式概率增加权重值w3。
其中,步骤S404、步骤S405以及步骤S406没有先后顺序。
步骤S407:比较用户右手操作模式概率和左手操作模式概率的大小。
步骤S408:若用户右手操作模式概率大于左手操作模式概率,则识别用户的操作模式是右手操作模式,若用户右手操作模式概率小于左手操作模式概率,则识别用户的操作模式是左手操作模式。
检测到用户的手指在屏幕上滑动,有四个参数可供获取:用户的手指在滑动过程中经过的像素点落入的区域、用户的手指的滑动方向、用户的手指在滑动过程中与所述屏幕的接触面积的变化情况以及用户的手指在滑动过程中滑动加速度的变化情况,根据实际应用情况,可以获取其中的某几个参数综合识别左右手持机概率,获取的参数越多,识别的精度越高,上述实施方式仅仅只是选取其中的几个组合,其它的组合在此不再赘叙。
其中,在上述实施方式中,用户的手指在屏幕上滑动的动作是屏幕的解锁动作。通过这种方式,能够在不需要用户增加任何额外的动作的情况下,迅速而较为准确的识别左右手持机模式;而且识别后的结果,应用于解锁之后的整个解锁周期内。
另外,上述实施方式在实际应用中,如果4个参数都获取,则具体应用过程如下:
首先,设置用户的手指在滑动过程中经过的像素点落入的区域的权重值为w0,用户的手指的滑动方向的权重值为w1,用户的手指在滑动过程中与屏幕的接触面积的变化情况的权重值为w2,用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3。
其次,在获取四个参数(用户的手指在滑动过程中经过的像素点落入的区域、用户的手指的滑动方向、用户的手指在滑动过程中与所述屏幕的接触面积的变化情况以及用户的手指在滑动过程中滑动加速度的变化情况)时:
若用户的手指经过的像素点落入屏幕的右方区域,则确定用户右手操作模式概率增加权重值w0,若用户的手指经过的像素点落入屏幕的左方区域,则确定用户左手操作模式概率增加权重值w0;
若用户的手指的滑动方向是向右,则确定用户右手操作模式概率增加权重值w1,若用户的手指的滑动方向是向左,则确定用户左手操作模式概率增加权重值w1;
若用户的手指与屏幕的接触面积从左至右是逐渐变小,则确定用户右手操作模式概率增加权重值w2,若用户的手指与屏幕的接触面积从左至右是逐渐变大,则确定用户左手操作模式概率增加权重值w2;
若用户的手指的滑动加速度从左至右逐渐变大,则确定用户右手操作模式概率增加权重值w3,若用户的手指的滑动加速度从左至右逐渐变小,则确定用户左手操作模式概率增加权重值w3。
再次,比较用户右手操作模式概率和左手操作模式概率的大小。
若用户右手操作模式概率大于左手操作模式概率,则识别用户的操作模式是右手操作模式,若用户右手操作模式概率小于左手操作模式概率,则识别用户的操作模式是左手操作模式。
例如,右手操作模式概率为w0+w2+w3,左手操作模式概率为w1,比较w0+w2+w3和w1的大小,如果w0+w2+w3大于w1,则识别用户的操作模式是右手操作模式。
其中,权重值w0、w1、w2以及w3是根据屏幕的大小和用户的手指在屏幕上滑动的长度和形状进行设置的。
例如:当手机的屏幕较大,用户单手持机时,拇指已经很难接近到对侧的边框位置(即左手触及右边框,右手触及左边框)。如果设计一种横向滑动一定距离去解锁的方法,如图10所示,使得用户在屏幕上横向画圆滑动解锁,那么用手指在滑动过程中经过的像素点落入的区域去判断显然更加有效,这时权重值w0需要放大,在这种情况下,权重值可能是w0>w2>w1>w3。
如果屏幕大小一般(即双手拇指均可触碰到对侧边框),例如设计如图11所示的解锁方式,这时,权重值需要调整为:w2>w1>w3>w0。由于图11的解锁图形两端完全对称,所以判断手指在滑动过程中经过的像素点落入的区域的权重值w0可以完全忽略不用(即w0权重值为0)。
所以,以上四个参数并不一定都需要同时利用,根据屏幕的大小和解锁滑动形状的设计,可能只利用其中几个参数进行判断。
再例如,对于更大屏幕的手机或者平板电脑,可能设计一种解锁方法:如图12所示,可以在设计解锁界面的时候将解锁滑动区域设计在左右两个下角,且单手拇指不可能碰到对侧的区域,在这种较为极端的情况下,可以只用滑动时手指落在屏幕的区域这一个权重值w0作为判断依据,即可获得准确的判断。
本发明实施方式获取用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一;根据用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一,识别用户的操作模式。通过这种方式,一方面不需要额外的成本且丰富了识别用户操作模式的方式,另一方面,当根据用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者识别用户操作模式的概率时,能够增加识别的精度。另外,结合参数用户的手指在滑动过程中经过的像素点落入的区域和用户的手指的滑动方向,能够进一步提高识别精度;用户的手指在屏幕上滑动的动作是屏幕的解锁动作,能够在不需要用户增加任何额外的动作的情况下,迅速而较为准确的识别左右手持机模式;而且识别后的结果,应用于解锁之后的整个解锁周期内。
参阅图13,图13是本发明手持设备上用户操作模式的识别方法又一实施方式的流程图,包括:
步骤S501:当检测到用户的手指在手持设备的屏幕上滑动时,获取用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一。
步骤S502:根据用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一,识别用户的操作模式。
步骤S503:根据用户的操作模式,自动切换所述手持设备操作界面的显示模式。
哪只手持机概率大,判断是哪只手持机,获得判断结果后,自动切换操作界面的显示模式,其中操作界面的显示模式包括:左手显示模式和右手显示模式;左手显示模式下,操作界面通常显示在屏幕偏左的区域,以便于用户左手操作,右手显示模式下,操作界面通常显示在屏幕偏右的区域,以便于用户右手操作。即左手显示模式与左手操作模式相匹配,右手显示模式与右手操作模式相匹配。另外,如果用户滑动屏幕的动作是解锁动作,切换后的显示模式应用于解锁后的整个解锁周期内。
本发明实施方式获取用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一;根据用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一,识别用户的操作模式;根据用户的操作模式,将操作界面自动切换为与用户的操作模式匹配的显示模式。通过这种方式,一方面不需要额外的成本且丰富了识别用户的操作模式的方式,另一方面,当根据用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者识别用户的操作模式时,能够增加识别的精度,从而使得操作界面的显示模式更加准确。
参阅图14,图14是本发明手持设备一实施方式的结构示意图,该手持设备10包括:检测模块101、获取模块102以及识别模块103。
需要说明的是,本实施方式的手持设备可以执行图2中的步骤。
检测模块101用于检测用户的手指是否在手持设备的屏幕上滑动。
获取模块102用于当检测模块101检测到用户的手指在手持设备的屏幕上滑动时,获取用户的手指在滑动过程中的滑动信息。
识别模块103用于根据获取模块102获取的用户的手指在滑动过程中的滑动信息,识别用户的操作模式,操作模式包括:左手操作模式和右手操作模式。
用户的手指在手持设备的屏幕上滑动时,会产生许多滑动信息,获取这些滑动信息,有助于识别用户的操作模式,因此,根据用户的手指在滑动过程中的滑动信息,即可识别用户的操作模式。其中,操作模式包括:左手操作模式和右手操作模式;左手操作模式即为左手操作手持设备的模式,右手操作模式即为右手操作手持设备的模式。
由于滑动信息是在用户的手指在手持的屏幕上滑动时,自然而然的产生的,只要捕获或采集这些滑动信息,不需要额外通过其它的传感器,即可通过这些滑动信息识别用户的操作模式,降低了手持设备的成本;另外,用户的手指在滑动过程中产生的滑动信息通常包活很多相关的滑动信息,综合考虑这些滑动信息,有助于提高识别的精度。
本发明实施方式当检测到用户的手指在手持设备的屏幕上滑动时,获取用户的手指在滑动过程中的滑动信息;当检测到用户的手指在手持设备的屏幕上滑动时,获取用户的手指在滑动过程中的滑动信息。通过这种方式,能够在识别用户的操作模式时降低手持设备的成本,提高识别的精度。
参阅图15至图19,图15至图19是本发明手持设备五个实施方式的结构示意图,该手持设备20包括:检测模块201、获取模块202以及识别模块203。
需要说明的是,本实施方式的手持设备可以执行图3、图4、图5、图8、图9中的步骤。
检测模块201用于检测用户的手指是否在手持设备的屏幕上滑动。
获取模块202用于当检测模块201检测到用户的手指在手持设备的屏幕上滑动时,获取用户的手指在滑动过程中的滑动信息。
识别模块203用于根据获取模块202获取的用户的手指在滑动过程中的滑动信息,识别用户的操作模式,操作模式包括:左手操作模式和右手操作模式。
其中,用户的手指在滑动过程中的滑动信息包括用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一。
用户的手指在屏幕上滑动时,手指与屏幕的接触面积是不断发生变化的,手指滑动的加速度也是在不断发生变化的,获取手指与屏幕的接触面积的变化情况、或获取手指滑动的加速度的变化情况、或获取手指与屏幕的接触面积的变化情况和手指滑动的加速度的变化情况。
识别模块203具体用于根据获取模块202获取的用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一,识别用户的操作模式。
左右手指在屏幕上滑动时,手指与屏幕的接触面积的变化情况和手指滑动的加速度的变化情况,对于左右手来说变化的规律是不一样的,因此,根据手指与屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一,可以识别用户的操作模式。
参阅图15,如果用户的手指在滑动过程中的滑动信息是用户的手指在滑动过程中与屏幕的接触面积的变化情况,那么识别模块203包括:第一判断单元2031和第一识别单元2032。
第一判断单元2031用于判断用户的手指与屏幕的接触面积从左至右是否是逐渐变小。
第一识别单元2032用于在第一判断单元2031的判断结果为用户的手指与屏幕的接触面积从左至右是逐渐变小时,识别用户的操作模式是右手操作模式。
第一识别单元2032还用于在第一判断单元2031的判断结果为用户的手指与屏幕的接触面积从左至右是逐渐变大时,识别用户的操作模式是左手操作模式。
参阅图16,如果用户的手指在滑动过程中的滑动信息是用户的手指在滑动过程中的滑动加速度的变化情况,那么识别模块203包括:第二判断单元2033和第二识别单元2034。
第二判断单元2033用于判断用户的手指的滑动加速度从左至右是否是逐渐变大。
第二识别单元2034用于在第二判断单元2033的判断结果为用户的手指的滑动加速度从左至右是逐渐变大时,识别用户的操作模式是右手操作模式。
第二识别单元2034还用于在第二判断单元2033的判断结果为用户的手指的滑动加速度从左至右是逐渐变小时,识别用户的操作模式是左手操作模式。
参阅图17,如果用户的手指在滑动过程中的滑动信息包括用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者,那么识别模块203包括:第一设置单元2035、第一确定单元2036、第一比较单元2037、第三识别单元2038。
第一设置单元2035用于设置用户的手指在滑动过程中与屏幕的接触面积的变化情况的权重值为w2,用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3。
第一确定单元2036用于在用户的手指与屏幕的接触面积从左至右是逐渐变小时,确定用户右手操作模式概率增加权重值w2,在用户的手指与屏幕的接触面积从左至右是逐渐变大时,确定用户左手操作模式概率增加权重值w2。
第一确定单元2036还用于在用户的手指的滑动加速度从左至右逐渐变大时,确定用户右手操作模式概率增加权重值w3,在用户的手指的滑动加速度从左至右逐渐变小时,确定用户左手操作模式概率增加权重值w3。
第一比较单元2037用于比较用户右手操作模式概率和左手操作模式概率的大小。
第三识别单元2038用于在第一比较单元的比较结果为用户右手操作模式概率大于左手操作模式概率时,识别用户的操作模式是右手操作模式,在第一比较单元的比较结果为用户右手操作模式概率小于左手操作模式概率时,识别用户的操作模式是左手操作模式。
本发明实施方式获取用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一;根据用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一,识别用户的操作模式。通过这种方式,一方面不需要额外的成本且丰富了识别用户左右手持机的方式,另一方面,当根据用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者识别用户的操作模式时,能够增加识别的精度。
进一步地,在一个更优的实施例中,可以在用户滑动解锁(比如滑块解锁或者图案解锁)的过程中,获取用户的手指在滑动解锁时手指与屏幕的接触面积的变化情况、滑动速度的变化情况两者或两者之一,并进一步用户的手指在滑动解锁时手指与屏幕的接触面积的变化情况、滑动速度的变化情况两者或两者之一,识别出用户的操作模式,即用户是左手操作模式还是右手操作模式。这样在用户执行完解锁操作之后,即可识别出用户的操作模式,进而可以在用户进行下一步操作前,第一时间将用户界面切换到与用户的操作模式相匹配的形式(例如,方便左手操作的形式、或方便右手操作的形式),进一步提升了用户体验。
参阅图18,如果用户的手指在滑动过程中的滑动信息包括用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况、以及用户的手指的滑动方向,那么识别模块203包括:第二设置单元2039、第二确定单元20310、第二比较单元20311、第四识别单元20312。
第二设置单元2039用于设置用户的手指的滑动方向的权重值为w1,用户的手指在滑动过程中与屏幕的接触面积的变化情况的权重值为w2,用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3;
第二确定单元20310用于在用户的手指的滑动方向是向右时,确定用户右手操作模式概率增加权重值w1,在用户的手指的滑动方向是向左时,确定用户左手操作模式概率增加权重值w1;
第二确定单元20310还用于在用户的手指与屏幕的接触面积从左至右是逐渐变小时,确定用户右手操作模式概率增加权重值w2,在用户的手指与屏幕的接触面积从左至右是逐渐变大时,确定用户左手操作模式概率增加权重值w2;
第二确定单元20310还用于在用户的手指的滑动加速度从左至右逐渐变大时,确定用户右手操作模式概率增加权重值w3,在用户的手指的滑动加速度从左至右逐渐变小时,确定用户左手操作模式概率增加权重值w3;
第二比较单元20311用于比较用户右手操作模式概率和左手操作模式概率的大小;
第四识别单元20312用于在第二比较单元20311的比较结果为用户右手操作模式概率大于左手操作模式概率时,识别用户的操作模式是右手操作模式,在第二比较单元20311的比较结果为用户右手操作模式概率小于左手操作模式概率时,识别用户的操作模式是左手操作模式。
参阅图19,如果用户的手指在滑动过程中的滑动信息包括用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况、以及用户的手指在滑动过程中经过的像素点落入的区域,那么识别模块203包括:第三设置单元20313、第三确定单元20314、第三比较单元20315、第五识别单元20316。
第三设置单元20313用于设置用户的手指在滑动过程中经过的像素点落入的区域的权重值为w0,用户的手指在滑动过程中与屏幕的接触面积的变化情况的权重值为w2,用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3。
第三确定单元20314用于在用户的手指经过的像素点落入屏幕的右方区域时,确定用户右手操作模式概率增加权重值w0,在用户的手指经过的像素点落入屏幕的左方区域时,确定用户左手操作模式概率增加权重值w0。
第三确定单元20314还用于在用户的手指与屏幕的接触面积从左至右是逐渐变小时,确定用户右手操作模式概率增加权重值w2,在用户的手指与屏幕的接触面积从左至右是逐渐变大时,确定用户左手操作模式概率增加权重值w2。
第三确定单元20314还用于在用户的手指的滑动加速度从左至右逐渐变大时,确定用户右手操作模式概率增加权重值w3,在用户的手指的滑动加速度从左至右逐渐变小时,确定用户左手操作模式概率增加权重值w3。
第三比较单元20315用于比较用户右手操作模式概率和左手操作模式概率的大小。
第五识别单元20316用于在第三比较单元20315的比较结果为用户右手操作模式概率大于左手操作模式概率时,识别用户的操作模式是右手操作模式,在第三比较单元20315的比较结果为用户右手操作模式概率小于左手操作模式概率时,识别用户的操作模式是左手操作模式。
检测到用户的手指在屏幕上滑动,有四个参数可供获取:用户的手指在滑动过程中经过的像素点落入的区域、用户的手指的滑动方向、用户的手指在滑动过程中与所述屏幕的接触面积的变化情况以及用户的手指在滑动过程中滑动加速度的变化情况,根据实际应用情况,可以获取其中的某几个参数综合识别左右手持机概率,获取的参数越多,识别的精度越高,上述实施方式仅仅只是选取其中的几个组合,其它的组合在此不再赘叙。
其中,在上述实施方式中,用户的手指在屏幕上滑动的动作是屏幕的解锁动作。通过这种方式,能够在不需要用户增加任何额外的动作的情况下,迅速而较为准确的识别左右手持机模式;而且识别后的结果,应用于解锁之后的整个解锁周期内。
另外,上述实施方式在实际应用中,如果4个参数都获取,则识别模块包括:第四设置单元、第四确定单元、第四比较单元以及第六识别单元。
第四设置单元用于设置用户的手指在滑动过程中经过的像素点落入的区域的权重值为w0,用户的手指的滑动方向的权重值为w1,用户的手指在滑动过程中与屏幕的接触面积的变化情况的权重值为w2,用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3。
第四确定单元用于在用户的手指经过的像素点落入屏幕的右方区域时,确定用户右手操作模式概率增加权重值w0,在用户的手指经过的像素点落入屏幕的左方区域时,确定用户左手操作模式概率增加权重值w0。
第四确定单元还用于在用户的手指的滑动方向是向右时,确定滑动方向结果为用户右手操作模式持机概率增加权重值w1,在用户的手指的滑动方向是向左时,确定滑动方向结果为用户左手操作模式概率增加权重值w1。
第四确定单元还用于在用户的手指与屏幕的接触面积从左至右是逐渐变小时,确定用户右手操作模式概率增加权重值w2,在用户的手指与屏幕的接触面积从左至右是逐渐变大时,确定用户左手操作模式概率增加权重值w2。
第四确定单元还用于在用户的手指的滑动加速度从左至右逐渐变大时,确定用户右手操作模式概率增加权重值w3,在用户的手指的滑动加速度从左至右逐渐变小时,确定用户左手操作模式概率增加权重值w3。
第四比较单元用于比较用户右手持机概率和左手持机概率的大小。
若用户右手操作模式概率大于左手操作模式概率,则识别用户右手操作模式,若用户右手操作模式概率小于左手操作模式概率,则识别用户左手操作模式。
第六识别单元用于在第四比较单元的比较结果为用户右手操作模式概率大于左手操作模式概率时,识别用户的操作模式是右手操作模式,在第四比较单元的比较结果为用户右手操作模式概率小于左手操作模式概率时,识别用户的操作模式是左手操作模式。
其中,权重值w0、w1、w2以及w3是根据屏幕的大小和用户的手指在屏幕上滑动的长度和形状进行设置的。
本发明实施方式获取用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一;根据用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一,识别用户的操作模式。通过这种方式,一方面不需要额外的成本且丰富了识别用户的操作模式的方式,另一方面,当根据用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者识别用户的操作模式概率时,能够增加识别的精度。另外,结合参数用户的手指在滑动过程中经过的像素点落入的区域和用户的手指的滑动方向,能够进一步提高识别精度;用户的手指在屏幕上滑动的动作是屏幕的解锁动作,能够在不需要用户增加任何额外的动作的情况下,迅速而较为准确的识别左右手持机模式;而且识别后的结果,应用于解锁之后的整个解锁周期内。
参阅图20,图20是本发明手持设备又一实施方式的结构示意图,该手持设备30包括:检测模块301、获取模块302、识别模块303以及切换模块304。
需要说明的是,本实施方式中的手持设备可以执行图13中的步骤。
检测模块301用于检测用户的手指是否在手持设备的屏幕上滑动。
获取模块302用于当检测模块301检测到用户的手指在手持设备的屏幕上滑动时,获取用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一。
识别模块303用于根据获取模块302获取的用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一,识别用户的操作模式。
切换模块304用于根据识别模块303识别的用户的操作模式,自动切换操作界面的显示模式。
本发明实施方式获取用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一;根据用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一,识别用户的操作模式;根据用户的操作模式,自动切换操作界面的显示模式。通过这种方式,一方面不需要额外的成本且丰富了识别用户的操作模式的方式,另一方面,当根据用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者识别用户的操作模式概率时,能够增加识别的精度,从而使得操作界面的显示模式更加准确。
参阅图21,图21是本发明手持设备又一实施方式的实体结构示意图,该手持设备包括:处理器31、与处理器31耦合的存储器32、检测器33以及采集器34。
所述检测器33用于检测用户的手指是否在手持设备的屏幕上滑动。
所述采集器34用于当所述检测器33检测到用户的手指在手持设备的屏幕上滑动时,获取所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一,并将所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一存储在所述存储器32中。
所述处理器31用于提取所述存储器32中存储的所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一,并根据所述采集器34获取的用户的手指在滑动过程中与所述屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一,识别所述用户的操作模式。
其中,所述处理器31用于判断所述用户的手指与所述屏幕的接触面积从左至右是否是逐渐变小、或所述用户的手指的滑动加速度从左至右是否是逐渐变大;在判断结果为所述用户的手指与所述屏幕的接触面积从左至右是逐渐变小、或所述用户的手指的滑动加速度从左至右是逐渐变大时,识别用户的操作模式为右手操作模式;在判断结果为所述用户的手指与所述屏幕的接触面积从左至右是逐渐变大、或所述用户的手指的滑动加速度从左至右是逐渐变小时,识别用户的操作模式为左手操作模式。
其中,所述处理器31用于设置用户的手指在滑动过程中与屏幕的接触面积的变化情况的权重值为w2,用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3;若用户的手指与屏幕的接触面积从左至右是逐渐变小,则确定用户右手操作模式概率增加权重值w2,若用户的手指与屏幕的接触面积从左至右是逐渐变大,则确定用户左手操作模式概率增加权重值w2;若用户的手指的滑动加速度从左至右逐渐变大,则确定用户右手操作模式概率增加权重值w3,若用户的手指的滑动加速度从左至右逐渐变小,则确定用户左手操作模式概率增加权重值w3;比较用户右手操作模式概率和左手操作模式概率的大小;若用户右手操作模式概率大于左手操作模式概率,则识别用户的操作模式是右手操作模式,若用户右手操作模式概率小于左手操作模式概率,则识别用户的操作模式是左手操作模式。
其中,在用户的手指在滑动过程中的滑动信息包括:用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况以及用户的手指的滑动方向时:
所述采集器34用于获取用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况以及用户的手指的滑动方向。
所述处理器31用于设置用户的手指的滑动方向的权重值为w1,用户的手指在滑动过程中与屏幕的接触面积的变化情况的权重值为w2,用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3;若用户的手指的滑动方向是向右,则确定用户右手操作模式概率增加权重值w1,若用户的手指的滑动方向是向左,则确定用户左手操作模式概率增加权重值w1;若用户的手指与屏幕的接触面积从左至右是逐渐变小,则确定用户右手操作模式概率增加权重值w2,若用户的手指与屏幕的接触面积从左至右是逐渐变大,则确定用户左手操作模式概率增加权重值w2;若用户的手指的滑动加速度从左至右逐渐变大,则确定用户右手操作模式概率增加权重值w3,若用户的手指的滑动加速度从左至右逐渐变小,则确定用户左手操作模式概率增加权重值w3;比较用户右手操作模式概率和左手操作模式概率的大小;若用户右手操作模式概率大于左手操作模式概率,则识别用户的操作模式是右手操作模式,若用户右手操作模式概率小于左手操作模式概率,则识别用户的操作模式是左手操作模式。
其中,在用户的手指在滑动过程中的滑动信息包括:用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况以及用户的手指在滑动过程中经过的像素点落入的区域时:
所述采集器34用于获取用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况以及用户的手指在滑动过程中经过的像素点落入的区域。
所述处理器31用于设置用户的手指在滑动过程中经过的像素点落入的区域的权重值为w0,用户的手指在滑动过程中与屏幕的接触面积的变化情况的权重值为w2,用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3;若用户的手指经过的像素点落入屏幕的右方区域,则确定用户右手操作模式概率增加权重值w0,若用户的手指经过的像素点落入屏幕的左方区域,则确定用户左手操作模式概率增加权重值w0;若用户的手指与屏幕的接触面积从左至右是逐渐变小,则确定用户右手操作模式概率增加权重值w2,若用户的手指与屏幕的接触面积从左至右是逐渐变大,则确定用户左手操作模式概率增加权重值w2;若用户的手指的滑动加速度从左至右逐渐变大,则确定用户右手操作模式概率增加权重值w3,若用户的手指的滑动加速度从左至右逐渐变小,则确定用户左手操作模式概率增加权重值w3;比较用户右手操作模式概率和左手操作模式概率的大小;若用户右手操作模式概率大于左手操作模式概率,则识别用户的操作模式是右手操作模式,若用户右手操作模式概率小于左手操作模式概率,则识别用户的操作模式是左手操作模式。
其中,所述处理器31用于设置所述用户的手指在滑动过程中经过的像素点落入的区域的权重值为w0,所述用户的手指的滑动方向的权重值为w1,所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况的权重值为w2,所述用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3;在所述采集器34获取的所述用户的手指经过的像素点落入所述屏幕的右方区域时,确定所述用户右手操作模式概率增加权重值w0,在所述采集器34获取的所述用户的手指经过的像素点落入所述屏幕的左方区域时,确定所述用户左手操作模式概率增加权重值w0;在所述采集器34获取的所述用户的手指的滑动方向是向右时,确定所述用户右手操作模式概率增加权重值w1,在所述采集器34获取的所述用户的手指的滑动方向是向左时,确定所述用户左手操作模式概率增加权重值w1;在所述用户的手指与所述屏幕的接触面积从左至右是逐渐变小时,确定所述用户右手操作模式概率增加权重值w2,在所述用户的手指与所述屏幕的接触面积从左至右是逐渐变大时,确定所述用户左手操作模式概率增加权重值w2;在所述用户的手指的滑动加速度从左至右逐渐变大时,确定所述用户右手操作模式概率增加权重值w3,在所述用户的手指的滑动加速度逐渐变小时,确定所述用户左手操作模式概率增加权重值w3;比较所述用户右手操作模式概率和左手操作模式概率的大小;在比较结果为所述用户右手操作模式概率大于左手操作模式概率时,识别所述用户的操作模式为右手操作模式,在比较结果为所述用户右手操作模式概率小于左手操作模式概率时,识别所述用户的操作模式为左手操作模式。
其中,所述权重值w0、w1、w2以及w3是根据所述屏幕的大小和所述用户的手指在所述屏幕上滑动的长度和形状进行设置的。
其中,所述用户的手指在屏幕上滑动的动作是所述屏幕的解锁动作。
本发明实施方式获取用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一;根据用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一,识别用户的操作模式。通过这种方式,一方面不需要额外的成本且丰富了识别用户的操作模式的方式,另一方面,当根据用户的手指在滑动过程中与屏幕的接触面积的变化情况、滑动加速度的变化情况两者识别用户左右手持机概率时,能够增加识别的精度。另外,结合参数用户的手指在滑动过程中经过的像素点落入的区域和用户的手指的滑动方向,能够进一步提高识别精度;用户的手指在屏幕上滑动的动作是屏幕的解锁动作,能够在不需要用户增加任何额外的动作的情况下,迅速而较为准确的识别左右手持机模式;而且识别后的结果,应用于解锁之后的整个解锁周期内。
在本发明所提供的几个实施方式中,应该理解到,所揭露的系统,装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施方式仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施方式方案的目的。
另外,在本发明各个实施方式中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器(processor)执行本发明各个实施方式所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述仅为本发明的实施方式,并非因此限制本发明的专利范围,凡是利用本发明说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本发明的专利保护范围内。

Claims (20)

  1. 一种手持设备上用户操作模式的识别方法,其特征在于,包括:
    当检测到用户的手指在手持设备的屏幕上滑动时,获取所述用户的手指在滑动过程中的滑动信息;
    根据所述用户的手指在滑动过程中的滑动信息,识别所述用户的操作模式,所述操作模式包括:左手操作模式和右手操作模式。
  2. 根据权利要求1所述的方法,其特征在于,所述用户的手指在滑动过程中的滑动信息包括所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一。
  3. 根据权利要求2所述的方法,其特征在于,所述根据所述用户的手指在滑动过程中的滑动信息,识别所述用户的操作模式,包括:
    若所述用户的手指与所述屏幕的接触面积从左至右是逐渐变小,则识别所述用户的操作模式是右手操作模式;
    若所述用户的手指与所述屏幕的接触面积从左至右是逐渐变大,则识别所述用户的操作模式是左手操作模式。
  4. 根据权利要求2所述的方法,其特征在于,所述根据所述用户的手指在滑动过程中的滑动信息,识别所述用户的操作模式,包括:
    若所述用户的手指的滑动加速度从左至右是逐渐变大,则识别所述用户的操作模式是右手操作模式;
    若所述用户的手指的滑动加速度从左至右是逐渐变小,则识别所述用户的操作模式是左手操作模式。
  5. 根据权利要求2所述的方法,其特征在于,所述根据所述用户的手指在滑动过程中的滑动信息,识别所述用户的操作模式,包括:
    设置所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况的权重值为w2,所述用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3;
    若所述用户的手指与所述屏幕的接触面积从左至右是逐渐变小,则确定所述用户右手操作模式概率增加权重值w2,若所述用户的手指与所述屏幕的接触面积从左至右是逐渐变大,则确定所述用户左手操作模式概率增加权重值w2;
    若所述用户的手指的滑动加速度从左至右逐渐变大,则确定所述用户右手操作模式概率增加权重值w3,若所述用户的手指的滑动加速度从左至右逐渐变小,则确定所述用户左手操作模式概率增加权重值w3;
    比较所述用户右手操作模式概率和左手操作模式概率的大小;
    若所述用户右手操作模式概率大于左手操作模式概率,则识别所述用户的操作模式是右手操作模式,若所述用户右手操作模式概率小于左手操作模式概率,则识别所述用户的操作模式是左手操作模式。
  6. 根据权利要求2所述的方法,其特征在于,所述用户的手指在滑动过程中的滑动信息还包括所述用户的手指的滑动方向;
    所述根据所述用户的手指在滑动过程中的滑动信息,识别所述用户的操作模式,包括:
    设置所述用户的手指的滑动方向的权重值为w1,所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况的权重值为w2,所述用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3;
    若所述用户的手指的滑动方向是向右,则确定所述用户右手操作模式概率增加权重值w1,若所述用户的手指的滑动方向是向左,则确定所述用户左手操作模式概率增加权重值w1;
    若所述用户的手指与所述屏幕的接触面积从左至右是逐渐变小,则确定所述用户右手操作模式概率增加权重值w2,若所述用户的手指与所述屏幕的接触面积从左至右是逐渐变大,则确定所述用户左手操作模式概率增加权重值w2;
    若所述用户的手指的滑动加速度从左至右逐渐变大,则确定所述用户右手操作模式概率增加权重值w3,若所述用户的手指的滑动加速度从左至右逐渐变小,则确定所述用户左手操作模式概率增加权重值w3;
    比较所述用户右手操作模式概率和左手操作模式概率的大小;
    若所述用户右手操作模式概率大于左手操作模式概率,则识别所述用户的操作模式是右手操作模式,若所述用户右手操作模式概率小于左手操作模式概率,则识别所述用户的操作模式是左手操作模式。
  7. 根据权利要求2所述的方法,其特征在于,所述用户的手指在滑动过程中的滑动信息还包括所述用户的手指在滑动过程中经过的像素点落入的区域;
    所述根据所述用户的手指在滑动过程中的滑动信息,识别所述用户的操作模式,包括:
    设置所述用户的手指在滑动过程中经过的像素点落入的区域的权重值为w0,所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况的权重值为w2,所述用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3;
    若所述用户的手指经过的像素点落入所述屏幕的右方区域,则确定所述用户右手操作模式概率增加权重值w0,若所述用户的手指经过的像素点落入所述屏幕的左方区域,则确定所述用户左手操作模式概率增加权重值w0;
    若所述用户的手指与所述屏幕的接触面积从左至右是逐渐变小,则确定所述用户右手操作模式概率增加权重值w2,若所述用户的手指与所述屏幕的接触面积从左至右是逐渐变大,则确定所述用户左手操作模式概率增加权重值w2;
    若所述用户的手指的滑动加速度从左至右逐渐变大,则确定所述用户右手操作模式概率增加权重值w3,若所述用户的手指的滑动加速度从左至右逐渐变小,则确定所述用户左手操作模式概率增加权重值w3;
    比较所述用户右手操作模式概率和左手操作模式概率的大小;
    若所述用户右手操作模式概率大于左手操作模式概率,则识别所述用户的操作模式是右手操作模式,若所述用户右手操作模式概率小于左手操作模式概率,则识别所述用户的操作模式是左手操作模式。
  8. 根据权利要求5、6、7任一项所述的方法,其特征在于,所述权重值w0、w1、w2以及w3是根据所述屏幕的大小和所述用户的手指在所述屏幕上滑动的长度和形状进行设置的。
  9. 根据权利要求1-7任一项所述的方法,其特征在于,所述用户的手指在屏幕上滑动的动作是所述屏幕的解锁动作。
  10. 根据权利要求1-7任一项所述的方法,其特征在于,所述根据所述用户的手指在滑动过程中的滑动信息,识别所述用户的操作模式之后,还包括:
    根据所述用户的操作模式,自动切换所述手持设备操作界面的显示模式。
  11. 一种手持设备,其特征在于,所述手持设备包括:检测模块、获取模块以及识别模块;
    所述检测模块用于检测用户的手指是否在所述手持设备的屏幕上滑动;
    所述获取模块用于当所述检测模块检测到用户的手指在所述手持设备的屏幕上滑动时,获取所述用户的手指在滑动过程中的滑动信息;
    所述识别模块用于根据所述获取模块获取的用户的手指在滑动过程中的滑动信息,识别所述用户的操作模式,所述操作模式包括:左手操作模式和右手操作模式。
  12. 根据权利要求11所述的手持设备,其特征在于,所述用户的手指在滑动过程中的滑动信息包括所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况、滑动加速度的变化情况两者或两者之一。
  13. 根据权利要求12所述的手持设备,其特征在于,所述识别模块包括:第一判断单元和第一识别单元;
    所述第一判断单元用于判断所述用户的手指与所述屏幕的接触面积从左至右是否是逐渐变小;
    所述第一识别单元用于在所述第一判断单元的判断结果为所述用户的手指与所述屏幕的接触面积从左至右是逐渐变小时,识别所述用户的操作模式是右手操作模式;
    所述第一识别单元还用于在所述第一判断单元的判断结果为所述用户的手指与所述屏幕的接触面积从左至右是逐渐变大时,识别所述用户的操作模式是左手操作模式。
  14. 根据权利要求12所述的手持设备,其特征在于,所述识别模块包括:第二判断单元和第二识别单元;
    所述第二判断单元用于判断所述用户的手指的滑动加速度从左至右是否是逐渐变大;
    所述第二识别单元用于在所述第二判断单元的判断结果为所述用户的手指的滑动加速度从左至右是逐渐变大时,识别所述用户的操作模式是右手操作模式;
    所述第二识别单元还用于在所述第二判断单元的判断结果为所述用户的手指的滑动加速度从左至右是逐渐变小时,识别所述用户的操作模式是左手操作模式。
  15. 根据权利要求12所述的手持设备,其特征在于,所述识别模块包括:第一设置单元、第一确定单元、第一比较单元、第三识别单元;
    所述第一设置单元用于设置所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况的权重值为w2,所述用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3;
    所述第一确定单元用于在所述用户的手指与所述屏幕的接触面积从左至右是逐渐变小时,确定所述用户右手操作模式概率增加权重值w2,在所述用户的手指与所述屏幕的接触面积从左至右是逐渐变大时,确定所述用户左手操作模式概率增加权重值w2;
    所述第一确定单元还用于在所述用户的手指的滑动加速度从左至右逐渐变大时,确定所述用户右手操作模式概率增加权重值w3,在所述用户的手指的滑动加速度从左至右逐渐变小时,确定所述用户左手操作模式概率增加权重值w3;
    所述第一比较单元用于比较所述用户右手操作模式概率和左手操作模式概率的大小;
    所述第三识别单元用于在所述第一比较单元的比较结果为所述用户右手操作模式概率大于左手操作模式概率时,识别所述用户的操作模式是右手操作模式,在所述第一比较单元的比较结果为所述用户右手操作模式概率小于左手操作模式概率时,识别所述用户的操作模式是左手操作模式。
  16. 根据权利要求12所述的手持设备,其特征在于,所述用户的手指在滑动过程中的滑动信息还包括所述用户的手指的滑动方向;
    所述识别模块包括:第二设置单元、第二确定单元、第二比较单元、第四识别单元;
    所述第二设置单元用于设置所述用户的手指的滑动方向的权重值为w1,所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况的权重值为w2,所述用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3;
    所述第二确定单元用于在所述用户的手指的滑动方向是向右时,确定所述用户右手操作模式概率增加权重值w1,在所述用户的手指的滑动方向是向左时,确定所述用户左手操作模式概率增加权重值w1;
    所述第二确定单元还用于在所述用户的手指与所述屏幕的接触面积从左至右是逐渐变小时,确定所述用户右手操作模式概率增加权重值w2,在所述用户的手指与所述屏幕的接触面积从左至右是逐渐变大时,确定所述用户左手操作模式概率增加权重值w2;
    所述第二确定单元还用于在所述用户的手指的滑动加速度从左至右逐渐变大时,确定所述用户右手操作模式概率增加权重值w3,在所述用户的手指的滑动加速度从左至右逐渐变小时,确定所述用户左手操作模式概率增加权重值w3;
    所述第二比较单元用于比较所述用户右手操作模式概率和左手操作模式概率的大小;
    所述第四识别单元用于在所述第二比较单元的比较结果为所述用户右手操作模式概率大于左手操作模式概率时,识别所述用户的操作模式是右手操作模式,在所述第二比较单元的比较结果为所述用户右手操作模式概率小于左手操作模式概率时,识别所述用户的操作模式是左手操作模式。
  17. 根据权利要求12所述的手持设备,其特征在于,所述用户的手指在滑动过程中的滑动信息还包括所述用户的手指在滑动过程中经过的像素点落入的区域;
    所述识别模块包括:第三设置单元、第三确定单元、第三比较单元、第五识别单元;
    所述第三设置单元用于设置所述用户的手指在滑动过程中经过的像素点落入的区域的权重值为w0,所述用户的手指在滑动过程中与所述屏幕的接触面积的变化情况的权重值为w2,所述用户的手指在滑动过程中滑动加速度的变化情况的权重值为w3;
    所述第三确定单元用于在所述用户的手指经过的像素点落入所述屏幕的右方区域时,确定所述用户右手操作模式概率增加权重值w0,在所述用户的手指经过的像素点落入所述屏幕的左方区域时,确定所述用户左手操作模式概率增加权重值w0;
    所述第三确定单元还用于在所述用户的手指与所述屏幕的接触面积从左至右是逐渐变小时,确定所述用户右手操作模式概率增加权重值w2,在所述用户的手指与所述屏幕的接触面积从左至右是逐渐变大时,确定所述用户左手操作模式概率增加权重值w2;
    所述第三确定单元还用于在所述用户的手指的滑动加速度从左至右逐渐变大时,确定所述用户右手操作模式概率增加权重值w3,在所述用户的手指的滑动加速度从左至右逐渐变小时,确定所述用户左手操作模式概率增加权重值w3;
    所述第三比较单元用于比较所述用户右手操作模式概率和左手操作模式概率的大小;
    所述第五识别单元用于在所述第三比较单元的比较结果为所述用户右手操作模式概率大于左手操作模式概率时,识别所述用户的操作模式是右手操作模式,在所述第三比较单元的比较结果为所述用户右手操作模式概率小于左手操作模式概率时,识别所述用户的操作模式是左手操作模式。
  18. 根据权利要求15、16、17任一项所述的手持设备,其特征在于,所述权重值w0、w1、w2以及w3是根据所述屏幕的大小和所述用户的手指在所述屏幕上滑动的长度和形状进行设置的。
  19. 根据权利要求11-17任一项所述的手持设备,其特征在于,所述用户的手指在屏幕上滑动的动作是所述屏幕的解锁动作。
  20. 根据权利要求11-17任一项所述的手持设备,其特征在于,所述手持设备还包括切换模块,所述切换模块用于根据所述用户的操作模式,自动切换所述手持设备操作界面的显示模式。
PCT/CN2015/072531 2014-03-31 2015-02-09 手持设备上用户操作模式的识别方法及手持设备 WO2015149588A1 (zh)

Priority Applications (5)

Application Number Priority Date Filing Date Title
KR1020197008553A KR20190035938A (ko) 2014-03-31 2015-02-09 핸드헬드형 기기 상의 사용자 조작 모드를 식별하는 방법 및 핸드헬드형 기기
KR1020167030111A KR101963782B1 (ko) 2014-03-31 2015-02-09 핸드헬드형 기기 상의 사용자 조작 모드를 식별하는 방법 및 핸드헬드형 기기
JP2016559832A JP6272502B2 (ja) 2014-03-31 2015-02-09 携帯型デバイス上のユーザ操作モードを識別する方法及び携帯型デバイス
EP15772243.0A EP3118733B1 (en) 2014-03-31 2015-02-09 Method for recognizing operation mode of user on handheld device, and handheld device
US15/279,733 US10444951B2 (en) 2014-03-31 2016-09-29 Method and device for identifying a left-hand or a right-hand mode of operation on a user handheld device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201410126867.1 2014-03-31
CN201410126867.1A CN103870199B (zh) 2014-03-31 2014-03-31 手持设备上用户操作模式的识别方法及手持设备

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/279,733 Continuation US10444951B2 (en) 2014-03-31 2016-09-29 Method and device for identifying a left-hand or a right-hand mode of operation on a user handheld device

Publications (1)

Publication Number Publication Date
WO2015149588A1 true WO2015149588A1 (zh) 2015-10-08

Family

ID=50908787

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/072531 WO2015149588A1 (zh) 2014-03-31 2015-02-09 手持设备上用户操作模式的识别方法及手持设备

Country Status (7)

Country Link
US (1) US10444951B2 (zh)
EP (1) EP3118733B1 (zh)
JP (1) JP6272502B2 (zh)
KR (2) KR101963782B1 (zh)
CN (1) CN103870199B (zh)
TW (1) TW201602867A (zh)
WO (1) WO2015149588A1 (zh)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104281367A (zh) * 2014-09-28 2015-01-14 北京数字天域科技股份有限公司 一种移动终端系统管理方法及装置
CN105511774A (zh) * 2014-10-17 2016-04-20 深圳Tcl新技术有限公司 显示终端界面显示方法及装置
CN105700782A (zh) * 2014-11-25 2016-06-22 中兴通讯股份有限公司 一种调整虚拟按键布局的方法、装置及移动终端
CN106155525A (zh) * 2015-04-14 2016-11-23 中兴通讯股份有限公司 一种终端操作方式的识别方法及装置
CN106210242A (zh) * 2015-04-29 2016-12-07 宇龙计算机通信科技(深圳)有限公司 天线切换方法、装置及终端
CN105260121A (zh) * 2015-10-23 2016-01-20 东莞酷派软件技术有限公司 终端操控方法、终端操控装置和终端
CN107346171A (zh) * 2016-05-04 2017-11-14 深圳市中兴微电子技术有限公司 一种操作模式的切换方法和装置
WO2019056393A1 (zh) 2017-09-25 2019-03-28 华为技术有限公司 一种终端界面的显示方法及终端
CN107704190B (zh) * 2017-11-06 2020-07-10 Oppo广东移动通信有限公司 手势识别方法、装置、终端及存储介质
CN110769096A (zh) * 2019-10-21 2020-02-07 Oppo(重庆)智能科技有限公司 一种马达振动方法、终端及存储介质
CN111078087A (zh) * 2019-11-25 2020-04-28 深圳传音控股股份有限公司 移动终端、控制模式切换方法及计算机可读存储介质
CN113553568B (zh) * 2020-04-23 2024-06-18 京东科技控股股份有限公司 人机识别方法、滑块验证方法、装置、介质和设备
CN113204305B (zh) * 2021-04-30 2023-06-09 网易(杭州)网络有限公司 移动终端的握持模式检测方法、装置、介质及移动终端
TWI775474B (zh) * 2021-06-07 2022-08-21 華碩電腦股份有限公司 可攜式電子裝置與其單手觸控操作方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102624977A (zh) * 2012-02-17 2012-08-01 深圳市金立通信设备有限公司 根据用户左右手使用习惯进行手机界面切换的系统及方法
US20130212535A1 (en) * 2012-02-13 2013-08-15 Samsung Electronics Co., Ltd. Tablet having user interface
CN103354581A (zh) * 2013-06-14 2013-10-16 广东欧珀移动通信有限公司 一种通过左右手来自动调整手机控件的方法及系统

Family Cites Families (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008204402A (ja) * 2007-02-22 2008-09-04 Eastman Kodak Co ユーザインターフェース装置
US9134972B2 (en) * 2008-04-02 2015-09-15 Kyocera Corporation User interface generation apparatus
JP2010277197A (ja) * 2009-05-26 2010-12-09 Sony Corp 情報処理装置、情報処理方法およびプログラム
US20100310136A1 (en) 2009-06-09 2010-12-09 Sony Ericsson Mobile Communications Ab Distinguishing right-hand input and left-hand input based on finger recognition
CN101599001B (zh) * 2009-07-13 2012-11-14 青岛海信移动通信技术股份有限公司 触摸屏显示界面更新方法和多媒体电子设备
JP2011081646A (ja) * 2009-10-08 2011-04-21 Seiko Epson Corp 情報処理装置、情報処理方法およびプログラム
WO2011158475A1 (ja) * 2010-06-16 2011-12-22 パナソニック株式会社 情報入力装置、情報入力方法及びプログラム
KR101694787B1 (ko) * 2010-06-30 2017-01-10 엘지전자 주식회사 이동 단말기 및 이동 단말기의 제어 방법
US8471869B1 (en) * 2010-11-02 2013-06-25 Google Inc. Optimizing display orientation
CN102096513B (zh) * 2011-02-23 2014-04-16 惠州Tcl移动通信有限公司 一种触摸屏的滑动解决方法及使用该方法的电子设备
US20130100063A1 (en) * 2011-04-20 2013-04-25 Panasonic Corporation Touch panel device
JP2013003949A (ja) * 2011-06-20 2013-01-07 Nec Casio Mobile Communications Ltd 情報端末装置、入力方法およびプログラム
CN103176724A (zh) * 2011-12-21 2013-06-26 富泰华工业(深圳)有限公司 可切换左右手使用模式的操作界面的电子设备及方法
TWI493438B (zh) * 2012-01-09 2015-07-21 Amtran Technology Co Ltd 觸控方法
US8863042B2 (en) * 2012-01-24 2014-10-14 Charles J. Kulas Handheld device with touch controls that reconfigure in response to the way a user operates the device
CN202475551U (zh) * 2012-03-20 2012-10-03 深圳市金立通信设备有限公司 基于左右手使用习惯调整手机显示界面的系统
JP2013232118A (ja) * 2012-04-27 2013-11-14 Panasonic Corp 操作処理装置
CN102799268A (zh) 2012-07-03 2012-11-28 广东欧珀移动通信有限公司 一种手持终端左右手识别方法
CN104520798B (zh) * 2012-08-08 2018-07-03 日本电气株式会社 便携电子设备及其控制方法和程序
CN102830935B (zh) * 2012-08-22 2015-05-06 上海华勤通讯技术有限公司 触控终端及操作界面的调整方法
JP2014041498A (ja) * 2012-08-23 2014-03-06 Sanyo Electric Co Ltd 通信端末装置
US8665238B1 (en) * 2012-09-21 2014-03-04 Google Inc. Determining a dominant hand of a user of a computing device
US8782549B2 (en) * 2012-10-05 2014-07-15 Google Inc. Incremental feature-based gesture-keyboard decoding
EP2720172A1 (en) * 2012-10-12 2014-04-16 Nederlandse Organisatie voor toegepast -natuurwetenschappelijk onderzoek TNO Video access system and method based on action type detection
TWM461837U (zh) * 2012-10-15 2013-09-11 Guan-Ru Wang 行動設備滑動解鎖裝置
KR101995278B1 (ko) * 2012-10-23 2019-07-02 삼성전자 주식회사 터치 디바이스의 ui 표시방법 및 장치
US8769431B1 (en) * 2013-02-28 2014-07-01 Roy Varada Prasad Method of single-handed software operation of large form factor mobile electronic devices
DE102013011689A1 (de) * 2013-07-12 2015-01-15 e.solutions GmbH Verfahren und Vorrichtung zum Verarbeiten von Berührungssignalen eines Touchscreens

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130212535A1 (en) * 2012-02-13 2013-08-15 Samsung Electronics Co., Ltd. Tablet having user interface
CN102624977A (zh) * 2012-02-17 2012-08-01 深圳市金立通信设备有限公司 根据用户左右手使用习惯进行手机界面切换的系统及方法
CN103354581A (zh) * 2013-06-14 2013-10-16 广东欧珀移动通信有限公司 一种通过左右手来自动调整手机控件的方法及系统

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3118733A4 *

Also Published As

Publication number Publication date
KR20160136446A (ko) 2016-11-29
US20170017799A1 (en) 2017-01-19
JP2017518553A (ja) 2017-07-06
CN103870199B (zh) 2017-09-29
CN103870199A (zh) 2014-06-18
JP6272502B2 (ja) 2018-01-31
TW201602867A (zh) 2016-01-16
EP3118733A1 (en) 2017-01-18
EP3118733A4 (en) 2017-03-08
KR101963782B1 (ko) 2019-03-29
TWI560595B (zh) 2016-12-01
US10444951B2 (en) 2019-10-15
KR20190035938A (ko) 2019-04-03
EP3118733B1 (en) 2018-10-17

Similar Documents

Publication Publication Date Title
WO2015149588A1 (zh) 手持设备上用户操作模式的识别方法及手持设备
WO2016104922A1 (ko) 웨어러블 전자기기
WO2016060397A1 (en) Method and apparatus for processing screen using device
WO2018030594A1 (en) Mobile terminal and method for controlling the same
WO2017034116A1 (en) Mobile terminal and method for controlling the same
WO2016209020A1 (en) Image processing apparatus and image processing method
WO2017043857A1 (ko) 어플리케이션 제공 방법 및 이를 위한 전자 기기
WO2016129784A1 (en) Image display apparatus and method
WO2016072674A1 (en) Electronic device and method of controlling the same
WO2012144666A1 (en) Display device and control method therof
WO2014035113A1 (en) Method of controlling touch function and an electronic device thereof
WO2014035054A1 (en) Method and apparatus for controlling zoom function in an electronic device
WO2016074235A1 (zh) 一种移动物体的控制方法、装置及移动设备
WO2019168238A1 (ko) 이동단말기 및 그 제어 방법
WO2015180013A1 (zh) 一种终端的触摸操作方法及装置
WO2014027818A2 (en) Electronic device for displaying touch region to be shown and method thereof
WO2017007090A1 (en) Display device and method of controlling therefor
WO2015012629A1 (en) Method of processing input and electronic device thereof
WO2021025534A1 (ko) 카메라 프리뷰 이미지를 제공하기 위한 전자 장치 및 그의 동작 방법
WO2016089074A1 (en) Device and method for receiving character input through the same
WO2021054784A1 (en) Electronic device and method for changing user interface according to user input
WO2017159931A1 (en) Electronic device including touch panel and method of controlling the electronic device
WO2017206892A1 (zh) 一种移动终端的传感器处理方法、装置、存储介质及电子设备
WO2016195197A1 (en) Pen terminal and method for controlling the same
WO2020149600A1 (en) Electronic device and operation method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15772243

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2016559832

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

REEP Request for entry into the european phase

Ref document number: 2015772243

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2015772243

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 20167030111

Country of ref document: KR

Kind code of ref document: A