CN115525182A - Electronic equipment and finger activity area adjusting method of virtual keyboard of electronic equipment - Google Patents

Electronic equipment and finger activity area adjusting method of virtual keyboard of electronic equipment Download PDF

Info

Publication number
CN115525182A
CN115525182A CN202110704135.6A CN202110704135A CN115525182A CN 115525182 A CN115525182 A CN 115525182A CN 202110704135 A CN202110704135 A CN 202110704135A CN 115525182 A CN115525182 A CN 115525182A
Authority
CN
China
Prior art keywords
finger
mobile phone
activity area
user
fingers
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110704135.6A
Other languages
Chinese (zh)
Inventor
海庆
刘喜龙
闫昊
窦峥
曾文科
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110704135.6A priority Critical patent/CN115525182A/en
Publication of CN115525182A publication Critical patent/CN115525182A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Abstract

The application relates to an electronic device and a finger activity area adjusting method of a virtual keyboard of the electronic device. The finger activity area adjusting method is applied to electronic equipment, and comprises the following steps: a user triggers keys of a virtual keyboard displayed on second electronic equipment through gesture change in a first finger activity area, and acquires a first position of the first finger activity area under the condition that the position of an input gesture changes; and after position change information between the first position and the second position after the position change is determined, the first finger activity area is adjusted to be the second finger activity area. According to the method, in the process that the user executes the gesture operation to control the virtual keyboard, under the condition that the fingers of the user deviate from the previous activity area, the activity areas of the fingers are calibrated through the position difference before and after the fingers deviate, the activity areas of the fingers do not need to be determined again, and the performance of the electronic equipment and the user experience are improved.

Description

Electronic equipment and finger activity area adjusting method of virtual keyboard of electronic equipment
Technical Field
The present application relates to communication technology. In particular, to an electronic device and a method for adjusting a finger activity area of a virtual keyboard thereof.
Background
With the development of mobile office technology, the mobile phone can expand the content displayed by the mobile phone to the screens of other intelligent devices for display. For example, as shown in fig. 1, the mobile phone 100 displays the opened document application 101 extended to the screen of the smart tv 200. A virtual keyboard is displayed on the screen of the smart television 200, and a user performs gesture operation on the screen of the mobile phone 100 to input characters, so that the content of the document application 101 is modified. Generally, before the user operates the virtual keyboard through the mobile phone 100, the mobile phone 100 needs to determine an active area of the finger, so as to establish a mapping relationship between the gesture operation of the finger in the active area and the key position of the virtual keyboard.
Disclosure of Invention
The application aims to provide an electronic device and a method for adjusting a finger activity area of a virtual keyboard of the electronic device. By the method, in the process that the user executes the gesture operation to control the virtual keyboard, under the condition that the finger of the user deviates from the previous activity area, the activity area of the finger is calibrated through the position difference before and after the finger deviates, the activity area of the finger does not need to be determined again, and the performance of the electronic equipment and the user experience are improved.
A first aspect of the present application provides a method for adjusting a finger activity area of a virtual keyboard, which is applied to an electronic device, and the method includes:
coordinate information of a first finger activity area on a screen of the first electronic equipment is stored on the first electronic equipment, wherein the gesture change of a user of the first electronic equipment in the first finger activity area can trigger a key of a virtual keyboard displayed on the second electronic equipment;
the method comprises the steps that under the condition that the position of an input gesture of a user in a first activity area is detected to be changed, a first position of at least one finger of the user when the first finger activity area is determined is obtained by first electronic equipment;
the first electronic equipment calculates position change information between the acquired first position of the finger and the second position of the finger after the position change;
and the first electronic equipment adjusts the first finger activity area into a second finger activity area according to the position change information.
That is, in the embodiment of the present application, the first electronic device may be a mobile phone, and the second electronic device may be a smart television. The first finger activity area may be an activity area of a finger on a screen of the first electronic device, and the coordinate information of the first finger activity area may be coordinate information in a rectangular coordinate system established on the screen of the first electronic device. The first finger activity zone can be determined by the stroke position of the finger gesture, i.e. the first position, and the second finger activity zone can be determined by the stroke position of the changed finger gesture, i.e. the second position. After the input gesture of the finger is changed, the first electronic device may adjust the first finger activity area to the second finger activity area according to a difference between the first position and the second position before and after the gesture is changed, without recalculating the second finger activity area.
In one possible implementation of the first aspect, the gesture change in the first finger activity area includes a gesture operation of clicking, swiping and folding back of at least one finger of the user.
That is, in the embodiment of the present application, taking the index finger as an example, the gesture change of the index finger in the first finger activity area includes: clicking, drawing up, drawing down, turning back up and down, and turning back up and down. Mapping relations can be established between different gesture changes and key positions of the virtual keyboard, and the virtual keyboard can be controlled through gestures.
In one possible implementation of the first aspect, the changing the position of the input gesture includes:
the finger performing the input gesture contacts a location on the screen of the first electronic device and a direction of a trajectory generated by the input gesture on the screen of the first electronic device changes.
That is, in the embodiment of the present application, taking the index finger as an example, the position of the input gesture of the index finger is changed, including: the starting position of the input gesture of the index finger remains unchanged, the final position changes and both the starting position and the final position change simultaneously.
In one possible implementation of the first aspect, the position change information includes an angular difference of the finger before and after the position change.
That is, in the embodiment of the present application, taking the index finger as an example, the angular difference between before and after the position of the index finger is changed may be an included angle formed between vectors between the initial position and the final position before and after the position is changed.
In a possible implementation manner of the first aspect, the first electronic device obtains the second finger activity area by rotating the first finger activity area by the difference of the angles.
That is, in the embodiment of the present application, taking the index finger as an example, after determining the angle difference between before and after the position of the index finger is changed, the first finger active region of the index finger can be adjusted to the second finger active region,
in one possible implementation of the first aspect, the position change information includes a translation distance and a translation direction of the finger before and after the position change.
That is, in the embodiment of the present application, taking the index finger as an example, the vectors between the initial position and the final position before and after the position of the index finger is changed are parallel, and the position change information may be the coordinate difference between the initial position and the final position before and after the position of the index finger is changed.
In a possible implementation of the first aspect, the first electronic device obtains the second finger activity area by translating the first finger activity area by the translation distance according to the translation direction.
That is, in the embodiment of the present application, taking the index finger as an example, the first finger active region is determined as the second finger active region after being translated according to the coordinate difference before and after the change of the position of the index finger.
In one possible implementation of the first aspect, the position change information includes a translation distance, a translation direction, and an angle difference of the finger before and after the position change.
That is, in the embodiment of the present application, taking the index finger as an example, there is an angular difference between vectors between the starting position and the final position before and after the position of the index finger changes, and there is a coordinate difference between the starting position and the final position.
In a possible implementation manner of the first aspect, the first electronic device obtains the second finger activity area by rotating the first finger activity area by the angle difference and translating the first finger activity area by the translation distance according to the translation direction.
That is, in the embodiment of the present application, taking the index finger as an example, the first finger active region is adjusted to the second finger active region by rotation and translation according to the coordinate difference and the angle difference before and after the change of the position of the index finger.
In one possible implementation of the first aspect described above, in the first finger activity area, there are a plurality of locations corresponding to keys of a virtual keyboard displayed on the second electronic device.
That is, in the embodiment of the present application, taking the index finger as an example, the first active region of the index finger may establish a mapping relationship with the keys of the virtual keyboard, such as "4", "5", "r", "t", "f", "g", "c", "v", and the like. The mapping relation can be established between the clicking operation of the index finger in the first activity area and the key position of the virtual keyboard, and the virtual keyboard is controlled through gestures.
A second aspect of the present application provides an electronic device comprising: the virtual keyboard comprises a memory, wherein instructions are stored in the memory, and a processor is used for reading and executing the instructions in the memory, so that the electronic equipment executes the finger activity area adjusting method of the virtual keyboard provided by the first aspect.
A third aspect of the present application provides a computer-readable storage medium, which contains instructions, and when the instructions are executed by a controller of an electronic device, the instructions cause the electronic device to implement the finger activity area adjustment method for a virtual keyboard provided in the foregoing first aspect.
Drawings
Fig. 1 illustrates a scene diagram of a smart tv 200 for controlling an extended display of the smart tv by receiving a gesture operation by a mobile phone according to an embodiment of the present application;
FIG. 2 is a schematic diagram illustrating a mobile phone receiving a gesture operation of a five-finger press performed by a user according to an embodiment of the present application;
fig. 3 is a schematic flowchart illustrating a process of establishing a mapping relationship between gesture operations and key positions of a virtual keyboard by a mobile phone according to an embodiment of the present application;
FIG. 4 illustrates a schematic diagram of a handset performing left and right hand recognition in accordance with an embodiment of the present application;
fig. 5 (a) and 5 (b) are schematic diagrams illustrating a mobile phone recognizing a four-finger inner stroke operation according to an embodiment of the present application;
FIG. 6 is a schematic diagram illustrating a mobile phone determining a stroke position corresponding to a four-finger stroke operation according to an embodiment of the present application;
FIG. 7 is a schematic diagram illustrating a mobile phone determining a direction corresponding to a four-finger stroke operation according to an embodiment of the present application;
FIGS. 8 (a) and 8 (b) are schematic diagrams illustrating a cell phone determining the length of an active area of four fingers from a swipe position of four fingers according to embodiments of the present application;
9 (a) and 9 (b) show schematic diagrams of a handset generating four-fingered active areas, according to embodiments of the present application;
FIG. 10 is a diagram illustrating a state of four fingers of a user, according to an embodiment of the present application;
fig. 11 is a schematic flowchart illustrating a process of controlling a virtual keyboard displayed in a smart tv screen by a user performing a gesture operation through a mobile phone according to an embodiment of the present application;
fig. 12 is a schematic diagram illustrating an extended display established between a mobile phone and a smart tv according to an embodiment of the present application;
FIG. 13 illustrates a schematic diagram of a user performing a gesture operation on a screen of a cell phone, in accordance with an embodiment of the present application;
FIG. 14 shows a schematic flowchart of a cell phone calibrating an active area of a finger, according to an embodiment of the present application;
FIG. 15 shows a schematic diagram of a user's palm tilting on the screen of a cell phone, according to an embodiment of the present application;
FIG. 16 shows a schematic diagram of a displacement of a user's palm on a cell phone's screen, according to an embodiment of the present application;
FIG. 17 is a schematic flow chart illustrating calibration of a gesture operation of a user by a mobile phone according to an embodiment of the present application;
FIG. 18 illustrates a schematic diagram of a user's finger swipe position producing a tilt, in accordance with embodiments of the present application;
FIG. 19 is a schematic flow chart diagram illustrating another exemplary handset calibration for user gesture operations according to an embodiment of the present application;
FIG. 20 illustrates a schematic diagram of a user's finger swipe position displacement, in accordance with embodiments of the present application;
FIG. 21 illustrates a schematic diagram of a cell phone determining active areas of fingers of a user's hands, according to an embodiment of the present application;
FIG. 22 illustrates a cell phone determining a mapping between active areas of fingers of both hands of a user and keys of a virtual keyboard, according to an embodiment of the present application;
FIG. 23 is a flow chart illustrating a method for switching between gesture operation and touch operation of a mobile phone according to an embodiment of the present application;
FIG. 24 is a flow chart illustrating a method for switching between a touch operation and a gesture operation of a mobile phone according to an embodiment of the present application;
FIG. 25 shows a schematic diagram of a handset, according to an embodiment of the application;
fig. 26 shows a block diagram of a software architecture of a handset according to an embodiment of the application.
Detailed Description
Embodiments of the present application include, but are not limited to, an electronic device and a method for adjusting a finger activity area of a virtual keyboard thereof. To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
In the embodiment of the present application, taking the mobile phone 100 and the smart television 200 in fig. 1 as an example, a user performs a gesture operation through the screen of the mobile phone 100 and further operates a virtual keyboard to perform text input as an example.
The embodiments of the present application are not limited to the mobile phone 100 and the smart tv 200, and the embodiments of the present application may also be applied to terminal devices providing voice and/or data connectivity to users, where common terminal devices include, for example: in-vehicle devices, cell phones, tablet computers, laptops, palmtops, mobile Internet Devices (MIDs), wearable devices (including, for example, smartwatches, smartbands, pedometers, etc.), personal digital assistants, portable media players, navigation devices, video game devices, set-top boxes, virtual reality and/or augmented reality devices, internet of things devices, industrial control devices, streaming media client devices, electronic books, reading devices, POS machines, and other devices. In another embodiment of the present application, the terminal device may implement the method of the present application by connecting an external display or connecting an intelligent display.
The gesture operations performed by the user on the screen of the mobile phone 100 may include: a swipe operation or a click operation. Before the user performs the gesture operation through the screen of the mobile phone 100, the mobile phone 100 needs to determine an active area of the finger of the user on the screen of the mobile phone 100, and further establish a mapping relationship between the gesture operation performed by the finger in the active area and the key position of the virtual keyboard.
The method of the mobile phone 100 for determining the active area of the user's finger on the screen of the mobile phone 100 will be described. In some embodiments of the present application, as shown in fig. 2, taking a right hand of a user as an example, before the user performs a gesture operation through a screen of the mobile phone 100, the user places five fingers of the right hand on the screen of the mobile phone 100, and performs a gesture operation of pressing the five fingers, wherein a Thumb (T, thumb) of the right hand is to be placed in an edge preset area 1001 of the screen of the mobile phone 100, and the remaining fingers are to be placed on the screen of the mobile phone 100 outside the edge preset area 1001, so that it can be determined that the right hand of the user is in a gesture posture of the gesture operation. Then, the mobile phone 100 may establish a rectangular coordinate system in the screen, for example, as shown in fig. 2, the lower left corner of the screen of the mobile phone 100 is the origin of the rectangular coordinate system, and the initial positions of the five fingers on the screen are determined by the coordinates of the five fingers of the right hand in the rectangular coordinate system. Then, the right Index Finger (I, index Finger), the Middle Finger (M, middle Finger), the Ring Finger (R), and the small thumb (L, little Finger) of the right hand of the user perform the four-Finger stroke operation, the mobile phone 100 records the final position of the four fingers after the four fingers perform the stroke operation, and the first stroke position corresponding to the four fingers is obtained by the coordinates between the final position and the initial position. Taking the right index finger as an example, as shown in fig. 2, the first stroke position between the final position and the initial position of the right index finger is marked as a segment a, then the length of the first stroke position is extended again in the direction of the inward stroke operation to be the same as the length of the segment a to be a segment B, finally the length of the first stroke position is extended again in the direction opposite to the direction of the inward stroke operation to be a segment C, the segment a + the segment B + the segment C are the central tracks of the first active region of the right index finger, and finally the central tracks are expanded outwards to form the first active region of the right index finger. It will be appreciated that the first stroke position here of the right index finger may be a stroke vector between the final position and the initial position of the right index finger.
Next, a method for determining an active area of a finger of a user and establishing a mapping relationship between a gesture operation performed by the finger in the active area and a key position of a virtual keyboard by the mobile phone 100 will be described with reference to fig. 3. The arrangement shown in fig. 3 may be implemented by the processor 110 of the handset 100 invoking the associated program.
Specifically, as shown in fig. 3, the scheme of the method includes:
s301: it is determined whether the user holds the five-finger pressing operation. If yes, the process goes to S302, and the mobile phone 100 acquires the positions of the five fingers in the screen. If not, returning to S301, a prompt is prompted and determination is continued as to whether the user performs a five-finger press operation.
Here, taking the right hand of the user as an example, the mobile phone 100 may detect that the five-finger pressing operation is to place the Thumb (T, thumb) of the right hand of the user within the edge preset region 1001 of the screen of the mobile phone 100, and to place the remaining fingers on the screen of the mobile phone 100 outside the edge preset region 1001. In other embodiments of the present application, for example, if the user performs a gesture operation of pressing with five fingers, but the thumb falls outside the edge preset area 1001 of the screen of the mobile phone 100, or if the user performs a gesture operation with four fingers, the mobile phone 100 determines that the gesture operation with five fingers is not detected.
S302: and acquiring initial positions corresponding to the fingers after the five-finger pressing operation is performed.
For example, as shown in fig. 2, the mobile phone 100 may establish a rectangular coordinate system in the screen, and determine the initial positions of the five fingers on the screen through the coordinates of the five fingers of the right hand in the rectangular coordinate system.
S303: left-right hand recognition and five-finger recognition are performed.
As shown in FIG. 4, the thumb, index finger, and pinky of FIG. 4 are each represented by T, I, L. The mobile phone 100 performs the left-right hand recognition by the following steps, the mobile phone 100 first determines the finger pressed on the edge preset area 1001 of the screen as the thumb, and then the mobile phone 100 calculates the distance between the initial positions of the remaining four fingers and the initial position of the thumb. The distance between the thumb and the index finger is the nearest, and the distance between the thumb and the index finger is the farthest. Then, generating a perpendicular line D by using a distance center point between the forefinger and the little finger, and if the big finger is positioned at the right side of the perpendicular line D, taking the left hand; on the left side of the perpendicular D, the right hand is shown. Finally, the mobile phone 100 determines that the finger on the same side of the index finger and the little finger as the thumb is the index finger, the little finger on the different side is the little finger, the middle finger close to the index finger and the ring finger close to the little finger.
After the mobile phone 100 determines that the user presses the screen of the mobile phone 100 with the right hand, the mobile phone 100 may acquire initial positions of five fingers and store the initial positions. For example, the mobile phone 100 may use Pst, psi, psm, psr, psl to respectively represent the initial positions of five fingers, i.e. the thumb, the index finger, the middle finger, the ring finger, and the little finger, on the screen through the rectangular coordinate system of the screen of the mobile phone 100 shown in fig. 2.
S304: and prompting and determining whether the user performs the four-finger stroke operation. If the operation is a four-finger stroke operation, the step S305 is entered; if no or wrong gesture operation is detected, the process returns to S304 to continue to detect the next operation performed by the user.
Here, after the user performs the four-finger stroke operation, the mobile phone 100 may determine whether the user performs the stroke operation according to whether the distance between any one of the four fingers and the thumb is shortened, for example, the distance between the index finger I and the thumb T as shown in fig. 5 (a). In another embodiment of the present application, the mobile phone 100 determines whether the user performs the inner stroke operation according to the distance between the index finger I and the thumb T as shown in fig. 5 (a). The mobile phone 100 may further determine whether the user has performed the inner stroke operation according to whether the distance H between the center point G between the index finger I, the middle finger M, the ring finger R, and the small finger L and the edge preset region 1001 of the screen of the mobile phone 100 becomes short as shown in fig. 5 (b).
It is understood that after the mobile phone 100 detects that the user performs the non-four-finger stroke operation, for example, after the mobile phone 100 detects that the user performs the gesture operation of sliding down two fingers, the mobile phone 100 may prompt the user to perform the four-finger stroke operation again.
S305: and acquiring the stroking position corresponding to the four-finger inward stroking operation.
Here, after the mobile phone 100 detects the four-finger stroke operation, the mobile phone 100 can obtain the stroke position corresponding to the four fingers from the coordinates between the final position and the initial position by the final position after the four-finger stroke operation is performed by the four fingers.
For example, as shown in fig. 6, pei, pem, per, pel indicate the final positions on the screen of four fingers, an index finger I, a middle finger M, a ring finger R, and a little finger L, respectively. For the index finger, the stroking distance of the index finger can be expressed by | Pei-Psi |. Similarly, the cell phone 100 may acquire the final positions of the five fingers of the right hand of the user and save the final positions.
In the embodiment of the application, after obtaining the stroke positions corresponding to the four fingers, the mobile phone 100 may further determine that the four-finger inward stroke operation is a four-finger inward stroke in the same direction by judging whether an included angle between two adjacent stroke positions of the four fingers is smaller than an included angle threshold; for example, as shown in fig. 7, α, β, γ are included angles of the stroke positions between the index finger I and the middle finger M, between the middle finger M and the ring finger R, and between the ring finger R and the small finger L, respectively, and if all of the three included angles are smaller than the included angle threshold value 30 degrees, the mobile phone 100 determines that the four-finger inward stroke operation is a four-point same-direction stroke.
S306: and determining and storing the activity areas of the four fingers according to the stroking positions of the four fingers.
In the embodiment of the present application, taking the index finger as an example, as shown in fig. 8 (a), the mobile phone 100 records the length of the stroke position of the index finger I as a segment a, then extends the length of the stroke position again in the direction of the inward stroke operation as a segment B, and finally extends the length of the stroke position again in the direction opposite to the direction of the inward stroke operation as a segment C, where the segment a + segment B + segment C is the central track of the active region of the index finger of the right hand. Here, the length of the a section can be calculated by | Pei-Psi |, where the operation symbol | | | denotes the length of the vector for calculating the final position to the initial position of the index finger, that is, the modulus of the calculated vector.
Here, when the user performs the four-finger stroke operation, the four fingers are not parallel-lined in the same direction, but the index finger I and the little finger L are inclined to the middle finger M and the ring finger R, respectively, and therefore, the mobile phone 100 needs to adjust the directions of the B segments corresponding to the index finger I and the little finger L.
For example, the mobile phone 100 may obtain the directions X of the inner stroke operations of the middle finger M and the ring finger R, respectively, where X may be the average of the directions of the inner stroke operations of the middle finger M and the ring finger R. Then, the mobile phone 100 determines the direction of the segment B of the index finger I by the direction of the stroke operation of the index finger I and the direction X, and the perpendicular distance between the end of the segment B of the index finger I and the segment B of the middle finger M is greater than the preset minimum distance value. Similarly, the mobile phone 100 determines the direction of the section B of the little finger L by the direction of the stroke operation of the little finger L and the direction X, and the distance between the end of the section B of the little finger L and the section B of the ring finger R is greater than the preset minimum distance value. In this way, as shown in fig. 8 (B), it can be avoided that the distance between the ends of the segment B corresponding to the index finger I and the small finger L and the ends of the segment B corresponding to the index finger I and the small finger L are too close, so that the mobile phone 100 finally determines that the active areas of the index finger I and the small finger L are too small.
Next, as shown in fig. 9 (a), the mobile phone 100 calculates an active area of the index finger, which may be composed of four areas I1 to I4, where I1 may be the left side of the middle line between the index finger a + C segment and the middle finger a + C segment; i2 can be the left side of the middle line between the index finger section B and the middle finger section B; i3 can be the left extending distance of the index finger A + C section is consistent with the I1 area; i4 may be the left side of the index B segment extending for a distance consistent with the size of the top of I2. The widths of the four regions I1 to I4 may be half of the distance between the a-segment of the index finger to the a-segment of the middle finger.
Similarly, the cell phone 100 can calculate the active areas of the middle finger, ring finger, and small finger in the same way as the active areas of the index finger.
The activity area of the middle finger can be composed of four areas from M1 to M4, wherein M3, the index finger A + C section and the middle finger A + C section are arranged on the right side of the middle line; m4, the middle line right side of the index finger section B and the middle finger section B; m1, the left side of the middle line of the middle finger A + C section and the ring finger A + C section; m2, the middle finger section B and the left side of the middle line of the ring finger section B.
The active area of the ring finger can be composed of four areas from R1 to R4, wherein R3, the middle line right side of the middle line of the middle finger section A + C and the ring finger section A + C; r4, the middle line right side of the middle line of the middle finger section B and the ring finger section B; r1, the left side of the middle line of the section A + C of the ring finger and the section A + C of the little finger; r2, left side of the middle line of the B section of the ring finger and the B section of the little finger.
The activity area of the little finger can be composed of four areas from L1 to L4, wherein the right side of the middle line of the L3, the section A + C of the ring finger and the section A + C of the little finger; l4, the right side of the middle line of the B section of the ring finger and the B section of the little finger; the right extending distance of the L1 and the A + C section of the little finger is consistent with that of the L3; the right extending distance of the section L2 and the little finger B is consistent with the top of the section L4.
It is understood that, in another embodiment of the present application, the B segment of the four fingers shown in fig. 9 (B) may also be located in the edge preset region 1001, and in step S301, when the mobile phone 100 receives the five-finger pressing operation of the user, the positions where the four fingers except the thumb fall cannot be located in the edge preset region 1001 because the identification of the thumb by the mobile phone 100 is interfered. Once the cell phone 100 has completed identifying the position of the thumb, the cell phone 100 may allow the four-fingered active area to be located within the edge preset area 1001 when the cell phone 100 determines the four-fingered active area based on the stroke position of the four fingers.
S307: and establishing a mapping relation between the gesture operation of the four fingers in the corresponding active area and the key positions of the virtual keyboard.
First, the mobile phone 100 acquires a gesture operation performed by the user on the screen.
Here, the four fingers of the user can be divided into three states, as shown in fig. 10, four-finger default state, four-finger contracted state, and four-finger relaxed state.
Taking the index finger as an example, the index finger draws down from the default state position as the starting point and draws down to the contracted state position, and the gesture operation is a short drawing down of the index finger. The index finger takes the position in the diastole state as the starting point for drawing down, and if the index finger takes the position in the default state for drawing down, the gesture operation is short drawing down of the index finger; if the user slides down to the position of the contracted state, the gesture operation is long sliding down of the index finger.
The index finger is drawn from the default state position as the starting point and drawn up to the diastolic state position, and the gesture operation is a short drawing-up of the index finger. The index finger is drawn from the contracted position as the starting point, and if the index finger is drawn to the default position, the gesture operation is a short drawing of the index finger. If the user moves up to the position of the diastole state, the gesture operation is the long up-stroke of the index finger.
The index finger takes the default state position as a starting point to stroke upwards to the diastolic state position, and then returns to the default state position, and the gesture operation is the up-and-down turning back of the index finger; and the index finger takes the default state position as a starting point to draw down, draws down to the contracted state position and returns to the default state position, and the gesture operation is the upper-lower turning back of the index finger.
It is understood that the gesture operation of the index finger may include a click operation of the index finger in addition to the above six kinds. Besides the index finger, the middle finger, ring finger and little finger of the user can also comprise the seven gesture operations.
And then, configuring a mapping relation between the gesture operation and the key positions of the virtual keyboard.
Here, the mobile phone 100 may determine the mapping relationship between the gesture operations and the key positions of the virtual keyboard through a combination of a plurality of fingers of four fingers and a combination of seven kinds of gesture operations of clicking, short stroke up, short stroke down, long stroke up, long stroke down, turning back up and down, and turning back up and down of the four fingers. The index, middle, ring and pinky fingers are denoted I, M, R, L, respectively.
As shown in table 1, the mobile phone 100 can be operated by a combination of ten fingers including a single finger and a double finger: I. and the mapping relation is established between the gesture operations of R, M, L, IM, MR, RL, IR, ML and IL and five fingers and the key positions of the virtual keyboard. IM, MR, RL, IR, ML, IL here represent combinations of index and middle fingers, middle and ring fingers, ring and little fingers, index and ring fingers, middle and little fingers, and index and little fingers, respectively. It will be appreciated that the above combinations of fingers are exemplary and that the cell phone 100 may prompt the user to select a combination of fingers for ease of operation.
Figure BDA0003131502450000071
TABLE 1
In another embodiment of the present application, as shown in table 2, the mobile phone 100 can be operated by a combination of seven fingers including a single finger and a double finger: I. and a mapping relation is established between the gesture operations of the R, M, L, IM, MR, RL and five fingers and the key positions of the virtual keyboard.
Figure BDA0003131502450000072
Figure BDA0003131502450000081
TABLE 2
It is to be understood that the combinations of the above-described single-finger and double-finger combinations and the finger touch operations may be any combinations, and the larger the screen of the mobile phone 100 is, the larger the number of combinations of the finger combinations and the finger touch operations may be. For example, the mobile phone 100 may be operated by a combination of ten fingers including a single finger and two fingers: I. the gesture operations of R, M, L, IM, MR, RL, IR, ML, IL and seven fingers determine seventy mapping relations with the key positions of the virtual keyboard. The mapping relationship between the gesture operations and the key positions of the virtual keyboard in table 1 and table 2 can be stored in the internal memory of the mobile phone 100.
Here, the mobile phone 100 establishes a mapping relationship between the gesture operations of the four fingers in the active area and the key positions of the virtual keyboard, so that the mobile phone 100 can receive the gesture operations of the four fingers performed by the user to implement the operations on the key positions of the virtual keyboard. It can be understood that, during the four-finger gesture operation performed by the user, the thumb needs to be pressed in the edge preset area 1001 of the screen of the mobile phone 100, so that the user can act as a support point for the four-finger gesture operation by the thumb. The virtual keyboard may be a virtual keyboard displayed on the screen of the smart tv 200 as in fig. 1.
Next, with reference to fig. 11, a method for a user to operate a virtual keyboard through the mobile phone 100 is described by taking an example in which the mobile phone 100 displays the document application 101 of the user on the screen of the smart tv 200 in an expanded manner, and performs gesture operation using the mobile phone 100 to input text to the document application 101 in the screen of the smart tv 200. The arrangement shown in fig. 11 may be implemented by the processor 110 of the handset 100 invoking the associated program.
Specifically, as shown in fig. 11, the scheme of the method includes:
s1101: the cell phone 100 detects an operation of opening an application by the user.
The mobile phone 100 may respond to a user clicking an icon of the document application 101 on the desktop of the mobile phone 100, that is, a "document" icon in a User Interface (UI) of the mobile phone 100, and after the mobile phone 100 receives an instruction of a user click operation, the mobile phone 100 starts the document application 101.
S1102: the mobile phone 100 establishes a communication connection with the smart television 200 and sends an extended display request to the smart television 200.
For example, the mobile phone 100 may be in communication connection with the smart tv 200 through a wireless communication mode such as bluetooth, WIFI, or NFC. In some embodiments, the mobile phone 100 may also be communicatively connected to the smart tv 200 through a wired communication manner, for example, the mobile phone 100 is communicatively connected to the smart tv 200 through a data line and a Universal Serial Bus (USB) interface.
Before or after the mobile phone 100 establishes a communication connection with the smart television 200, the user may turn on the extended display function of the mobile phone 100, as shown in fig. 12, after the user starts the document application 101, the user may click an extended display key 1002 in the screen of the mobile phone 100, and the user may select to send an extended display request to the smart television 200 in communication connection with the mobile phone 100.
S1103: after receiving the extended display request, the smart tv 200 displays the document application 101 on the screen.
Here, after the smart tv 200 receives the extended display request, the document application 101 may be displayed in a partial area within the screen as shown in fig. 1. In other embodiments of the present application, the smart tv 200 may also display the document application 101 in a full-screen display manner.
S1104: the cellular phone 100 determines whether or not the start operation of the gesture operation is detected. If so, go to S1105; if no or an erroneous startup operation is detected, the process returns to S1104 to continue to detect the next startup operation of the user.
Here, the start operation of the gesture operation may be a five-finger pressing operation, which may be the same as in S301 described above, and it is ensured that the Thumb (T, thumb) of the user is to be dropped on the edge preset region 1001 of the screen of the mobile phone 100, and the remaining fingers are to be dropped on the screen of the mobile phone 100 outside the edge preset region 1001. In another embodiment of the present application, the starting operation of the gesture operation may also be a single-finger pressing operation performed by a thumb in the edge preset area 1001 of the screen of the mobile phone 100.
It is to be appreciated that the handset 100 can also determine whether the user has performed an initial gesture operation. If yes, the mobile phone 100 executes the above steps S301 to S307, the mobile phone 100 configures the activity area of the fingers of the user, and establishes a mapping relationship between the gesture operation of the four fingers in the activity area and the key positions of the virtual keyboard.
S1105, the mobile phone 100 determines the active area of the fingers of the user and configures the mapping relation between the gesture operation of the four fingers in the active area and the key positions of the virtual keyboard.
Here, the mobile phone 100 determining the active area of the fingers of the user and configuring the mapping relationship between the gesture operations of the four fingers in the active area and the key positions of the virtual keyboard may be implemented by the processor 110 of the mobile phone 100 executing the above-described steps S301 to S307.
S1106: the mobile phone 100 acquires gesture operations of the user in the four-finger activity area, and outputs content corresponding to the gesture operations on the screen of the smart television 100.
Here, as shown in fig. 13, with the configuration of table 1, the user's index finger I and middle finger M simultaneously perform a gesture operation of a short stroke on the screen of the mobile phone 100, the gesture operation corresponds to the "x" key on the virtual keyboard, and the text "x" is input in the document application 101 on the screen of the smart tv 200.
It can be understood that, in the process of the gesture operation performed by the user, in the case that the finger of the user deviates from the previous active area, for example, the hand of the user leaves the screen of the mobile phone 100 and returns, so that the mobile phone 100 needs to re-acquire the active area of the finger of the user, at this time, the mobile phone 10 needs to perform the steps S1104 to S1106 again, the process is time consuming, and if the finger of the user deviates from the stroking position many times, the process needs to be repeated many times, which results in poor user experience.
In order to solve the problem, in the technical solution of the present application, when a user uses the mobile phone 100 for the first time to perform a gesture operation, the mobile phone 100 may first determine a first stroke position and a first activity region of a finger of the user on a screen, when the finger of the user deviates from the first stroke position, the mobile phone 100 may obtain a second stroke position of the finger on the screen of the smart device 100, determine an inclined angle and/or a displacement distance of the hand of the user according to the first stroke position and the second stroke position, rotate the first activity region using the inclined angle and/or move the first activity region using the displacement distance to obtain a second activity region of the finger, and calibrate the activity region corresponding to the finger of the user from the first activity region to the second activity region.
The above steps S1104 to S1106 describe that the user starts the mobile phone 100 to enter the gesture operation by performing a five-finger pressing gesture on the screen of the mobile phone 100, and before the user uses the mobile phone 100 for the first time to perform the gesture operation, the mobile phone 100 determines the position and the active area of the finger of the user, establishes the mapping relationship between the gesture operation of the finger in the active area and the key position of the virtual keyboard, and displays the content corresponding to the input gesture on the screen of the smart tv 200 when the mobile phone 100 receives the input gesture performed by the finger of the user in the corresponding active area. In the embodiment of the present application, during the gesture operation performed by the user through the screen of the mobile phone 100, there may be a difference between the position of the finger of the user on the screen of the mobile phone 100 and the position of the finger already stored in the above steps S301 to S307, for example, after the mapping relationship between the active area and the virtual keyboard is confirmed, if the hand of the user returns to the screen of the mobile phone 100, an inclination is likely to be generated between the hand and the position of the finger already stored, at this time, the mobile phone 100 may calibrate the active area of the finger after returning according to the inclination angle between the positions of the finger before and after returning, and the processes of steps S301 to S307 need not be performed again. It can be understood that, during the process of performing the gesture operation by the user or after the hand of the user leaves the screen of the mobile phone 100 and returns, the user needs to always press the edge preset area 1001 of the screen of the mobile phone 100 by the thumb in the process that the mobile phone 100 needs to calibrate the finger activity area.
Fig. 14 shows a scheme of the gesture calibration method, and the scheme shown in fig. 14 can be implemented by the processor 110 of the mobile phone 100 calling a relevant program, and the scheme includes:
s1401: it is determined whether the user performs a five-finger pressing operation. If yes, the process goes to S1402, and the mobile phone 100 acquires the position of the five fingers in the screen. If not, returning to S1401, a prompt is prompted and determination is continued as to whether the user performs a five-finger press operation.
In the embodiment of the present application, the mobile phone 100 may determine whether the user performs the five-finger pressing operation in the same manner as in S301.
S1402: and acquiring the current position corresponding to each finger after the five-finger pressing operation is executed.
For example, the mobile phone 100 may determine the initial positions of the five fingers on the screen by the coordinates of the five fingers of the right hand in the rectangular coordinate system in the same manner as in S302.
S1403: and calculating the angle difference between the current vector between the two fingers and the saved vector between the two fingers, and further obtaining the inclination angle of the palm of the user.
For example, as shown in fig. 15, taking the index finger I and the little finger L as an example, the mobile phone 100 first obtains the saved vectors a = Psi-Psl =betweenthe initial positions Psi and Psl of the index finger I and the little finger L<a x ,a y >,a x ,a y Coordinates representing vector a; and the vector b = Psi1-Psl1 =between the current positions Psil and Psl1 of the index and little fingers<b x ,b y >,b x ,b y Representing the coordinates of vector b.
Next, the cellular phone 100 calculates the angle θ at which the palm of the user is tilted using the following formula.
a·b=|a||b|·cosθ
θ=arccos(a·b/|a||b|)
Figure BDA0003131502450000101
The above formula can obtain the included angle theta, that is, the inclination angle theta of the palm of the user, by multiplying two vector points by the length of two vectors and then multiplying by cos theta.
S1404: updating the stroke positions of the four fingers based on the inclined angle of the palm.
For example, taking the index finger as an example, the current position of the index finger is Psi1; pre-saved initial position and final position of index fingerThe positions are Psi and Pei respectively, and the pre-determined stroking position of the index finger is Psi-Pei =<c x ,c y >After the palm is tilted by the angle θ, the current final position of the index finger is changed to Pei1=<c x *cosθ-c y *sinθ,c x *sinθ+c y *cosθ>+ Psi. Similarly, the mobile phone 100 may further determine the current final positions of the middle finger, the ring finger, and the little finger after the palm is tilted by the angle θ.
S1405: and calculating and storing the activity areas of the four fingers according to the four-finger stroke positions.
In the embodiment of the present application, the process of S1405 is the same as that of step S306 described above, and is not described in detail here. After the mobile phone 100 recalculates the activity area of the four fingers, the mobile phone 100 updates the mapping relationship between the gesture operation of the four fingers in the activity area and the key positions of the virtual keyboard.
It will be appreciated that in some embodiments of the present application, for the case where the user's finger is only displaced between the position on the screen of the mobile phone 100 and the position of the stored finger without tilting, i.e. panning, when the user's hand is moved back away from the screen of the mobile phone 100, the mobile phone 100 can calculate the position difference between the current thumb position and the stored initial thumb position, and then update the initial position and the final position of the four fingers.
For example, as shown in fig. 16, the initial position of the thumb is Pst =<t x ,t y >The current thumb position is Pst1=<t x1 ,t y1 >The difference between the current position of the thumb and the stored initial position of the thumb is Pst-Pst1=<Δ x ,Δ y >。
After a position difference between the current position of the thumb and the saved initial position of the thumb is acquired, the initial positions and the final positions of the remaining four fingers except the thumb are updated using the position difference. Taking the index finger as an example, the initial position of the index finger is stored as Psi =<i x ,i y >Then the updated initial position of the index finger may be Psi1= Psi+<Δ x ,Δ y >=<i xx ,i yy >. Similarly, the updated final position of the index finger may be, pei1= Pei +<Δ x ,Δ y >=<i xx ,i yy >。
Fig. 14 to 16 describe that, when the hand of the user leaves the screen of the mobile phone 100 and returns again, there may be a difference between the position of the finger of the user on the screen of the mobile phone 100 after returning and the position of the finger already stored in the mobile phone 100, for example, after the hand of the user returns again to the screen of the mobile phone 100, there is a tilt or displacement between the current position of the finger and the position of the finger already stored, and then the mobile phone 100 may calibrate the active region of the finger after returning according to the angle of the tilt between the positions of the finger before and after returning. The gesture calibration method of the present application will be described below with reference to fig. 17, by taking as an example that there is no change between the current position of the finger and the initial position of the finger already saved, and when the finger performs a gesture operation in the saved active area, an error occurs between the gesture operation and the position of the finger already saved.
Fig. 17 shows a scheme of the gesture calibration method, and the scheme shown in fig. 17 can be implemented by the processor 110 of the mobile phone 100 calling a relevant program, and the scheme includes:
s1701: and acquiring the included angle between the current scratching positions of the fingers and the stored scratching positions.
Here, as shown in fig. 18, taking the index finger I as an example, the mobile phone 100 obtains the current stroke position of the index finger I in the active area, and calculates the angle Ω between the stored stroke position and the current stroke position. Here, the mobile phone 100 may calculate the angle Ω between the current stroke position of the index finger I and the saved stroke position in the same manner as in step S1403.
In the embodiment of the present application, when a plurality of fingers swipe in the active area, the mobile phone 100 may further calculate an included angle between the current swiping position of the plurality of fingers in the active area and the stored swiping position, and obtain an average direction difference α' of the included angles corresponding to the plurality of fingers.
In the embodiment of the present application, the mobile phone 100 may further store an update coefficient μ, where μ is an update coefficient with an interval of (0,1), and μmay be a fixed value, or may be adjusted according to the number of stroked fingers, and the larger the number of fingers is, the larger μ is. Mu can also be adjusted according to the length of the scratching position of the finger, and the longer the length of the scratching position, the larger mu is. The handset 100 will obtain the included angle e = Ω 'μ corresponding to the plurality of fingers from α'. Mu.
S1702: and acquiring the current position corresponding to the stroked finger.
For example, the mobile phone 100 may obtain the current position corresponding to the stroked finger by the same method as in step S302.
S1703: and calculating the stroke position corresponding to the finger based on the included angle, the current position and the final position corresponding to the finger.
For example, taking the index finger I as an example, the current position of the index finger I and the initial position of the index finger stored in advance are Psi, the final position of the index finger I stored in advance is Pei, and the stroking position of the index finger stored in advance is Pei-Psi =<d x ,d y >Changing the stroking position of the index finger to be the same according to the included angle epsilon corresponding to the index finger I calculated in the step S1601
Figure BDA0003131502450000111
Figure BDA0003131502450000112
Similarly, the mobile phone 100 may further obtain the positions of the middle finger, the ring finger, and the little finger after changing based on the included angle e.
S1704: and calculating and saving the activity area of the finger according to the stroke position.
In the embodiment of the present application, the process of S1704 is the same as that of step S306 described above, and is not described in detail here. After the mobile phone 100 recalculates the activity areas of the plurality of fingers, the mobile phone 100 updates the mapping relationship between the gesture operations of the plurality of fingers in the activity areas and the key positions of the virtual keyboard.
Fig. 17 illustrates that, when the current position of the finger and the initial position of the stored finger are not changed, and the finger performs a gesture operation in the stored active area, an error occurs between the gesture operation and the stored position of the finger, and at this time, the mobile phone 100 may calibrate the active area of the finger according to an included angle formed by the finger performing the swipe operation. The gesture calibration method of the present application will be described below with reference to fig. 19, by taking as an example that the current position of a single or multiple fingers that are not displaced by the palm of the user and are only stroked is different from the stored initial position, and the mobile phone 100 calibrates the active area of the fingers according to the current position of the fingers and the stored initial position.
Fig. 19 shows a scheme of the gesture calibration method, and the scheme shown in fig. 19 can be implemented by the processor 110 of the mobile phone 100 calling a relevant program, and the scheme includes:
s1901: and calculating a default slope from the initial position to the final position according to the saved initial position and the final position of the stroked finger.
For example, as shown in fig. 20, an initial position A1 (x 0, y 0) of the stroked finger, and a final position A2 (x 1, y 1) of the stroked finger. The mobile phone 100 calculates a default slope k of a line segment between the initial position A1 and the final position A2 according to k = (y 1-y 0)/(x 1-x 0).
S1902: and acquiring the current position corresponding to the stroked finger.
Here, the mobile phone 100 may obtain the current position corresponding to the stroked finger by the same method as in step S302.
For example, as shown in fig. 20, the current start position B1 (x 1', y 1') of the stroked finger.
S1903: the projected position of the initial position of the stroked finger to a line passing through the current position with a default slope is determined.
For example, as shown in fig. 20, the projected position A3 (x 0', y 0') of the stroked finger. Since the line segment A1A3 is perpendicular to the line R1. The slope of A1A3 is the negative reciprocal of k, 1/k = (y 0'-y 0)/(x 0' -x 0). Meanwhile, the projection position A3 (x 0', y 0') can be obtained in combination of k = (y 1'-y 0')/(x 1'-x 0').
x0′=(k*k*x1′-k(y1′-y0)-x0)/(k*k-1)
y0′=(k*k*y0+k(x1′-x0)+y2)/(k*k-1)
S1904: a first intermediate position between the current starting position and the projected position of a stroked finger is determined.
For example, assume a first intermediate position M (x) M ,y M ) Where | MA3| = w | B1A3|, that is, the length of MA3 is equal to the length of B1A3 multiplied by w, where w is an update coefficient of an interval (0,1). The coordinates (x) of the first intermediate position M are obtained by the following formula M ,y M )。
x M =x0′+w*(x1′-x0′)
y M =y0′+w*(y1′-y0′)
S1905: a second intermediate position between the initial position and the first intermediate position of a stroked finger is determined.
For example, assume a second intermediate position C (x) C ,y C ) Where | CA1| = z | MA1|, that is, the length is equal to the length multiplied by z, where z is the same as w described above and is an update coefficient for an interval of (0,1). The coordinates (x) of the second intermediate position C are obtained by the following formula C ,y C )。
x C =x0+w*(x M -x0)
y C =y0+w*(y M -y0)
S1906: and recalculating the activity area of the finger by taking the second intermediate position as the initial position of the finger.
In the embodiment of the present application, the process of S1806 is the same as that of step S306 described above, and is not described in detail here. After the cell phone 100 recalculates the activity areas of the plurality of fingers, the cell phone 100 updates the mapping relationship between the gesture operations of the plurality of fingers in the activity areas and the key positions of the virtual keyboard.
In step S307, after the mobile phone 100 in the embodiment of the present application determines the activity area of the four fingers and establishes the mapping relationship between the gesture operation of the four fingers in the corresponding activity area and the key positions of the virtual keyboard, the mobile phone 100 receives the gesture operation performed by the user on the screen of the mobile phone 100, and outputs the content corresponding to the gesture operation on the screen of the smart television 100. In another embodiment of the present application, the user may also operate the virtual keyboard by performing a gesture operation in the screen of the tablet pc 300. The tablet pc 300 is in communication connection with the smart television 200, and as the tablet pc 300 belongs to a large-screen smart device, as shown in fig. 21, the tablet pc 300 may determine, in the manner of the above steps S301 to S306, the activity areas of the respective four fingers of the left hand and the right hand of the user on the screen of the tablet pc 300, and establish a mapping relationship between the activity areas of the respective four fingers of the left hand and the right hand and the key positions of the virtual keyboard, so that the activity areas of the respective four fingers of the left hand and the right hand respectively cover a part of the key positions of the virtual keyboard, and the user may implement gesture operation on the virtual keyboard through the respective four fingers of the left hand and the right hand. For example, as shown in fig. 22, the active area of the left index finger can establish a mapping relationship with the keys of the virtual keyboard, such as "4", "5", "r", "t", "f", "g", "c", "v", etc. It will be appreciated that the active areas of the left index finger and the right index finger may cover two columns of keys of the virtual keyboard.
It can be understood that, in the process of controlling the virtual keyboard by the user performing gesture operation through the screen of the mobile phone 100 and the process of calibrating the active area of the finger of the user by the mobile phone 100, when the mobile phone 100 receives the gesture operation of the finger of the user and the gesture operation crosses the active areas of two or more fingers, the mobile phone 100 may prompt the gesture operation to be wrong, and the user may perform the gesture operation again.
After the method for determining the mapping relationship between the gesture operation of the user and the key positions of the virtual keyboard by the mobile phone 100 and operating the virtual keyboard by the mobile phone 100 receiving the gesture operation performed by the user on the mobile phone 100 is introduced, in the embodiment of the present application, the mobile phone 100 may further support the user to perform a touch operation by using the screen of the mobile phone 100 as a touch pad. The mobile phone 100 can switch between the gesture operation and the touch operation performed by the user. Next, a method for switching between the gesture operation and the touch operation of the mobile phone 100 will be described with reference to fig. 23. The arrangement shown in fig. 23 may be implemented by the processor 110 of the handset 100 calling the relevant program.
Specifically, as shown in fig. 23, the mobile phone 100 is currently in a state of receiving a gesture operation performed by a user, and the scheme of the method includes:
s2301: and determining whether the user performs switching operation corresponding to the touch operation. If yes, the process goes to S2302, and the mobile phone 100 obtains the mapping relationship between the key positions of the virtual keyboard corresponding to the touch operation. If not, the process goes to S2304, and the mobile phone 100 saves the state of receiving the gesture operation performed by the user.
It can be understood that the switching operation corresponding to the touch operation herein may be the same four-finger pressing operation as that in step S301, or may also be other types of gesture operations, for example, three-finger pressing operation or a user clicking a switching key in the screen of the mobile phone 100, or a hard key operation set on the mobile phone 100, which is not described herein again.
S2302: and acquiring the mapping relation of the key positions of the virtual keyboard corresponding to the touch operation.
It can be understood that the mobile phone 100 may load a mapping relationship of key positions of the virtual keyboard corresponding to the pre-stored touch operation from the internal memory.
S2303: and switching to a state of receiving the touch operation performed by the user.
Here, the touch operation performed by the user with the screen of the mobile phone 100 as the touch panel may be a cursor generated in the screen of the smart tv 200 by the user controlling the touch operation.
S2304: and keeping receiving the state of the gesture operation performed by the user.
It is understood that when the mobile phone 100 is currently in a state of receiving a touch operation performed by a user, the mobile phone 100 may also switch from the state of receiving the touch operation performed by the user to the state of receiving a gesture operation performed by the user. Specifically, as shown in fig. 25, the scheme of the method includes:
s2401: and determining whether the user performs switching operation corresponding to the gesture operation. If yes, the process goes to S2402, and the mobile phone 100 obtains the mapping relationship between the key positions of the virtual keyboard corresponding to the gesture operation. If not, returning to S2404, the cellular phone 100 saves the state of receiving the touch operation performed by the user.
It can be understood that the switching operation corresponding to the gesture operation herein may also be various gesture operations supported by the mobile phone 100, and a hard key operation set on the mobile phone 100, which is not described herein again.
S2402: and acquiring the mapping relation of the key positions of the virtual keyboard corresponding to the gesture operation.
It can be understood that the mobile phone 100 may load the mapping relationship of the key positions of the virtual keyboard corresponding to the pre-stored touch operation from the internal memory.
S2403: and switching to a state of receiving gesture operation performed by the user.
Here, the user can operate the virtual keyboard by the gesture operation through the mapping relationship between the gesture operation in the active region of the user' S finger and the key positions of the virtual keyboard set in steps S301 to S307.
S2404: and keeping receiving the state that the user performs the touch operation.
A schematic structural diagram of the mobile phone 100 in the embodiment of the present application is described below.
As shown in fig. 25, the mobile phone 100 may include a processor 110, a screen 111, an internal memory 120, an interface module 130, a power supply module 140, a wireless communication module 150, a mobile communication module 160, and a touch sensor 170.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the mobile phone 100. In other embodiments of the present application, the handset 100 may include more or fewer components than shown, or some components may be combined, some components may be separated, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
A memory may also be provided in the processor 110 for storing instructions and data.
The screen 111 is used to display images, videos, and the like.
Internal memory 120 may be used to store computer-executable program code, which includes instructions. The internal memory 120 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area may store data (e.g., audio data, a phonebook, etc.) created during use of the handset 100, and the like. In an embodiment of the present application, a mapping relationship between the touch operation and the key position of the virtual keyboard and a mapping relationship between the gesture operation and the key position of the virtual keyboard may be stored in the internal memory 120.
The interface module 130 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the mobile phone 100. The external memory card communicates with the processor 110 through the interface module 130 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The power module 140 receives input from the battery and supplies power to the processor 110, the internal memory 120, the display 111, and the like.
The wireless communication module 150 may provide a solution for wireless communication applied to the mobile phone 100, including a Wireless Local Area Network (WLAN) (e.g., a wireless fidelity (Wi-Fi) network), bluetooth (BT), a Global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and so on.
The mobile communication module 160 may provide a solution including 2G/3G/4G/5G wireless communication applied to the handset 100.
The touch sensor 170 is also referred to as a "touch device". The touch sensor 170 may be disposed on the screen 111, and the touch sensor 170 and the screen 111 form a touch screen, which is also called a "touch screen". In an embodiment of the present application, the touch sensor 170 is used to determine the position of the user's finger on the screen 111 and the stroking position after the finger performs a stroking operation.
Fig. 26 is a block diagram of the software configuration of the cellular phone 100 according to the embodiment of the present invention.
As shown in fig. 26, the mobile phone 100 may be divided into an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer.
Wherein the application layer may include a series of application packages.
As shown in fig. 26, the application package may include camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc. applications. In an embodiment of the present application, the application package may include a document application 101 or the like.
The application framework layer may include a view system, a gesture recognition system, and the like.
In an embodiment of the present application, the gesture recognition system is configured to recognize gesture operations, such as a gesture operation and a touch operation, performed on the document application 101 by a user on the screen of the mobile phone 100.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build a display interface for an application. The display interface may be composed of one or more display elements, where a display element refers to an element in the display interface of an application in the screen of the electronic device. For example, the display elements may include buttons, text, pictures, popups, menus, title bars, lists, or search boxes, among others. The display interface of the application may include at least one display element.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
It will be understood that, although the terms "first", "second", etc. may be used herein to describe various features, these features should not be limited by these terms. These terms are used merely for distinguishing and are not intended to indicate or imply relative importance. For example, a first feature may be termed a second feature, and, similarly, a second feature may be termed a first feature, without departing from the scope of example embodiments.
Moreover, various operations will be described as multiple operations separate from one another in a manner that is most helpful in understanding the illustrative embodiments; however, the order of description should not be construed as to imply that these operations are necessarily order dependent, and that many of the operations can be performed in parallel, concurrently, or simultaneously. In addition, the order of the operations may be re-arranged. The process may be terminated when the described operations are completed, but may have additional operations not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, and the like.
References in the specification to "one embodiment," "an illustrative embodiment," etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may or may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Furthermore, when a particular feature is described in connection with a particular embodiment, the knowledge of one skilled in the art can affect such feature in combination with other embodiments, whether or not such embodiments are explicitly described.
The terms "comprising," "having," and "including" are synonymous, unless the context dictates otherwise. The phrase "A/B" means "A or B". The phrase "A and/or B" means "(A), (B) or (A and B)".
As used herein, the term "module" may refer to, be a part of, or include: memory (shared, dedicated, or group) for executing one or more software or firmware programs, an Application Specific Integrated Circuit (ASIC), an electronic circuit and/or processor (shared, dedicated, or group), a combinational logic circuit, and/or other suitable components that provide the described functionality.
In the drawings, some features of the structures or methods may be shown in a particular arrangement and/or order. However, it should be understood that such specific arrangement and/or ordering is not required. Rather, in some embodiments, these features may be described in a manner and/or order different from that shown in the illustrative figures. Additionally, the inclusion of a structural or methodical feature in a particular figure does not imply that all embodiments need to include such feature, and in some embodiments may not include such feature, or may be combined with other features.
While the embodiments of the present application have been described in detail with reference to the accompanying drawings, the application of the present application is not limited to the various applications mentioned in the embodiments of the present application, and various structures and modifications can be easily implemented with reference to the present application to achieve various advantageous effects mentioned herein. Variations that do not depart from the gist of the invention are intended to be within the scope of the invention.

Claims (12)

1. A finger activity area adjusting method of a virtual keyboard is applied to electronic equipment, and is characterized by comprising the following steps:
coordinate information of a first finger activity area on a screen of the first electronic equipment is stored on the first electronic equipment, wherein the gesture change of a user of the first electronic equipment in the first finger activity area can trigger a key of a virtual keyboard displayed on the second electronic equipment;
the method comprises the steps that under the condition that the position of an input gesture of a user in a first activity area is detected to be changed, a first position of at least one finger of the user when the first finger activity area is determined is obtained by first electronic equipment;
the first electronic equipment calculates position change information between the first position of the finger and the second position of the finger after the position of the finger is changed;
and the first electronic equipment adjusts the first finger activity area into a second finger activity area according to the position change information.
2. The method of claim 1, wherein the gesture change in the first finger activity zone comprises a gesture operation of a tap, a swipe and a fold back of at least one finger of the user.
3. The method of claim 1, wherein the change in the position of the input gesture comprises:
the position on the first electronic device screen contacted by the finger executing the input gesture and the direction of the track generated on the first electronic device screen by the input gesture are changed.
4. The method of claim 1, wherein the position change information includes an angular difference of the finger before and after the position change.
5. The method according to claim 4, wherein the first electronic device obtains the second finger activity area by rotating the first finger activity area by the angular difference.
6. The method of claim 1, wherein the position change information comprises a translation distance and a translation direction of the finger before and after the position change.
7. The method of claim 6, wherein the first electronic device obtains the second finger activity area by translating the first finger activity area by the translation distance in the translation direction.
8. The method of claim 1, wherein the position change information comprises a translation distance, a translation direction, and an angle difference of the finger before and after the position change.
9. The method of claim 8, wherein the first electronic device obtains the second finger activity area by rotating the first finger activity area by the angular difference and translating the first finger activity area by the translation distance according to the translation direction.
10. The method of claim 1, wherein in the first finger activity area, there are a plurality of locations corresponding to keys of a virtual keyboard displayed on a second electronic device.
11. An electronic device, comprising:
a memory having instructions stored therein, an
A processor configured to read and execute the instructions in the memory, so as to cause the electronic device to execute the finger activity area adjustment method of the virtual keyboard according to any one of claims 1 to 10.
12. A computer-readable storage medium containing instructions that, when executed by a controller of an electronic device, cause the electronic device to implement the method for adjusting finger-active areas of a virtual keyboard according to any one of claims 1 to 10.
CN202110704135.6A 2021-06-24 2021-06-24 Electronic equipment and finger activity area adjusting method of virtual keyboard of electronic equipment Pending CN115525182A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110704135.6A CN115525182A (en) 2021-06-24 2021-06-24 Electronic equipment and finger activity area adjusting method of virtual keyboard of electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110704135.6A CN115525182A (en) 2021-06-24 2021-06-24 Electronic equipment and finger activity area adjusting method of virtual keyboard of electronic equipment

Publications (1)

Publication Number Publication Date
CN115525182A true CN115525182A (en) 2022-12-27

Family

ID=84694151

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110704135.6A Pending CN115525182A (en) 2021-06-24 2021-06-24 Electronic equipment and finger activity area adjusting method of virtual keyboard of electronic equipment

Country Status (1)

Country Link
CN (1) CN115525182A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117752477A (en) * 2024-02-22 2024-03-26 浙江强脑科技有限公司 Method, device, terminal and medium for controlling gesture locking of bionic hand

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117752477A (en) * 2024-02-22 2024-03-26 浙江强脑科技有限公司 Method, device, terminal and medium for controlling gesture locking of bionic hand

Similar Documents

Publication Publication Date Title
US11461004B2 (en) User interface supporting one-handed operation and terminal supporting the same
US20230161417A1 (en) Sharing Across Environments
US10482573B2 (en) Method and mobile device for displaying image
CN111651116B (en) Split screen interaction method, electronic equipment and computer storage medium
US9286895B2 (en) Method and apparatus for processing multiple inputs
CN108139778B (en) Portable device and screen display method of portable device
CN108845782B (en) Method for connecting mobile terminal and external display and apparatus for implementing the same
US9921737B2 (en) Flexible apparatus and control method thereof
CN110347317B (en) Window switching method and device, storage medium and interactive intelligent panel
US9880697B2 (en) Remote multi-touch control
EP4024186A1 (en) Screenshot method and terminal device
EP2575009A2 (en) User interface method for a portable terminal
EP4057137A1 (en) Display control method and terminal device
CN115525182A (en) Electronic equipment and finger activity area adjusting method of virtual keyboard of electronic equipment
US20190332237A1 (en) Method Of Navigating Panels Of Displayed Content
CN114115691B (en) Electronic equipment and interaction method and medium thereof
WO2023020541A1 (en) Electronic device and human-computer interaction method
CN114461312B (en) Display method, electronic device and storage medium
US20220180582A1 (en) Electronic device and method for controlling application thereof
KR20190064633A (en) Method and apparatus for displaying pages, graphical user interface, and mobile terminal
CN116700914B (en) Task circulation method and electronic equipment
US20190034069A1 (en) Programmable Multi-touch On-screen Keyboard
CN117093290A (en) Window size adjustment method, related device and communication system
JP2014053746A (en) Character input device, method of controlling character input device, control program, and computer-readable recording medium with control program recorded

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination