WO2022267760A1 - 按键功能执行方法、装置、设备及存储介质 - Google Patents
按键功能执行方法、装置、设备及存储介质 Download PDFInfo
- Publication number
- WO2022267760A1 WO2022267760A1 PCT/CN2022/093610 CN2022093610W WO2022267760A1 WO 2022267760 A1 WO2022267760 A1 WO 2022267760A1 CN 2022093610 W CN2022093610 W CN 2022093610W WO 2022267760 A1 WO2022267760 A1 WO 2022267760A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- gesture
- human hand
- area
- target
- virtual keyboard
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 88
- 230000004044 response Effects 0.000 claims abstract description 115
- 230000033001 locomotion Effects 0.000 claims abstract description 43
- 230000008859 change Effects 0.000 claims abstract description 16
- 210000003811 finger Anatomy 0.000 claims description 143
- 230000006870 function Effects 0.000 claims description 91
- 238000006073 displacement reaction Methods 0.000 claims description 34
- 210000003813 thumb Anatomy 0.000 claims description 30
- 230000009471 action Effects 0.000 claims description 29
- 238000004590 computer program Methods 0.000 claims description 25
- 230000015654 memory Effects 0.000 claims description 20
- 238000012790 confirmation Methods 0.000 claims description 5
- 238000010586 diagram Methods 0.000 description 26
- 230000002093 peripheral effect Effects 0.000 description 10
- 230000008569 process Effects 0.000 description 8
- 238000012545 processing Methods 0.000 description 8
- 210000004247 hand Anatomy 0.000 description 7
- 230000001960 triggered effect Effects 0.000 description 6
- 238000004458 analytical method Methods 0.000 description 5
- 210000004932 little finger Anatomy 0.000 description 5
- 230000011218 segmentation Effects 0.000 description 5
- 238000001514 detection method Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000035945 sensitivity Effects 0.000 description 3
- 230000003068 static effect Effects 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000005309 stochastic process Methods 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 210000001145 finger joint Anatomy 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000013179 statistical model Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0489—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
- G06F3/04892—Arrangements for controlling cursor position based on codes indicative of cursor displacements from one discrete location to another, e.g. using cursor control keys associated to different directions or using the tab key
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/018—Input/output arrangements for oriental characters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04812—Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04803—Split screen, i.e. subdividing the display area or the window area into separate subareas
Definitions
- the present application relates to the field of computer technology, and in particular to a key function execution method, device, equipment and storage medium.
- buttons are displayed on electronic devices, and users operate the buttons to trigger electronic devices to automatically execute corresponding functions, which greatly reduces labor costs and realizes intelligence.
- the key function execution method is usually to display a virtual keyboard on the screen, and in response to detecting a user's touch operation on the screen, execute the function of the key corresponding to the touch point of the touch operation.
- Embodiments of the present application provide a button function execution method, device, device, and storage medium, which improve the granularity and accuracy of button triggering, and have better applicability. Described technical scheme is as follows:
- a key function execution method comprising:
- the key function execution system displays a virtual keyboard, the virtual keyboard includes a first area and a second area, each of the first area and the second area includes at least two keys, the first area corresponds to the left hand, The second region corresponds to the right hand;
- the system In response to detecting that a human hand is in a target gesture, the system displays a cursor at a first position based on a biometric feature of the target gesture, the first position being within an area of the virtual keyboard corresponding to the biometric feature , the first position is the position corresponding to the biometric feature;
- the system In response to the human hand maintaining the target gesture and moving, the system displays that the cursor moves following the movement of the human hand;
- the system executes the function of the key corresponding to the second position of the virtual keyboard, and the second position is the position of the cursor when the gesture of the human hand changes.
- a key function execution system includes an electronic device, a gesture tracking sensor, and a display device; wherein, the gesture tracking sensor and the display device are respectively connected to the electronic device;
- the display device is used to display a virtual keyboard
- the virtual keyboard includes a first area and a second area, each of the first area and the second area includes at least two keys, and the first area and the left hand Corresponding, the second area corresponds to the right hand;
- the gesture tracking sensor is used to detect that the human hand is in a target gesture
- the display device is further configured to, in response to detecting that the human hand is in the target gesture, display a cursor at the first position according to the biometric feature of the target gesture, the first position being located on the virtual keyboard In the area corresponding to the biometric feature, the first position is the position corresponding to the biometric feature;
- the gesture tracking sensor is also used to detect that the human hand maintains the target gesture and moves;
- the display device is further configured to display that the cursor moves following the movement of the human hand in response to the human hand maintaining the target gesture and moving;
- the electronic device is configured to execute the function of the key corresponding to the second position of the virtual keyboard in response to the change of the gesture of the human hand, and the second position is the position of the cursor when the gesture of the human hand changes.
- a key function execution device comprising:
- a display module configured to display a virtual keyboard
- the virtual keyboard includes a first area and a second area, each of the first area and the second area includes at least two keys, and the first area corresponds to the left hand , the second region corresponds to the right hand;
- the display module is further configured to display a cursor at the first position according to the biometric feature of the target gesture in response to detecting that the human hand is in the target gesture, and the first position is located in the virtual keyboard.
- the first position is the position corresponding to the biometric feature;
- the display module is further configured to display that the cursor moves following the movement of the human hand in response to the human hand maintaining the target gesture and moving;
- the executing module is configured to execute the function of the key corresponding to the second position of the virtual keyboard in response to the change of the gesture of the human hand, where the second position is the position of the cursor when the gesture of the human hand changes.
- the display module is used to:
- a cursor is displayed within the virtual keyboard at a target location within the second region.
- the display module is used to:
- determining a subkey corresponding to the first finger in the first area within the virtual keyboard A cursor is displayed on the target position in the area;
- determining a subkey corresponding to the first finger in the second region within the virtual keyboard A cursor is displayed at the target position within the area.
- the target location is the center location of the area.
- the display module is used to:
- the execution module is used for any of the following:
- the input content in the input box of the virtual keyboard is used as the target content, and the display of the virtual keyboard is canceled.
- the display module is further configured to display the inputted character combination within the target range of the character display area in response to corresponding candidate Chinese characters for the inputted character combination in the character display area of the virtual keyboard.
- the candidate Chinese characters corresponding to the character combination are further configured to display the inputted character combination within the target range of the character display area in response to corresponding candidate Chinese characters for the inputted character combination in the character display area of the virtual keyboard.
- the virtual keyboard further includes at least one virtual control
- the display module is further configured to display that the cursor moves as the human hand moves in response to detecting that the human hand is not in the target gesture and the human hand moves;
- the execution module is further configured to execute a function corresponding to the target virtual control in response to detecting that the human hand makes a target action and the cursor is located on the target virtual control in the at least one virtual control.
- the execution module is used to:
- the characters displayed in the virtual keyboard are switched between letters and symbols;
- the character input mode of the virtual keyboard is switched between Chinese input and English input.
- the execution module is further configured to update the display content of the target virtual control, and the updated display content is consistent with the switching of characters in the virtual keyboard or the switching of character input modes.
- a key function execution system includes an electronic device, a gesture tracking sensor, and a display device; wherein, the gesture tracking sensor and the display device are respectively connected to the electronic device;
- the display device is used to display a virtual keyboard
- the virtual keyboard includes a first area and a second area, each of the first area and the second area includes at least two keys, and the first area and the left hand Corresponding, the second area corresponds to the right hand;
- the gesture tracking sensor is used to detect that the human hand is in a target gesture
- the display device is further configured to, in response to detecting that the human hand is in the target gesture, display a cursor at the first position according to the biometric feature of the target gesture, the first position being located on the virtual keyboard In the area corresponding to the biometric feature, the first position is the position corresponding to the biometric feature;
- the gesture tracking sensor is also used to detect that the human hand maintains the target gesture and moves;
- the display device is further configured to display that the cursor moves following the movement of the human hand in response to the human hand maintaining the target gesture and moving;
- the electronic device is configured to execute the function of the key corresponding to the second position of the virtual keyboard in response to the change of the gesture of the human hand, and the second position is the position of the cursor when the gesture of the human hand changes.
- an electronic device in one aspect, includes one or more processors and one or more memories, at least one computer program is stored in the one or more memories, and the at least one computer program is provided by The one or more processors are loaded and executed to implement various optional implementation manners of the above key function execution method.
- a computer-readable storage medium wherein at least one computer program is stored in the storage medium, and the at least one computer program is loaded and executed by a processor to realize various optional implementations of the above key function execution method Way.
- a computer program product or computer program comprising one or more pieces of program code stored in a computer-readable storage medium.
- One or more processors of the electronic device read the one or more program codes from the computer-readable storage medium, and the one or more processors execute the one or more program codes, so that the Various optional implementation manners for the electronic device to execute the method for executing the key function above.
- the embodiment of the present application provides a flexible and easy-to-operate method for the scene of triggering keys in the virtual keyboard by combining two operations of gesture and movement.
- this method the user only needs to make a target gesture with the human hand to trigger the display of the cursor, and then move the human hand while maintaining the target gesture to control the movement of the cursor.
- This operation is very simple and convenient, and can also control the movement of the cursor.
- the display position of the cursor can also be determined based on the biometric features of the target gesture, so that the user can flexibly use different hands for gesture operations, so as to minimize the moving distance of the cursor, reduce operation complexity, and improve operational efficiency.
- FIG. 1 is a schematic diagram of an implementation environment of a key function execution method provided by an embodiment of the present application
- FIG. 2 is a schematic diagram of an implementation environment of another key function execution method provided by an embodiment of the present application.
- FIG. 3 is a schematic diagram of an implementation environment of another key function execution method provided by an embodiment of the present application.
- Fig. 4 is a schematic diagram of an implementation environment of another key function execution method provided by an embodiment of the present application.
- Fig. 5 is a schematic diagram of an implementation environment of another key function execution method provided by the embodiment of the present application.
- FIG. 6 is a flow chart of a method for executing key functions provided by an embodiment of the present application.
- FIG. 7 is a flow chart of a method for executing key functions provided by an embodiment of the present application.
- FIG. 8 is a schematic diagram of a terminal interface provided by an embodiment of the present application.
- FIG. 9 is a schematic diagram of a terminal interface provided by an embodiment of the present application.
- FIG. 10 is a schematic diagram of a terminal interface provided by an embodiment of the present application.
- FIG. 11 is a schematic diagram of a terminal interface provided by an embodiment of the present application.
- FIG. 12 is a schematic diagram of a terminal interface provided by an embodiment of the present application.
- FIG. 13 is a schematic diagram of a terminal interface provided by an embodiment of the present application.
- FIG. 14 is a schematic diagram of a terminal interface provided by an embodiment of the present application.
- FIG. 15 is a schematic diagram of a terminal interface provided by an embodiment of the present application.
- Fig. 16 is a schematic diagram of a terminal interface provided by an embodiment of the present application.
- FIG. 17 is a schematic diagram of a terminal interface provided by an embodiment of the present application.
- Fig. 18 is a schematic diagram of a key function execution system provided by an embodiment of the present application.
- Fig. 19 is a schematic structural diagram of a key function execution device provided by an embodiment of the present application.
- Fig. 20 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
- FIG. 21 is a structural block diagram of a terminal provided in an embodiment of the present application.
- FIG. 22 is a schematic structural diagram of a server provided by an embodiment of the present application.
- first and second are used to distinguish the same or similar items with basically the same function and function. It should be understood that “first”, “second” and “nth” There are no logical or timing dependencies, nor are there restrictions on quantity or order of execution. It should also be understood that although the following description uses the terms first, second, etc. to describe various elements, these elements should not be limited by the terms. These terms are only used to distinguish one element from another. For example, a first image could be termed a second image, and, similarly, a second image could be termed a first image, without departing from the scope of the various examples. The first image and the second image are both images, and in some cases, separate and distinct images.
- FIG. 1 , FIG. 2 , FIG. 3 and FIG. 4 are schematic diagrams of the implementation environment of the key function execution method provided by the embodiment of the present application.
- the implementation environment includes a terminal 101 .
- the implementation environment includes a terminal 101 and a gesture tracking sensor 102 .
- the implementation environment includes a terminal 101 , a gesture tracking sensor 102 and a display device 103 .
- the implementation environment includes a terminal 101 and a display device 103 .
- the terminal 101 is a smart phone, a game console, a desktop computer, a tablet computer, an e-book reader, an MP3 (Moving Picture Experts Group Audio Layer III, moving picture expert compression standard audio layer 3) player or an MP4 (Moving Picture Experts Group Audio Layer IV, dynamic image expert compressed standard audio layer 4) player, at least one of laptop computers.
- the terminal 101 installs and runs an application program that supports key function execution, for example, the application program is a system application, a VR (Virtual Reality, virtual reality) application, an instant messaging application, a news push application, a shopping application, an online video application, a social application .
- the application program is a system application, a VR (Virtual Reality, virtual reality) application, an instant messaging application, a news push application, a shopping application, an online video application, a social application .
- the implementation environment includes a terminal 101, which is equipped with a gesture tracking sensor and a screen display, so that the terminal can track the user's hand through the gesture tracking sensor, and can recognize the gesture or action made by the hand , to trigger the corresponding command of the gesture or action.
- the terminal also has a display function, which can display a virtual keyboard through the screen display, and can display corresponding changes according to the gesture or action through the screen display when recognizing the gesture or movement of the human hand.
- the gesture tracking sensor is a camera, and the camera can shoot a human hand to collect gestures or movements of the human hand.
- the implementation environment includes a terminal 101 and a gesture tracking sensor 102 .
- the terminal 101 and the gesture tracking sensor 102 are connected through a wired or wireless network.
- the terminal 101 has a display function and can display a virtual keyboard on a screen display.
- the gesture tracking sensor 102 can track the user's hand, and can recognize the gesture or action made by the hand, and then send the recognition result to the terminal 101, and the terminal 101 can display corresponding content in the virtual keyboard according to the recognition result.
- the gesture tracking sensor is any kind of gesture sensor, such as a Leap Motion sensor, or a finger tracking glove, etc., which is not limited in this embodiment of the present application.
- the implementation environment includes a terminal 101 , a gesture tracking sensor 102 and a display device 103 .
- the terminal 101 is respectively connected to the gesture tracking sensor 102 and the display device 103 through a wired or wireless network.
- the terminal 101 has a data processing function.
- the terminal 101 controls the display device 103 to display the virtual keyboard.
- the gesture tracking sensor 102 can track the user's hand, and can recognize the gesture or action made by the hand, and then send the recognition result to the terminal 101, and the terminal 101 can control the display device 103 to display in the virtual keyboard according to the recognition result. corresponding content.
- the display device is any kind of display device, such as a screen display, or a VR head-mounted display device (referred to as a VR head-mounted display), etc., which is not limited in this embodiment of the present application.
- the implementation environment includes a terminal 101 and a display device 103 .
- the terminal 101 is connected to the display device 103 through a wired or wireless network.
- the terminal 101 has a data processing function.
- the terminal 101 controls the display device 103 to display the virtual keyboard.
- the terminal 101 is equipped with a gesture tracking sensor, which can track the user's hand through the installed gesture tracking sensor, and can recognize the gesture or action made by the hand, and then the terminal 101 determines the change of the displayed content according to the gesture or action , and displayed on the display device 103.
- the implementation environment includes a gesture tracking sensor 501 , a device system 502 and a display device 503 .
- the gesture tracking sensor 501 is connected with the device system 502 .
- the gesture tracking sensor 501 is used to detect and track the user's finger operation.
- the gesture tracking sensor 501 is a device of different types, such as a Leap Motion sensor, a camera, and a finger tracking glove.
- the gesture tracking sensor 501 can obtain the rotation and coordinate information of each finger joint.
- the device system 502 can run game applications or other application programs, and the application includes input method program modules.
- the device system can run the input method program module within the program.
- the display device 503 is capable of displaying a screen.
- the display device 503 is a VR (Virtual Reality, virtual reality) head-mounted display or a screen display.
- Fig. 6 is a flow chart of a key function execution method provided by an embodiment of the present application. The method is applied to an electronic device, and the electronic device is a terminal or a server. Referring to Fig. 6, taking the method applied to a terminal as an example, the method Include the following steps:
- the terminal displays a virtual keyboard, where the virtual keyboard includes a first area and a second area, and each of the first area and the second area includes at least two keys, and the first area and the second area are respectively connected to
- the left hand corresponds to the right hand, that is, the first area corresponds to the left hand, and the second area corresponds to the right hand.
- the virtual keyboard is a virtual keyboard displayed on the screen, and the virtual keyboard includes at least two keys. Wherein, at least two refers to two or more, that is, multiple.
- the keys in the virtual keyboard include various types of keys, for example, a plurality of letter keys, number keys or symbol keys are displayed in the virtual keyboard.
- the virtual keyboard is divided into two areas: a first area and a second area.
- the first area corresponds to the left hand
- the second area corresponds to the right hand.
- Each area includes at least two buttons, and when any button is operated, the function of the button can be triggered. In this way, the user can use different hands to make target gestures to control the triggering of keys in different areas of the virtual keyboard.
- the terminal In response to detecting that the human hand is in the target gesture, the terminal displays the cursor at a first position according to the biometric feature of the target gesture, the first position is located in the area corresponding to the biometric feature in the virtual keyboard, and the first position is the location corresponding to the biometric feature.
- the terminal detects the gesture of the human hand, determines the corresponding operation position on the virtual keyboard based on the gesture, and then further determines the key that the user really wants to trigger , to perform the function of the key.
- the virtual keyboard is associated with the biometric feature of the human hand.
- the biometric feature is used to indicate which hand the gesture is performed by, or the biometric feature is also used to indicate which fingers are used to perform the gesture.
- the terminal detects the gesture of the human hand, and when determining that the human hand is in the target gesture, determines the first position in the area corresponding to the biometric feature according to the biometric feature of the target gesture, and the first position is The initial display position of the cursor.
- the terminal displays a cursor at the first position to indicate that if the current operation position corresponding to the position of the human hand is to be the first position. In this way, the human hand only needs to make a target gesture without touching the screen to trigger the terminal to display the cursor at the first position.
- the virtual keyboard is divided into two areas.
- the area corresponding to the biometric feature is the first area.
- the biometric feature corresponds to the first area.
- the area is the second area.
- the terminal displays that the cursor moves following the movement of the human hand.
- the embodiment of the present application combines two operations of gesture and movement. After making the target gesture, if the user does not actually trigger the button on the first position, then keep the target gesture to move the human hand, so that the terminal can track the human hand. To control the cursor to move with the movement of the human hand. In this way, the user only needs to move the hand to easily move the cursor to accurately select the key to be triggered.
- the simple operation can realize the precise triggering of the key and improve the accuracy and flexibility of key function execution.
- the terminal executes the function of the key corresponding to the second position of the virtual keyboard, where the second position is the position of the cursor when the gesture of the human hand changes.
- the position of the cursor has changed, it may be moved to another position, or it may be moved back to the first position.
- the gesture can be changed, that is, the target gesture is no longer maintained, so that the terminal can determine that the user wants to trigger the key at the current position of the cursor, and therefore, can execute The function of the key on the current position of the cursor (that is, the second position).
- the second position is the position where the cursor is located when the human hand maintains the target gesture and no longer maintains the target gesture after moving.
- the second position is the same position as the first position, or the second position is a different position from the first position.
- the embodiment of the present application provides a flexible and easy-to-operate method for the scene of triggering keys in the virtual keyboard by combining two operations of gesture and movement.
- this method the user only needs to make a target gesture with the human hand to trigger the display of the cursor, and then move the human hand while maintaining the target gesture to control the movement of the cursor.
- This operation is very simple and convenient, and can also control the movement of the cursor.
- the display position of the cursor can also be determined based on the biometric features of the target gesture, so that the user can flexibly use different hands for gesture operations, so as to minimize the moving distance of the cursor, reduce operation complexity, and improve operational efficiency.
- Fig. 7 is a flow chart of a key function execution method provided by the embodiment of the present application. Referring to Fig. 7, the method includes the following steps:
- the terminal displays a virtual keyboard, where the virtual keyboard includes a first area and a second area, each of the first area and the second area includes at least two keys, the first area corresponds to the left hand, and the second area The area corresponds to the right hand.
- multiple letter keys, number keys or symbol keys are displayed in the virtual keyboard.
- the virtual keyboard includes a plurality of letter keys, such as Q, A, Z, W, etc., which are not listed here.
- the virtual keyboard also includes symbol keys, such as "null” for a space, ",", “.”, etc. are symbol keys, and the virtual keyboard also includes function keys, such as " ⁇ " for a delete key.
- the display of the virtual keyboard is triggered by a trigger operation on the input box.
- the terminal displays a first page, and an input box is displayed on the first page.
- the user moves the human hand to control the cursor to follow the human hand to move into the input box, and then the human hand performs a click or slap action.
- the terminal detects the human hand's click or slap action and the cursor is located in the input box.
- the virtual keyboard is displayed on the page.
- the display of the virtual keyboard is triggered by a trigger operation on the virtual keyboard control.
- the terminal displays a first page, and a virtual keyboard control is displayed on the first page.
- the user moves the human hand to control the cursor to follow the human hand to the position of the virtual keyboard control, and then the human hand performs a click or slap action.
- the terminal detects the human hand's click or slap action and the cursor is located at the virtual keyboard control. position, the virtual keyboard is displayed on the first page.
- the terminal In response to detecting that the human hand is in the target gesture, the terminal displays the cursor at a first position according to the biometric feature of the target gesture, where the first position is located in the area corresponding to the biometric feature in the virtual keyboard, and the first The location is the location corresponding to the biometric feature.
- the user can trigger the selection of a button in the virtual keyboard and execute the function of the button by making a target gesture or moving after making a target gesture without touching the display screen.
- the terminal collects the gesture, action or position of the user's hand through the gesture tracking sensor.
- the gesture tracking sensor is a sensor installed on the terminal, for example, the gesture tracking sensor may be a camera installed on the terminal.
- the terminal shoots the user based on the camera, and then analyzes the captured image through computer vision detection to determine the gesture, action or position of the user's hand. For example, when performing computer vision detection, the position of the human hand is determined by detecting the human hand in the image, and then the shape of each finger is determined, and then matched with the shape of each finger in the candidate gesture to determine the current gesture of the human hand.
- the gesture tracking sensor is a gesture tracking sensor outside the terminal.
- the gesture tracking sensor is a Leap Motion sensor, a finger tracking glove, or a camera device.
- the finger tracking glove can collect the rotation and position of each knuckle of the human hand, and then analyze the shape of each finger through the rotation and position of each knuckle, and then match the shape of each finger in the candidate gesture to obtain Present human hand gesture.
- the target gesture is a pinching gesture, that is, the thumb of one hand is pressed together with the pulp of another finger.
- the terminal collects the image of the human hand or obtains the shape parameters of the human hand through the gesture sensing glove, and then through gesture recognition, determines that the matching degree is higher than the matching degree threshold, and the candidate gesture with the highest matching degree is the target gesture, then based on the determined
- the target gesture performs the step shown by the cursor.
- the target gesture can also be other gestures, for example, the target hand is a gesture with five fingers together, or the target gesture is a fist gesture, etc. Customary settings, which are not limited in this embodiment of the application.
- Gesture recognition includes three parts: gesture segmentation, gesture analysis and recognition.
- gesture segmentation methods include gesture segmentation based on monocular vision and gesture segmentation based on stereo vision.
- Monocular vision is to use an image acquisition device to collect images of gestures, so as to establish a plane model of the gesture, and match the plane model with the gestures in the gesture shape database to determine the type of the gesture.
- the method of building a gesture shape database is to build up all the gestures that can be considered, which is beneficial to the template matching of gestures, but the calculation amount is high, which is not conducive to the rapid recognition of the system.
- Stereo vision is to use multiple image acquisition devices to obtain different images of gestures, and convert different images of the same gesture into a three-dimensional model of the gesture.
- the method of stereo matching is similar to the template matching method in monocular vision. It also needs to build a large number of gesture libraries, and perform three-dimensional reconstruction of each gesture in the gesture library. For three-dimensional reconstruction, a three-dimensional model of the gesture needs to be established, and the amount of calculation will increase. , but the segmentation effect is better.
- Gesture analysis is one of the key technologies for gesture recognition. Through gesture analysis, the shape characteristics or motion trajectory of the gesture can be obtained. Gesture shape features and motion trajectories are important features in dynamic gesture recognition, which are directly related to the meaning expressed by gestures. Gesture analysis methods include the following categories: edge contour extraction method, multi-feature combination method such as centroid fingers, and knuckle tracking method.
- edge contour extraction method is a gesture analysis method, and the hand shape is distinguished from other objects due to its unique shape; or a gesture recognition algorithm that combines geometric moments and edge detection is used to calculate the image by setting the weight of two features The distance between them realizes the recognition of letter gestures.
- the multi-feature combination method is to analyze the posture or trajectory of the gesture according to the physical characteristics of the hand; Meenakshi Panwar combines the shape of the gesture with the features of the fingertips to realize the recognition of the gesture.
- the knuckle tracking method is mainly to build a two-dimensional or three-dimensional model of the hand, and then track it according to the position changes of the joint points of the human hand, which is mainly used in dynamic trajectory tracking.
- Gesture recognition is the process of classifying trajectories in the model parameter space into a certain subset in the space, which includes static gesture recognition and dynamic gesture recognition, and dynamic gesture recognition can eventually be transformed into static gesture recognition.
- gesture recognition methods mainly include: template matching neural network method and hidden Markov model method.
- the template matching method regards the gesture action as a sequence composed of static gesture images, and then compares the gesture sequence to be recognized with the known gesture template sequence to recognize the gesture.
- Hidden Markov Model is a statistical model, and the system modeled with Hidden Markov has a double stochastic process, which includes the random process of state transition and observation value output.
- the stochastic process of state transition is implicit, which is manifested by observing the random process of the sequence.
- the virtual keyboard is associated with the biometric features of the human hand, and when the biometric features are different, the corresponding positions of the target gesture in the virtual keyboard are different.
- the biometric feature is used to indicate which hand is the hand that makes the gesture, or the biometric feature is also used to indicate which fingers are the fingers that make the gesture.
- the terminal in step 702, the terminal first determines the first position, and then displays the cursor at the first position.
- the terminal determines, within the virtual keyboard, the first position in the area corresponding to the biometric feature according to the biometric feature of the target gesture, and then on the second position of the virtual keyboard. A cursor is displayed at a position.
- the terminal indicates that the human hand is left-handed in response to the biometric feature, and displays a cursor on the target position in the first area of the virtual keyboard; the terminal indicates that the human hand is right-handed in response to the biometric feature, and then A cursor is displayed on the target position in the second area of the virtual keyboard.
- the terminal determines that the target position in the first area of the virtual keyboard is the first position in response to the biometric feature indicating that the human hand is left-handed, and then displays the cursor at the first position.
- the terminal determines that the target position in the second area of the virtual keyboard is the first position, and then displays a cursor on the first position.
- the virtual keyboard is equally divided to obtain a left area 901 and a right area 902 , the left area 901 is a first area, and the right area 902 is a second area.
- the corresponding first position is within the left area 901 .
- the corresponding first location is within the right area 902 .
- the user can distinguish the small area in the virtual keyboard by distinguishing the left and right hands.
- the user can choose to use the left hand or the right hand according to the position where the button is to be triggered, so that further precise selection in the small area can reduce the Further selection range, compared with the method of not dividing the virtual keyboard, the user's operation of adjusting the cursor can be reduced as much as possible, so as to improve the operation efficiency.
- the above-mentioned first area and second area are further divided, the first area corresponds to the left hand, and the second area corresponds to the right hand.
- the first area and the second area are further divided into a plurality of sub-areas, and different fingers of a human hand correspond to different sub-areas.
- the target gesture is formed by a finger and thumb of the human hand, and the finger different from the thumb corresponds to a sub-region. The user can control the cursor to be displayed in different sub-regions by using different gestures and the thumb to form the target gesture.
- step 302 the terminal determines that the first finger in the first area of the virtual keyboard A cursor is displayed on the target position within the corresponding sub-area.
- the terminal determines the target position in the sub-area corresponding to the first finger in the second area in the virtual keyboard Show cursor.
- the first area is divided into two sub-areas
- the second area is also divided into two sub-areas.
- the first sub-area of the first area corresponds to the second finger of the left hand
- the second sub-area of the first area corresponds to the third finger of the left hand
- the third sub-area of the second area corresponds to the fourth finger of the right hand
- the fourth sub-area of the second area corresponds to the fifth finger of the right hand.
- a hand includes four fingers in addition to the thumb: index finger, middle finger, ring finger and little finger. Which of the four fingers are the second finger and the third finger is set by the relevant technical personnel according to the requirements, or set by the user according to the operating habits. The same is true for the fourth finger and the fifth finger, which is not limited in the embodiment of the present application.
- the left hand serves as the input for the left half of the keyboard (the first area 1001 ), and the right hand serves as the input for the right half of the keyboard (the second area 1001 ).
- the target gesture is a pinch gesture
- the second finger is the left middle finger
- the third finger is the left index finger
- the fourth finger is the right index finger
- the fifth finger is the right middle finger.
- the cursor appears in the second sub-area 1003 of the first area 1001, and is used to control the triggering of E, R, T, D, F, G, C, V, and B keys .
- the cursor appears in the first sub-area 1004 of the first area 1001 for controlling the triggering of the Q, W, A, S, Z, X keys.
- the target location is the center location of the area. In this way, the cursor is displayed at the center position. If the user wants to move the cursor to select a key later, the sum of the distances to other keys can be guaranteed to be the smallest. In this way, the distance for the user to move the cursor can be greatly reduced, making the operation more convenient and efficient. higher.
- the cursor when the user pinches with the index finger and thumb of the left hand, the cursor appears on the F key at the center position in the second sub-area 1003 of the first area 1001 .
- the cursor appears between the central position A and the S key in the first sub-area 1004 of the first area 1001 .
- the first area is divided into three sub-areas
- the second area is also divided into three sub-areas.
- the first sub-area of the first area corresponds to the second finger of the left hand
- the second sub-area of the first area corresponds to the third finger of the left hand.
- the third sub-area of the first area corresponds to the fourth finger of the right hand.
- the fourth sub-area of the second area corresponds to the fifth finger of the right hand
- the fifth sub-area of the second area corresponds to the sixth finger of the right hand
- the sixth sub-area of the second area corresponds to the seventh finger of the right hand.
- Which of the four fingers are the second finger, the third finger, and the fourth finger is set by the relevant technical personnel according to the requirements, or set by the user according to the operating habits.
- the same is true for the fifth finger, the sixth finger, and the seventh finger.
- the embodiment of the application does not limit this.
- the left hand serves as the input for the left half of the keyboard (the first area 1101 ), and the right hand serves as the input for the right half of the keyboard (the second area 1102 ).
- the target gesture as a pinch gesture
- the second finger, third finger, and fourth finger are respectively the ring finger, middle finger, and index finger of the left hand
- the fifth finger, sixth finger, and seventh finger are the index finger, middle finger, and ring finger of the right hand, respectively.
- the cursor appears in the third sub-area 1103 of the first area 1101 for controlling the triggering of the R, T, F, G, V, B keys.
- the cursor When the user pinches with the middle finger and thumb of the left hand, the cursor appears in the second sub-area 1104 of the first area 1101 for controlling the triggering of the W, E, S, D, X, and C keys.
- the cursor appears in the first sub-area 1105 of the first area 1101 for controlling the triggering of Q, A, and X. The same is true for the right hand, so I won’t repeat it here.
- the target position is the center of the area.
- the cursor appears in the third sub-area of the first area 1101 Center location within 1103 between the F and G keys.
- the first area is divided into four sub-areas, and the second area is also divided into four sub-areas.
- the first sub-area of the first area corresponds to the second finger of the left hand, and the second sub-area of the first area corresponds to the third finger of the left hand.
- the third sub-area of the first area corresponds to the fourth finger of the right hand, and the fourth sub-area of the first area corresponds to the fifth finger of the left hand.
- the fifth sub-area of the second area corresponds to the sixth finger of the right hand
- the sixth sub-area of the second area corresponds to the seventh finger of the right hand
- the seventh sub-area of the second area corresponds to the eighth finger of the right hand
- the seventh sub-area of the second area corresponds to the eighth finger of the right hand
- the eighth sub-region corresponds to the ninth finger of the right hand.
- the left hand serves as the input for the left half of the keyboard (the first area 1201 ), and the right hand serves as the input for the right half of the keyboard (the second area 1202 ).
- the target gesture is a pinch gesture
- the second finger, third finger, fourth finger, and fifth finger are the little finger, ring finger, middle finger, index finger, sixth finger, seventh finger, eighth finger, and ninth finger of the left hand, respectively.
- the cursor appears in the fourth sub-area 1203 of the first area 1201 for controlling the triggering of the R, T, F, G, V, B keys.
- the cursor When the user pinches with the middle finger and thumb of the left hand, the cursor appears in the third sub-area 1204 of the first area 1201 for controlling the triggering of E, D, and C.
- the cursor appears in the second sub-area 1205 of the first area 1201 for controlling the triggering of the W, S, and X keys.
- the cursor appears in the first sub-area 1206 of the first area 1201 for controlling the triggering of the Q, A, and X keys. The same is true for the right hand, so I won’t repeat it here.
- the target position is the center of the area.
- the cursor appears in the fourth sub-area of the first area 1201 Center location within 1203 between the F and G keys.
- the cursor appears on the central position D key in the third sub-area 1204 of the first area 1201 .
- the terminal when the terminal detects that the human hand is in the target gesture, it may also determine the biometric feature of the target gesture. If the biometric feature indicates that the first finger and the thumb of the human hand form the target gesture, and there is no corresponding first position for the first finger, then this operation is ignored.
- the terminal acquires the displacement of the human hand in response to the human hand maintaining the target gesture and moving.
- the above-mentioned first position is the initial display position of the cursor.
- the initial display position may be on the key or between the keys. The user can move the hand to control the movement of the cursor.
- the terminal acquires the displacement of the human hand.
- the process of obtaining the displacement of the human hand can adopt the same steps as the above-mentioned gesture recognition.
- the terminal collects two images of the human hand, and recognizes the position of the human hand to determine the position change of the human hand in the two images, that is, the displacement.
- the features such as the rotation determined by the above-mentioned gesture tracking sensor are used to describe the shape of the finger, and are also used to determine the position of each knuckle of the human hand (the position can be represented by coordinates). Then, when the human hand is moving, the terminal may also continue to acquire the positions of the knuckles of the human hand, so as to compare with the positions acquired in step 702 above to determine the displacement of the human hand.
- the terminal determines a third position according to the first position and the displacement of the human hand.
- the terminal determines the displacement of the cursor through the position moved by the human hand, and then, through the first position and the displacement of the cursor, determines the third position, which is the position where the cursor should move Location.
- the terminal determines the displacement of the cursor according to the displacement of the human hand, it may be realized in various implementation manners. In some embodiments, the terminal uses the displacement of the human hand as the displacement of the cursor. In some other embodiments, the terminal determines the displacement of the cursor according to the displacement and sensitivity of the human hand.
- the sensitivity is set by relevant technicians according to requirements, or by users according to their own operating habits, which is not limited in this embodiment of the present application.
- the terminal displays that the cursor moves from the first position to the third position.
- the terminal After the terminal determines the third position, it can control the cursor to move from the first position to the third position, thereby reflecting the effect that the cursor moves following the movement of the human hand.
- step 703 to step 705 is a possible implementation of displaying that the cursor moves following the movement of the human hand in response to the human hand maintaining the target gesture and moving.
- the above-mentioned method of controlling the movement of the cursor by determining the position of the cursor according to the displacement of the human hand is An example has been described, and this process can also be implemented in other ways.
- the terminal does not need to determine the third position to which the cursor is to be moved, but directly obtains the displacement of the cursor, and then controls the movement of the cursor according to the displacement of the cursor. This embodiment of the present application does not limit it.
- the finger displacement increases or decreases equally with the cursor displacement, that is, the sensitivity is 1.
- the first position is represented by InitCursorPos
- the third position is represented by CursorPos
- the displacement of the human hand is obtained (the displacement of the human hand is represented by TipMovement)
- the terminal executes the function of the key corresponding to the second position of the virtual keyboard, where the second position is the position of the cursor when the gesture of the human hand changes.
- the function of the button on the second position can be executed.
- the virtual keyboard includes various types of keys: character keys, function keys, and the like.
- the keys corresponding to the second position belong to different types, and the terminals perform different functions.
- this step 706 includes the following situations.
- Case 1 In response to the fact that the key corresponding to the second position is a character key, the character represented by the key corresponding to the second position is input in the input box or character display area of the virtual keyboard.
- the terminal can input the character represented by the key corresponding to the second position in the input box of the virtual keyboard. If the current Chinese input mode is used and the character key is a letter key, the terminal can input the character represented by the key corresponding to the second position in the character display area.
- Case 2 In response to the fact that the key corresponding to the second position is a delete key, the last character in the input box or character display area of the virtual keyboard is deleted.
- the delete button is a function button, not for character input, but for deleting the input characters. If the current input mode is English, or the input mode is Chinese and there is no input character in the character display area, the terminal can delete the last character in the input box of the virtual keyboard. If the current input mode is Chinese and there are already input characters in the character display area, the terminal can delete the last character in the character display area.
- Case 3 In response to the fact that the key corresponding to the second position is a newline key, the input cursor in the input box displaying the virtual keyboard changes to the next line.
- the virtual keyboard also includes an exit button or a return button. button, and the terminal cancels the display of the virtual keyboard in response to that the button corresponding to the second position is an exit button or a return button.
- the virtual keyboard also includes a parameter adjustment button, and the terminal adjusts the parameters of the current display interface in response to the button corresponding to the second position being a parameter adjustment button, and the parameter includes at least one of a display parameter, a volume parameter, and a control parameter . This embodiment of the present application does not limit it.
- the virtual keyboard further includes at least one virtual control.
- the terminal may also respond to detecting that the human hand is not in the target gesture and the human hand moves, displaying that the cursor moves as the human hand moves, and then responding to detecting that the human hand makes a target action and the cursor is located in the at least one virtual control On the target virtual control, execute the function corresponding to the target virtual control.
- the process of displaying that the cursor moves along with the movement of the human hand is the same as the above steps 303 to 305, and will not be repeated here.
- the target action is continued to be set by relevant technicians according to requirements, or set by the user according to operating habits.
- the target action is a tap action, or the target action is a slap action, which is not limited in the embodiment of the present application.
- the virtual controls include the following types: a switch control for uppercase input and lowercase input, a switch control for symbol input and letter input, and a switch control for Chinese input and English input. These types of virtual controls are described below.
- the terminal switches the characters displayed in the virtual keyboard between uppercase input characters and lowercase input characters.
- the uppercase input may also include uppercase input and locked uppercase input.
- the terminal updates the display content of the target virtual control, and the updated display content is consistent with the switching of characters in the virtual keyboard or the switching of character input modes.
- the terminal switches the characters displayed in the virtual keyboard between letters and symbols in response to the target virtual control being a switching control for symbol input and letter input.
- the terminal switches the character input mode of the virtual keyboard between Chinese input and English input.
- the switching control 1501 displays "EN”, and in the Chinese input mode, the switching control 1501 displays "Pinyin".
- the input pinyin "fa" is displayed as an example, that is, the character display area is the switching control 1501 your region.
- the character display area can also be other areas, which are not in the area where the switching control 1501 is located.
- the terminal responds to the presence of corresponding candidate Chinese characters for the input character combination in the character display area of the virtual keyboard, and displays the candidate Chinese characters corresponding to the input character combination within the target range of the character display area.
- the character display area is a pinyin display area
- the input characters in the character display area are pinyin characters for representing the pronunciation of Chinese characters
- the input character combination represents the pinyin of the candidate Chinese characters.
- the character display area includes a plurality of characters, and the plurality of characters correspond to candidate Chinese characters, then the candidate Chinese characters may be displayed on the right.
- “fa” corresponds to a plurality of candidate Chinese characters, such as hair, punishment, raft, cutting, poor, valve, law, etc., only showing "fa, punishment, raft, cutting, poor, valve,
- the displayed candidate Chinese characters can scroll to display other candidate Chinese characters.
- the pinch gesture If it is judged that the pinch gesture has not been done before, that is, the pinch gesture is just detected at present, then it is judged whether it is the left hand, and if so, the operation will be applied to the left keyboard part. If not left-handed, the operation will apply to the right-handed part of the keyboard. Then determine which finger and thumb are making the pinch gesture, and then determine whether there is a corresponding keyboard operation area according to the judged finger. If there is, a cursor will be displayed in the center of the corresponding keyboard operation area. The position of the cursor is the initial cursor Position InitCursorPos. Then it is also possible to record the Fingerindex of the finger (the index finger to the little finger are numbers 1-4 respectively), and then record the spatial coordinate InitTipPos of the finger.
- This application does not need a handle, and uses pinch and move gestures to trigger virtual keys in the keyboard. After the pinch gesture, it will start from the center of the control area and shorten the distance to each button in the control area. Use the left hand and right hand to screen a large area (screen out the left or right keyboard part), and each finger can also screen a small area (screen out one or two columns of keyboard keys). After the final key area is screened out by pinching , and then use the displacement of the gesture to make precise button selection (precise selection of up, middle and down), use the gesture to improve the selection efficiency, and use the displacement operation to fill the final freedom of selection of the specific button.
- the embodiment of the present application provides a flexible and easy-to-operate method for the scene of triggering keys in the virtual keyboard by combining two operations of gesture and movement.
- this method the user only needs to make a target gesture with the human hand to trigger the display of the cursor, and then move the human hand while maintaining the target gesture to control the movement of the cursor.
- This operation is very simple and convenient, and can also control the movement of the cursor.
- the display position of the cursor can also be determined based on the biometric features of the target gesture, so that the user can flexibly use different hands for gesture operations, so as to minimize the moving distance of the cursor, reduce operation complexity, and improve operational efficiency.
- FIG. 18 is a schematic diagram of a key function execution system provided by an embodiment of the present application.
- the key function execution system includes an electronic device 1801, a gesture tracking sensor 1802, and a display device 1803; wherein, the gesture tracking sensor 1802 and the display device 1802 are connected with The electronic device 1801 is connected.
- the display device 1802 is used to display a virtual keyboard, the virtual keyboard includes a first area and a second area, each of the first area and the second area includes at least two keys, the first area corresponds to the left hand, the The second area corresponds to the right hand;
- the gesture tracking sensor 1802 is used to detect that the human hand is in a target gesture
- the display device 1802 is further configured to display a cursor at the first position according to the biometric feature of the target gesture in response to detecting that the human hand is in the target gesture, and the first position is located in the virtual keyboard corresponding to the biometric feature.
- the first location is the location corresponding to the biometric feature;
- the gesture tracking sensor 1802 is also used to detect that the human hand maintains the target gesture and moves;
- the display device 1802 is also used to display that the cursor moves following the movement of the human hand in response to the human hand maintaining the target gesture and moving;
- the electronic device 1801 is configured to execute the function of the key corresponding to the second position of the virtual keyboard in response to the change of the gesture of the human hand, and the second position is the position of the cursor when the gesture of the human hand changes.
- the display device 1802 is a virtual reality (VR) display device or a screen display.
- VR virtual reality
- the display device 1802 is used to:
- a cursor is displayed within the virtual keyboard at the target location within the second area.
- the display device 1802 is used to:
- determining to display a cursor at the target position in the sub-region corresponding to the first finger in the first region of the virtual keyboard In response to the biometric feature indicating that the human hand is left-handed and the target gesture is formed by the first finger and thumb of the human hand, determining to display a cursor at the target position in the sub-region corresponding to the first finger in the first region of the virtual keyboard ;
- determining to display a cursor on the target position in the sub-region corresponding to the first finger in the second region in the virtual keyboard In response to the biometric feature indicating that the human hand is the right hand and the target gesture is formed by the first finger and the thumb of the human hand, determining to display a cursor on the target position in the sub-region corresponding to the first finger in the second region in the virtual keyboard .
- the target location is the center location of the area.
- the electronic device 1801 is configured to acquire the displacement of the human hand in response to the human hand maintaining the target gesture and moving; determining a third position according to the first position and the displacement of the human hand;
- the display device 1802 is used for displaying that the cursor moves from the first position to the third position.
- the display device 1802 is configured to perform any of the following:
- the input content in the input box of the virtual keyboard is used as the target content, and the display of the virtual keyboard is canceled.
- the display device 1802 is also used for displaying the input character combination within the target range of the character display area in response to the presence of corresponding candidate Chinese characters for the input character combination in the character display area of the virtual keyboard The corresponding candidate Chinese characters.
- the virtual keyboard also includes at least one virtual control
- the display device 1802 is also used to display that the cursor moves as the human hand moves in response to detecting that the human hand is not in the target gesture and moves;
- the electronic device 1801 is further configured to execute a function corresponding to the target virtual control in response to detecting that the human hand makes a target action and the cursor is located on the target virtual control in the at least one virtual control.
- the electronic device 1801 is used to:
- the characters displayed in the virtual keyboard are switched between uppercase input characters and lowercase input characters;
- the characters displayed in the virtual keyboard are switched between letters and symbols;
- the character input mode of the virtual keyboard is switched between Chinese input and English input.
- the display device 1802 is further configured to update the display content of the target virtual control, and the updated display content is consistent with the switching of characters in the virtual keyboard or the switching of character input modes.
- Fig. 19 is a schematic structural diagram of a key function execution device provided by an embodiment of the present application. Referring to Fig. 19, the device includes:
- the display module 1901 is configured to display a virtual keyboard, the virtual keyboard includes a first area and a second area, each of the first area and the second area includes at least two keys, the first area corresponds to the left hand, the The second area corresponds to the right hand;
- the display module 1901 is further configured to, in response to detecting that the human hand is in the target gesture, the system displays the cursor at a first position according to the biometric feature of the target gesture, and the first position is located in the virtual keyboard. In the area corresponding to the biometric feature, the first position is the position corresponding to the biometric feature;
- the display module 1901 is further configured to display that the cursor moves following the movement of the human hand in response to the human hand maintaining the target gesture and moving;
- the execution module 1902 is configured to execute the function of the key corresponding to the second position of the virtual keyboard in response to the change of the gesture of the human hand, where the second position is the position of the cursor when the gesture of the human hand changes.
- the display module 1901 is used for:
- a cursor is displayed within the virtual keyboard at the target location within the second area.
- the display module 1901 is used for:
- determining to display a cursor at the target position in the sub-region corresponding to the first finger in the first region of the virtual keyboard In response to the biometric feature indicating that the human hand is left-handed and the target gesture is formed by the first finger and thumb of the human hand, determining to display a cursor at the target position in the sub-region corresponding to the first finger in the first region of the virtual keyboard ;
- determining to display a cursor on the target position in the sub-region corresponding to the first finger in the second region in the virtual keyboard In response to the biometric feature indicating that the human hand is the right hand and the target gesture is formed by the first finger and the thumb of the human hand, determining to display a cursor on the target position in the sub-region corresponding to the first finger in the second region in the virtual keyboard .
- the target location is the center location of the area.
- the display module 1901 is used for:
- the execution module 1902 is used for any of the following:
- the input content in the input box of the virtual keyboard is used as the target content, and the display of the virtual keyboard is canceled.
- the display module 1901 is further configured to display the input character combination within the target range of the character display area in response to the corresponding candidate Chinese characters for the input character combination in the character display area of the virtual keyboard The corresponding candidate Chinese characters.
- the virtual keyboard also includes at least one virtual control
- the display module 1901 is also used for displaying that the cursor moves with the movement of the human hand in response to detecting that the human hand is not in the target gesture and moving;
- the execution module 1902 is further configured to execute a function corresponding to the target virtual control in response to detecting that the human hand makes a target action and the cursor is located on the target virtual control in the at least one virtual control.
- the execution module 1902 is used for:
- the characters displayed in the virtual keyboard are switched between uppercase input characters and lowercase input characters;
- the characters displayed in the virtual keyboard are switched between letters and symbols;
- the character input mode of the virtual keyboard is switched between Chinese input and English input.
- the executing module 1902 is further configured to update the display content of the target virtual control, and the updated display content is consistent with the switching of characters in the virtual keyboard or the switching of character input modes.
- the apparatus shown in the embodiment of the present application is configured on an electronic device or on other devices, which is not limited in the embodiment of the present application.
- the embodiment of the present application provides a flexible and easy-to-operate method for the scene of triggering keys in the virtual keyboard by combining two operations of gesture and movement.
- this method the user only needs to make a target gesture with the human hand to trigger the display of the cursor, and then move the human hand while maintaining the target gesture to control the movement of the cursor.
- This operation is very simple and convenient, and can also control the movement of the cursor.
- the display position of the cursor can also be determined based on the biometric features of the target gesture, so that the user can flexibly use different hands for gesture operations, so as to minimize the moving distance of the cursor, reduce operation complexity, and improve operational efficiency.
- buttons function execution device when the button function execution device provided by the above-mentioned embodiment executes the button function, it only uses the division of the above-mentioned functional modules for illustration. That is, the internal structure of the key function execution device is divided into different functional modules to complete all or part of the functions described above.
- the device for performing key function provided by the above embodiment and the embodiment of the method for performing key function belong to the same concept, and its specific implementation process is detailed in the method embodiment, and will not be repeated here.
- FIG. 20 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
- the electronic device 2000 may have relatively large differences due to different configurations or performances, including one or more than one processor (Central Processing Units, CPU) 2001 and One or more memories 2002, wherein at least one computer program is stored in the memory 2002, and the at least one computer program is loaded and executed by the processor 2001 to implement the key function execution methods provided by the above method embodiments.
- the electronic device also includes other components for implementing device functions.
- the electronic device also has components such as a wired or wireless network interface and an input and output interface for input and output.
- the embodiment of the present application will not be described in detail here.
- FIG. 21 is a structural block diagram of a terminal provided in an embodiment of the present application.
- the terminal 2100 includes: a processor 2101 and a memory 2102 .
- the processor 2101 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. Processor 2101 can adopt at least one hardware form in DSP (Digital Signal Processing, digital signal processing), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array, programmable logic array) accomplish. In some embodiments, the processor 2101 may also include an AI (Artificial Intelligence, artificial intelligence) processor, where the AI processor is used to process computing operations related to machine learning.
- AI Artificial Intelligence, artificial intelligence
- Memory 2102 may include one or more computer-readable storage media, which may be non-transitory.
- the non-transitory computer-readable storage medium in the memory 2102 is used to store at least one instruction, and the at least one instruction is used to be executed by the processor 2101 to realize the button function provided by the method embodiment in this application Execution method.
- the terminal 2100 may optionally further include: a peripheral device interface 2103 and at least one peripheral device.
- the processor 2101, the memory 2102, and the peripheral device interface 2103 may be connected through buses or signal lines.
- Each peripheral device can be connected to the peripheral device interface 2103 through a bus, a signal line or a circuit board.
- the peripheral device includes: at least one of a display screen 2104 or a camera assembly 2105 .
- the peripheral device interface 2103 may be used to connect at least one peripheral device related to I/O (Input/Output, input/output) to the processor 2101 and the memory 2102 .
- the processor 2101, memory 2102 and peripheral device interface 2103 are integrated on the same chip or circuit board; in some other embodiments, any one of the processor 2101, memory 2102 and peripheral device interface 2103 or The two can be implemented on a separate chip or circuit board, which is not limited in this embodiment.
- the display screen 2104 is used to display a UI (User Interface, user interface).
- the UI can include graphics, text, icons, video, and any combination thereof.
- the display screen 2104 also has the ability to collect touch signals on or above the surface of the display screen 2104 .
- the touch signal can be input to the processor 2101 as a control signal for processing.
- the display screen 2104 can also be used to provide virtual buttons and/or virtual keyboards, also called soft buttons and/or soft keyboards.
- the camera assembly 2105 is used to capture images or videos.
- the camera component 2105 includes a front camera and a rear camera.
- the front camera is set on the front panel of the terminal, and the rear camera is set on the back of the terminal.
- there are at least two rear cameras which are any one of the main camera, depth-of-field camera, wide-angle camera, and telephoto camera, so as to realize the fusion of the main camera and the depth-of-field camera to realize the background blur function.
- the wide-angle camera to achieve panoramic shooting and VR (Virtual Reality, virtual reality) shooting functions or other fusion shooting functions.
- terminal 2100 further includes one or more sensors.
- the one or more sensors include, but are not limited to: acceleration sensors, gyro sensors, pressure sensors, optical sensors, and proximity sensors.
- FIG. 21 does not limit the terminal 2100, and may include more or less components than shown in the figure, or combine some components, or adopt different component arrangements.
- FIG. 22 is a schematic structural diagram of a server provided by an embodiment of the present application.
- the server 2200 may have relatively large differences due to different configurations or performances, including one or more processors (Central Processing Units, CPU) 2201 and One or more memories 2202, wherein at least one computer program is stored in the memory 2202, and the at least one computer program is loaded and executed by the processor 2201 to implement the key function execution methods provided by the above method embodiments.
- the server also has components such as wired or wireless network interfaces and input and output interfaces for input and output, and the server also includes other components for realizing device functions, which will not be repeated here.
- a computer-readable storage medium such as a memory including at least one computer program, and the at least one computer program is executable by a processor to complete the key function execution method in the above-mentioned embodiment.
- the computer-readable storage medium is a read-only memory (Read-Only Memory, referred to as: ROM), a random access memory (Random Access Memory, referred to as: RAM), a compact disc (Compact Disc Read-Only Memory, referred to as: CD) -ROM), tapes, floppy disks and optical data storage devices, etc.
- a computer program product or a computer program comprising one or more pieces of program code stored in a computer-readable storage medium .
- One or more processors of the electronic device read the one or more pieces of program code from the computer-readable storage medium, and the one or more processors execute the one or more pieces of program code, so that the electronic device performs the above-mentioned button function execution method.
- the computer programs involved in the embodiments of the present application can be deployed and executed on one computer device, or executed on multiple computer devices at one location, or distributed in multiple locations and communicated Executed on multiple computer devices interconnected by the network, multiple computer devices distributed in multiple locations and interconnected through a communication network can form a blockchain system.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Security & Cryptography (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (16)
- 一种按键功能执行方法,所述方法包括:按键功能执行系统显示虚拟键盘,所述虚拟键盘包括第一区域和第二区域,所述第一区域和第二区域中每个区域内包括至少两个按键,所述第一区域与左手对应,所述第二区域与右手对应;响应于检测到人手处于目标手势,所述系统根据所述目标手势的生物识别特征,在第一位置上显示光标,所述第一位置位于所述虚拟键盘内所述生物识别特征对应的区域内,所述第一位置为所述生物识别特征对应的位置;响应于所述人手保持所述目标手势且移动,所述系统显示所述光标跟随所述人手的移动而移动;响应于所述人手的手势变化,所述系统执行所述虚拟键盘的第二位置对应按键的功能,所述第二位置为所述人手的手势变化时所述光标所在位置。
- 根据权利要求1所述的方法,其中,所述系统根据所述目标手势的生物识别特征,在所述第一位置上显示光标,所述第一位置位于所述虚拟键盘内所述生物识别特征对应的区域内,包括:响应于所述生物识别特征指示所述人手为左手,所述系统在所述虚拟键盘内所述第一区域内的目标位置上显示光标;响应于所述生物识别特征指示所述人手为右手,所述系统在所述虚拟键盘内所述第二区域内的目标位置上显示光标。
- 根据权利要求1所述的方法,其中,所述系统根据所述目标手势的生物识别特征,在所述第一位置上显示光标,包括:响应于所述生物识别特征指示所述人手为左手且所述目标手势由所述人手的第一手指与拇指形成,所述系统在所述第一区域内所述第一手指对应的子区域内目标位置上显示光标;响应于所述生物识别特征指示所述人手为右手且所述目标手势由所述人手的第一手指与拇指形成,所述系统在所述第二区域内所述第一手指对应的子区域内目标位置上显示光标。
- 根据权利要求2或3所述的方法,其中,所述目标位置为区域的中心位置。
- 根据权利要求1所述的方法,其中,所述响应于所述人手保持所述目标手势且移动,所述系统显示所述光标跟随所述人手的移动而移动,包括:响应于所述人手保持所述目标手势且移动,所述系统获取所述人手的位移;所述系统根据所述第一位置以及所述人手的位移,确定第三位置;所述系统显示所述光标从所述第一位置移动至所述第三位置。
- 根据权利要求1所述的方法,其中,所述系统执行所述虚拟键盘的第二位置对应按键的功能,包括下述任一项:响应于所述第二位置对应按键为字符按键,所述系统在所述虚拟键盘的输入框或字符显示区域中输入所述第二位置对应按键所表示的字符;响应于所述第二位置对应按键为删除按键,所述系统将所述虚拟键盘的输入框或字符显示区域中的最后一位字符删除;响应于所述第二位置对应按键为换行按键,所述系统显示所述虚拟键盘的输入框中的输入光标换到下一行;响应于所述第二位置对应按键为确认按键,所述系统将所述虚拟键盘的输入框中已输入的内容作为目标内容,并取消对所述虚拟键盘的显示。
- 根据权利要求1所述的方法,其中,所述方法还包括:响应于所述虚拟键盘的字符显示区域中已输入的字符组合存在对应的候选汉字,所述系统在所述字符显示区域的目标范围内显示所述候选汉字。
- 根据权利要求1所述的方法,其中,所述虚拟键盘还包括至少一个虚拟控件;所述方法还包括:响应于检测到人手未处于所述目标手势且所述人手移动,所述系统显示所述光标随着所述人手移动而移动;响应于检测到所述人手做出目标动作且所述光标位于所述至少一个虚拟控件中的目标虚拟控件上,所述系统执行所述目标虚拟控件所对应的功能。
- 根据权利要求8所述的方法,其中,所述系统执行所述虚拟控件所对应的功能,包括:响应于所述目标虚拟控件为大写输入和小写输入的切换控件,所述系统对所述虚拟键盘内显示的字符在大写输入字符和小写输入字符之间进行切换;响应于所述目标虚拟控件为符号输入和字母输入的切换控件,所述系统对所述虚拟键盘内显示的字符在字母和符号之间进行切换;响应于所述目标虚拟控件为中文输入和英文输入的切换控件,所述系统对所述虚拟键盘的字符输入模式在中文输入和英文输入之间进行切换。
- 根据权利要求9所述的方法,其中,所述系统执行所述虚拟控件所对应的功能,还包括:所述系统对所述目标虚拟控件的显示内容进行更新,更新后的所述显示内容与所述虚拟键盘内字符的切换或字符输入模式的切换相符。
- 一种按键功能执行系统,所述系统包括电子设备、手势跟踪传感器和显示设备;其中,所述手势跟踪传感器和所述显示设备分别与所述电子设备连接;所述显示设备用于显示虚拟键盘,所述虚拟键盘包括第一区域和第二区域,所述第一区域和第二区域中每个区域内包括至少两个按键,所述第一区域与左手对应,所述第二区域与右手对应;所述手势跟踪传感器用于检测到人手处于目标手势;所述显示设备还用于响应于检测到所述人手处于所述目标手势,根据所述目标手势的生物识别特征,在所述第一位置上显示光标,所述第一位置位于所述虚拟键盘内所述生物识别特征对应的区域内,所述第一位置为所述生物识别特征对应的位置;所述手势跟踪传感器还用于检测到所述人手保持所述目标手势且移动;所述显示设备还用于响应于所述人手保持所述目标手势且移动,显示所述光标跟随所述人手的移动而移动;所述电子设备用于响应于所述人手的手势变化,执行所述虚拟键盘的第二位置对应按键的功能,所述第二位置为所述人手的手势变化时所述光标所在位置。
- 根据权利要求11所述的系统,所述显示设备为虚拟现实显示设备或屏幕显示器。
- 一种按键功能执行装置,所述装置包括:显示模块,用于显示虚拟键盘,所述虚拟键盘包括第一区域和第二区域,所述第一区域 和第二区域中每个区域内包括至少两个按键,所述第一区域与左手对应,所述第二区域与右手对应;所述显示模块,还用于响应于检测到人手处于目标手势,根据所述目标手势的生物识别特征,在所述第一位置上显示光标,所述第一位置位于所述虚拟键盘内所述生物识别特征对应的区域内,所述第一位置为所述生物识别特征对应的位置;所述显示模块,还用于响应于所述人手保持所述目标手势且移动,显示所述光标跟随所述人手的移动而移动;执行模块,用于响应于所述人手的手势变化,执行所述虚拟键盘的第二位置对应按键的功能,所述第二位置为所述人手的手势变化时所述光标所在位置。
- 一种电子设备,所述电子设备包括一个或多个处理器和一个或多个存储器,所述一个或多个存储器中存储有至少一条计算机程序,所述至少一条计算机程序由所述一个或多个处理器加载并执行以实现如权利要求1至权利要求10任一项所述的按键功能执行方法。
- 一种计算机可读存储介质,所述存储介质中存储有至少一条计算机程序,所述至少一条计算机程序由处理器加载并执行以实现如权利要求1至权利要求10任一项所述的按键功能执行方法。
- 一种计算机程序产品,所述计算机程序产品包括一条或多条程序代码,所述一条或多条程序代码存储在计算机可读存储介质中,电子设备的一个或多个处理器从所述计算机可读存储介质中读取所述一条或多条程序代码,所述一个或多个处理器执行所述一条或多条程序代码,使得所述电子设备执行如权利要求1至10任一权利要求所述的按键功能执行方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023571382A JP2024520943A (ja) | 2021-06-22 | 2022-05-18 | キー機能実行方法、キー機能実行システム、キー機能実行装置、電子機器、及びコンピュータプログラム |
EP22827258.9A EP4307096A1 (en) | 2021-06-22 | 2022-05-18 | Key function execution method, apparatus and device, and storage medium |
US18/295,623 US20230244379A1 (en) | 2021-06-22 | 2023-04-04 | Key function execution method and apparatus, device, and storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110693794.4A CN113253908B (zh) | 2021-06-22 | 2021-06-22 | 按键功能执行方法、装置、设备及存储介质 |
CN202110693794.4 | 2021-06-22 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/295,623 Continuation US20230244379A1 (en) | 2021-06-22 | 2023-04-04 | Key function execution method and apparatus, device, and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022267760A1 true WO2022267760A1 (zh) | 2022-12-29 |
Family
ID=77189245
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2022/093610 WO2022267760A1 (zh) | 2021-06-22 | 2022-05-18 | 按键功能执行方法、装置、设备及存储介质 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20230244379A1 (zh) |
EP (1) | EP4307096A1 (zh) |
JP (1) | JP2024520943A (zh) |
CN (1) | CN113253908B (zh) |
WO (1) | WO2022267760A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116048374A (zh) * | 2023-03-05 | 2023-05-02 | 广州网才信息技术有限公司 | 虚拟隐形键盘的在线考试方法及系统 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113253908B (zh) * | 2021-06-22 | 2023-04-25 | 腾讯科技(深圳)有限公司 | 按键功能执行方法、装置、设备及存储介质 |
CN117193540B (zh) * | 2023-11-06 | 2024-03-12 | 南方科技大学 | 虚拟键盘的控制方法和系统 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102880304A (zh) * | 2012-09-06 | 2013-01-16 | 天津大学 | 用于便携设备的字符输入方法及装置 |
CN103324271A (zh) * | 2012-03-19 | 2013-09-25 | 联想(北京)有限公司 | 一种基于手势的输入方法和电子设备 |
CN109542239A (zh) * | 2019-01-16 | 2019-03-29 | 张斌 | 一种手势控制方法、手势设备及系统 |
CN111596757A (zh) * | 2020-04-02 | 2020-08-28 | 林宗宇 | 一种基于指尖交互的手势控制方法和装置 |
US20200387229A1 (en) * | 2019-06-07 | 2020-12-10 | Facebook Technologies, Llc | Artificial reality system having a digit-mapped self-haptic input method |
US20200387214A1 (en) * | 2019-06-07 | 2020-12-10 | Facebook Technologies, Llc | Artificial reality system having a self-haptic virtual keyboard |
CN113253908A (zh) * | 2021-06-22 | 2021-08-13 | 腾讯科技(深圳)有限公司 | 按键功能执行方法、装置、设备及存储介质 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018103040A1 (zh) * | 2016-12-08 | 2018-06-14 | 深圳市柔宇科技有限公司 | 头戴式显示设备及其内容输入方法 |
CN109828672B (zh) * | 2019-02-14 | 2022-05-27 | 亮风台(上海)信息科技有限公司 | 一种用于确定智能设备的人机交互信息的方法与设备 |
CN111142674B (zh) * | 2019-12-31 | 2021-09-14 | 联想(北京)有限公司 | 一种控制方法及电子设备 |
CN111142675A (zh) * | 2019-12-31 | 2020-05-12 | 维沃移动通信有限公司 | 输入方法及头戴式电子设备 |
CN111443831A (zh) * | 2020-03-30 | 2020-07-24 | 北京嘉楠捷思信息技术有限公司 | 一种手势识别方法及装置 |
-
2021
- 2021-06-22 CN CN202110693794.4A patent/CN113253908B/zh active Active
-
2022
- 2022-05-18 WO PCT/CN2022/093610 patent/WO2022267760A1/zh active Application Filing
- 2022-05-18 JP JP2023571382A patent/JP2024520943A/ja active Pending
- 2022-05-18 EP EP22827258.9A patent/EP4307096A1/en active Pending
-
2023
- 2023-04-04 US US18/295,623 patent/US20230244379A1/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103324271A (zh) * | 2012-03-19 | 2013-09-25 | 联想(北京)有限公司 | 一种基于手势的输入方法和电子设备 |
CN102880304A (zh) * | 2012-09-06 | 2013-01-16 | 天津大学 | 用于便携设备的字符输入方法及装置 |
CN109542239A (zh) * | 2019-01-16 | 2019-03-29 | 张斌 | 一种手势控制方法、手势设备及系统 |
US20200387229A1 (en) * | 2019-06-07 | 2020-12-10 | Facebook Technologies, Llc | Artificial reality system having a digit-mapped self-haptic input method |
US20200387214A1 (en) * | 2019-06-07 | 2020-12-10 | Facebook Technologies, Llc | Artificial reality system having a self-haptic virtual keyboard |
CN111596757A (zh) * | 2020-04-02 | 2020-08-28 | 林宗宇 | 一种基于指尖交互的手势控制方法和装置 |
CN113253908A (zh) * | 2021-06-22 | 2021-08-13 | 腾讯科技(深圳)有限公司 | 按键功能执行方法、装置、设备及存储介质 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116048374A (zh) * | 2023-03-05 | 2023-05-02 | 广州网才信息技术有限公司 | 虚拟隐形键盘的在线考试方法及系统 |
CN116048374B (zh) * | 2023-03-05 | 2023-08-29 | 广州网才信息技术有限公司 | 虚拟隐形键盘的在线考试方法及系统 |
Also Published As
Publication number | Publication date |
---|---|
JP2024520943A (ja) | 2024-05-27 |
US20230244379A1 (en) | 2023-08-03 |
CN113253908B (zh) | 2023-04-25 |
EP4307096A1 (en) | 2024-01-17 |
CN113253908A (zh) | 2021-08-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2022267760A1 (zh) | 按键功能执行方法、装置、设备及存储介质 | |
JP5702296B2 (ja) | ソフトウェアキーボード制御方法 | |
KR101270847B1 (ko) | 터치 감지 입력 장치용 제스처 | |
EP2717120B1 (en) | Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications | |
JP2020052991A (ja) | ジェスチャ認識に基づく対話型ディスプレイの方法及び装置 | |
US8390577B2 (en) | Continuous recognition of multi-touch gestures | |
US20060026521A1 (en) | Gestures for touch sensitive input devices | |
US9063573B2 (en) | Method and system for touch-free control of devices | |
JP7233109B2 (ja) | タッチ感知面-ディスプレイによる入力方法、電子装置、触覚-視覚技術による入力制御方法及びシステム | |
Kolsch et al. | Multimodal interaction with a wearable augmented reality system | |
CN111596757A (zh) | 一种基于指尖交互的手势控制方法和装置 | |
Yin et al. | CamK: Camera-based keystroke detection and localization for small mobile devices | |
Zhang et al. | A novel human-3DTV interaction system based on free hand gestures and a touch-based virtual interface | |
WO2016018518A1 (en) | Optical tracking of a user-guided object for mobile platform user input | |
US10860120B2 (en) | Method and system to automatically map physical objects into input devices in real time | |
JP6232694B2 (ja) | 情報処理装置、その制御方法及びプログラム | |
US20190377423A1 (en) | User interface controlled by hand gestures above a desktop or a keyboard | |
Athira | Touchless technology | |
Chen | Universal Motion-based control and motion recognition | |
Lee et al. | Embodied interaction on constrained interfaces for augmented reality | |
KR20140086805A (ko) | 전자 장치, 그 제어 방법 및 컴퓨터 판독가능 기록매체 | |
HANI | Detection of Midair Finger Tapping Gestures and Their Applications | |
Xu et al. | Plug&touch: a mobile interaction solution for large display via vision-based hand gesture detection | |
李雲奭 | Finger identification-based hand gestures and point cloud-based 3D motion gestures for Natural User Interfaces | |
Padliya | Gesture Recognition and Recommendations |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22827258 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022827258 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2022827258 Country of ref document: EP Effective date: 20231012 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023571382 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |