WO2015110063A1 - 信息处理方法、装置及设备 - Google Patents

信息处理方法、装置及设备 Download PDF

Info

Publication number
WO2015110063A1
WO2015110063A1 PCT/CN2015/071424 CN2015071424W WO2015110063A1 WO 2015110063 A1 WO2015110063 A1 WO 2015110063A1 CN 2015071424 W CN2015071424 W CN 2015071424W WO 2015110063 A1 WO2015110063 A1 WO 2015110063A1
Authority
WO
WIPO (PCT)
Prior art keywords
finger
user
gesture information
information
gesture
Prior art date
Application number
PCT/CN2015/071424
Other languages
English (en)
French (fr)
Inventor
陈磊
Original Assignee
华为终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为终端有限公司 filed Critical 华为终端有限公司
Priority to EP15740661.2A priority Critical patent/EP3089018B1/en
Priority to BR112016017206A priority patent/BR112016017206B1/pt
Priority to RU2016134720A priority patent/RU2662408C2/ru
Priority to US15/114,033 priority patent/US9965044B2/en
Priority to KR1020167022600A priority patent/KR101877823B1/ko
Priority to JP2016548222A priority patent/JP6249316B2/ja
Publication of WO2015110063A1 publication Critical patent/WO2015110063A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/0202Constructional details or processes of manufacture of the input device
    • G06F3/0219Special purpose keyboards
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • the present invention relates to the field of information processing technologies, and more particularly to an information processing method, apparatus and device.
  • a gesture control arm ring In the prior art, in order to facilitate the user to interact with the smart device, a gesture control arm ring has been developed.
  • the gesture control arm ring needs to be worn on the user's wrist when it is used, and realizes human-computer interaction by detecting bioelectrical changes generated by the muscles of the arm or wrist when the user moves, and monitoring the physical movement of the arm.
  • the gesture control arm ring can trigger a certain operation by a specific action. For example, the user can set the fist, and then the wrist is shaken up and down three times to start the corresponding action of the device.
  • the present invention provides an information processing method, apparatus, and device to overcome the user's need to memorize multiple sets of gesture actions and operations due to specific triggering operations that must be associated with specific gesture actions in the prior art.
  • the problem of the correspondence is a problem of the correspondence.
  • the present invention provides the following technical solutions:
  • the present application discloses an information processing method, including:
  • the input mode includes a keyboard input mode and a mouse input mode;
  • the gesture information includes click gesture information and/or swipe gesture information
  • the input mode is a keyboard input mode
  • 12 finger joints or finger joints on the index finger, the middle finger, the ring finger and the little finger of the user and 12 on the 12-key dial keyboard Buttons have a one-to-one correspondence
  • the acquiring the gesture information of the user in the input mode includes: acquiring, in the keyboard input mode, click gesture information of any finger joint or finger joint of the user's click index finger, middle finger, ring finger, and little finger;
  • the generating according to the corresponding relationship between the preset gesture information and the operation instruction, generating an operation instruction corresponding to the gesture information, including:
  • the acquiring the gesture information of the user in the input mode includes:
  • the generating according to the corresponding relationship between the preset gesture information and the operation instruction, generating an operation instruction corresponding to the gesture information, including:
  • the sliding gesture information corresponds to the movement track information of the mouse pointer; the click gesture information corresponds to the trigger information of the left or right mouse button.
  • the enabling an input mode corresponding to the mode start gesture information includes:
  • the keyboard input mode is activated
  • the mouse input mode is activated when the mode start gesture information corresponds to a gesture in which the user's index finger, middle finger, ring finger, and little finger are close together.
  • the mode activation gesture information or gesture information is detected by a sensor disposed on a wrist or a palm of a user.
  • the acquiring the mode startup gesture information of the user or the gesture information of the user includes:
  • the mode start gesture information input by the user is determined according to the displacement amount.
  • the method further includes:
  • the operation instruction is sent to the terminal to facilitate the terminal to respond to the operation instruction.
  • an information processing method including:
  • an information processing method including:
  • the sliding gesture information corresponds to the movement track information of the mouse pointer; the click gesture information corresponds to the trigger information of the left or right mouse button.
  • an information processing apparatus which includes:
  • a first acquiring module configured to acquire mode startup gesture information of the user
  • a mode startup module configured to start an input mode corresponding to the mode startup gesture information;
  • the input mode includes a keyboard input mode and a mouse input mode;
  • a second acquiring module configured to acquire gesture information of the user in the input mode;
  • the gesture information includes click gesture information and sliding gesture information;
  • the command generating module is configured to generate an operation instruction corresponding to the gesture information according to a preset correspondence between the gesture information and the operation instruction.
  • the input mode is a keyboard input mode
  • the buttons have a one-to-one correspondence
  • the second acquiring module is specifically configured to:
  • the user clicks the click gesture information of any finger joint or finger joint of the index finger, the middle finger, the ring finger and the little finger.
  • the second acquiring module is specifically configured to:
  • the user can obtain sliding gesture information on the index finger, middle finger, ring finger and little finger and/or click gesture information of the finger joint or the finger joint.
  • the sliding gesture information corresponds to the movement track information of the mouse pointer; the click gesture information corresponds to the trigger information of the left or right mouse button.
  • the mode starting module includes:
  • a first mode starting module configured to start a keyboard input mode when the mode startup gesture information corresponds to a gesture of a user's index finger, middle finger, ring finger, and little finger opening;
  • the second mode startup module is configured to start a mouse input mode when the mode startup gesture information corresponds to a gesture of the user's index finger, middle finger, ring finger, and little finger.
  • the mode activation gesture information and the gesture information are detected by a sensor disposed on a wrist or a palm of a user.
  • the first acquiring module or the second obtaining module includes:
  • Obtaining a sub-module configured to acquire a pressure value of each part of the wrist or the palm when the gesture or the gesture is activated by detecting a user input mode by a sensor disposed on the wrist or the palm of the user;
  • a calculation module configured to determine a displacement amount of each part of the user's wrist or the palm according to the pressure value
  • a gesture determining module configured to determine a mode start gesture information input by the user according to the displacement amount.
  • the method further includes:
  • an instruction transmission module configured to send the operation instruction to the terminal, so that the terminal responds to the operation instruction.
  • an information processing apparatus including:
  • a first input activation module for confirming that the system enters a keyboard input mode; wherein, in the keyboard input mode, 12 finger joints or finger joints on the index finger, the middle finger, the ring finger and the little finger of the user and 12 buttons on the 12-key dial keypad have One-to-one correspondence;
  • a first gesture acquiring module configured to acquire click gesture information of any finger joint or finger joint of the user's click index finger, middle finger, ring finger, and little finger;
  • the first instruction generating module is configured to generate an operation instruction corresponding to the click gesture information according to a preset correspondence between the click gesture information and the operation instruction.
  • an information processing apparatus including:
  • a second input activation module for confirming that the system enters a mouse input mode
  • a second gesture acquiring module configured to acquire sliding gesture information of the user on the index finger, the middle finger, the ring finger, and the little finger and/or click gesture information of the finger joint or the finger joint;
  • the second instruction generating module is configured to generate an operation instruction corresponding to the sliding gesture information and/or the click gesture information according to the preset sliding gesture information and/or the correspondence between the click gesture information and the operation instruction.
  • a data processing device comprising any one of the above information processing devices.
  • an intelligent terminal comprising any one of the above information processing devices.
  • the embodiment of the present invention discloses an information processing method, apparatus, and device, which can first obtain a mode startup gesture information input by a user, and then input according to a user.
  • the mode initiates the gesture information to enter the corresponding input mode, and identifies the gesture information of the user in the determined input mode, and then identifies the user's intention according to the correspondence between the preset gesture information and the operation instruction.
  • the above input modes include a keyboard input mode and a mouse input mode, so that the user can realize convenient control of the terminal only by the input operation modes that have been used in the past, such as a click operation and a touch slide operation, in an already known input mode environment.
  • the method, device and device do not require the user to memorize the correspondence between a plurality of specific gesture actions and specific operations, and only need to pre-implant the correspondence between the basic input operations that the user is accustomed to and the standard keyboard and/or mouse operation events.
  • the execution body of the information processing method it is possible to realize the purpose of the user controlling the terminal through a habitual operation mode.
  • FIG. 1 is a flowchart of an information processing method according to an embodiment of the present invention.
  • FIG. 2 is a schematic diagram of correspondence between a 12-key dial keyboard and a 12-finger bone joint according to an embodiment of the present invention
  • FIG. 3 is a flowchart of acquiring mode startup gesture information or gesture information of a user according to an embodiment of the present disclosure
  • FIG. 4 is a schematic view showing the position of a sensor disposed in a wristband according to an embodiment of the present invention
  • FIG. 5 is a flowchart of another information processing method according to an embodiment of the present invention.
  • FIG. 6 is a flowchart of a third information processing method according to an embodiment of the present invention.
  • FIG. 7 is a flowchart of a fourth information processing method according to an embodiment of the present invention.
  • FIG. 8 is a schematic structural diagram of an information processing apparatus according to an embodiment of the present disclosure.
  • FIG. 9 is a schematic structural diagram of a first acquiring module according to an embodiment of the present disclosure.
  • FIG. 10 is a schematic structural diagram of another information processing apparatus according to an embodiment of the present disclosure.
  • FIG. 11 is a schematic structural diagram of a third information processing apparatus according to an embodiment of the present disclosure.
  • FIG. 12 is a schematic structural diagram of a fourth information processing apparatus according to an embodiment of the present disclosure.
  • FIG. 13 is a schematic diagram of communication between an information processing device and a terminal according to an embodiment of the present invention.
  • FIG. 14 is a schematic structural diagram of an information processing device according to an embodiment of the present disclosure.
  • FIG. 15 is a schematic structural diagram of another information processing device according to an embodiment of the present invention.
  • FIG. 1 is a flowchart of an information processing method according to an embodiment of the present invention. Referring to FIG. 1, the method may include:
  • Step 101 Acquire a mode startup gesture information of the user.
  • each input mode corresponds to a mode start gesture.
  • the execution body of the information processing method disclosed in the embodiment of the present invention may initiate a gesture according to a mode input by the user. The information automatically recognizes the user's current input mode and is convenient for the user.
  • Step 102 Start an input mode corresponding to the mode start gesture information.
  • the input mode may include a keyboard input mode and a mouse input mode.
  • the keyboard input mode described here means that in this mode, when a user clicks on a part of a finger, a character on a known keyboard can be directly triggered, and the character can be a number, an English letter or a fixed function. Symbol, for example, the "#" key. Of course, this requires pre-configuring the correspondence between the gesture information of the user clicking or triggering a certain part of the finger and the various key positions on the known keyboard.
  • the above mouse input mode means that in this mode, the user can perform corresponding operations on the finger or the palm like a mouse, and the operation at this time may include a sliding operation and a click operation, for example, the user uses the thumb on the finger or the palm of the hand.
  • Step 103 Acquire gesture information of the user in the input mode.
  • the gesture information may include click gesture information and/or swipe gesture information.
  • the same gesture information may have different processing response results in different input modes, and the same gesture information may be processed in one input mode, but in another input mode, There may be no processing response.
  • the user in the keyboard input mode, the user can click on the gesture information of a certain part of the palm to determine the character that the user wants to trigger. In this mode, if the user inputs the sliding gesture information, the system will The sliding gesture information is not processed in response; if the user inputs the sliding gesture information in the mouse input mode, the sliding gesture information will be processed in response, because the sliding gesture information can correspond to the mouse arrow moving operation, and this
  • the corresponding relationship conforms to the input operation habit of the user in the prior art through the mouse to the computer interface or directly to the touch display interface. For example, the user can move up and down and left and right on the palm of the hand through the thumb to experience the corresponding up and down on the touch screen. Turn the page left and right.
  • Step 104 Generate an operation instruction corresponding to the gesture information according to a preset correspondence between the gesture information and the operation instruction.
  • the operation instruction generated according to the gesture information of the user may be transmitted to the terminal, so that the terminal performs a response operation according to the operation instruction.
  • the information processing method can first acquire the mode startup gesture information input by the user, then start the gesture information according to the mode input by the user, enter the corresponding input mode, and identify the gesture information of the user in the determined input mode, and further The user's intention is identified according to the correspondence between the preset gesture information and the operation instruction.
  • the above input modes include a keyboard input mode and a mouse input mode, so that the user can realize convenient control of the terminal only by the input operation modes that have been used in the past, such as a click operation and a touch slide operation, in an already known input mode environment.
  • the method does not require the user to memorize the correspondence between a plurality of specific gesture actions and specific operations, and only needs to pre-implant the correspondence between the basic input operations that the user is accustomed with the standard keyboard and mouse operation events into the execution body of the information processing method. In the middle, it is possible to realize the user's convenient input operation and control the purpose of the terminal through the customary operation mode.
  • the system when the mode start gesture information acquired by the system is a mode start gesture corresponding to the keyboard input mode, the system starts a keyboard input mode, in which the user's index finger, middle finger, ring finger and little finger are activated.
  • the upper 12 finger joints or finger joints can have a one-to-one correspondence with the 12 keys on the 12-key dial pad.
  • the 12-key dial keypad includes 1, 2 (abc), 3 (def), 4 (ghi), 5 (jkl), 6 (mno), 7 (pqrs), 8 (tuv), 9 (wyxz), *, 0 and # keys, and in the human hand structure, the index finger, middle finger, ring finger and little finger are also separately packaged Including 3 finger joints and 3 finger joints, the index finger, middle finger, ring finger and little finger comprise a total of 12 finger joints and 12 gesture joints, and the 4*3 array of 12 buttons on the 12-key dial pad and the index finger in the human hand structure, The 4 finger joints of the middle finger, the ring finger and the little finger are the same as the 4*3 array of the finger joint.
  • FIG. 2 is a schematic diagram of the correspondence between the 12-key dial keyboard and the 12-finger bone joint disclosed in the embodiment of the present invention. Correspond to the first finger joint of the index finger and the "1" button, and the second finger joint of the index finger is associated with the "2" key... The third finger joint of the little finger corresponds to the "#" key stand up.
  • each button on the 12-key dial keyboard and the 12 finger joints or the finger joints is not limited to the above manner, and the three finger joints of the little finger may be respectively corresponding to the "1, 2, 3" buttons.
  • the three finger joints of the index finger are respectively corresponding to the "*, 0, #" button. Specifically, it can be set according to the user's preferences and habits.
  • acquiring the gesture information of the user in the input mode described in step 103 may include: acquiring any finger joint or finger joint of the user's click index finger, middle finger, ring finger, and little finger in the keyboard input mode. Click gesture information.
  • generating, according to the corresponding relationship between the preset gesture information and the operation instruction, the operation instruction corresponding to the gesture information may include: generating, according to a corresponding relationship between the preset click gesture information and the operation instruction, An operation instruction corresponding to the click gesture information.
  • each button on the 12-key dial keyboard is respectively associated with 12 finger joints or finger joints, and the corresponding relationship is pre-configured in the system, since the user is basically already familiar with the 12-key dialing.
  • the use of the keyboard and can also accurately sense the position of several finger joints or finger joints, so you do not need to remember the correspondence between each button on a 12-key dial keyboard and a fixed gesture action, you can easily and quickly realize the information. Input.
  • the step 103 is Acquiring the gesture information of the user in the input mode may include: acquiring, in a mouse input mode, sliding gesture information of the user on the index finger, the middle finger, the ring finger, and the little finger and/or clicking gesture information of the finger joint or the finger joint.
  • the generating, according to the corresponding relationship between the preset gesture information and the operation instruction, the operation instruction corresponding to the gesture information may include: according to the preset sliding gesture information and/or the click gesture information and operation Corresponding relationship of the instructions, generating an operation instruction corresponding to the sliding gesture information and/or the click gesture information.
  • the sliding gesture information of the user may be pre-configured to correspond to the movement track information of the mouse pointer; the click gesture information of the user corresponds to the trigger information of the left or right mouse button.
  • the user can close the four fingers except the thumb to form a "flat", and the thumb moves on the "flat” to simulate the movement of the mouse on the display screen.
  • this also requires the user's gesture information to correspond to the mouse standard operation event in advance. The relationship is configured well. Considering the user's previous use of the PAD and the habit of operating with the mouse, the user's left sliding gesture information can be directly moved to the left corresponding to the mouse arrow, and the user's upward sliding gesture information is corresponding to the upward movement of the mouse arrow.
  • the user's thumb is equivalent to a mouse on a "flat” of four fingers, and the "flat” is equivalent to a display. It is also possible to associate the gesture information of the user's thumb on the "flat” with the operational event of the user operating the physical touch screen so that the user can experience the operation of the touch screen on the "flat” formed by the four fingers.
  • the mouse also includes a left button and a right button.
  • the gesture action of the user clicking any one of the index finger, the middle finger, the ring finger and the little finger may be defined in the mouse input mode.
  • the gesture action of the user clicking any one of the index finger, the middle finger, the ring finger and the little finger is defined as the right mouse button trigger operation.
  • the gesture action of the user clicking the fingertip of the index finger may be defined as a left mouse button triggering operation
  • the gesture action of the user clicking the middle finger fingertip is defined as a right mouse button triggering operation and the like. There is no limit here.
  • the user's several fixed gesture information is associated with several basic operations in the mouse operation, so that the user directly realizes the gesture action performed on the palm by using the well-known mouse operation mode or the touch screen operation mode.
  • the input operation is convenient and fast, and does not require the user to remember the correspondence between multiple sets of gesture actions and operations.
  • the mode start gesture corresponding to different input modes is not fixed.
  • the startup and the operation described in step 102 in the first embodiment are used.
  • the input mode corresponding to the mode startup gesture information is specifically: when the mode startup gesture information corresponds to the user's index finger, the middle finger, the ring finger, and the little finger opening gesture, the keyboard input mode is activated; and the mode startup gesture information corresponds to the user's index finger When the gesture of the middle finger, the ring finger, and the little finger close together, the mouse input mode is activated.
  • the mode corresponds.
  • the gesture of bringing the user's index finger, middle finger, ring finger and little finger together is corresponding to the mouse input mode, and the "flat" formed by the four fingers after the close can be simulated as a display screen. This is closer to the user's operating habits.
  • the mode start gestures of the keyboard input mode and the mouse input mode are not fixedly defined.
  • the mode activation gesture information or gesture information may be detected by a sensor disposed on a wrist or a palm of a user.
  • the sensor disposed on the wrist or the palm of the user can detect a mode start gesture action or a gesture action on the hand with the sensor.
  • FIG. 3 is a flowchart of acquiring mode startup gesture information or gesture information of a user according to an embodiment of the present disclosure. Referring to FIG. 3, the foregoing process of acquiring a mode startup gesture information of a user and the specific process of acquiring the gesture information of the user may be performed. include:
  • Step 301 Acquire a pressure value of each part of the wrist or the palm when the user detects the gesture action or the gesture signal detected by the sensor disposed on the wrist or the palm of the user;
  • the sensor disposed on the wrist of the user may be disposed in a wristband, and the sensor disposed on the palm of the user may be disposed in a glove or a mitt.
  • the gesture of the user can be determined by the difference in the pressure caused by the muscle changes of the various parts of the wrist and the change of the motion when the user performs different gestures.
  • the gesture of the user can also be determined by the difference in muscle pressure between the various parts of the palm and the pressure caused by the change in motion. If the sensor is placed on the user's finger, the sensor on the sensor can directly detect the contact information of the user's thumb, thereby determining the gesture information of the user.
  • the senor can be used not only to detect the relevant sensing information of the mode starting gesture, but also to detect the user after the system enters the keyboard input mode or the mouse input mode. Sensing information for a gesture related to a swipe operation or a click operation on the palm or finger.
  • Step 302 Determine, according to the pressure value, a displacement amount of a sensor on a wrist or a palm of the user;
  • the greater the pressure change the greater the amount of displacement. Since the user wears the device with the sensor, the contact pressure of the device with the sensor and the skin of the user's hand has a substantially stable initial value, and the user is gesturing without the user performing any gesture action. During the movement, the pressure value of a certain part of the hand may become larger or smaller. Therefore, the displacement of the hand muscle may also differ in the vector direction. The greater the pressure, the greater the amount of displacement of the sensor's original position. In this embodiment, there may be multiple sensors disposed on the wrist or the palm of the user, and may be disposed at different locations as needed to improve the accuracy of the detection result.
  • Step 303 Determine mode start gesture information input by the user according to the displacement amount.
  • FIG. 4 is a schematic diagram showing the position of a sensor disposed in a wristband according to an embodiment of the present invention.
  • a pressure sensor can be disposed in the wristband, wherein each sensor can change when the user gesture changes.
  • the positional offset between the sensor and its initial position (the user does not do anything, the position where the sensor is in the natural state) is obtained based on the measured pressure change value.
  • the user's gesture motion can be finally determined based on the data detected by the sensors. Because the closer to the finger, the more obvious the change in the muscle group, so the sensor in the wristband can be placed close to the finger.
  • the data detected by the sensor in the wristband may be different each time, which affects the accuracy of the detection result.
  • the following two solutions are given.
  • the first way is to build a database that is as comprehensive as possible, and to preserve the correspondence between sensor data and gestures when the wristband is in different positions.
  • the second way is to set a multi-turn sensor in the wristband.
  • the data detected by a circle sensor located in the middle position can be used, and the relevant calibration work can be completed to ensure the intermediate position.
  • a circle of sensors can accurately identify the user's gestures.
  • each circle sensor on the wristband can directly judge the position of the current wristband according to the relevant displacement data (the thickness of the wrist is different in different positions, so According to the displacement data detected by the sensor, the cross-sectional shape of the current sensor can be calculated to estimate the position of the wristband on the wrist, and the position of the sensor and the sensor used in the first use can be calculated.
  • the position of the position is poor, thus completing the deviation correction.
  • Or in the process of using the wristband according to the current position information of each lap sensor, automatically find the nearest circle of the sensor that is in the middle position in the middle of the first use, and use the data detected by the lap sensor. .
  • the calibration process may be: 1. the user brings the wristband; 2. the wristband and the terminal establish communication; 3. initiate the calibration mode; 4. display the calibration interface on the display screen of the terminal; 5. the wristband notification terminal The "1" button is displayed; 6. The user touches the first finger joint of the index finger with the thumb of the hand with the wristband, indicating that the button "1" is pressed, and the recognition is completed until the terminal is recognized; 7. The wristband records that the sensor is current The position information is used as a reference to complete the calibration of the button "1". For the calibration process of the remaining buttons, refer to the calibration process of the above button "1".
  • the method for acquiring the mode startup gesture information of the user is also applicable to acquiring other gesture information of the user.
  • a specific method for acquiring mode start gesture information is given, but this is not the only method for obtaining user gesture information.
  • the method for recognizing user gesture information by bioelectricity can also be used in gloves.
  • FIG. 5 is a flowchart of another information processing method according to an embodiment of the present invention. As shown in FIG. 5, the information processing method may include:
  • Step 501 Acquire a mode startup gesture information of the user.
  • Step 502 Start an input mode corresponding to the mode start gesture information.
  • the input mode includes a keyboard input mode and a mouse input mode.
  • Step 503 Acquire gesture information of the user in the input mode.
  • the gesture information includes click gesture information and/or swipe gesture information
  • Step 504 Generate an operation instruction corresponding to the gesture information according to a preset correspondence between the gesture information and the operation instruction.
  • Step 505 Send the operation instruction to the terminal.
  • Step 505 sends the operation instruction to the terminal, so that the terminal responds to the operation instruction to implement human-computer interaction.
  • the user can implement convenient control of the terminal through input operation modes that are conventionally used, such as a click operation and a touch slide operation.
  • the method does not require the user to memorize the correspondence between a plurality of specific gesture actions and specific operations, and only needs to pre-implant the correspondence between the basic input operations that the user is accustomed with the standard keyboard and mouse operation events into the execution body of the information processing method. In this way, it is possible to realize a convenient input operation and control the purpose of the terminal.
  • FIG. 6 is a flowchart of a third information processing method according to an embodiment of the present invention. As shown in FIG. 6, the information processing method may include:
  • Step 601 Confirm entering the keyboard input mode
  • the 12 finger joints or the finger joints on the index finger, the middle finger, the ring finger and the little finger of the user have a one-to-one correspondence with the 12 buttons on the 12-key dial keypad.
  • Step 602 Acquire user click gesture information of any finger joint or finger joint of the index finger, the middle finger, the ring finger and the little finger.
  • Step 603 Generate an operation instruction corresponding to the click gesture information according to a preset correspondence between the click gesture information and the operation instruction.
  • each button on the 12-key dial keyboard is respectively associated with 12 finger joints or finger joints, and the corresponding relationship is pre-configured in the system, since the user is basically already familiar with the 12-key dialing.
  • the use of the keyboard and can also accurately sense the position of several finger joints or finger joints, so you do not need to remember the correspondence between each button on a 12-key dial keyboard and a fixed gesture action, you can easily and quickly realize the information. Input.
  • FIG. 7 is a flowchart of a fourth information processing method according to an embodiment of the present invention. As shown in FIG. 7, the method may include:
  • Step 701 Confirm entering the mouse input mode
  • Step 702 Acquire sliding gesture information of the user on the index finger, the middle finger, the ring finger and the little finger and/or click gesture information of the finger joint or the finger joint;
  • Step 703 Generate an operation instruction corresponding to the sliding gesture information and/or the click gesture information according to the preset sliding gesture information and/or the corresponding relationship between the click gesture information and the operation instruction.
  • the sliding gesture information corresponds to the movement track information of the mouse pointer; the click gesture information corresponds to the trigger information of the left or right mouse button.
  • the user's several fixed gesture information is associated with several basic operations in the mouse operation, so that the user directly realizes the gesture action performed on the palm by using the well-known mouse operation mode or the touch screen operation mode.
  • the input operation is convenient and fast, and does not require the user to remember the correspondence between multiple sets of gesture actions and operations.
  • FIG. 8 is a schematic structural diagram of an information processing apparatus according to an embodiment of the present invention. As shown in FIG. 8, the information processing apparatus 80 may include:
  • the first obtaining module 801 is configured to acquire mode startup gesture information of the user
  • each input mode corresponds to a mode start gesture.
  • a mode starting module 802 configured to start an input mode corresponding to the mode start gesture information
  • the input mode includes a keyboard input mode and a mouse input mode.
  • the keyboard input mode described here means that in this mode, when a user clicks on a part of the palm, a character on a known keyboard can be directly triggered, and the character can be a number, an English letter or a fixed function. symbol.
  • the above mouse input mode means that in this mode, the user can operate on the palm like a mouse, and the operation can include a slide operation and a click operation. Of course, the above two situations require pre-configured correspondence between user gesture information and standard keyboard or mouse operation events.
  • a second obtaining module 803, configured to acquire gesture information of the user in the input mode
  • the gesture information includes click gesture information and swipe gesture information.
  • the same gesture information may have different processing response results in different input modes, and the same gesture information may be processed in one input mode, but in another input mode, There may be no processing response.
  • the command generating module 804 is configured to generate an operation instruction corresponding to the gesture information according to a preset correspondence between the gesture information and the operation instruction.
  • the information processing apparatus can first acquire the mode startup gesture information input by the user, then start the gesture information according to the mode input by the user, enter the corresponding input mode, and identify the gesture information of the user in the determined input mode, and further The user's intention is identified according to the correspondence between the preset gesture information and the operation instruction.
  • the above input modes include a keyboard input mode and a mouse input mode, so that the user can realize convenient control of the terminal only by the input operation modes that have been used in the past, such as a click operation and a touch slide operation, in an already known input mode environment.
  • the device does not require the user to memorize the correspondence between a plurality of specific gesture actions and specific operations, and only needs to pre-implant the correspondence between the basic input operations that the user is accustomed with the standard keyboard and mouse operation events into the execution body of the information processing method. In the middle, it is possible to realize the user's convenient input operation and control the purpose of the terminal through the customary operation mode.
  • the 12 finger joints or the finger joints on the index finger, the middle finger, the ring finger and the little finger of the user have a one-to-one correspondence with the 12 buttons on the 12-key dial keypad;
  • the second acquiring module may be specifically configured to: obtain, in a keyboard input mode, click gesture information of any finger joint or finger joint of the user's click index finger, middle finger, ring finger, and little finger.
  • the 4*3 array of 12 buttons on the 12-key dial keyboard is the same as the 4 finger joint of the index finger, the middle finger, the ring finger and the little finger of the human hand structure or the 4*3 array of the finger joint, so the 12 key can be used.
  • the 12 buttons on the dial pad correspond to the 12 finger joints or the finger joints one by one.
  • the specific order is not specifically limited, and may be specifically set according to user preferences and habits.
  • each button on the 12-key dial keyboard is respectively associated with 12 finger joints or finger joints, and the corresponding relationship is pre-configured in the system, since the user is basically already familiar with the 12-key dialing.
  • the use of the keyboard and can also accurately sense the position of several finger joints and finger joints, so you do not need to remember the correspondence between each button on a 12-key dial keyboard and a fixed gesture action, you can easily and quickly realize the number Input.
  • the second acquiring module may be specifically configured to: obtain the sliding gesture information of the user on the index finger, the middle finger, the ring finger, and the little finger in the mouse input mode. And/or click on the finger joint or finger click gesture information.
  • the sliding gesture information of the user may be pre-configured to correspond to the movement track information of the mouse pointer; the click gesture information of the user corresponds to the trigger information of the left or right mouse button.
  • the gesture action of the user clicking any one of the index finger, the middle finger, the ring finger and the little finger may be defined as a left mouse button trigger operation, and the user clicks the index finger.
  • the gesture action of any one of the middle finger, the middle finger, and the little finger is defined as a right mouse button trigger operation.
  • the user's several fixed gesture information is associated with several basic operations in the mouse operation, so that the user directly realizes the gesture action performed on the palm by using the well-known mouse operation mode or the touch screen operation mode.
  • the input operation is convenient and fast, and does not require the user to remember the correspondence between multiple sets of gesture actions and operations.
  • the mode startup module may include the first mode startup module and the second mode. Start the module.
  • the first mode starting module may be configured to: when the mode startup gesture information corresponds to a user's index finger, middle finger, ring finger, and little finger opening gesture, initiate a keyboard input mode.
  • the second mode starting module may be configured to: when the mode start gesture information corresponds to a gesture of the user's index finger, middle finger, ring finger and little finger, start the mouse input mode.
  • the mode corresponds.
  • the gesture of bringing the user's index finger, middle finger, ring finger and little finger together is corresponding to the mouse input mode, and the "flat" formed by the four fingers after the close can be simulated as a display screen. This is closer to the user's operating habits.
  • the mode start gestures of the keyboard input mode and the mouse input mode are not fixedly defined.
  • the first obtaining module 801 may be specifically configured to: acquire mode startup gesture information of a user detected by a sensor disposed on a wrist or a palm of a user.
  • the second acquiring module may be specifically configured to: acquire gesture information of the user detected by a sensor disposed on a wrist or a palm of the user.
  • the sensor disposed on the wrist or palm of the user can detect gesture motion information on the hand with the sensor.
  • FIG. 9 is a schematic structural diagram of a first acquiring module according to an embodiment of the present invention.
  • the first acquiring module 801 may specifically include:
  • the obtaining sub-module 901 is configured to acquire a pressure value of each part of the wrist or the palm when the gesture or the gesture is activated by detecting a user input mode by a sensor disposed on the wrist or the palm of the user;
  • the sensor disposed on the wrist of the user may be disposed in a wristband, and the sensor disposed on the palm of the user may be disposed in a glove or a mitt.
  • the gesture of the user can be determined by the difference in the pressure caused by the muscle changes of the various parts of the wrist and the change of the motion when the user performs different gestures.
  • the sensor is placed on the palm of the user, it can also be caused by changes in muscles and movements in various parts of the palm. The difference in force determines the gesture action of the user. If the sensor is placed on the user's finger, the sensor on the sensor can directly detect the contact information of the user's thumb, thereby determining the gesture information of the user.
  • the calculating module 902 is configured to determine, according to the pressure value, a displacement amount of each part of the wrist or the palm of the user;
  • the displacement also has a different direction.
  • the greater the pressure the greater the amount of displacement of the sensor's original position.
  • the gesture determining module 903 is configured to determine mode start gesture information input by the user according to the displacement amount.
  • the sensor in the wristband can be placed close to the finger.
  • a specific method for acquiring mode start gesture information is given, but this is not the only method for obtaining user gesture information.
  • the method for recognizing user gesture information by bioelectricity can also be used in gloves.
  • FIG. 10 is a schematic structural diagram of another information processing apparatus according to an embodiment of the present invention.
  • the data processing apparatus 100 may include:
  • the first obtaining module 801 is configured to acquire mode startup gesture information of the user
  • a mode starting module 802 configured to start an input mode corresponding to the mode start gesture information
  • the input mode includes a keyboard input mode and a mouse input mode
  • the second obtaining module 803 is configured to acquire gesture information of the user in the input mode.
  • the gesture information includes click gesture information and swipe gesture information.
  • the command generating module 804 is configured to generate an operation instruction corresponding to the gesture information according to a preset correspondence between the gesture information and the operation instruction;
  • the instruction transmission module 1001 is configured to send the operation instruction to the terminal, so that the terminal responds to the operation instruction.
  • the instruction transmission module 1001 sends the operation instruction to the terminal, so that the terminal responds to the operation instruction to implement human-computer interaction.
  • the user can implement convenient control of the terminal through input operation modes that are conventionally used, such as a click operation and a touch slide operation.
  • the method does not require the user to memorize the correspondence between a plurality of specific gesture actions and specific operations, and only needs to pre-implant the correspondence between the basic input operations that the user is accustomed with the standard keyboard and mouse operation events into the execution body of the information processing method. In the middle, it is convenient to realize the process of human-computer interaction.
  • FIG. 11 is a schematic structural diagram of a third information processing apparatus according to an embodiment of the present invention.
  • the information processing apparatus 110 may include:
  • the first input activation module 1101 is configured to confirm that the system enters a keyboard input mode; in the keyboard input mode, 12 finger joints or finger joints on the user's index finger, middle finger, ring finger and little finger and 12 buttons on the 12-key dial keyboard Have a one-to-one correspondence;
  • the first gesture acquiring module 1102 is configured to acquire click gesture information of any finger joint or finger joint of the user's click index finger, middle finger, ring finger, and little finger.
  • the first instruction generating module 1103 generates an operation instruction corresponding to the click gesture information according to a preset correspondence between the click gesture information and the operation instruction.
  • each button on the 12-key dial keyboard is respectively associated with 12 finger joints or finger joints, and the corresponding relationship is pre-configured in the system, since the user is basically already familiar with the 12-key dialing.
  • the use of the keyboard and can also accurately sense the position of several finger joints or finger joints, so you do not need to remember the correspondence between each button on a 12-key dial keyboard and a fixed gesture action, you can easily and quickly realize the information. Input.
  • FIG. 12 is a schematic structural diagram of a fourth information processing apparatus according to an embodiment of the present invention. As shown in FIG. 12, the information processing apparatus 120 may include:
  • a second input activation module 1201, configured to confirm that the system enters a mouse input mode
  • the second gesture acquiring module 1202 is configured to acquire sliding gesture information of the user on the index finger, the middle finger, the ring finger, and the little finger and/or click gesture information of the finger joint or the finger joint.
  • the second instruction generating module 1203 is configured to generate an operation instruction corresponding to the sliding gesture information and/or the click gesture information according to the preset sliding gesture information and/or the correspondence between the click gesture information and the operation instruction.
  • the sliding gesture information may correspond to the movement track information of the mouse pointer; the click gesture information may correspond to the trigger information of the left or right mouse button.
  • the user's several fixed gesture information is associated with several basic operations in the mouse operation, so that the user directly realizes the gesture action performed on the palm by using the well-known mouse operation mode or the touch screen operation mode.
  • the input operation is convenient and fast, and does not require the user to remember the correspondence between multiple sets of gesture actions and operations.
  • an embodiment of the present invention further discloses an information processing device, which includes any one of the information processing devices disclosed in the foregoing embodiments. Since the information processing device includes any one of the information processing devices disclosed in the above embodiments, the information processing device can also first acquire the mode activation gesture information input by the user. Then, according to the mode input by the user, the gesture information is started to enter the corresponding input mode, and the gesture information of the user is recognized in the determined input mode, and then the user intention is identified according to the correspondence between the preset gesture information and the operation instruction.
  • the above input modes include a keyboard input mode and a mouse input mode, so that the user can realize convenient control of the terminal only by the input operation modes that have been used in the past, such as a click operation and a touch slide operation, in an already known input mode environment.
  • the device does not require the user to memorize the correspondence between a plurality of specific gesture actions and specific operations, and only needs to pre-implant the correspondence between the basic input operations that the user is accustomed with the standard keyboard and mouse operation events into the execution body of the information processing method. In the middle, it is possible to achieve the purpose of the user controlling the terminal through the habitual operation mode.
  • the information processing device may be any device that has a processing function and can be worn on the user's hand.
  • the information processing device can perform information interaction with the terminal by using a wireless communication method or a Bluetooth method.
  • the information processing device may transmit the operation instruction generated according to the gesture information of the user to the terminal, so that the terminal performs a response operation according to the operation instruction.
  • FIG. 13 is a schematic diagram of communication between an information processing device and a terminal according to an embodiment of the present invention.
  • the information processing device may also be an intelligent terminal that integrates the above information processing device and terminal functions, such as a portable smart terminal.
  • the portable smart terminal as a smart watch as an example
  • the sensor can be disposed on the watch strap, and the sensor on the strap can be different in pressure caused by muscle changes and motion changes of the wrist when the user performs different gestures. Determine the gesture action of the user. In this way, as long as the muscle group of the wrist of the user changes due to the action of the finger, the gesture of the user can be finally determined according to the data detected by the sensors, and then the sensor can transmit the detected gesture information to the Information processing device.
  • the information processing apparatus does not need to transmit the operation instruction to the smart watch through a wireless communication module or a Bluetooth module, and only needs to pass through the The communication line inside the smart watch can transmit the operation command. And after receiving the operation instruction, the smart watch can also normally respond to the operation instruction.
  • the information processing device is an intelligent terminal that integrates the above-described information processing device and terminal functions, such as a portable smart terminal, the accuracy of the input can be improved.
  • the input interface is small because of its portability, taking the input phone number as an example, when the finger inputs the numbers 1, 2, ..., 9, 0, it is often The input interface is small and the number cannot be entered accurately.
  • the information processing method of the embodiment of the present invention since each of the numbers has a fixed gesture information corresponding thereto, the user can accurately determine the number that the user wants to input by recognizing the gesture information of the user, thereby greatly Reduce or even avoid the situation where users cannot input information accurately.
  • FIG. 14 is a schematic structural diagram of an information processing device according to an embodiment of the present invention.
  • the information processing device 140 may include: a sensor 1401, a processor 1402, a communication device 1403, a memory 1404, and a bus 1405.
  • the sensor 1401, the processor 1402, the communication device 1403, and the memory 1404 complete communication with each other via the bus 1405.
  • the sensor 1401 is configured to collect gesture information of a user.
  • the sensor 1401 may be a contact sensor, a pressure sensor, a biostatic sensor, or the like, and may be applied to the present embodiment as long as it is capable of detecting a user's different mode activation gesture information and gesture information.
  • the memory 1404 is configured to store a set of program instructions.
  • the memory may be a high speed RAM memory or a non-volatile memory such as at least one disk memory or the like.
  • the processor 1402 is configured to invoke a program instruction stored by the memory 1404, and perform the following operations:
  • the input mode includes a keyboard input mode and a mouse input mode;
  • the gesture information includes click gesture information and/or swipe gesture information
  • each input mode corresponds to a mode start gesture, such that the processor 1402 can automatically recognize the user's current desire according to the mode input gesture mode input by the user.
  • the input mode is convenient for the user to use.
  • the keyboard input mode described therein means that in a mode, when a user clicks on a part of a finger, a character on a known keyboard can be directly triggered, and the character can be a number, an English letter or a fixed function. Symbol, for example, the "#" key. Of course, this requires pre-configuring the correspondence between the gesture information of the user clicking or triggering a certain part of the finger and the various key positions on the known keyboard.
  • the same gesture information may have different processing response results in different input modes, and the same gesture information may be processed in one input mode, but in another input mode, There may be no processing response.
  • the processor 1402 may be a central processing unit CPU, or an Application Specific Integrated Circuit (ASIC), or one or more integrated circuits configured to implement embodiments of the present invention.
  • CPU central processing unit
  • ASIC Application Specific Integrated Circuit
  • the communication device 1403 is configured to receive data in a service operation task.
  • the communication device 1403 may specifically be a wireless communication device or a Bluetooth device to transmit an operation instruction generated by the processor 1402 to the terminal through the wireless communication device or the Bluetooth device, so that the terminal can respond to the operation instruction. .
  • FIG. 15 is a schematic structural diagram of an intelligent terminal according to an embodiment of the present invention.
  • the smart terminal 150 may include: a sensor 1501, a processor 1502, a memory 1503, and a bus 1504.
  • the sensor 1501, the processor 1502, and the memory 1503 complete communication with each other via the bus 1504.
  • the sensor 1501 is configured to collect gesture information of the user.
  • the sensor 1501 may be a contact sensor, a pressure sensor, a bio-static sensor, or the like, and may be applied to the present embodiment as long as it is capable of detecting a user's different mode activation gesture information and gesture information.
  • the memory 1503 is configured to store a set of program instructions.
  • the memory may be a high speed RAM memory or a non-volatile memory such as at least one disk memory or the like.
  • the processor 1502 is configured to invoke a program instruction stored by the memory 1503, and perform the following operations:
  • the input mode includes a keyboard input mode and a mouse input mode;
  • the gesture information includes click gesture information and/or swipe gesture information
  • each input mode corresponds to a mode start gesture, such that the processor 1502 can automatically recognize the user's current desire according to the mode input gesture information input by the user.
  • the input mode is convenient for the user to use.
  • the keyboard input mode described therein means that in a mode, when a user clicks on a part of a finger, a character on a known keyboard can be directly triggered, and the character can be a number, an English letter or a fixed function. Symbol, for example, the "#" key. Of course, this requires pre-configuring the correspondence between the gesture information of the user clicking or triggering a certain part of the finger and the various key positions on the known keyboard.
  • the same gesture information may have different processing response results in different input modes, and the same gesture information may be processed in one input mode, but in another input mode, There may be no processing response.
  • the processor 1502 may be a central processing unit CPU, or an Application Specific Integrated Circuit (ASIC), or one or more integrated circuits configured to implement embodiments of the present invention.
  • CPU central processing unit
  • ASIC Application Specific Integrated Circuit
  • the information processing device Since the information processing device is integrated in the terminal, after the processor acquires the gesture information of the user and generates a corresponding operation instruction, the operation instruction is not required to be transmitted to the The portable intelligent terminal only needs to transmit the operation command through the communication line inside the portable intelligent terminal. And the processor of the portable intelligent terminal can normally respond to the operation instruction after receiving the operation instruction. Further, when the information processing device is an information processing device that integrates the above-described information processing device and terminal functions, such as a portable smart terminal, the accuracy of the input can also be improved.
  • the steps of a method or algorithm described in connection with the embodiments disclosed herein can be implemented directly in hardware, a software module executed by a processor, or a combination of both.
  • the software module can be placed in random access memory (RAM), memory, read only memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, removable disk, CD-ROM, or technical field. Any other form of storage medium known.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Dermatology (AREA)
  • General Health & Medical Sciences (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

本发明公开了一种信息处理方法、装置及设备,该信息处理方法能够在获取用户输入的模式启动手势信息后,根据用户输入的模式启动手势信息进入对应的输入模式,并在确定的输入模式下识别用户的手势信息,进而根据预设的手势信息与操作指令的对应关系识别用户意图。上述输入模式包括键盘输入模式和鼠标输入模式,从而用户能够在已经熟知的输入模式环境下,只通过以往习惯的输入操作方式,例如点击操作和触摸滑动操作,实现对终端的方便控制。该方法、装置及设备不需要用户记忆多组手势动作和操作之间的对应关系,只需将用户习惯的基本操作与标准的键盘和/或鼠标操作事件的对应关系预先植入系统中,就能够实现用户通过习惯的操作方式控制终端的目的。

Description

信息处理方法、装置及设备 技术领域
本发明涉及信息处理技术领域,更具体的说,是涉及一种信息处理方法、装置及设备。
背景技术
近年来,随着科学技术的发展,智能设备也得到了越来越广泛的发展和应用。伴随用户对智能设备功能丰富化和形式多样化的要求的提升,智能设备的输入方式也越来越智能化和丰富化。
现有技术中,为了方便用户与智能设备进行交互,开发出了一种手势控制臂环。手势控制臂环在使用时,需要佩戴在用户的手腕上,其通过检测用户运动时胳膊或手腕处肌肉产生的生物电变化,并配合手臂的物理动作监控来实现人机交互。手势控制臂环可以通过特定的动作来触发某一种操作,例如,可以设定用户握拳,然后手腕上下摇动三次为启动设备的对应动作。
但是,智能设备上可进行的操作特别多,如果用户想要采用现有技术中的手势控制臂环对智能设备进行操作,必须记住很多组手势动作与操作之间的对应关系,用户使用起来非常不方便。
发明内容
有鉴于此,本发明提供了一种信息处理方法、装置及设备,以克服现有技术中由于特定的触发操作必须对应有特定的手势动作而导致的用户需要记忆多组手势动作与操作之间的对应关系的问题。
为实现上述目的,本发明提供如下技术方案:
第一方面,本申请公开了一种信息处理方法,包括:
获取用户的模式启动手势信息;
启动与所述模式启动手势信息对应的输入模式;所述输入模式包括键盘输入模式和鼠标输入模式;
在所述输入模式下获取用户的手势信息;所述手势信息包括点击手势信息和/或滑动手势信息;
根据预设的手势信息与操作指令的对应关系,生成与所述手势信息对应的操作指令。
在第一方面的第一种可能实现的方式中,在所述输入模式为键盘输入模式时,用户食指、中指、无名指和小指上的12个手指关节或手指骨节与12键拨号键盘上的12个按键具有一一对应关系;
则所述在所述输入模式下获取用户的手势信息,包括:在键盘输入模式下获取用户的点击食指、中指、无名指和小指中任意手指关节或手指骨节的点击手势信息;
相应地,所述根据预设的手势信息与操作指令的对应关系,生成与所述手势信息对应的操作指令,包括:
根据预设的点击手势信息与操作指令的对应关系,生成与所述点击手势信息对应的操作指令。
在第一方面的第二种可能实现的方式中,在所述输入模式为鼠标输入模式时,所述在所述输入模式下获取用户的手势信息,包括:
在鼠标输入模式下获取用户在食指、中指、无名指和小指上的滑动手势信息和/或点击手指关节或手指骨节的点击手势信息;
相应地,所述根据预设的手势信息与操作指令的对应关系,生成与所述手势信息对应的操作指令,包括:
根据预设的滑动手势信息和/或点击手势信息与操作指令的对应关系,生成与所述滑动手势信息和/或点击手势信息对应的操作指令。
其中,所述滑动手势信息与鼠标指针的移动轨迹信息对应;所述点击手势信息与鼠标左键或右键的触发信息对应。
在第一方面的第三种可能的实现方式中,所述启动与所述模式启动手势信息对应的输入模式,包括:
在所述模式启动手势信息对应用户的食指、中指、无名指和小指张开的手势时,启动键盘输入模式;
在所述模式启动手势信息对应用户的食指、中指、无名指和小指并拢的手势时,启动鼠标输入模式。
其中,所述模式启动手势信息或手势信息由设置在用户腕上或手掌上的传感器来检测。
在第一方面的第四种可能的实现方式中,所述获取用户的模式启动手势信息或用户的手势信息,包括:
获取由设置在用户腕上或手掌上的传感器检测用户的输入模式启动手势动作或手势动作时腕上或手掌上各部位的压力值;
根据所述压力值确定用户腕上或手掌上传感器的位移量;
根据所述位移量确定用户输入的模式启动手势信息。
在第一方面的上述任意一种实现方式中,在所述生成与所述手势信息对应的操作指令后,还包括:
将所述操作指令发送给终端,以便于所述终端响应所述操作指令。
第二方面,公开了一种信息处理方法,包括:
确认进入键盘输入模式;所述键盘输入模式中,用户食指、中指、无名指和小指上的12个手指关节或手指骨节与12键拨号键盘上的12个按键具有一一对应关系;
获取用户的点击食指、中指、无名指和小指中任意手指关节或手指骨节的点击手势信息;
根据预设的点击手势信息与操作指令的对应关系,生成与所述点击手势信息对应的操作指令。
第三方面,公开了一种信息处理方法,包括:
确认进入鼠标输入模式;
获取用户在食指、中指、无名指和小指上的滑动手势信息和/或点击手指关节或手指骨节的点击手势信息;
根据预设的滑动手势信息和/或点击手势信息与操作指令的对应关系,生成与所述滑动手势信息和/或点击手势信息对应的操作指令。
其中,所述滑动手势信息与鼠标指针的移动轨迹信息对应;所述点击手势信息与鼠标左键或右键的触发信息对应。
第四方面,公开了一种信息处理装置,其特征在于,包括:
第一获取模块,用于获取用户的模式启动手势信息;
模式启动模块,用于启动与所述模式启动手势信息对应的输入模式;所述输入模式包括键盘输入模式和鼠标输入模式;
第二获取模块,用于在所述输入模式下获取用户的手势信息;所述手势信息包括点击手势信息和滑动手势信息;
指令生成模块,用于根据预设的手势信息与操作指令的对应关系,生成与所述手势信息对应的操作指令。
在第四方面的第一种可能的实现方式中,在所述输入模式为键盘输入模式时,用户食指、中指、无名指和小指上的12个手指关节或手指骨节与12键拨号键盘上的12个按键具有一一对应关系;则所述第二获取模块具体用于:
在键盘输入模式下获取用户的点击食指、中指、无名指和小指中任意手指关节或手指骨节的点击手势信息。
在第四方面的第二种可能的实现方式中,在所述输入模式为鼠标输入模式时,所述第二获取模块具体用于:
在鼠标输入模式下获取用户在食指、中指、无名指和小指上的滑动手势信息和/或点击手指关节或手指骨节的点击手势信息。
其中,所述滑动手势信息与鼠标指针的移动轨迹信息对应;所述点击手势信息与鼠标左键或右键的触发信息对应。
在第四方面的第三种可能的实现方式中,所述模式启动模块包括:
第一模式启动模块,用于在所述模式启动手势信息对应用户的食指、中指、无名指和小指张开的手势时,启动键盘输入模式;
第二模式启动模块,用于在所述模式启动手势信息对应用户的食指、中指、无名指和小指并拢的手势时,启动鼠标输入模式。
在第四方面的第四种可能的实现方式中,所述模式启动手势信息和手势信息由设置在用户腕上或手掌上的传感器来检测。
在第四方面的第五种可能实现的方式中,所述第一获取模块或第二获取模块包括:
获取子模块,用于获取由设置在用户腕上或手掌上的传感器检测用户的输入模式启动手势动作或手势动作时腕上或手掌上各部位的压力值;
计算模块,用于根据所述压力值确定用户腕上或手掌上各部位的位移量;
手势确定模块,用于根据所述位移量确定用户输入的模式启动手势信息。
在第四方面的上述任意一种可能的实现方式中,还包括:
指令传送模块,用于将所述操作指令发送给终端,以便于所述终端响应所述操作指令。
第五方面,公开了一种信息处理装置,包括:
第一输入启动模块,用于确认系统进入键盘输入模式;所述键盘输入模式中,用户食指、中指、无名指和小指上的12个手指关节或手指骨节与12键拨号键盘上的12个按键具有一一对应关系;
第一手势获取模块,用于获取用户的点击食指、中指、无名指和小指中任意手指关节或手指骨节的点击手势信息;
第一指令生成模块,用于根据预设的点击手势信息与操作指令的对应关系,生成与所述点击手势信息对应的操作指令。
第六方面,公开了一种信息处理装置,包括:
第二输入启动模块,用于确认系统进入鼠标输入模式;
第二手势获取模块,用于获取用户在食指、中指、无名指和小指上的滑动手势信息和/或点击手指关节或手指骨节的点击手势信息;
第二指令生成模块,用于根据预设的滑动手势信息和/或点击手势信息与操作指令的对应关系,生成与所述滑动手势信息和/或点击手势信息对应的操作指令。
第七方面,公开了一种数据处理设备,所述数据处理设备包括上述任意一种信息处理装置。
第八方面,公开了一种智能终端,所述智能终端包括如上述任意一种信息处理装置。
经由上述的技术方案可知,与现有技术相比,本发明实施例公开了一种信息处理方法、装置及设备,所述信息处理方法能够首先获取用户输入的模式启动手势信息,然后根据用户输入的模式启动手势信息进入对应的输入模式,并在确定的输入模式下识别用户的手势信息,进而根据预设的手势信息与操作指令的对应关系识别用户意图。上述输入模式包括键盘输入模式和鼠标输入模式,从而用户能够在已经熟知的输入模式环境下,只通过以往习惯的输入操作方式,例如点击操作和触摸滑动操作,实现对终端的方便控制。该方法、装置及设备不需要用户记忆多组特定手势动作和特定操作之间的对应关系,只需要将用户习惯的基本输入操作与标准的键盘和/或鼠标操作事件的对应关系预先植入该信息处理方法的执行主体中,就能够实现用户通过习惯的操作方式控制终端的目的。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据提供的附图获得其他的附图。
图1为本发明实施例公开的信息处理方法的流程图;
图2为本发明实施例公开的12键拨号键盘与12手指骨节的对应关系示意图;
图3为本发明实施例公开的获取用户的模式启动手势信息或手势信息的流程图;
图4为本发明实施例公开的设置在腕带中的传感器的位置示意图;
图5为本发明实施例公开的另一种信息处理方法的流程图;
图6为本发明实施例公开的第三种信息处理方法的流程图;
图7为本发明实施例公开的第四种信息处理方法的流程图;
图8为本发明实施例公开的信息处理装置的结构示意图;
图9为本发明实施例公开的第一获取模块的结构示意图;
图10为本发明实施例公开的另一种信息处理装置的结构示意图;
图11为本发明实施例公开的第三种信息处理装置的结构示意图;
图12为本发明实施例公开的第四种信息处理装置的结构示意图;
图13为本发明实施例公开的信息处理设备与终端的通信示意图;
图14为本发明实施例公开的信息处理设备的结构示意图;
图15为本发明实施例公开的另一种信息处理设备的结构示意图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
图1为本发明实施例公开的信息处理方法的流程图,参见图1所示,所述方法可以包括:
步骤101:获取用户的模式启动手势信息;
本发明实施例中,用户进行输入的模式可以有多种,每一种输入模式对应一种模式启动手势,这样,本发明实施例公开的信息处理方法的执行主体可以根据用户输入的模式启动手势信息自动识别用户当前想要的输入模式,方便用户使用。
步骤102:启动与所述模式启动手势信息对应的输入模式;
其中,所述输入模式可以包括键盘输入模式和鼠标输入模式。这里所述的键盘输入模式,是指在这种模式下,用户点击手指上的某部位,就可以直接触发已知键盘上的某个字符,该字符可以是数字、英文字母或具有固定作用的符号,例如,“#”号键。当然,这需要预先配置好用户点击或触发手指某部位的手势信息与已知键盘上各个键位的对应关系。上述鼠标输入模式,是指在这种模式下,用户可以像操作鼠标一样在手指或手掌上进行对应操作,这时的操作可以包括滑动操作和点击操作,例如,用户用拇指在手指或手掌上向左滑动,则对应鼠标的光标向左滑动;用户点击手指或手掌上某个部位,则对应点击鼠标左键。当然,这种情况也需要预先配置好用户手势信息与标准鼠标操作事件的对应关系。
步骤103:在所述输入模式下获取用户的手势信息;
所述手势信息可以包括点击手势信息和/或滑动手势信息。
需要说明的是,相同的手势信息在不同的输入模式下得到的处理响应结果可能不同,且同一个手势信息,在一种输入模式下可能被处理响应,但是在另外一种输入模式下,则可能得不到处理响应。上面已经介绍到,在键盘输入模式下,可以通过获取用户点击手掌上某部位的手势信息确定用户想要触发的字符,在这种模式下,如果用户输入的是的滑动手势信息,那么系统将不会对所述滑动手势信息进行处理响应;而如果在鼠标输入模式下,用户输入滑动手势信息,该滑动手势信息将会被处理响应,因为滑动手势信息可以与鼠标箭头移动操作对应,且这种对应关系符合用户在现有技术中通过鼠标对电脑界面,或直接对触摸显示屏界面的输入操作习惯,例如,用户可以通过大拇指在手掌上的上下左右移动来体验在触摸屏上进行对应上下左右翻页的操作。
步骤104:根据预设的手势信息与操作指令的对应关系,生成与所述手势信息对应的操作指令。
将手势信息与操作指令的对应关系预先植入系统程序中,通过现有技术就可以实现,只不过本发明实施例中,由于引入了多种输入模式,因此,在每一种输入模式下,都会有一个单独的手势信息与操作指令的对应关系,因为前面也已经介绍到,相同的手势信息在不同的输入模式下得到的处理响应结果可能不同。
进一步的,根据用户的手势信息生成的操作指令可以传送给终端,以便于所述终端根据所述操作指令进行响应操作。
本实施例中,所述信息处理方法能够首先获取用户输入的模式启动手势信息,然后根据用户输入的模式启动手势信息进入对应的输入模式,并在确定的输入模式下识别用户的手势信息,进而根据预设的手势信息与操作指令的对应关系识别用户意图。上述输入模式包括键盘输入模式和鼠标输入模式,从而用户能够在已经熟知的输入模式环境下,只通过以往习惯的输入操作方式,例如点击操作和触摸滑动操作,实现对终端的方便控制。该方法不需要用户记忆多组特定手势动作和特定操作之间的对应关系,只需要将用户习惯的基本输入操作与标准的键盘和鼠标操作事件的对应关系预先植入该信息处理方法的执行主体中,就能够实现用户通过习惯的操作方式方便的进行输入操作并控制终端的目的。
在上述实施例中,当系统获取到的模式启动手势信息为对应于键盘输入模式的模式启动手势的情况下,系统会启动键盘输入模式,在这种模式下,用户食指、中指、无名指和小指上的12个手指关节或手指骨节可以与12键拨号键盘上的12个按键具有一一对应关系。众所周知,12键拨号键盘上包括1、2(abc)、3(def)、4(ghi)、5(jkl)、6(mno)、7(pqrs)、8(tuv)、9(wyxz)、*、0和#号键,而人手结构中,食指、中指、无名指和小指也分别包 括3个手指关节和3个手指骨节,食指、中指、无名指和小指一共包括12个手指关节和12个手势骨节,且12键拨号键盘上12个按键的4*3阵列与人手结构中食指、中指、无名指和小指的12个手指关节或手指骨节的4*3阵列相同,因此,可参见图2,图2为本发明实施例公开的12键拨号键盘与12手指骨节的对应关系示意图,可以将食指的第一个手指骨节与“1”号按键对应起来,将食指的第二个手指骨节与“2”号键对应起来……将小指的第三个手指骨节与“#”号键对应起来。
当然,12键拨号键盘上各个按键与12个手指关节或手指骨节的对应关系并不固定限制为上述方式,也可以设置将小指的三个手指关节分别对应“1、2、3”号按键,而将食指的三个手指关节分别对应“*、0、#”号按键。具体可以根据用户的喜好及习惯来设置。
这样,在上述实施例中,步骤103所述的在所述输入模式下获取用户的手势信息可以包括:在键盘输入模式下获取用户的点击食指、中指、无名指和小指中任意手指关节或手指骨节的点击手势信息。
相应的,步骤104所述的根据预设的手势信息与操作指令的对应关系,生成与所述手势信息对应的操作指令,可以包括:根据预设的点击手势信息与操作指令的对应关系,生成与所述点击手势信息对应的操作指令。
本实施例中,将12键拨号键盘上各个按键与12个手指关节或手指骨节分别一一对应起来,并将这种对应关系预先配置在系统中,由于用户基本上都已经非常熟悉12键拨号键盘的使用方法,且也能够准确感知几个手指关节或手指骨节的位置,因此不需要记住12键拨号键盘上每一个按键与某个固定手势动作的对应关系,就可以方便快速的实现信息输入。
在第一个实施例中,当系统获取到的模式启动手势信息为对应于鼠标输入模式的模式启动手势的情况下,系统会启动鼠标输入模式,在这种模式下,步骤103所述的在所述输入模式下获取用户的手势信息可以包括:在鼠标输入模式下获取用户在食指、中指、无名指和小指上的滑动手势信息和/或点击手指关节或手指骨节的点击手势信息。
相应的,步骤104所述的根据预设的手势信息与操作指令的对应关系,生成与所述手势信息对应的操作指令,可以包括:根据预设的滑动手势信息和/或点击手势信息与操作指令的对应关系,生成与所述滑动手势信息和/或点击手势信息对应的操作指令。
在鼠标输入模式下,可以预先配置用户的滑动手势信息与鼠标指针的移动轨迹信息对应;用户的点击手势信息与鼠标左键或右键的触发信息对应。具体的,用户可以将除大拇指外的四个手指并拢,则形成一个“平板”,大拇指在所述“平板”上移动,就可以模拟鼠标在显示屏上的移动。当然,这也需要预先将用户的手势信息与鼠标标准操作事件的对应 关系配置好。考虑到用户之前使用PAD和用鼠标进行操作的习惯,可以直接将用户的向左的滑动手势信息对应鼠标箭头向左移动的操作,将用户的向上的滑动手势信息对应鼠标箭头向上移动的操作,依次类推,这样,用户的大拇指在四个手指组成的“平板”上相当于鼠标,而所述“平板”相当于显示屏。也可以将用户大拇指在所述“平板”上的手势信息与用户操作实体触摸屏的操作事件对应起来,使得用户可以在四个手指形成的“平板”上体验对触摸屏的操作。
因为在实际场景中,鼠标还包括左键和右键,为了便于用户使用,本实施例中,在鼠标输入模式下,可以将用户点击食指、中指、无名指和小指中任意一个指尖的手势动作定义为鼠标左键触发操作,将用户点击食指、中指、无名指和小指中任意一个指根的手势动作定义为鼠标右键触发操作。或者,可以将用户点击食指指尖的手势动作定义为鼠标左键触发操作,将用户点击中指指尖的手势动作定义为鼠标右键触发操作等等。在此不做限定。
本实施例中,将用户的几个固定的手势信息与鼠标操作中的几个基本操作对应起来,使得用户利用已经熟知的鼠标操作方式或触摸屏操作方式,通过在手掌上进行的手势动作直接实现输入操作,方便快捷,且不需要用户记住多组手势动作与操作之间的对应关系。
可以理解的是,上述实施例中,对应于不同输入模式的模式启动手势并没有固定限制,为了便于用户使用,本实施例中,将第一个实施例中步骤102所述的启动与所述模式启动手势信息对应的输入模式具体为:在所述模式启动手势信息对应用户的食指、中指、无名指和小指张开的手势时,启动键盘输入模式;在所述模式启动手势信息对应用户的食指、中指、无名指和小指并拢的手势时,启动鼠标输入模式。
由于用户在点击食指、中指、无名指和小指的不同的手指关节或手指骨节时,不自觉的会将几个手指分开,因此,将用户的食指、中指、无名指和小指张开的手势与键盘输入模式对应。而将用户的食指、中指、无名指和小指并拢的手势与鼠标输入模式对应,可以将并拢后的四个手指形成的“平板”模拟为显示屏。这样比较贴近用户操作习惯。当然,键盘输入模式和鼠标输入模式的模式启动手势并没有固定限定。
在上述实施例中,所述模式启动手势信息或手势信息可以通过设置在用户腕上或手掌上的传感器来检测。
需要说明的是,所述设置在用户腕上或手掌上的传感器能够检测的是带有所述传感器的手上的模式启动手势动作或手势动作。
图3为本发明实施例公开的获取用户的模式启动手势信息或手势信息的流程图,参见图3所示,上述获取用户的模式启动手势信息,以及所述获取用户的手势信息的具体过程可以包括:
步骤301:获取由设置在用户腕上或手掌上的传感器检测到的用户在输入模式启动手势动作或手势信动作时腕上或手掌上各部位的压力值;
其中,设置在用户腕上的传感器可以设置在一个腕带中,设置在用户手掌上的传感器可以设置在一个手套或露指手套中。如果传感器设置在用户的腕上,则可以通过用户在做不同手势时腕上各部位肌肉变化以及动作变化引起的压力的不同确定用户的手势动作。如果传感器设置在用户的手掌上,也可以通过手掌上各部位肌肉变化以及动作变化引起的压力的不同确定用户的手势动作。如果传感器设置在用户的手指上,则其上的传感器可以直接检测到用户大拇指的接触信息,进而确定用户的手势信息。需要说明的是,不管传感器设置在用户手部的那个位置,其不仅可以用来检测模式启动手势的相关传感信息,还可以用来检测在系统在进入键盘输入模式或鼠标输入模式后,用户在手掌或手指上进行滑动操作或点击操作的相关手势的传感信息。
步骤302:根据所述压力值确定用户腕上或手掌上传感器的位移量;
一般来说,压力变化越大,位移量越大。由于用户佩戴上带有传感器的装置后,在用户不做任何手势动作的情况下,所述带有传感器的装置与用户手部皮肤的接触压力具有一个基本稳定的初始值,而用户在做手势动作时,手部某部位的压力值可能变大也可能变小因此,手部肌肉的位移也有矢量方向上的不同。压力越大,说明传感器偏移原来位置的位移量越大。本实施例中,设置在用户腕上或手掌上的传感器可以有多个,且可以根据需要设置在不同的部位,以便于提升检测结果的准确性。
步骤303:根据所述位移量确定用户输入的模式启动手势信息。
图4为本发明实施例公开的设置在腕带中的传感器的位置示意图,如图4所示可以在腕带中设置一圈压力传感器,其中,每个传感器都可以在用户手势发生变化时,根据测量到的压力变化值得到传感器与其初始位置(用户不做任何动作,手处于自然状态下传感器所处的位置)之间的位置偏移量。这样,只要用户的手腕部的肌肉群由于手指的动作发生改变,就可以根据这些传感器检测到的数据来最终确定出用户的手势动作。因为越接近手指,肌肉群的变化越明显,因此,腕带中的传感器可以设置在接近手指的位置。
由于在实际情况中,用户每次佩戴腕带的位置可能都不相同,这样,腕带中的传感器每次检测到的数据也会不一样,这样就影响检测结果的准确性。本实施例中,为克服上述问题对检测结果准确性的影响,给出以下两种解决方式。
第一种方式是,建立尽可能全面的数据库,保存腕带在不同位置时的传感器数据和手势动作之间的对应关系。
第二种方式是,在腕带中设置多圈传感器,在用户第一次使用时,可以使用位于中间位置的一圈传感器检测到的数据,并完成相关校准工作,以保证所述处于中间位置的一圈传感器能够准确识别用户的手势动作。当用户以后使用时,可能腕带的前后位置会有差别,这种情况下,腕带上的各圈传感器就可以直接根据相关的位移数据判断当前腕带的位置(手腕不同位置粗细不同,因此,根据传感器检测到的位移数据可以计算得到当前这圈传感器所构成的截面形状,从而估计出腕带在手腕上的位置),并计算出本次传感器所处位置与第一次使用时传感器所处位置的位置差,从而完成偏差纠正。或者在使用腕带的过程中,根据当前每圈传感器对应的位置信息,自动找到与第一次使用时处于中间位置的一圈传感器距离最近的一圈传感器,并使用这圈传感器检测到的数据。
其中,校准的流程可以为:1、用户带上腕带;2、腕带和终端建立通信;3、启动校准模式;4、在所述终端的显示屏上显示校准界面;5、腕带通知终端显示“1”按键;6、用户用带有腕带的手的大拇指接触食指第一手指关节,表示按下按键“1”,保持至终端识别完毕;7、腕带记录下其中传感器是当前位置信息,以其作为基准,完成按键“1”的校准。其余按键的校准过程可参考上述按键“1”的校准流程。
当然,上述获取用户的模式启动手势信息的方法也适用于获取用户其他的手势信息。
本实施例中,给出了一种获取模式启动手势信息的具体方法,但这并不是获取用户手势信息的唯一方法,例如,还可以通过生物电识别用户手势信息的方法,还可以使用在手套的手指位置安装接触传感器的方法来获取用户的手势信息的方法等。需要说明的是,不管传感器设置在用户手部的哪个位置,其不仅可以用来检测模式启动手势的相关传感信息,还可以用来检测在系统进入键盘输入模式或鼠标输入模式后,用户在手掌或手指上进行滑动操作或点击操作的相关手势的传感信息。
图5为本发明实施例公开的另一种信息处理方法的流程图,如图5所示,所述信息处理方法可以包括:
步骤501:获取用户的模式启动手势信息;
步骤502:启动与所述模式启动手势信息对应的输入模式;
其中,所述输入模式包括键盘输入模式和鼠标输入模式。
步骤503:在所述输入模式下获取用户的手势信息;
其中,所述手势信息包括点击手势信息和/或滑动手势信息;
步骤504:根据预设的手势信息与操作指令的对应关系,生成与所述手势信息对应的操作指令;
步骤505:将所述操作指令发送给终端。
步骤505将所述操作指令发送给终端,以便于所述终端响应所述操作指令,实现人机交互。
本实施例中,用户能够通过以往习惯的输入操作方式,例如点击操作和触摸滑动操作,实现对终端的方便控制。该方法不需要用户记忆多组特定手势动作和特定操作之间的对应关系,只需要将用户习惯的基本输入操作与标准的键盘和鼠标操作事件的对应关系预先植入该信息处理方法的执行主体中,就能够实现方便的进行输入操作并控制终端的目的。
图6为本发明实施例公开的第三种信息处理方法的流程图,参见图6所示,所述信息处理方法可以包括:
步骤601:确认进入键盘输入模式;
所述键盘输入模式中,用户食指、中指、无名指和小指上的12个手指关节或手指骨节与12键拨号键盘上的12个按键具有一一对应关系。
步骤602:获取用户的点击食指、中指、无名指和小指中任意手指关节或手指骨节的点击手势信息。
步骤603:根据预设的点击手势信息与操作指令的对应关系,生成与所述点击手势信息对应的操作指令。
本实施例中,将12键拨号键盘上各个按键与12个手指关节或手指骨节分别一一对应起来,并将这种对应关系预先配置在系统中,由于用户基本上都已经非常熟悉12键拨号键盘的使用方法,且也能够准确感知几个手指关节或手指骨节的位置,因此不需要记住12键拨号键盘上每一个按键与某个固定手势动作的对应关系,就可以方便快速的实现信息输入。
图7为本发明实施例公开的第四种信息处理方法的流程图,如图7所示,所述方法可以包括:
步骤701:确认进入鼠标输入模式;
步骤702:获取用户在食指、中指、无名指和小指上的滑动手势信息和/或点击手指关节或手指骨节的点击手势信息;
步骤703:根据预设的滑动手势信息和/或点击手势信息与操作指令的对应关系,生成与所述滑动手势信息和/或点击手势信息对应的操作指令。
其中,所述滑动手势信息与鼠标指针的移动轨迹信息对应;所述点击手势信息与鼠标左键或右键的触发信息对应。
本实施例中,将用户的几个固定的手势信息与鼠标操作中的几个基本操作对应起来,使得用户利用已经熟知的鼠标操作方式或触摸屏操作方式,通过在手掌上进行的手势动作直接实现输入操作,方便快捷,且不需要用户记住多组手势动作与操作之间的对应关系。
上述本发明公开的实施例中详细描述了方法,对于本发明的方法可采用多种形式的装置实现,因此本发明还公开了一种装置,下面给出具体的实施例进行详细说明。
图8为本发明实施例公开的信息处理装置的结构示意图,参见图8所示,所述信息处理装置80可以包括:
第一获取模块801,用于获取用户的模式启动手势信息;
本发明实施例中,用户进行输入的模式可以有多种,每一种输入模式对应一种模式启动手势。
模式启动模块802,用于启动与所述模式启动手势信息对应的输入模式;
其中,所述输入模式包括键盘输入模式和鼠标输入模式。这里所述的键盘输入模式,是指在这种模式下,用户点击手掌上的某部位,就可以直接触发已知键盘上的某个字符,该字符可以是数字、英文字母或具有固定作用的符号。上述鼠标输入模式,是指在这种模式下,用户可以像操作鼠标一样在手掌上进行操作,这时的操作可以包括滑动操作和点击操作。当然,上述这两种情况需要预先配置好用户手势信息与标准键盘或鼠标操作事件的对应关系。
第二获取模块803,用于在所述输入模式下获取用户的手势信息;
所述手势信息包括点击手势信息和滑动手势信息。
需要说明的是,相同的手势信息在不同的输入模式下得到的处理响应结果可能不同,且同一个手势信息,在一种输入模式下可能被处理响应,但是在另外一种输入模式下,则可能得不到处理响应。
指令生成模块804,用于根据预设的手势信息与操作指令的对应关系,生成与所述手势信息对应的操作指令。
将手势信息与操作指令的对应关系预先植入系统程序中,通过现有技术就可以实现,只不过本发明实施例中,由于引入了多种输入模式,因此,在每一种输入模式下,都会有 一个单独的手势信息与操作指令的对应关系,因为前面也已经介绍到,相同的手势信息在不同的输入模式下得到的处理响应结果可能不同。
本实施例中,所述信息处理装置能够首先获取用户输入的模式启动手势信息,然后根据用户输入的模式启动手势信息进入对应的输入模式,并在确定的输入模式下识别用户的手势信息,进而根据预设的手势信息与操作指令的对应关系识别用户意图。上述输入模式包括键盘输入模式和鼠标输入模式,从而用户能够在已经熟知的输入模式环境下,只通过以往习惯的输入操作方式,例如点击操作和触摸滑动操作,实现对终端的方便控制。该装置不需要用户记忆多组特定手势动作和特定操作之间的对应关系,只需要将用户习惯的基本输入操作与标准的键盘和鼠标操作事件的对应关系预先植入该信息处理方法的执行主体中,就能够实现用户通过习惯的操作方式方便的进行输入操作并控制终端的目的。
在上述实施例中,在所述输入模式为键盘输入模式时,用户食指、中指、无名指和小指上的12个手指关节或手指骨节与12键拨号键盘上的12个按键具有一一对应关系;则所述第二获取模块具体可以用于:在键盘输入模式下获取用户的点击食指、中指、无名指和小指中任意手指关节或手指骨节的点击手势信息。
可以理解的是,12键拨号键盘上12个按键的4*3阵列与人手结构中食指、中指、无名指和小指的12个手指关节或手指骨节的4*3阵列相同,因此,可以将12键拨号键盘上的12个按键与12个手指关节或手指骨节一一对应起来。但是,本实施例中,对其具体对应顺序没有具体限制,具体可以根据用户的喜好及习惯来设置。
本实施例中,将12键拨号键盘上各个按键与12个手指关节或手指骨节分别一一对应起来,并将这种对应关系预先配置在系统中,由于用户基本上都已经非常熟悉12键拨号键盘的使用方法,且也能够准确感知几个手指关节和手指骨节的位置,因此不需要记住12键拨号键盘上每一个按键与某个固定手势动作的对应关系,就可以方便快速的实现数字输入。
在第一个实施例中,在所述输入模式为鼠标输入模式时,所述第二获取模块具体可以用于:在鼠标输入模式下获取用户在食指、中指、无名指和小指上的滑动手势信息和/或点击手指关节或手指骨节的点击手势信息。
在鼠标输入模式下,可以预先配置用户的滑动手势信息与鼠标指针的移动轨迹信息对应;用户的点击手势信息与鼠标左键或右键的触发信息对应。
具体的,为了便于用户使用,本实施例中,在鼠标输入模式下,可以将用户点击食指、中指、无名指和小指中任意一个指尖的手势动作定义为鼠标左键触发操作,将用户点击食指、中指、无名指和小指中任意一个指根的手势动作定义为鼠标右键触发操作。
本实施例中,将用户的几个固定的手势信息与鼠标操作中的几个基本操作对应起来,使得用户利用已经熟知的鼠标操作方式或触摸屏操作方式,通过在手掌上进行的手势动作直接实现输入操作,方便快捷,且不需要用户记住多组手势动作与操作之间的对应关系。
可以理解的是,上述实施例中,对应于不同输入模式的模式启动手势并没有固定限制,为了便于用户使用,本实施例中,所述模式启动模块可以包括第一模式启动模块和第二模式启动模块。其中,所述第一模式启动模块可以用于:在所述模式启动手势信息对应用户的食指、中指、无名指和小指张开的手势时,启动键盘输入模式。所述第二模式启动模块可以用于:在所述模式启动手势信息对应为用户的食指、中指、无名指和小指并拢的手势时,启动鼠标输入模式。
由于用户在点击食指、中指、无名指和小指的不同的手指关节或手指骨节时,不自觉的会将几个手指分开,因此,将用户的食指、中指、无名指和小指张开的手势与键盘输入模式对应。而将用户的食指、中指、无名指和小指并拢的手势与鼠标输入模式对应,可以将并拢后的四个手指形成的“平板”模拟为显示屏。这样比较贴近用户操作习惯。当然,键盘输入模式和鼠标输入模式的模式启动手势并没有固定限定。
在上述实施例中,所述第一获取模块801具体可以用于:获取由设置在用户腕上或手掌上的传感器检测到的用户的模式启动手势信息。则相应的,所述第二获取模块具体可以用于:获取由设置在用户腕上或手掌上的传感器检测到的用户的手势信息。所述设置在用户腕上或手掌上的传感器能够检测的是带有所述传感器的手上的手势动作信息。
图9为本发明实施例公开的第一获取模块的结构示意图,参见图9所示,所述第一获取模块801具体可以包括:
获取子模块901,用于获取由设置在用户腕上或手掌上的传感器检测用户的输入模式启动手势动作或手势动作时腕上或手掌上各部位的压力值;
其中,设置在用户腕上的传感器可以设置在一个腕带中,设置在用户手掌上的传感器可以设置在一个手套或露指手套中。如果传感器设置在用户的腕上,则可以通过用户在做不同手势时腕上各部位肌肉变化以及动作变化引起的压力的不同确定用户的手势动作。如果传感器设置在用户的手掌上,也可以通过手掌上各部位肌肉变化以及动作变化引起的压 力的不同确定用户的手势动作。如果传感器设置在用户的手指上,则其上的传感器可以直接检测到用户大拇指的接触信息,进而确定用户的手势信息。
计算模块902,用于根据所述压力值确定用户腕上或手掌上各部位的位移量;
一般来说,压力变化越大,位移量越大,由于压力有变大和变小两种情况,因此,位移也有方向上的不同。压力越大,说明传感器偏移原来位置的位移量越大。本实施例中,设置在用户腕上或手掌上的传感器可以有多个,且可以根据需要设置在不同的部位,以便于提升检测结果的准确性。
手势确定模块903,用于根据所述位移量确定用户输入的模式启动手势信息。
因为越接近手指,肌肉群的变化越明显,因此,腕带中的传感器可以设置在接近手指的位置。
本实施例中,给出了一种获取模式启动手势信息的具体方法,但这并不是获取用户手势信息的唯一方法,例如,还可以通过生物电识别用户手势信息的方法,还可以使用在手套的手指位置安装接触传感器的方法来获取用户的手势信息的方法等。
图10为本发明实施例公开的另一种信息处理装置的结构示意图,参见图10所示,所述数据处理装置100可以包括:
第一获取模块801,用于获取用户的模式启动手势信息;
模式启动模块802,用于启动与所述模式启动手势信息对应的输入模式;
其中,所述输入模式包括键盘输入模式和鼠标输入模式;
第二获取模块803,用于在所述输入模式下获取用户的手势信息。
其中所述手势信息包括点击手势信息和滑动手势信息。
指令生成模块804,用于根据预设的手势信息与操作指令的对应关系,生成与所述手势信息对应的操作指令;
指令传送模块1001,用于将所述操作指令发送给终端,以便于所述终端响应所述操作指令。
所述指令传送模块1001将所述操作指令发送给终端,以便于所述终端响应所述操作指令,实现人机交互。
本实施例中,用户能够通过以往习惯的输入操作方式,例如点击操作和触摸滑动操作,实现对终端的方便控制。该方法不需要用户记忆多组特定手势动作和特定操作之间的对应关系,只需要将用户习惯的基本输入操作与标准的键盘和鼠标操作事件的对应关系预先植入该信息处理方法的执行主体中,就能够方便的实现人机交互的过程。
图11为本发明实施例公开的第三种信息处理装置的结构示意图,参见图11所示,所述信息处理装置110可以包括:
第一输入启动模块1101,用于确认系统进入键盘输入模式;所述键盘输入模式中,用户食指、中指、无名指和小指上的12个手指关节或手指骨节与12键拨号键盘上的12个按键具有一一对应关系;
第一手势获取模块1102,用于获取用户的点击食指、中指、无名指和小指中任意手指关节或手指骨节的点击手势信息。
第一指令生成模块1103,根据预设的点击手势信息与操作指令的对应关系,生成与所述点击手势信息对应的操作指令。
本实施例中,将12键拨号键盘上各个按键与12个手指关节或手指骨节分别一一对应起来,并将这种对应关系预先配置在系统中,由于用户基本上都已经非常熟悉12键拨号键盘的使用方法,且也能够准确感知几个手指关节或手指骨节的位置,因此不需要记住12键拨号键盘上每一个按键与某个固定手势动作的对应关系,就可以方便快速的实现信息输入。
图12为本发明实施例公开的第四种信息处理装置的结构示意图,参见图12所示,所述信息处理装置120可以包括:
第二输入启动模块1201,用于确认系统进入鼠标输入模式;
第二手势获取模块1202,用于获取用户在食指、中指、无名指和小指上的滑动手势信息和/或点击手指关节或手指骨节的点击手势信息。
第二指令生成模块1203,用于根据预设的滑动手势信息和/或点击手势信息与操作指令的对应关系,生成与所述滑动手势信息和/或点击手势信息对应的操作指令。
其中,所述滑动手势信息可以与鼠标指针的移动轨迹信息对应;所述点击手势信息可以与鼠标左键或右键的触发信息对应。
本实施例中,将用户的几个固定的手势信息与鼠标操作中的几个基本操作对应起来,使得用户利用已经熟知的鼠标操作方式或触摸屏操作方式,通过在手掌上进行的手势动作直接实现输入操作,方便快捷,且不需要用户记住多组手势动作与操作之间的对应关系。
进一步的,本发明实施例还公开了一种信息处理设备,所述信息处理设备包括上述实施例公开的任意一种信息处理装置。由于所述信息处理设备包括上述实施例公开的任意一种信息处理装置,因此,所述信息处理设备也能够首先获取用户输入的模式启动手势信息, 然后根据用户输入的模式启动手势信息进入对应的输入模式,并在确定的输入模式下识别用户的手势信息,进而根据预设的手势信息与操作指令的对应关系识别用户意图。上述输入模式包括键盘输入模式和鼠标输入模式,从而用户能够在已经熟知的输入模式环境下,只通过以往习惯的输入操作方式,例如点击操作和触摸滑动操作,实现对终端的方便控制。该设备不需要用户记忆多组特定手势动作和特定操作之间的对应关系,只需要将用户习惯的基本输入操作与标准的键盘和鼠标操作事件的对应关系预先植入该信息处理方法的执行主体中,就能够实现用户通过习惯的操作方式控制终端的目的。
优选的,所述信息处理设备可以为任意具有处理功能,且能够佩戴在用户手部的任意装置。所述信息处理设备可以通过无线通信方式或蓝牙方式与终端进行信息交互。具体的,所述信息处理设备可以就根据用户的手势信息生成的操作指令传送给终端,以便于所述终端根据所述操作指令进行响应操作。图13为本发明实施例公开的信息处理设备与终端的通信示意图。
优选的,所述信息处理设备也可以为集成上述信息处理装置和终端功能的智能终端,比如携带式智能终端。以所述携带式智能终端为智能手表为例,传感器可以设置在手表的表带上,表带上的传感器可以通过用户在做不同手势时腕上各部位肌肉变化以及动作变化引起的压力的不同确定用户的手势动作。这样,只要用户的手腕部的肌肉群由于手指的动作发生改变,就可以根据这些传感器检测到的数据来最终确定出用户的手势动作,然后所述传感器可以将检测到的手势信息传送给所述信息处理装置。需要说明的是,所述信息处理装置在获取到用户的手势信息并生成相应的操作指令后,不需要通过无线通讯模块或蓝牙模块将所述操作指令传送给所述智能手表,只需通过所述智能手表内部的通信线路进行操作指令的传送即可。且所述智能手表在接收到所述操作指令后,也能够正常的响应所述操作指令。这样,除了上述容易记忆的优点外,特别地,当所述信息处理设备为集成上述信息处理装置和终端功能的智能终端,比如携带式智能终端时,还可以提高输入的准确度。以所述携带式智能终端为智能手表为例,由于其具有易于携带性而使得输入界面较小,以输入电话号码为例,手指在输入数字1,2,…,9,0时,往往会会因为输入界面小,而无法准确输入数字。而采用本发明实施例的信息处理的方法之后,由于每一个数字都有固定的手势信息与其相对应,因此,通过识别用户的手势信息就能够准确的确定用户想要输入的数字,从而大大的减少甚至避免了用户无法准确输入信息的情况的发生。
图14为本发明实施例公开的信息处理设备的结构示意图,如图14所示,所述信息处理设备140可以包括:传感器1401、处理器1402、通信装置1403、存储器1404和总线1405。
传感器1401、处理器1402、通信装置1403和存储器1404通过总线1405完成相互间的通信。
其中,所述传感器1401,用于采集用户的手势信息。该传感器1401可以是接触传感器、压力传感器、生物静电传感器等,只要是能够检测用户不同模式启动手势信息和手势信息的传感器,都可以应用到本实施例中。
所述存储器1404,用于存储一组程序指令。
该存储器可以是是高速RAM存储器,也可能是非易失性存储器(non-volatile memory),例如至少一个磁盘存储器等。
所述处理器1402,用于调用所述存储器1404存储的程序指令,执行如下操作:
获取用户的模式启动手势信息;其中,所述模式启动手势信息可以由传感器检测得到,然后传送给所述处理器1402。
启动与所述模式启动手势信息对应的输入模式;所述输入模式包括键盘输入模式和鼠标输入模式;
在所述输入模式下获取用户的手势信息;所述手势信息包括点击手势信息和/或滑动手势信息;
根据预设的手势信息与操作指令的对应关系,生成与所述手势信息对应的操作指令。
在本实施例中,用户进行输入的模式可以有多种,每一种输入模式对应一种模式启动手势,这样,所述处理器1402可以根据用户输入的模式启动手势信息自动识别用户当前想要的输入模式,方便用户使用。
其中所述的键盘输入模式,是指在这种模式下,用户点击手指上的某部位,就可以直接触发已知键盘上的某个字符,该字符可以是数字、英文字母或具有固定作用的符号,例如,“#”号键。当然,这需要预先配置好用户点击或触发手指某部位的手势信息与已知键盘上各个键位的对应关系。
需要说明的是,相同的手势信息在不同的输入模式下得到的处理响应结果可能不同,且同一个手势信息,在一种输入模式下可能被处理响应,但是在另外一种输入模式下,则可能得不到处理响应。
该处理器1402可能是一个中央处理器CPU,或者是特定集成电路ASIC(Application Specific Integrated Circuit),或者是被配置成实施本发明实施例的一个或多个集成电路。
所述通信装置1403,用于接收业务操作任务中的数据。所述通信装置1403具体的可以是无线通信装置或者蓝牙装置,以实现将所述处理器1402生成的操作指令通过无线通信装置或蓝牙装置传送给终端,以使得所述终端能够响应所述操作指令。
图15为本发明实施例公开的一种智能终端的结构示意图,如图15所示,所述智能终端150可以包括:传感器1501、处理器1502、存储器1503和总线1504。
传感器1501、处理器1502和存储器1503通过总线1504完成相互间的通信。
其中,所述传感器1501,用于采集用户的手势信息。该传感器1501可以是接触传感器、压力传感器、生物静电传感器等,只要是能够检测用户不同模式启动手势信息和手势信息的传感器,都可以应用到本实施例中。
所述存储器1503,用于存储一组程序指令。
该存储器可以是是高速RAM存储器,也可能是非易失性存储器(non-volatile memory),例如至少一个磁盘存储器等。
所述处理器1502,用于调用所述存储器1503存储的程序指令,执行如下操作:
获取用户的模式启动手势信息;其中,所述模式启动手势信息可以由传感器检测得到;
启动与所述模式启动手势信息对应的输入模式;所述输入模式包括键盘输入模式和鼠标输入模式;
在所述输入模式下获取用户的手势信息;所述手势信息包括点击手势信息和/或滑动手势信息;
根据预设的手势信息与操作指令的对应关系,生成与所述手势信息对应的操作指令;
相应所述操作指令.
在本实施例中,用户进行输入的模式可以有多种,每一种输入模式对应一种模式启动手势,这样,所述处理器1502可以根据用户输入的模式启动手势信息自动识别用户当前想要的输入模式,方便用户使用。
其中所述的键盘输入模式,是指在这种模式下,用户点击手指上的某部位,就可以直接触发已知键盘上的某个字符,该字符可以是数字、英文字母或具有固定作用的符号,例如,“#”号键。当然,这需要预先配置好用户点击或触发手指某部位的手势信息与已知键盘上各个键位的对应关系。
需要说明的是,相同的手势信息在不同的输入模式下得到的处理响应结果可能不同,且同一个手势信息,在一种输入模式下可能被处理响应,但是在另外一种输入模式下,则可能得不到处理响应。
该处理器1502可能是一个中央处理器CPU,或者是特定集成电路ASIC(Application Specific Integrated Circuit),或者是被配置成实施本发明实施例的一个或多个集成电路。
由于所述信息处理设备集成在终端中,因此,在所述处理器获取到用户的手势信息并生成相应的操作指令后,不需要通过无线通讯模块或蓝牙模块将所述操作指令传送给所述携带式智能终端,只需通过所述携带式智能终端内部的通信线路进行操作指令的传送即可。且所述携带式智能终端的处理器在接收到所述操作指令后,也能够正常的响应所述操作指令。此外,当所述信息处理设备为集成上述信息处理装置和终端功能的信息处理设备,比如携带式智能终端时,还可以提高输入的准确度。
本说明书中各个实施例采用递进的方式描述,每个实施例重点说明的都是与其他实施例的不同之处,各个实施例之间相同相似部分互相参见即可。对于实施例公开的装置而言,由于其与实施例公开的方法相对应,所以描述的比较简单,相关之处参见方法部分说明即可。
还需要说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
结合本文中所公开的实施例描述的方法或算法的步骤可以直接用硬件、处理器执行的软件模块,或者二者的结合来实施。软件模块可以置于随机存储器(RAM)、内存、只读存储器(ROM)、电可编程ROM、电可擦除可编程ROM、寄存器、硬盘、可移动磁盘、CD-ROM、或技术领域内所公知的任意其它形式的存储介质中。
对所公开的实施例的上述说明,使本领域专业技术人员能够实现或使用本发明。对这些实施例的多种修改对本领域的专业技术人员来说将是显而易见的,本文中所定义的一般原理可以在不脱离本发明的精神或范围的情况下,在其它实施例中实现。因此,本发明将不会被限制于本文所示的这些实施例,而是要符合与本文所公开的原理和新颖特点相一致的最宽的范围。

Claims (23)

  1. 一种信息处理方法,其特征在于,包括:
    获取用户的模式启动手势信息;
    启动与所述模式启动手势信息对应的输入模式;所述输入模式包括键盘输入模式和鼠标输入模式;
    在所述输入模式下获取用户的手势信息;所述手势信息包括点击手势信息和/或滑动手势信息;
    根据预设的手势信息与操作指令的对应关系,生成与所述手势信息对应的操作指令。
  2. 根据权利要求1所述的信息处理方法,其特征在于,在所述输入模式为键盘输入模式时,用户食指、中指、无名指和小指上的12个手指关节或手指骨节与12键拨号键盘上的12个按键具有一一对应关系;
    则所述在所述输入模式下获取用户的手势信息,包括:在键盘输入模式下获取用户的点击食指、中指、无名指和小指中任意手指关节或手指骨节的点击手势信息;
    相应地,所述根据预设的手势信息与操作指令的对应关系,生成与所述手势信息对应的操作指令,包括:
    根据预设的点击手势信息与操作指令的对应关系,生成与所述点击手势信息对应的操作指令。
  3. 根据权利要求1所述的信息处理方法,其特征在于,在所述输入模式为鼠标输入模式时,所述在所述输入模式下获取用户的手势信息,包括:
    在鼠标输入模式下获取用户在食指、中指、无名指和小指上的滑动手势信息和/或点击手指关节或手指骨节的点击手势信息;
    相应地,所述根据预设的手势信息与操作指令的对应关系,生成与所述手势信息对应的操作指令,包括:
    根据预设的滑动手势信息和/或点击手势信息与操作指令的对应关系,生成与所述滑动手势信息和/或点击手势信息对应的操作指令。
  4. 根据权利要求3所述的信息处理方法,其特征在于,所述滑动手势信息与鼠标指针的移动轨迹信息对应;所述点击手势信息与鼠标左键或右键的触发信息对应。
  5. 根据权利要求1所述的信息处理方法,其特征在于,所述启动与所述模式启动手势信息对应的输入模式,包括:
    在所述模式启动手势信息对应用户的食指、中指、无名指和小指张开的手势时,启动键盘输入模式;
    在所述模式启动手势信息对应用户的食指、中指、无名指和小指并拢的手势时,启动鼠标输入模式。
  6. 根据权利要求1所述的信息处理方法,其特征在于,所述模式启动手势信息或手势信息由设置在用户腕上或手掌上的传感器来检测。
  7. 根据权利要求6所述的信息处理方法,其特征在于,所述获取用户的模式启动手势信息或用户的手势信息,包括:
    获取由设置在用户腕上或手掌上的传感器检测用户的输入模式启动手势动作或手势动作时腕上或手掌上各部位的压力值;
    根据所述压力值确定用户腕上或手掌上传感器的位移量;
    根据所述位移量确定用户输入的模式启动手势信息。
  8. 根据权利要求1-7任一项所述的信息处理方法,其特征在于,在所述生成与所述手势信息对应的操作指令后,还包括:
    将所述操作指令发送给终端,以便于所述终端响应所述操作指令。
  9. 一种信息处理方法,其特征在于,包括:
    确认进入键盘输入模式;所述键盘输入模式中,用户食指、中指、无名指和小指上的12个手指关节或手指骨节与12键拨号键盘上的12个按键具有一一对应关系;
    获取用户的点击食指、中指、无名指和小指中任意手指关节或手指骨节的点击手势信息;
    根据预设的点击手势信息与操作指令的对应关系,生成与所述点击手势信息对应的操作指令。
  10. 一种信息处理方法,其特征在于,包括:
    确认进入鼠标输入模式;
    获取用户在食指、中指、无名指和小指上的滑动手势信息和/或点击手指关节或手指骨节的点击手势信息;
    根据预设的滑动手势信息和/或点击手势信息与操作指令的对应关系,生成与所述滑动手势信息和/或点击手势信息对应的操作指令。
  11. 根据权利要求10所述的信息处理方法,其特征在于,所述滑动手势信息与鼠标指针的移动轨迹信息对应;所述点击手势信息与鼠标左键或右键的触发信息对应。
  12. 一种信息处理装置,其特征在于,包括:
    第一获取模块,用于获取用户的模式启动手势信息;
    模式启动模块,用于启动与所述模式启动手势信息对应的输入模式;所述输入模式包括键盘输入模式和鼠标输入模式;
    第二获取模块,用于在所述输入模式下获取用户的手势信息;所述手势信息包括点击手势信息和滑动手势信息;
    指令生成模块,用于根据预设的手势信息与操作指令的对应关系,生成与所述手势信息对应的操作指令。
  13. 根据权利要求12所述的信息处理装置,其特征在于,在所述输入模式为键盘输入模式时,用户食指、中指、无名指和小指上的12个手指关节或手指骨节与12键拨号键盘上的12个按键具有一一对应关系;则所述第二获取模块具体用于:
    在键盘输入模式下获取用户的点击食指、中指、无名指和小指中任意手指关节或手指骨节的点击手势信息。
  14. 根据权利要求12所述的信息处理装置,其特征在于,在所述输入模式为鼠标输入模式时,所述第二获取模块具体用于:
    在鼠标输入模式下获取用户在食指、中指、无名指和小指上的滑动手势信息和/或点击手指关节或手指骨节的点击手势信息。
  15. 根据权利要求14所述的信息处理装置,其特征在于,所述滑动手势信息与鼠标指针的移动轨迹信息对应;所述点击手势信息与鼠标左键或右键的触发信息对应。
  16. 根据权利要求12所述的信息处理装置,其特征在于,所述模式启动模块包括:
    第一模式启动模块,用于在所述模式启动手势信息对应用户的食指、中指、无名指和小指张开的手势时,启动键盘输入模式;
    第二模式启动模块,用于在所述模式启动手势信息对应用户的食指、中指、无名指和小指并拢的手势时,启动鼠标输入模式。
  17. 根据权利要求12所述的信息处理装置,其特征在于,所述模式启动手势信息和手势信息由设置在用户腕上或手掌上的传感器来检测。
  18. 根据权利要求17所述的信息处理装置,其特征在于,所述第一获取模块或第二获取模块包括:
    获取子模块,用于获取由设置在用户腕上或手掌上的传感器检测用户的输入模式启动手势动作或手势动作时腕上或手掌上各部位的压力值;
    计算模块,用于根据所述压力值确定用户腕上或手掌上各部位的位移量;
    手势确定模块,用于根据所述位移量确定用户输入的模式启动手势信息。
  19. 根据权利要求12-18任一项所述的信息处理装置,其特征在于,还包括:
    指令传送模块,用于将所述操作指令发送给终端,以便于所述终端响应所述操作指令。
  20. 一种信息处理装置,其特征在于,包括:
    第一输入启动模块,用于确认系统进入键盘输入模式;所述键盘输入模式中,用户食指、中指、无名指和小指上的12个手指关节或手指骨节与12键拨号键盘上的12个按键具有一一对应关系;
    第一手势获取模块,用于获取用户的点击食指、中指、无名指和小指中任意手指关节或手指骨节的点击手势信息;
    第一指令生成模块,用于根据预设的点击手势信息与操作指令的对应关系,生成与所述点击手势信息对应的操作指令。
  21. 一种信息处理装置,其特征在于,包括:
    第二输入启动模块,用于确认系统进入鼠标输入模式;
    第二手势获取模块,用于获取用户在食指、中指、无名指和小指上的滑动手势信息和/或点击手指关节或手指骨节的点击手势信息;
    第二指令生成模块,用于根据预设的滑动手势信息和/或点击手势信息与操作指令的对应关系,生成与所述滑动手势信息和/或点击手势信息对应的操作指令。
  22. 一种数据处理设备,其特征在于,所述数据处理设备包括如权利要求12-19任一项所述的信息处理装置;或,包括如权利要求20所述的信息处理装置;或,包括如权利要求21所述的信息处理装置。
  23. 一种智能终端,其特征在于,所述智能终端包括如权利要求12-21任一项所述的信息处理装置。
PCT/CN2015/071424 2014-01-26 2015-01-23 信息处理方法、装置及设备 WO2015110063A1 (zh)

Priority Applications (6)

Application Number Priority Date Filing Date Title
EP15740661.2A EP3089018B1 (en) 2014-01-26 2015-01-23 Method, apparatus, and device for information processing
BR112016017206A BR112016017206B1 (pt) 2014-01-26 2015-01-23 método, aparelho e dispositivo de processamento de informação.
RU2016134720A RU2662408C2 (ru) 2014-01-26 2015-01-23 Способ, аппаратура и устройство обработки информации
US15/114,033 US9965044B2 (en) 2014-01-26 2015-01-23 Information processing method, apparatus, and device
KR1020167022600A KR101877823B1 (ko) 2014-01-26 2015-01-23 정보 처리를 위한 방법, 장치, 및 디바이스
JP2016548222A JP6249316B2 (ja) 2014-01-26 2015-01-23 情報処理の方法、装置、およびデバイス

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201410038752.7 2014-01-26
CN201410038752.7A CN103793057B (zh) 2014-01-26 2014-01-26 信息处理方法、装置及设备

Publications (1)

Publication Number Publication Date
WO2015110063A1 true WO2015110063A1 (zh) 2015-07-30

Family

ID=50668815

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/071424 WO2015110063A1 (zh) 2014-01-26 2015-01-23 信息处理方法、装置及设备

Country Status (8)

Country Link
US (1) US9965044B2 (zh)
EP (1) EP3089018B1 (zh)
JP (1) JP6249316B2 (zh)
KR (1) KR101877823B1 (zh)
CN (1) CN103793057B (zh)
BR (1) BR112016017206B1 (zh)
RU (1) RU2662408C2 (zh)
WO (1) WO2015110063A1 (zh)

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103793057B (zh) * 2014-01-26 2017-02-01 华为终端有限公司 信息处理方法、装置及设备
CN105373321A (zh) * 2014-08-13 2016-03-02 中兴通讯股份有限公司 移动终端功能对象的控制方法及装置、移动终端
CN105138136A (zh) * 2014-09-15 2015-12-09 北京至感传感器技术研究院有限公司 手势识别装置、手势识别方法及手势识别系统
CN105278699A (zh) * 2014-09-29 2016-01-27 北京至感传感器技术研究院有限公司 易穿戴式手势识别装置
CN105204645A (zh) * 2014-10-02 2015-12-30 北京至感传感器技术研究院有限公司 易穿戴式手势识别装置
CN104461365A (zh) * 2014-12-11 2015-03-25 三星电子(中国)研发中心 终端的触控方法和装置
CN104750253B (zh) * 2015-03-11 2018-10-12 苏州佳世达电通有限公司 一种供用户进行体感输入的电子装置
CN106073126B (zh) * 2015-04-27 2018-05-18 陈明洙 Led安全背包
US10638316B2 (en) * 2016-05-25 2020-04-28 Intel Corporation Wearable computer apparatus with same hand user authentication
CN107490370B (zh) * 2016-06-13 2020-01-10 原相科技股份有限公司 测量装置及其运作方法,轨迹感测系统及其轨迹感测方法
CN106569610A (zh) * 2016-11-09 2017-04-19 李飞洋 一种通过大拇指弯曲度检测实现电子设备输入功能的方法
CN107329574A (zh) * 2017-06-30 2017-11-07 联想(北京)有限公司 用于电子设备的输入方法和系统
CN107301415B (zh) * 2017-08-08 2024-03-22 方超 手势采集系统
EP3667564A4 (en) * 2017-08-08 2021-04-07 Fang, Chao GESTURE ACQUISITION SYSTEM
CN107403178A (zh) * 2017-08-08 2017-11-28 方超 手势采集系统
CN107632716B (zh) * 2017-11-07 2024-03-08 王可攀 一种输入信息处理装置及其处理输入信息的方法
CN208752575U (zh) * 2018-07-27 2019-04-16 深圳市志海和科技有限公司 一种键盘
CN209879763U (zh) * 2018-10-26 2019-12-31 珠海中电数码科技有限公司 一种组合式智能全面屏黑板
US10902250B2 (en) * 2018-12-21 2021-01-26 Microsoft Technology Licensing, Llc Mode-changeable augmented reality interface
CN109859452A (zh) * 2019-03-11 2019-06-07 汕头大学 一种穿戴式控制器及使用该控制器的方法
CN109947245A (zh) * 2019-03-12 2019-06-28 合肥工业大学 用于操作腕表式设备的手指动作指令集及其生成方法
CN110209265A (zh) * 2019-04-04 2019-09-06 北京理工大学 一种基于手指触碰检测的输入系统
CN110162183B (zh) * 2019-05-30 2022-11-01 努比亚技术有限公司 凌空手势操作方法、可穿戴设备及计算机可读存储介质
CN110780732A (zh) * 2019-09-06 2020-02-11 北京理工大学 一种基于空间定位以及手指点击的输入系统
CN111897477B (zh) * 2020-08-04 2022-06-17 上海传英信息技术有限公司 移动终端的操控方法、移动终端及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101142616A (zh) * 2004-12-31 2008-03-12 感应板公司 数据输入装置
CN201638148U (zh) * 2009-09-10 2010-11-17 深圳市亿思达显示科技有限公司 一种手套式虚拟输入装置
US8436821B1 (en) * 2009-11-20 2013-05-07 Adobe Systems Incorporated System and method for developing and classifying touch gestures
CN103793057A (zh) * 2014-01-26 2014-05-14 华为终端有限公司 信息处理方法、装置及设备

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7109970B1 (en) 2000-07-01 2006-09-19 Miller Stephen S Apparatus for remotely controlling computers and other electronic appliances/devices using a combination of voice commands and finger movements
US20030056278A1 (en) * 2001-09-26 2003-03-27 Lung Kuo Structure of finger keyboard
CN1482527A (zh) * 2002-09-15 2004-03-17 廖华勇 手指动作录入技术及其设备
JP4379214B2 (ja) * 2004-06-10 2009-12-09 日本電気株式会社 携帯端末装置
KR20060022984A (ko) 2004-09-08 2006-03-13 홍광석 키패드 글러브 장치
US20090091530A1 (en) 2006-03-10 2009-04-09 Kenji Yoshida System for input to information processing device
RU2457532C2 (ru) * 2006-03-10 2012-07-27 Кенджи Йошида Система обработки ввода для устройства обработки информации
CN200944218Y (zh) * 2006-06-30 2007-09-05 山东大学 指套式数字键盘装置
US20080129694A1 (en) * 2006-11-30 2008-06-05 Liberty Reach Inc. Keyless user interface device
JP2008135033A (ja) 2007-11-26 2008-06-12 Olympus Corp 手姿勢動作検出装置
CN101226438A (zh) * 2008-01-18 2008-07-23 于非 一种手套形计算机鼠标
US8421634B2 (en) 2009-12-04 2013-04-16 Microsoft Corporation Sensing mechanical energy to appropriate the body for data input
CN201780561U (zh) * 2010-07-29 2011-03-30 陈誉航 指套式鼠标
US20120242584A1 (en) * 2011-03-22 2012-09-27 Nokia Corporation Method and apparatus for providing sight independent activity reports responsive to a touch gesture
US9218058B2 (en) * 2011-06-16 2015-12-22 Daniel Bress Wearable digital input device for multipoint free space data collection and analysis
EP2760363A4 (en) 2011-09-29 2015-06-24 Magic Leap Inc TACTILE GLOVE FOR HUMAN COMPUTER INTERACTION
CA3051912C (en) 2012-02-24 2023-03-07 Thomas J. Moscarillo Gesture recognition devices and methods
US8743052B1 (en) * 2012-11-24 2014-06-03 Eric Jeffrey Keller Computing interface system
JP2014115688A (ja) 2012-12-06 2014-06-26 Sharp Corp 文字入力システムおよび文字入力方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101142616A (zh) * 2004-12-31 2008-03-12 感应板公司 数据输入装置
CN201638148U (zh) * 2009-09-10 2010-11-17 深圳市亿思达显示科技有限公司 一种手套式虚拟输入装置
US8436821B1 (en) * 2009-11-20 2013-05-07 Adobe Systems Incorporated System and method for developing and classifying touch gestures
CN103793057A (zh) * 2014-01-26 2014-05-14 华为终端有限公司 信息处理方法、装置及设备

Also Published As

Publication number Publication date
KR20160110992A (ko) 2016-09-23
RU2016134720A3 (zh) 2018-02-28
KR101877823B1 (ko) 2018-07-12
RU2662408C2 (ru) 2018-07-25
CN103793057B (zh) 2017-02-01
JP6249316B2 (ja) 2017-12-20
RU2016134720A (ru) 2018-02-28
EP3089018B1 (en) 2019-01-09
EP3089018A1 (en) 2016-11-02
BR112016017206B1 (pt) 2018-08-28
EP3089018A4 (en) 2017-01-18
CN103793057A (zh) 2014-05-14
JP2017509957A (ja) 2017-04-06
US20160342217A1 (en) 2016-11-24
US9965044B2 (en) 2018-05-08

Similar Documents

Publication Publication Date Title
WO2015110063A1 (zh) 信息处理方法、装置及设备
Whitmire et al. Digitouch: Reconfigurable thumb-to-finger input and text entry on head-mounted displays
CN105824559B (zh) 一种误触识别及处理方法和电子设备
EP2805220B1 (en) Skinnable touch device grip patterns
US10282090B2 (en) Systems and methods for disambiguating intended user input at an onscreen keyboard using dual strike zones
US8816964B2 (en) Sensor-augmented, gesture-enabled keyboard and associated apparatus and computer-readable storage medium
TW201237735A (en) Event recognition
JPWO2013094371A1 (ja) 表示制御装置、表示制御方法およびコンピュータプログラム
CN105159539A (zh) 可穿戴设备的触控响应方法、装置及可穿戴设备
WO2015123971A1 (zh) 输入方法和装置
US8959620B2 (en) System and method for composing an authentication password associated with an electronic device
US20240077948A1 (en) Gesture-based display interface control method and apparatus, device and storage medium
TWI659353B (zh) 電子設備以及電子設備的工作方法
TW200844818A (en) Method for browsing a user interface for an electronic device and the software thereof
RU2689430C1 (ru) Система и способ управления сенсорным экраном с помощью двух костяшек пальцев
WO2017113365A1 (zh) 响应作用在触摸屏上的手势的方法和终端
KR20150065336A (ko) 전자 장치의 제스처 인식 방법, 장치 및 컴퓨터 판독 가능한 기록 매체
WO2018112803A1 (zh) 触摸屏手势识别的方法及装置
US10564719B1 (en) Augmenting the functionality of user input devices using a digital glove
JP2005303870A (ja) 端末装置
US20160004384A1 (en) Method of universal multi-touch input
KR20110075700A (ko) Z값을 이용한 터치 인터페이스 장치 및 방법
JP2016066254A (ja) タッチ検出装置を備えた電子装置
TWI425397B (zh) 觸控模組及其控制方法
US9720513B2 (en) Apparatus and method for receiving a key input

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15740661

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2016548222

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 15114033

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

REEP Request for entry into the european phase

Ref document number: 2015740661

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2015740661

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 20167022600

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2016134720

Country of ref document: RU

Kind code of ref document: A

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112016017206

Country of ref document: BR

ENP Entry into the national phase

Ref document number: 112016017206

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20160725