CN103324271B - A kind of input method and electronic equipment based on gesture - Google Patents

A kind of input method and electronic equipment based on gesture Download PDF

Info

Publication number
CN103324271B
CN103324271B CN201210073206.8A CN201210073206A CN103324271B CN 103324271 B CN103324271 B CN 103324271B CN 201210073206 A CN201210073206 A CN 201210073206A CN 103324271 B CN103324271 B CN 103324271B
Authority
CN
China
Prior art keywords
keyboard
finger
operated
user
electronic equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201210073206.8A
Other languages
Chinese (zh)
Other versions
CN103324271A (en
Inventor
谢晓辉
阳光
李志刚
杨锦平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201210073206.8A priority Critical patent/CN103324271B/en
Publication of CN103324271A publication Critical patent/CN103324271A/en
Application granted granted Critical
Publication of CN103324271B publication Critical patent/CN103324271B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Position Input By Displaying (AREA)
  • Input From Keyboards Or The Like (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a kind of input method based on gesture and electronic equipment, this method is applied to an electronic equipment, and the electronic equipment includes information acquisition unit and keyboard, including:User's input gesture on the keyboard of the electronic equipment is obtained by information acquisition unit;When the finger for detecting user is located in specified button in the subpoint of plane where keyboard, movement locus of the finger in the corresponding predeterminable area of the specified button is obtained;According to the movement locus, object to be operated is determined;When detecting the finger and leaving predeterminable area, according to the corresponding input operation type of specified button, treat operation object and perform input operation.This method can reduce the operation complexity in input process, and improve input efficiency.

Description

A kind of input method and electronic equipment based on gesture
Technical field
The present invention relates to input technology field, more particularly to a kind of input method and electronics based on gesture are set It is standby.
Background technology
The application of electronic equipment is increasingly extensive.Currently in order to preferably realizing the input operation to electronic equipment, electronics is set Standby input mode is also increasing.
But existing input mode is usually present the cumbersome defect of complex operation.By taking computer as an example, generally by Keyboard is tapped to carry out the input operation of character input either character deletion, change etc..If it is defeated that user carries out document When entering to find continuous multiple character input mistakes, user needed persistently to press delete key until release should after deletion action is finished Delete key, this mode of operation is complicated, and lasting depresses button is time-consuming longer, is also easy to due to unclamping delete key not in time And the problem of error section content is deleted will not occur.Meanwhile, the operation switched between multiple documents or program It is relatively common input operation, if user is being carried out in document input process, it is necessary to be switched to other documents, user It is very numerous hand is moved back into input through keyboard region progress input operation process after utilization mouse selection respective document manual first It is trivial, take, influence user's input efficiency.
The content of the invention
In view of this, the invention provides a kind of input method based on gesture and electronic equipment, to reduce input process In operation complexity, and improve input efficiency.
To achieve the above object, the present invention provides following technical scheme:A kind of input method based on gesture, applied to one Electronic equipment, the electronic equipment includes information acquisition unit and keyboard, including:
The user obtained by described information acquiring unit on the keyboard of the electronic equipment inputs gesture;
When the finger for detecting user is located in specified button in the subpoint of plane where the keyboard, obtain described Movement locus of the finger in the corresponding predeterminable area of the specified button;
According to the movement locus, object to be operated is determined;
When detecting the finger and leaving the predeterminable area, according to the corresponding input operation class of the specified button Type, input operation is performed to the object to be operated.
On the other hand, the invention also discloses a kind of electronic equipment, including:
User's input gesture on information acquisition unit, the keyboard for obtaining the electronic equipment;
Gesture path acquiring unit, for being located at when the finger subpoint of plane where the keyboard for detecting user When in specified button, movement locus of the finger in the corresponding predeterminable area of the specified button is obtained;
Operation object determining unit, for according to the movement locus, determining object to be operated;
Processing unit, for when detecting the finger and leaving the predeterminable area, according to specified button correspondence Input operation type, input operation is performed to the object to be operated.
Understood via above-mentioned technical scheme, compared with prior art, the present disclosure provides a kind of based on gesture Input method and electronic equipment.This method is applied to an electronic equipment, and the electronic equipment includes information acquisition unit and keyboard, should Method is when the finger for detecting user is located in specified button in the subpoint of plane where keyboard, and triggering obtains user's finger Movement locus, and then object to be operated is determined according to the movement locus of finger, when detect the finger leave it is described pre- If during region, according to the corresponding input operation type of the specified button on the keyboard, corresponding place is performed to the object to be operated Reason, user only need to carry out corresponding gesture operation in keyboard area can both complete to select object to be operated and to be operated right As the operation handled, operating process simply, conveniently, improves input efficiency.
Brief description of the drawings
In order to illustrate more clearly about the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing There is the accompanying drawing used required in technology description to be briefly described, it should be apparent that, drawings in the following description are only this The embodiment of invention, for those of ordinary skill in the art, on the premise of not paying creative work, can also basis The accompanying drawing of offer obtains other accompanying drawings.
Fig. 1 shows a kind of schematic flow sheet of input method one embodiment based on gesture of the present invention;
Fig. 2 shows a kind of schematic flow sheet of input method another embodiment based on gesture of the present invention;
Fig. 3 shows a kind of schematic flow sheet of input method another embodiment based on gesture of the present invention;
Fig. 4 shows the structural representation of a kind of electronic equipment one embodiment of the present invention;
Fig. 5 shows the structural representation of another embodiment of a kind of electronic equipment of the present invention.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete Site preparation is described, it is clear that described embodiment is only a part of embodiment of the invention, rather than whole embodiments.It is based on Embodiment in the present invention, it is every other that those of ordinary skill in the art are obtained under the premise of creative work is not made Embodiment, belongs to the scope of protection of the invention.
The embodiment of the invention discloses a kind of input method based on gesture, the input method is applied to an electronic equipment, The electronic equipment includes information acquisition unit and keyboard, and this method obtains the user on the keyboard by the information acquisition unit Input gesture;When the finger for detecting user is located in specified button in the subpoint of plane where keyboard, obtains finger and exist Movement locus in the corresponding predeterminable area of the specified button;Object to be operated is determined according to the movement locus, and ought be detected When finger leaves the predeterminable area, according to the corresponding input operation type of the specified button, the object to be operated determined is held Row input operation.This method can reduce the complexity of input operation, and realization quickly and easily carries out corresponding input operation.
The input method based on gesture to the present invention is described in detail below, referring to Fig. 1, shows one kind of the invention The schematic flow sheet of input method one embodiment based on gesture, the input method of the present embodiment is applied to an electronic equipment, The electronic equipment includes information acquisition unit and keyboard, and the electronic equipment can be palm PC, notebook computer, tablet personal computer Or equipment of other setting keyboards.The keyboard of the electronic equipment can for physical entity keyboard, entity touch keyboard or It is the dummy keyboard being projected out by light shadow casting technique, the method for the present embodiment includes:
Step 101:The user obtained by described information acquiring unit on the keyboard of the electronic equipment inputs gesture.
The information acquisition unit can capture user's input gesture on the keyboard of the electronic equipment.The specific information is obtained It can be the camera set on image capture unit, such as electronic equipment to take unit, and the input of user is captured by camera Gesture.The information acquisition unit can also be electric field induction unit, and sensing on certain keyboard each by the electric field induction unit presses The change of the corresponding electric-field intensity of key, to determine whether the gesture of user is slided through certain button, or is to determine the finger of user Whether enter in the corresponding areal extent of certain button.Specifically, when the keyboard is physical keyboard, can be in each of the keyboard The lower section of button sets the sensor that induction field changes, and such as an electric capacity and sensor is set in keys upwards, when human body is close During certain button, the capacitance below the button changes, and then the electric field intensity value that sensor is got is changed Become, according to the situation of change of the corresponding electric-field intensity of button, it may be determined that whether the finger of user is on button or is somebody's turn to do Keys upwards.
Step 102:When the finger for detecting user is located in specified button in the subpoint of plane where the keyboard, Obtain movement locus of the finger in the corresponding predeterminable area of the specified button.
When the subpoint that the information acquisition unit detects finger plane where the keyboard of user is located at specified button When upper, illustrated that user needed by gesture input input operation corresponding with the action type of the specified button, then now to trigger Electronic equipment obtains movement locus of the finger in the corresponding predeterminable area of the specified button.
Wherein, the subpoint of the finger of user, which is located in specified button, a variety of situations, one of which situation, the hand of user Refer in the specified button, be in contact with the specified button.Another situation, the finger of user in the top of the specified button, From finger to the keyboard to vertical line, the vertical point of the vertical line in the specified button, and user finger not with the specified button phase Contact.
The subpoint of the finger of correspondence user is located at a variety of situations in specified button, obtains the finger in specified button pair The movement locus for the predeterminable area answered also has a variety of situations, specifically, can be obtain the finger it is corresponding in the specified button The movement locus of predeterminable area enterprising line slip operation or obtain the finger parallel to keyboard institute in the plane and Movement locus in the range of the predeterminable area, the i.e. finger not with keypad contact, but the finger motion region in the keyboard On projection but belong to the predeterminable area.
The specified button is that some or multiple buttons or system specified according to needs by user are preset Some or multiple buttons.For example, the specified button can for the tab key on keyboard, delete key (Backspace keys) or It is cursor movement key etc..
The corresponding predeterminable area of the specified button can in keyboard with the specified button be in a line button institute The region of composition;Or the region constituted in keyboard with button of the specified button in same row.For example, when this is specified When button is delete key, then the designated area can be understood as being in the delete key on keyboard constitutes area with the button of a line Domain, then slide of the user on the button of the row or user translate the action of finger in the row keys upwards can be by Capture.
Step 103:According to the movement locus, object to be operated is determined.
Object to be operated is determined according to movement locus, it is possible to understand that according to the finger movement locus determine user select Object to be operated.Specifically, can be obtain user's finger movement locus when, get user's finger the direction of motion and Moving displacement amount, and then determine that selection content obtains to be operated right according to the direction of motion or moving displacement amount got As.For example, when user carry out document input when, according to the direction of motion determine moving direction of cursor so that determine selection object or Person's word content etc..
Step 104:When detecting the finger and leaving the predeterminable area, according to the corresponding input of the specified button Action type, input operation is performed to the object to be operated.
The present embodiment triggers the input for the object to be operated and grasped when detecting user's finger and leaving the predeterminable area Make.Certainly there can also be other modes triggering to the input operation for the object to be operated determined, for example, to be operated when determining After object, the designated area etc. is clicked directly on by finger.
The input operation input operation type corresponding with the specified button performed to the object to be operated is relevant, e.g., should Specified button is delete key, then performs the deletion action for treating operation object.
The input method based on gesture of the present embodiment carries out document input or in other editors in user using keyboard During window enters edlin, when need to such as perform the corresponding input operation of certain specified button, the projection of the finger of user is only needed Point falls in the specified button, when the subpoint that the electronic equipment detects the finger of user is fallen in the specified button, triggering The input motion of user is monitored, and obtains user gesture motion, i.e. fortune of the finger of user in the specified button region Dynamic rail mark, and then object to be operated is determined according to the movement locus, and according to the corresponding input operation type pair of the specified button The object to be operated performs corresponding input operation, it is achieved thereby that inputting gesture to complete input operation according to user, simplifies Input operation, improves input complexity.
In order to clearly describe the input method of the invention based on gesture, below using information acquisition unit as image capture list Member, while being described exemplified by carrying out deletion action.Referring to Fig. 2, a kind of input method based on gesture of the present invention is shown The schematic flow sheet of another embodiment, the method for the present embodiment is applied to the electronics with image capture unit and keyboard and set Standby, the electronic equipment can be notebook computer, tablet personal computer, mobile phone etc., and the keyboard of the electronic equipment can be real for physics Body keyboard, entity touch keyboard can also be dummy keyboard.The method of the present embodiment includes:
Step 201:The user absorbed by image capture unit on keyboard inputs gesture.
Wherein, the image capture unit can be the camera device on electronic equipment, camera.
Image capture unit only needs putting down slide gesture either user of the user on keyboard in the present embodiment Row is absorbed in the gesture of (user not with keypad contact) in the plane of the keyboard.
For example, when the electronic equipment is notebook computer, when the top cover of notebook computer is opened, on the notebook computer Camera can absorb on keyboard area user gesture motion.And for example, when the electronic equipment is mobile phone or flat board electricity During brain, the electronic equipment outwards projects dummy keyboard by way of projection, and the electronic equipment can be to user in projection User gesture action on dummy keyboard region is absorbed, and such as user clicks on certain virtual key, and the finger of user is in virtual key Translated or slided above certain row virtual key of disk.
Step 202:When the finger for capturing user is located in delete key in the subpoint of plane where keyboard, following the trail of should Movement locus of the finger in the corresponding predeterminable area of delete key.
The present embodiment is described so that specified button is delete key as an example, therefore when image capture unit detects user Touched to finger in the delete button, or the finger of user is moved to directly over the delete key so that finger is in keyboard The subpoint of plane falls in the delete key, then triggers the motion to the finger of user and be tracked.Specifically, being then to follow the trail of to use The finger at family, which is located on keyboard with the delete key on the region constituted with the button of a line, slides touch track, for example, user Using the delete key as starting point, line slip is entered to the left on a line button where the delete key.Can also work as to detect user Subpoint when being located in the delete key, a line keys upwards of the finger of user where parallel to the delete key are translated Motion, the finger of user is not put down but with the keypad contact but the motion of user's finger with a line button where the delete key OK.
Step 203:According to the direction of motion of the movement locus and/or moving displacement amount, object to be operated is determined.
Step 204:When the corresponding predeterminable area of delete key is left in intake to the finger, triggering is performed to be treated described in deletion The operation of operation object.
After the motion that user is carried out by starting point of subpoint is followed the trail of in triggering, the electronic equipment is according to the motion of movement locus Direction or moving displacement amount determine object to be operated, can be determined simultaneously according to the direction of motion and moving displacement amount to be operated Object.When the finger that electronic equipment is absorbed to user leaves the corresponding region predeterminable area of delete key, then in delete step 203 The object to be operated determined.
Wherein it is determined that object to be operated can determine cursor displacement according to moving displacement amount, the operation of selection is determined Content.For example, when carry out document input process in, the finger of user is moved in the delete key, at the same user with delete key (identical with the arrow glide direction indicated in the delete key) is slided to the left on the button with a line, the electronic equipment is with right The motion of user is tracked, and gets the slide displacement amount of user's finger, determines cursor to moving to left according to the slide displacement amount The character passed through in dynamic scope, the cursor shifting range is used as object to be deleted.
It is of course also possible to determine object to be deleted according only to the direction of motion.If the user input document, capture The subpoint of the finger of user be located at the delete key on when, triggering follow the trail of the user's finger motion, when the direction of motion to the left when, then The character with a line will be in document with cursor and the content on the left of cursor is used as object to be operated;Then work as the direction of motion When to the right, then it will be in the character with a line with cursor in document and the character on the right side of cursor is used as deletion object.
Optionally, can be simultaneously to be operated to determine according to the direction of motion and moving displacement amount of the user's finger got Object.Can with the movement locus of real-time tracking user, and then changed in real time according to the direction of motion and moving displacement amount it is to be operated right As.
Specifically, when the finger of intake to user is located in the delete key in the subpoint of keyboard plan, it is determined that currently First position where cursor;The finger motion of user is followed the trail of, moving direction of cursor is determined according to the direction of motion of user's finger, And according to the corresponding relation between default displacement and cursor displacement, it is determined that corresponding with the moving displacement amount of user's finger Cursor displacement;And according to cursor displacement, the second place is moved the cursor to along the moving direction of cursor determined, And then using the operation object between first position and the second place as object to be operated, and when detect user's finger from When opening the corresponding predeterminable area of the delete key, the deletion action to the object to be operated is performed.
Wherein, the moving displacement amount of user's finger refers to fall in the delete key from the subpoint for detecting user's finger Position where the moment user is the distance between to current time user's finger position.
It should be noted that the first position where the cursor refers to that the subpoint for detecting user's finger falls in the deletion Position at the time of on key where cursor, and the second place is the position moved the cursor to the movement locus of the finger of user Put, with the motion of user's finger, the second place be able to may also change.
In order to which user can get information about object to be operated, it is determined that will can also wait to grasp while object to be operated It is arranged to selected state as object.For example, the word to be deleted in selected state can also be with choosing one section of text using mouse Word is similar.
For the scheme of the present embodiment clearly described, it is introduced below with a specific example, with defeated in a document Exemplified by entering one end word, for example user's input passage " ... moonlight is for example stream-like, quietly rushes down in this piece of leaf Son and take | " cursor be located at the afterbody of this section of word " on " after, now user needs to delete " a piece of leaf in this section of word Son and take ", then the finger of user, which may slide into delete key, (can also be only located above the delete key and not touch the deletion Key), the finger of user can enter line slip to the left on a line button where delete key.Electronic equipment detects the hand of user Refer to slide, then with the movement of user's finger, cursor is moved to the left so that a part of word is in selected state, when When cursor is moved to before " one ", should " leaf and take " when being in selected state, user can delete finger from this Except being removed on a line button where key, then trigger electronic equipment and delete the object to be operated " leaf and take ".
During user enters line slip since the delete key to the left, if user's sliding speed is too fast so that cursor Mobile position is in before " rushing down ", and " rush down this leaf with take " is in into selected state, then user can also be from Current location opposite direction is slided, i.e., slided to the right along same a line button where the delete key, electronic equipment is according to user's Glide direction and sliding distance, cursor is moved right certain distance, when user determines that cursor is located at " one " above, stops sliding Dynamic operation, now, the second place where cursor for before " one ", the operation object in selected state for " leaf and Take ", now user will can remove on a line button where finger from the delete key, and then triggering electronic equipment treats this Operation object is deleted.The process and user select object to be operated using mouse, and click on delete key and treat operation object Performed deletion action is identical, but uses the present embodiment method to carry out word or other character input process in user In, especially deleted when needing to carry out deletion action character quantity it is more when, it is to avoid hand is removed keyboard area simultaneously by user Dragging mouse is selected after object to be operated, then clicks on the complex operations of delete key, and user directly carries out gesture in keyboard area Motion can complete deletion action, and then improve input speed.
The description carried out exemplified by carrying out input operation to absorb the finger motion of user in the present embodiment, but this area Technical staff is appreciated that when utilizing other input operation bodies, when such as specific input pen or felt pen carry out input operation, originally Embodiment is equally applicable, and determines that operation object is treated in object to be operated and triggering according to the motion of other input operation bodies Process it is similar to the present embodiment, will not be repeated here.
The gesture input of view-based access control model is combined by the method for the present embodiment with input through keyboard, when the finger of user is in keyboard When the subpoint of place plane is located in specified button, the tracking moved to user's finger is triggered, and then according to the motion of finger Direction and range ability, conveniently determine object to be operated, when detecting the finger and leaving the predeterminable area, According to the corresponding input operation type of the specified button on the keyboard, corresponding processing is performed to the object to be operated, user is only Corresponding gesture operation need to be carried out in keyboard area can both complete to select object to be operated and treat at operation object progress The operation of reason, operating process simply, conveniently, improves input efficiency.
The specified button can also be other buttons, and the operation of different specified button triggerings is different, and triggering determines to wait to grasp Make object and treat operation object the corresponding input operation process of execution it is substantially similar, below again using specified button as tab key Exemplified by be described.Referring to Fig. 3, a kind of flow signal of input method another embodiment based on gesture of the present invention is shown Figure, the method for the present embodiment is applied to the electronic equipment with information acquisition unit and keyboard, and the keyboard of the present embodiment can be Physical entity keyboard, keyboard with touch screen can also be dummy keyboard, and the method for the present embodiment includes:
Step 301:User's input gesture on the keyboard of the electronic equipment is obtained by information acquisition unit.
Step 302:When the finger for detecting user is located on Tab buttons in the subpoint of plane where keyboard, obtain Movement locus of the finger in the corresponding predeterminable area of Tab buttons, the movement locus includes the direction of motion and moving displacement amount.
Wherein, the corresponding predeterminable area of tab buttons can be in for the Tab buttons and with the Tab buttons with a line or The region that the button of person's same row is constituted.
The operating process of step 301 and step 302 is similar with the operating process of the corresponding steps in both examples above, It will not be repeated here.
Step 303:Determine the first operation object that current selection icon is chosen in display interface.
It is applied in the present embodiment to the switching between individual operation object, for example, by operation object by present procedure It is switched to next program;Or when show on display interface have multiple documents and window when, between this multiple documents and window Switch over.
In order to complete the switching between multiple operation objects, when the user detected finger where keyboard plane When subpoint is located on the Tab buttons, motion of the tracking user in the corresponding predeterminable area of tab buttons will be triggered, is determined simultaneously The first operation object that current selection icon is chosen in the moment display interface.
Step 304:Using the distance between finger present position and described subpoint as moving displacement amount, according to pre- If displacement and shift position between corresponding relation, determine corresponding first displacement of the moving displacement amount.
Step 305:The selection icon is moved into first displacement along the direction of motion, the selection is schemed The selected second operand of mark, regard the second operand as object to be operated.
Direction and the distance of selection icon movement are determined according to the direction of motion of motion and moving displacement amount, and will selection Icon is changed to select second operand by selected first operation object.Such as, the first operation object be document 1, when according to After the direction of motion and moving displacement amount the movement selection icon at family, document 2 is chosen icon and chosen, i.e., second operand is text Shelves 2.
Step 306:When detecting the finger and leaving the corresponding predeterminable area of Tab buttons, the first operation object is cut Change to second operand.
Execution handover operation is then triggered when detecting finger and leaving the corresponding predeterminable area of Tab buttons.For example, will be in The document of display interface top layer is switched to document 2 by document 1.
The method of the present embodiment is to do corresponding motion by starting point of the Tab buttons to detect user to be grasped to trigger It is described exemplified by the switching for making object, the operation of the realization of the present embodiment using Tab buttons with Alt key combinations with being carried out Handover operation, but can avoid simple to operate using the method for the present embodiment, improve the speed of switching.
In addition, when subpoint of the user on keyboard is detected in the present embodiment positioned at the Tab buttons, triggering to user's Gesture motion is tracked, and according to the movement locus of user determination switching object, and then retouched exemplified by execution handover operation State, but the present invention is not limited thereto, during due to individually clicking on tab buttons, can trigger execution tabbing (8 spaces of bounce Length) operation, therefore, when detect subpoint of the user on keyboard positioned at the Tab buttons when, user can be obtained at this The movement locus of the corresponding designated area of Tab buttons, and then the direction of motion and/or moving displacement amount according to movement locus, really Determine tabbing quantity, and then perform Tab operation.
What is mainly carried out in more than invention several embodiments so that information acquisition unit is image capture unit as an example retouches State, but the information acquisition unit of the present invention is not limited to image capture unit, and the information acquisition unit can also be electric field induction Unit.When information acquisition unit is electric field induction unit, the keyboard of the electronic equipment can be that physical entity keyboard can also Be to be provided with electric field induction unit below keyboard with touch screen, the keyboard of electronic equipment, set below button Electric capacity, by the change of electric field below electric field induction unit induced key, with determine the finger of user whether touch the button or Person is proximate to the button, and detailed process is similar to the operating process of the several embodiments of the above, will not be repeated here.
The method of the correspondence present invention, present invention also offers a kind of electronic equipment, referring to Fig. 4, shows one kind of the invention The structural representation of electronic equipment one embodiment, the electronic equipment of the present embodiment can be notebook computer, tablet personal computer etc. Equipment with finger-impu system, the keyboard of the electronic equipment can be either empty for physical entity keyboard, keyboard with touch screen Intend keyboard.The electronic equipment of the present embodiment includes:Information acquisition unit 410, gesture path acquiring unit 420, operation object are true Order member 430 and processing unit 440.
Wherein, user's input gesture on information acquisition unit 410, the keyboard for obtaining the electronic equipment.
Gesture path acquiring unit 420, for when the subpoint for the finger plane where the keyboard for detecting user When in specified button, movement locus of the finger in the corresponding predeterminable area of the specified button is obtained.
Wherein, the subpoint of the finger of user, which is located in specified button, a variety of situations, one of which situation, the hand of user Refer in the specified button, be in contact with the specified button.Another situation, the finger of user in the top of the specified button, From finger to the keyboard to vertical line, the vertical point of the vertical line in the specified button, and user finger not with the specified button phase Contact.
The subpoint of the finger of correspondence user is located at a variety of situations in specified button, obtains the finger in specified button pair The movement locus for the predeterminable area answered also has a variety of situations, specifically, can be obtain the finger it is corresponding in the specified button The movement locus of predeterminable area enterprising line slip operation or obtain the finger parallel to keyboard institute in the plane and Movement locus in the range of the predeterminable area, the i.e. finger not with keypad contact, but the finger motion region in the keyboard On projection but belong to the predeterminable area.
Operation object determining unit 430, for according to the movement locus, determining object to be operated.
Processing unit 440, for when detecting the finger and leaving the predeterminable area, according to the specified button pair The input operation type answered, input operation is performed to the object to be operated.
The corresponding predeterminable area of the specified button can be set as needed, corresponding, the gesture path acquiring unit 420 Specially:For when the finger for detecting user is located in specified button in the subpoint of plane where the keyboard, obtaining The finger is in the fortune on the region constituted with the button of a line or same row in the keyboard with the specified button Dynamic rail mark.
Wherein, the mode for the user gesture that gesture path acquiring unit 420 is got has a variety of situations, and the gesture path is obtained The unit 420 is taken to be specially:It is located at specified button for the subpoint when the finger plane where the keyboard for detecting user When upper, the movement locus that the finger is slided on the predeterminable area was obtained;Or, the finger is obtained parallel to described Keyboard in the plane and the movement locus in the range of the predeterminable area.
Further, operation object determining unit, specifically, for according to the direction of motion and/or moving displacement Amount, determines object to be operated.
The electronic equipment of the present embodiment gets in information acquisition unit and user's input gesture on keyboard is detected, And when the subpoint that gesture path acquiring unit detects the finger of user falls in the specified button, trigger to the defeated of user Enter motion to be monitored, and obtain user gesture motion, and referred to by operation object acquiring unit according to the finger of user at this The movement locus for determining key area determines object to be operated, and by processing unit according to the corresponding input operation class of the specified button Type performs corresponding input operation to the object to be operated, it is achieved thereby that input gesture to complete input operation according to user, Input operation is simplified, input complexity is improved.
Wherein, the mode that the user that the information acquisition unit is obtained on keyboard inputs gesture has a variety of, referring to Fig. 5, shows The structural representation of a kind of electronic equipment one embodiment of the present invention, the electronic equipment of the present embodiment with a upper embodiment not It is with part:
The information acquisition unit of this in the present embodiment 410 can include:Image capture unit 411, for absorbing the keyboard On user input gesture.
Certainly, the information acquisition unit can also include electric field induction unit, each button correspondence on induction keyboard Electric-field intensity changing value, according to the changing value of the electric-field intensity obtain user input gesture.
The present embodiment is so that specified button is delete key as an example, and the corresponding operation object determining unit 430 includes:
Cursor position determining unit 431, for determining the first position where current cursor;
First distance determining unit 432, for determining moving direction of cursor according to the direction of motion, and by the finger The distance of present position and the subpoint as the moving displacement amount, according to default displacement and cursor movement away from Corresponding relation between, determines cursor displacement;
Cursor mobile unit 433, for according to the cursor displacement, along the moving direction of cursor, by the light Mark is moved to the second place;
First object determining unit 434, for using the operation object between first position and the second place as treating Operation object.
It is corresponding, when the specified button be delete key when, the object delete unit 440, for when the operation object it is true When order member determines object to be operated, the object to be operated is deleted
Also selected in order to which user can become apparent from intuitively knowing in the object to be operated determined, the present embodiment including state Unit 450, the state selectes unit 450, for the object to be operated of determination to be arranged into selected state.
The specified button of this in the present invention can also be other buttons, and such as the specified button can be the tab key on keyboard, when When the specified button is tab key, the operation object determining unit includes:
Icon determining unit, for determining the first operation object that current selection icon is chosen;
First distance determining unit, for regarding the distance of the finger present position and the subpoint as motion Displacement, according to the corresponding relation between default displacement and shift position, determines the moving displacement amount corresponding first Displacement;
Second object determining unit, for by it is described selection icon along the direction of motion move described first move away from From, by it is described selection icon select second operand, regard the second operand as object to be operated.
Certainly, according to the difference of specified button, the corresponding operating that the operation object determining unit is performed may also can be It is different.
The embodiment of each in this specification is described by the way of progressive, and what each embodiment was stressed is and other Between the difference of embodiment, each embodiment identical similar portion mutually referring to.For device disclosed in embodiment For, because it is corresponded to the method disclosed in Example, so description is fairly simple, related part is said referring to method part It is bright.
The foregoing description of the disclosed embodiments, enables professional and technical personnel in the field to realize or using the present invention. A variety of modifications to these embodiments will be apparent for those skilled in the art, as defined herein General Principle can be realized in other embodiments without departing from the spirit or scope of the present invention.Therefore, it is of the invention The embodiments shown herein is not intended to be limited to, and is to fit to and principles disclosed herein and features of novelty phase one The most wide scope caused.

Claims (17)

1. a kind of input method based on gesture, applied to an electronic equipment, the electronic equipment include information acquisition unit and Keyboard, it is characterised in that including:
The user obtained by described information acquiring unit on the keyboard of the electronic equipment inputs gesture;
When the finger for detecting user is located in specified button in the subpoint of plane where the keyboard, triggering obtains described With the specified button in same in movement locus of the finger in the corresponding predeterminable area of the specified button, the keyboard The area that the button for being in same row with the specified button in the region that capable button is constituted, or the keyboard is constituted Domain;
According to the movement locus, object to be operated is determined from the operation object that display interface is shown, wherein, the operation pair As different from the keyboard;
It is right according to the corresponding input operation type of the specified button when detecting the finger and leaving the predeterminable area The object to be operated performs input operation.
2. according to the method described in claim 1, it is characterised in that the keyboard includes:
Physical keyboard either dummy keyboard.
3. according to the method described in claim 1, it is characterised in that described information acquiring unit includes:Image capture unit;
User on the keyboard that the unit acquisition electronic equipment is absorbed by described information inputs gesture, including:
User on the unit intake keyboard is absorbed by described image and inputs gesture.
4. the method according to any one of claims 1 to 3, it is characterised in that the acquisition finger is specified described Movement locus in the corresponding predeterminable area of button, including following one or more:
Obtain the movement locus that the finger is slided on the predeterminable area;
Or, obtain the finger parallel to the keyboard motion rail in the plane and in the range of the predeterminable area Mark.
5. according to the method described in claim 1, it is characterised in that the movement locus includes the direction of motion of the finger And/or moving displacement amount;
According to the movement locus, object to be operated is determined, including:
According to the direction of motion and/or moving displacement amount, object to be operated is determined.
6. method according to claim 5, it is characterised in that described according to institute when the specified button is delete key The direction of motion and/or moving displacement amount of finger are stated, object to be operated is determined, including:
Determine the first position where current cursor;
Moving direction of cursor is determined according to the direction of motion, and by the finger present position and the subpoint away from From as the moving displacement amount, according to the corresponding relation between default displacement and cursor displacement, determine that cursor is moved Dynamic distance;
According to the cursor displacement, along the moving direction of cursor, the cursor is moved to the second place;
It regard the operation object between first position and the second place as object to be operated.
7. method according to claim 6, it is characterised in that determine also to include while object to be operated:
The object to be operated of determination is arranged to selected state.
8. the method according to claim 1,6 or 7 any one, it is characterised in that when the specified button is delete key, According to the corresponding input operation type of the specified button, input operation is performed to the object to be operated, including:
Delete the object to be operated.
9. method according to claim 5, it is characterised in that when the specified button is the Tab buttons on keyboard, institute The direction of motion and/or moving displacement amount according to the finger are stated, object to be operated is determined, including:
It is determined that the first operation object that currently selection icon is chosen;
Using the distance of the finger present position and the subpoint as moving displacement amount, according to default displacement with Corresponding relation between shift position, determines corresponding first displacement of the moving displacement amount;
The selection icon is moved into first displacement along the direction of motion, by selected second behaviour of the selection icon Make object, regard the second operand as object to be operated.
10. a kind of electronic equipment, it is characterised in that including:
User's input gesture on information acquisition unit, the keyboard for obtaining the electronic equipment;
Gesture path acquiring unit, for being specified when the subpoint for the finger plane where the keyboard for detecting user is located at When on button, triggering is obtained in movement locus of the finger in the corresponding predeterminable area of the specified button, the keyboard The region that is constituted of button with a line is in the specified button, or is in the keyboard with the specified button same The region that the button of one row is constituted;
Operation object determining unit, for according to the movement locus, determining to wait to grasp from the operation object that display interface is shown Make object, wherein, the operation object is different from the keyboard;
Processing unit, it is corresponding defeated according to the specified button for when detecting the finger and leaving the predeterminable area Enter action type, input operation is performed to the object to be operated.
11. electronic equipment according to claim 10, it is characterised in that described information acquiring unit includes:Image capture Unit, for absorbing the input gesture of the user on the keyboard.
12. the electronic equipment according to any one of claim 10 or 11, it is characterised in that the gesture path acquiring unit Specially:For when the finger for detecting user is located in specified button in the subpoint of plane where the keyboard, obtaining The movement locus that the finger is slided on the predeterminable area;Or, the finger is obtained where parallel to the keyboard Movement locus in plane and in the range of the predeterminable area.
13. electronic equipment according to claim 10, it is characterised in that the operation object determining unit, specifically, with According to the direction of motion and/or moving displacement amount, object to be operated is determined.
14. electronic equipment according to claim 13, it is characterised in that described when the specified button is delete key Operation object determining unit, including:
Cursor position determining unit, for determining the first position where current cursor;
First distance determining unit, for determining moving direction of cursor according to the direction of motion, and by the current institute of the finger Position and the subpoint distance as the moving displacement amount, according between default displacement and cursor displacement Corresponding relation, determine cursor displacement;
Cursor mobile unit, for according to the cursor displacement, along the moving direction of cursor, the cursor to be moved to The second place;
First object determining unit, for using the operation object between first position and the second place as to be operated right As.
15. electronic equipment according to claim 14, it is characterised in that also include:
State selectes unit, for the object to be operated of determination to be arranged into selected state.
16. the electronic equipment according to claim 10,14 or 15 any one, it is characterised in that when the specified button is During delete key, the processing unit, including:
Object deletes unit, for when the operation object determining unit determines object to be operated, deleting described to be operated Object.
17. electronic equipment according to claim 13, it is characterised in that when the specified button is pressed for the Tab on keyboard During key, the operation object determining unit includes:
Icon determining unit, for determining the first operation object that current selection icon is chosen;
First distance determining unit, for regarding the distance of the finger present position and the subpoint as moving displacement Amount, according to the corresponding relation between default displacement and shift position, determines corresponding first movement of the moving displacement amount Distance;
Second object determining unit, will for the selection icon to be moved into first displacement along the direction of motion The selection icon selectes second operand, regard the second operand as object to be operated.
CN201210073206.8A 2012-03-19 2012-03-19 A kind of input method and electronic equipment based on gesture Active CN103324271B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210073206.8A CN103324271B (en) 2012-03-19 2012-03-19 A kind of input method and electronic equipment based on gesture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210073206.8A CN103324271B (en) 2012-03-19 2012-03-19 A kind of input method and electronic equipment based on gesture

Publications (2)

Publication Number Publication Date
CN103324271A CN103324271A (en) 2013-09-25
CN103324271B true CN103324271B (en) 2017-07-25

Family

ID=49193072

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210073206.8A Active CN103324271B (en) 2012-03-19 2012-03-19 A kind of input method and electronic equipment based on gesture

Country Status (1)

Country Link
CN (1) CN103324271B (en)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103645829A (en) * 2013-12-18 2014-03-19 天津三星通信技术研究有限公司 Character deletion method and portable terminal utilizing same
CN104866070A (en) * 2014-02-20 2015-08-26 联想(北京)有限公司 Method for information processing and electronic equipment
WO2015123835A1 (en) * 2014-02-20 2015-08-27 Nokia Technologies Oy Cursor placement
US9612664B2 (en) * 2014-12-01 2017-04-04 Logitech Europe S.A. Keyboard with touch sensitive element
CN104503573A (en) * 2014-12-16 2015-04-08 苏州佳世达电通有限公司 Gesture operating method and gesture operating device
CN104699805A (en) * 2015-03-20 2015-06-10 努比亚技术有限公司 Music search method and music search device
CN105116770A (en) * 2015-07-13 2015-12-02 小米科技有限责任公司 Control method and device of intelligent socket
CN105183227A (en) * 2015-09-09 2015-12-23 魅族科技(中国)有限公司 Character deletion method and device and electronic equipment
CN106951165A (en) * 2017-03-30 2017-07-14 维沃移动通信有限公司 A kind of word editing method and mobile terminal
CN109062871B (en) * 2018-07-03 2022-05-13 北京明略软件系统有限公司 Text labeling method and device and computer readable storage medium
CN111078102B (en) * 2019-06-09 2021-07-23 广东小天才科技有限公司 Method for determining point reading area through projection and terminal equipment
CN112486366A (en) * 2020-11-27 2021-03-12 维沃移动通信有限公司 Control display method and device and electronic equipment
CN113253908B (en) * 2021-06-22 2023-04-25 腾讯科技(深圳)有限公司 Key function execution method, device, equipment and storage medium
CN117193540B (en) * 2023-11-06 2024-03-12 南方科技大学 Control method and system of virtual keyboard

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101393506A (en) * 2007-09-13 2009-03-25 苹果公司 Input methods for device having multi-language environment
CN101694650A (en) * 2009-10-10 2010-04-14 宇龙计算机通信科技(深圳)有限公司 Method, device and mobile terminal for copying and pasting data
CN101963887A (en) * 2010-09-26 2011-02-02 百度在线网络技术(北京)有限公司 Method and equipment for changing display object in mobile equipment based on sliding operation
CN101996049A (en) * 2010-11-24 2011-03-30 广州市久邦数码科技有限公司 Virtual keyboard input method applied to embedded touch screen equipment
CN102075713A (en) * 2011-01-10 2011-05-25 深圳创维-Rgb电子有限公司 Television character input method and television using same and remote controller

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101393506A (en) * 2007-09-13 2009-03-25 苹果公司 Input methods for device having multi-language environment
CN101694650A (en) * 2009-10-10 2010-04-14 宇龙计算机通信科技(深圳)有限公司 Method, device and mobile terminal for copying and pasting data
CN101963887A (en) * 2010-09-26 2011-02-02 百度在线网络技术(北京)有限公司 Method and equipment for changing display object in mobile equipment based on sliding operation
CN101996049A (en) * 2010-11-24 2011-03-30 广州市久邦数码科技有限公司 Virtual keyboard input method applied to embedded touch screen equipment
CN102075713A (en) * 2011-01-10 2011-05-25 深圳创维-Rgb电子有限公司 Television character input method and television using same and remote controller

Also Published As

Publication number Publication date
CN103324271A (en) 2013-09-25

Similar Documents

Publication Publication Date Title
CN103324271B (en) A kind of input method and electronic equipment based on gesture
CN102830819B (en) The method and apparatus of analog mouse input
KR101861395B1 (en) Detecting gestures involving intentional movement of a computing device
US9292111B2 (en) Gesturing with a multipoint sensing device
CN103034427B (en) A kind of touch-screen page turning method, device and a kind of touch panel device
CN103197885B (en) The control method and its mobile terminal of mobile terminal
CN102902481B (en) Terminal and terminal operation method
EP2485138A1 (en) Gesturing with a multipoint sensing device
CN105117056B (en) A kind of method and apparatus of operation touch-screen
CN103246382B (en) Control method and electronic equipment
CN103218044B (en) A kind of touching device of physically based deformation feedback and processing method of touch thereof
CN105144033A (en) Extending interactive inputs via sensor fusion
CN104965669A (en) Physical button touch method and apparatus and mobile terminal
WO2014075612A1 (en) Man-machine interaction method and interface
EP3477457B1 (en) Touchpad-based rapid information input and interaction method and input and interaction system
WO2011088793A1 (en) Terminal and input method thereof
CN105589636A (en) Method and mobile terminal used for realizing virtual pointer control on touch screen
CN107450820A (en) Interface control method and mobile terminal
CN103324430B (en) The objects operating method and device of many fingers
US20100289751A1 (en) Operation method for a trackpad equipped with pushbutton function
CN104423657B (en) The method and electronic equipment of information processing
CN104714726A (en) Control device capable of operating touch screen of mobile terminal by one hand, and control method of control device
CN105955570A (en) System and method for controlling cursor moving
CN103870144B (en) Method for controlling electronic device and electronic device
CN103885713B (en) A kind of information processing method and electronic equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant