CN104731439A - Gesture packaging and task executing method and device - Google Patents

Gesture packaging and task executing method and device Download PDF

Info

Publication number
CN104731439A
CN104731439A CN201310705689.3A CN201310705689A CN104731439A CN 104731439 A CN104731439 A CN 104731439A CN 201310705689 A CN201310705689 A CN 201310705689A CN 104731439 A CN104731439 A CN 104731439A
Authority
CN
China
Prior art keywords
gesture
task
corresponding relation
selection
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201310705689.3A
Other languages
Chinese (zh)
Inventor
覃迪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Mobile Communications Technology Co Ltd
Original Assignee
Hisense Mobile Communications Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Mobile Communications Technology Co Ltd filed Critical Hisense Mobile Communications Technology Co Ltd
Priority to CN201310705689.3A priority Critical patent/CN104731439A/en
Publication of CN104731439A publication Critical patent/CN104731439A/en
Pending legal-status Critical Current

Links

Abstract

An embodiment of the invention discloses a gesture packaging and task executing method and device and relates to the field of communication. The gesture packaging and task executing method is used for solving the problems of complicated operation and poor experience of users in task execution in the prior art. The gesture packaging and task executing method includes steps of acquiring gestures which are moving tracks of fingers or touch devices on a touch screen; acquiring one or more tasks selected, establishing correspondence of the gestures and the selected tasks to realize a gesture packaging process; receiving the input gestures and identifying the input gestures, looking up the tasks corresponding to the input gestures according to the correspondence of the gestures and the tasks, calling and executing the lookup tasks when the tasks corresponding to the input gestures are found and finally realizing the task executing process. Accordingly, the problems of complicated operation and poor experience of the users are solved.

Description

A kind of gesture encapsulation and task executing method, device
Technical field
The present invention relates to the communications field, particularly relate to the encapsulation of a kind of gesture and task executing method, device.
Background technology
Along with the function of intelligent terminal is more and more abundanter, the operation of various intelligent terminal becomes increasingly complex, and user operation is got up succinct not.When user selects a tasks carrying, need the multiple operation levels entering menu just can complete associative operation, if user needs to select multiple different task, then need the multiple operation levels entering different menu respectively just can complete associative operation.
For intelligent mobile phone terminal, if user wants to perform the darker operation of multiple operation level, so user needs the multistage submenu entering a menu, after selecting the function needing to perform, enter the submenu of other menus again, select another function needing to perform, by that analogy, just can complete multiple operation.Such as, if user needs to open WLAN(Wireless Local Area Networks, WLAN) function and GPS(Global Positioning System, GPS) function open the application program of microblogging, then first, user enters " setting " menu, select " connection " submenu, select " WLAN " submenu again, and select " unlatching ", just can complete and open WLAN function, then, user returns " setting " menu, select " GPS " submenu, and select " unlatching ", just can complete and open GPS function, finally, user selects " application program " menu, select " microblogging ", just can open the application program of microblogging.Visible, in order to perform above-mentioned three operations, user needs redirect in multiple menu, and enters multiple submenu, just can complete the setting of above-mentioned functions, complicated operation, poor user experience.
Summary of the invention
Embodiments provide the encapsulation of a kind of gesture and task executing method, device, in order to simplify the operation of user's calling task.
A kind of gesture method for packing, comprising:
Gather gesture, described gesture is finger or touching device motion track on the touchscreen;
Obtain the one or more tasks selected, set up the corresponding relation of the task of described gesture and selection;
Preserve the corresponding relation of the task of described gesture and selection.
As can be seen from such scheme, the embodiment of the present invention is by gathering gesture, obtain the one or more tasks selected, set up the corresponding relation of the task of described gesture and selection, achieve the process of gesture encapsulation, preserve this corresponding relation simultaneously, by corresponding with multiple task for a gesture, so that call this corresponding relation, thus simplify user operation, improve Consumer's Experience.
Preferably, after collecting gesture, extract the track characteristic value of described gesture; By the task list that task list interface provides, select one or more task, set up the corresponding relation of the mark of the track characteristic value of described gesture and the task of selection.Like this, the task list provided by task list interface, is set up the corresponding relation of the mark of the track characteristic value of gesture and the task of selection, thus achieves a process gesture be associated with multiple task.
Preferably, after collecting gesture, it is described gesture allocation identification; Obtain gesture delete instruction, in described gesture delete instruction, comprise the mark of the gesture that needs are deleted; According to described gesture delete instruction, delete the corresponding relation of the task of corresponding gesture and described gesture and selection.Like this, the process that gesture is deleted is achieved.
A kind of task executing method, comprising:
Receive the gesture of input, the motion track that described gesture is obtained by collection finger or touching device movement on the touchscreen; Identify the gesture of described input, according to the corresponding relation of the gesture prestored and task, search the task corresponding with the gesture of described input, and when finding the task corresponding with the gesture of described input, call the task of finding and perform; The corresponding relation of the described gesture that prestores and task specifically comprises: by the gesture that will the gather one or more tasks with selection, the corresponding relation of the task of the described gesture that foundation obtains and selection.
As can be seen from such scheme, the embodiment of the present invention is by identifying the gesture of described input, according to the corresponding relation of gesture and task, search the task corresponding with the gesture of described input, and when finding the task corresponding with the gesture of described input, call finding of task, achieve the process by gesture calling task, simplify the operation of user's calling task, improve the experience of user.
Preferably, obtain the gesture of input, extract the track characteristic value of the gesture of described input; According to the track characteristic value of the gesture prestored and the corresponding relation of task, and the track characteristic value of the gesture extracted, search the task corresponding with the track characteristic value of the described gesture extracted.Like this, the embodiment of the present invention can identify the gesture that user inputs, and searches the task corresponding with this gesture.
A kind of gesture packaging system, comprising:
Gesture collecting unit, for gathering gesture, described gesture is finger or touching device motion track on the touchscreen;
Gesture encapsulation unit, for obtaining one or more tasks of selection, sets up the corresponding relation of the task of described gesture and selection;
Gesture storage unit, for preserving the corresponding relation of the task of described gesture and selection.
As can be seen from such scheme, the embodiment of the present invention is by gathering gesture, obtain the one or more tasks selected, set up the corresponding relation of the task of described gesture and selection, achieve the process of gesture encapsulation, preserve this corresponding relation simultaneously, by corresponding with multiple task for a gesture, so that call this corresponding relation, thus simplify user operation, improve Consumer's Experience.
Preferably, described gesture encapsulation unit specifically for: after collecting gesture, extract the track characteristic value of described gesture; By the task list that task list interface provides, select one or more task, set up the corresponding relation of the mark of the track characteristic value of described gesture and the task of selection.Like this, the task list provided by task list interface, is set up the corresponding relation of the mark of the track characteristic value of gesture and the task of selection, thus achieves a process gesture be associated with multiple task.
Preferably, described gesture collecting unit is further used for, and after collecting gesture, is described gesture allocation identification; Gesture delete cells, for obtaining gesture delete instruction, comprises the mark of the gesture that needs are deleted in described gesture delete instruction; According to described gesture delete instruction, delete the corresponding relation of the task of corresponding gesture and described gesture and selection.Like this, the process that gesture is deleted is achieved.
A kind of task execution device, comprising:
Gesture identification unit, for receiving the gesture of input, the motion track that described gesture is obtained by collection finger or touching device movement on the touchscreen; Identify the gesture of described input;
Call unit, for identify input in described gesture identification unit gesture after, according to the corresponding relation of gesture and task, search the task corresponding with the gesture of described input, and when finding the task corresponding with the gesture of described input, calling the task of finding and performing; The corresponding relation of the described gesture that prestores and task specifically comprises: by the gesture that will the gather one or more tasks with selection, the corresponding relation of the task of the described gesture that foundation obtains and selection.
As can be seen from such scheme, the embodiment of the present invention is by identifying the gesture of described input, according to the corresponding relation of gesture and task, search the task corresponding with the gesture of described input, and when finding the task corresponding with the gesture of described input, call finding of task, achieve the process by gesture calling task, simplify the operation of user's calling task, improve the experience of user.
Preferably, described call unit specifically for, obtain input gesture, extract the track characteristic value of the gesture of described input; According to the track characteristic value of gesture and the corresponding relation of task, and the track characteristic value of the gesture extracted, search the task corresponding with the track characteristic value of the described gesture extracted.Like this, the embodiment of the present invention can identify the gesture that user inputs, and searches the task corresponding with this gesture.
Accompanying drawing explanation
In order to be illustrated more clearly in the technical scheme in the embodiment of the present invention, below the accompanying drawing used required in describing embodiment is briefly introduced, apparently, accompanying drawing in the following describes is only some embodiments of the present invention, for those of ordinary skill in the art, under the prerequisite not paying creative work, other accompanying drawing can also be obtained according to these accompanying drawings.
The schematic flow sheet of a kind of gesture method for packing that Fig. 1 provides for the embodiment of the present invention;
The schematic diagram at the Intelligent gesture interface of the Intelligent gesture plug-in unit that Fig. 2 provides for the embodiment of the present invention;
The schematic diagram of the newly-built intelligent sketching of the Intelligent gesture plug-in unit that Fig. 3 provides for the embodiment of the present invention;
The schematic diagram of the interpolation task interface of the Intelligent gesture plug-in unit that Fig. 4 provides for the embodiment of the present invention;
The schematic diagram of the gesture list interface of the Intelligent gesture plug-in unit that Fig. 5 provides for the embodiment of the present invention;
The schematic flow sheet of a kind of task executing method that Fig. 6 provides for the embodiment of the present invention;
The schematic diagram at the newly-built gesture success interface of the Intelligent gesture plug-in unit that Fig. 7 provides for the embodiment of the present invention;
The structural representation of a kind of gesture packaging system that Fig. 8 provides for the embodiment of the present invention;
The structural representation of a kind of task execution device that Fig. 9 provides for the embodiment of the present invention;
The structural representation of a kind of entity apparatus that Figure 10 provides for the embodiment of the present invention;
The structural representation of the another kind of entity apparatus that Figure 11 provides for the embodiment of the present invention.
Embodiment
In order to make the object, technical solutions and advantages of the present invention clearly, below in conjunction with accompanying drawing, the present invention is described in further detail, and obviously, described embodiment is only a part of embodiment of the present invention, instead of whole embodiments.Based on the embodiment in the present invention, those of ordinary skill in the art, not making other embodiments all obtained under creative work prerequisite, belong to the scope of protection of the invention.
The embodiment of the present invention is applicable to the various terminal possessing touch-screen, and especially intelligent terminal, includes but not limited to intelligent mobile phone terminal.
The embodiment of the present invention, by setting up the corresponding relation of gesture and task, after identifying gesture, can be passed through the direct calling task of described corresponding relation, thus simplify user operation step, achieves the process replacing many more manipulations with a gesture.It should be noted that, the gesture in the embodiment of the present invention is the touching device motion tracks on the touchscreen such as the body parts such as finger, or felt pen.
Below in conjunction with accompanying drawing, the embodiment of the present invention is specifically described.
The task executing method that the embodiment of the present invention provides comprises gesture encapsulation process and gesture invoked procedure, and Fig. 1 shows a kind of schematic flow sheet of gesture method for packing, and Fig. 7 shows the schematic flow sheet of gesture invoked procedure.
See Fig. 1, the process of the gesture encapsulation that the embodiment of the present invention provides, can set up the corresponding relation of gesture and task by this process.This process can be realized by following steps:
Step 11: the gesture that acquisition terminal touch-screen inputs, obtains and preserves the track characteristic value of this gesture.
During specific implementation, when user inputs gesture on the touch-screen of terminal, terminal can gather the tracing point of above-mentioned gesture, and the information of this tracing point is saved as the track characteristic value of gesture.User by body parts such as fingers, or by touching devices such as stylus, terminal touch screen inputs gesture.Gesture i.e. motion track on the touchscreen by the way, this motion track stores in the terminal with track-wise.
Step 12: obtain the task that user selects.
During specific implementation, a task list can be provided by task list interface, after user inputs gesture on the touch-screen of terminal, selection task in the task list that can provide in task list interface, after user selects task to terminate, clicks " completing " or " confirmation " button, can think that user have sent and confirm instruction, to make terminal can know user-selected mission bit stream, such as get the ID(Identity of user-selected task, mark).The available task can selected for user is listed in this task list.Generate by the application management interface calling terminal operating system at task list.
Step 13: the corresponding relation setting up gesture and task.
During specific implementation, terminal is by setting up the mode (i.e. the mapping table of gesture and task) of the corresponding relation of gesture and task, the user-selected task got in the gesture collected in step 11 and step 12 is mapped, thus sets up the corresponding relation of gesture and task.This corresponding relation can be man-to-man relation, and namely a corresponding task of gesture, also can be the relation of one-to-many, i.e. a corresponding multiple task of gesture.
In the preferred implementation of one, the process setting up the relation of gesture and task is as follows:
Collecting gesture and extracting the track characteristic value of gesture, compare with the track characteristic value of the gesture collected, if different from the track characteristic value of the gesture collected, then for the current gesture allocation identification (gesture ID) collected, gesture ID can the gesture of unique identification gesture or correspondence.Then, the task ID that gesture ID and user select is mapped, and mapping relations are stored in corresponding relation.If user is for the current gesture collected, have selected n program (n>1) in step 12, then the ID having n task is added in list item corresponding with the ID of this gesture in corresponding relation.
Certainly, the task creation corresponding relation also gesture and user can selected, and gesture and task ID are stored in the mapping table of gesture and task.
It should be noted that, the gesture in the embodiment of the present invention can for system intialization be at terminal inner, and above-mentioned corresponding relation also can for system intialization be at terminal inner, and user such as can modify, delete and increase newly to this corresponding relation at the operation.
Below for an embody rule scene, and composition graphs 2 ~ Fig. 6 is described in detail the gesture encapsulation process that the above embodiment of the present invention provides.
In the embodiment of the present invention, in advance Intelligent gesture plug-in unit can be installed in intelligent terminal, above-mentioned gesture encapsulation function can be realized by this plug-in unit, and call corresponding task by gesture.Fig. 2 shows the Intelligent gesture interface of the Intelligent gesture plug-in unit that the embodiment of the present invention provides, and this Intelligent gesture interface at least comprises: Name area, interface, function button region and gesture input area, and this Intelligent gesture interface can also comprise: quick gesture region.Fig. 3 shows the newly-built intelligent sketching of the Intelligent gesture plug-in unit that the embodiment of the present invention provides, and this newly-built intelligent sketching at least comprises: Name area, interface and gesture input area.Fig. 4 shows the interpolation task interface of the Intelligent gesture plug-in unit that the embodiment of the present invention provides, and this interpolation task interface at least comprises Name area, interface, task list area and complete/cancels key area.Fig. 5 shows the gesture list interface of the Intelligent gesture plug-in unit that the embodiment of the present invention provides, and this gesture list interface at least comprises Name area, interface, function button region and gesture selected zone.Fig. 6 shows the schematic flow sheet of a kind of task executing method that the embodiment of the present invention provides.Fig. 7 shows the newly-built gesture success interface of the Intelligent gesture plug-in unit that the embodiment of the present invention provides.
When user needs to set up a gesture, GPRS(General Packet Radio Service is called by this gesture, general packet radio service), GPS, vibrations, automatic rotation screen, automatically adjustment brightness and microblogging task time, first, can by the mode of slip on any operation interface of terminal, start Intelligent gesture plug-in unit, enter " Intelligent gesture interface " as shown in Figure 2.
User can be entered by " newly-built gesture " button in the function button region in click Intelligent gesture interface " newly-built intelligent sketching " as shown in Figure 3.
Newly-built intelligent sketching comprises gesture input area, and user can input gesture on gesture input area.User is after gesture input area input gesture " W ", and Intelligent gesture plug-in unit can obtain track characteristic value (being the track data of the gesture) line item of going forward side by side of this gesture.
When user is continuous on gesture input area, input the gesture of " W " shapes three times after, Intelligent gesture plug-in unit can the form of message notifying frame, prompting gesture typing success.Further, the automatic redirect of Intelligent gesture plug-in unit enters interpolation task interface as shown in Figure 4.
Add task interface and comprise task list, this list shows the task names can selected for user, and corresponding each task names shows individual event choice box or many items chooses frame, selects for user.It should be noted that, above-mentioned task comprises the Operation system setting task of terminal and the task of terminal.
Continue as shown in Figure 4, user is adding the system setup operation selecting in task interface to need, such as select " startup Mobile data " function, " starting GPS " function, " starting shock " function, " starting automatic rotation screen " function, " start and automatically adjust brightness " function and " startup microblogging " function, click " completing " button afterwards, in the name at " gesture name " interface input gesture " W " of ejecting, so far gesture " W " has been set up with the corresponding relation of task.Task in above-mentioned task list can be distinguished by the task ID of correspondence.When each Intelligent gesture plug-in unit is opened, this task list refreshable, namely obtains the Operation system setting program (such as connection, sound or display etc.) of task (such as microblogging, micro-letter or QQ etc.) and this intelligent terminal that intelligent terminal is installed by calling system task interface.
Table 1 shows the task list in a kind of Intelligent gesture plug-in unit.Comprise task ID and task names in this list, task names will be presented in the task list of interpolation task interface.
Table 1
Table 2 shows in above-mentioned application scenarios, the gesture of " W " shape and the mapping table of task.
Table 2
User can be entered by " gesture list " button in the function button region in click Intelligent gesture interface " gesture list interface " as shown in Figure 5.Gesture list interface can show the gesture of having preserved.
After completing above gesture encapsulation process, can be called the gesture after above-mentioned encapsulation by gesture invoked procedure.
See Fig. 6, the tasks carrying process that the embodiment of the present invention provides realizes by following steps:
Step 61: the gesture receiving input, identifies the gesture of described input, obtain the track data of this gesture.
During specific implementation, user inputs gesture on the touch-screen of terminal, and the gesture that Intelligent gesture plug-in unit inputs according to user identifies, obtains the track data of this gesture.
Step 62: the track key point of searching the gesture of mating completely with the track key point of this gesture in Intelligent gesture plug-in unit, if the track key point of the gesture of mating completely with the track key point of this gesture can be found, then perform step 63, if the track key point of the gesture of mating completely with the track key point of this gesture can not be found, then can further with the form of message notifying frame, prompting error message.During specific implementation, can Toast(Tuo Site be passed through) form of message notifying frame, point out None-identified this gesture to user, the message of gesture please be re-enter, make user again input gesture.
Step 63: search the task corresponding with described gesture in the corresponding relation obtained by step 13.
During specific implementation, Intelligent gesture plug-in unit searches the task corresponding with gesture in above-mentioned corresponding relation.Intelligent gesture plug-in unit, by the method for fuzzy matching, finds similarity to reach the track key point of the gesture of certain numerical value in a database, namely thinks that the track key point of this gesture is the track key point of the gesture of coupling.After finding the track key point of the gesture of coupling, will corresponding relation in invocation step 13, by searching the task (as search corresponding task ID) corresponding with this gesture in corresponding relation in step 13 of the relevant information (track key point or gesture ID as gesture) of this gesture.
Step 64: perform finding of task.That is, if find corresponding task in step 63, then call this task and perform.
Further, if find the task corresponding with the gesture inputted in step 63, can also by the form of message notifying frame, prompting identifies successfully, and the title of whole tasks of display startup, if search unsuccessfully, then can by the form of message notifying frame, prompting starts failed task names, and continues to start other tasks.
Be described above-mentioned flow process in order to clearer, below for the gesture of " W " shape by above-mentioned foundation, the process being called corresponding task by this gesture can be realized by following steps:
First, the mode by sliding in any operation interface of terminal, starts Intelligent gesture plug-in unit.
Then, user enters the Intelligent gesture interface of Intelligent gesture plug-in unit, and the gesture input area at Intelligent gesture interface inputs the gesture of " W " shape;
Now, the track key point of the gesture that Intelligent gesture plugin lookup mates completely to the track key point of the gesture with " W " shape, and with the form of message notifying frame, the success of prompting user gesture identification.Intelligent gesture plug-in unit searches the task corresponding with this gesture in above-mentioned corresponding relation, and starts above-mentioned task.
See Fig. 7, Intelligent gesture plug-in unit starts Mobile data function, GPS function, vibration function, automatic rotation screen function, automatically adjustment brightness function and microblogging function, and starts successfully with the form of message notifying frame prompting user.
As a kind of preferred embodiment, newly-built gesture can be set to shortcut by user, and is this shortcut numbering, makes it be presented in the shortcut hurdle at Intelligent gesture interface.After user clicks shortcut, system searches out gesture corresponding to shortcut in background data base, after searching this gesture, and the task that in system direct calling data storehouse, corresponding gesture is corresponding, and directly perform corresponding operating.After said process is equivalent to click keys, the system gesture that this shortcut stores from animation.Benefit is can when gesture is forgotten or identified inaccurate, and shortcut can Starting mode as an alternative, one of percentage hundred accuracy rate identification.
Based on identical technical conceive, the embodiment of the present invention additionally provides a kind of gesture packaging system that can be applicable to above-mentioned flow process.
Fig. 8 shows the structural representation of a kind of gesture packaging system that the embodiment of the present invention provides.
As shown in Figure 8, a kind of gesture packaging system, comprising:
Gesture collecting unit 81, for gathering gesture, described gesture is finger or stylus motion track on the touchscreen;
Gesture encapsulation unit 82, selects one or more task for obtaining, and sets up the corresponding relation of the task of gesture that described gesture collecting unit 81 collects and selection;
Gesture storage unit 83, for preserving the corresponding relation of the task of described gesture and selection.
Preferably, described gesture encapsulation unit 82 specifically for, after collecting gesture, extract the track characteristic value of described gesture; By the task list that task list interface provides, select one or more task, set up the corresponding relation of the mark of the track characteristic value of described gesture and the task of selection.
Preferably, described gesture collecting unit 81 is further used for as described gesture allocation identification; ; This device also comprises: gesture delete cells, for obtaining gesture delete instruction, comprises the mark of the gesture that needs are deleted in described gesture delete instruction; According to described gesture delete instruction, delete the corresponding relation of the task of corresponding gesture and described gesture and selection.
Based on identical technical conceive, the embodiment of the present invention additionally provides a kind of task execution device that can be applicable to above-mentioned flow process.
Fig. 9 shows the structural representation of a kind of task execution device that the embodiment of the present invention provides.
As shown in Figure 9, a kind of task execution device, comprising:
Gesture identification unit 91, for receiving the gesture of input, described gesture is finger or touching device motion track on the touchscreen; Identify the gesture of described input;
Call unit 92, for after described gesture identification unit 91 identifies the gesture of input, according to the corresponding relation of gesture and task, search the task corresponding with the gesture of described input, and when finding the task corresponding with the gesture of described input, calling the task of finding and performing.
Preferably, described call unit 92 specifically for, obtain input gesture, extract the track characteristic value of the gesture of described input; According to the track characteristic value of gesture and the corresponding relation of task, and the track characteristic value of the gesture extracted, search the task corresponding with the track characteristic value of the described gesture extracted.
Preferably, described call unit 92 also for, if do not find the task corresponding with the gesture of described input in described corresponding relation, then by the form of message notifying frame, prompting query error message.
Preferably, described call unit 92 also for, if described in the tasks carrying failure that finds, then by the form of message notifying frame, prompting call error information, and call the task of described malloc failure malloc.
Based on identical technical conceive, the embodiment of the present invention additionally provides a kind of entity apparatus that can be applicable to above-mentioned flow process.
Figure 10 shows a kind of entity apparatus that the embodiment of the present invention provides.As shown in Figure 10, this entity apparatus comprises:
Touch-screen 101, for gathering gesture, described gesture is the motion track of finger or touching device;
Processor 102, for obtaining one or more tasks of selection, sets up the corresponding relation of the task of described gesture and selection;
Storer 103, for preserving the corresponding relation of the task of described gesture and selection.
Preferably, described processor 102 specifically for, after collecting gesture, extract the track characteristic value of described gesture; By the task list that task list interface provides, select one or more task, set up the corresponding relation of the mark of the track characteristic value of described gesture and the task of selection.
Preferably, described processor 102 also for, be described gesture allocation identification; Receive gesture delete instruction, in described gesture delete instruction, comprise the mark of the gesture that needs are deleted; Corresponding gesture and the gesture of described correspondence and the corresponding relation of task is deleted according to the mark of described gesture.
Based on identical technical conceive, the embodiment of the present invention additionally provides the entity apparatus that another kind can be applicable to above-mentioned flow process.
Figure 11 shows the another kind of entity apparatus that the embodiment of the present invention provides.As shown in figure 11, this entity apparatus comprises:
Touch-screen 111, for receiving the gesture of input, described gesture moves by collection finger or touching device the motion track obtained;
Storer 112, for storing the corresponding relation of gesture and task, the gesture of described storage and the corresponding relation of task specifically comprise: by one or more tasks of gesture and selection that will gather, the corresponding relation of the task of the described gesture that foundation obtains and selection;
Processor 113, for identifying the gesture of described input; According to the gesture stored in described storer 112 and the corresponding relation of task, searching the task corresponding with the gesture of described input, and when finding the task corresponding with the gesture of described input, calling the task of finding and performing.
Preferably, described processor 113 specifically for, extract the track characteristic value of the gesture of described input; According to the track characteristic value of gesture stored in described storer 112 and the corresponding relation of task, and the track characteristic value of the gesture extracted, search the task corresponding with the track characteristic value of the described gesture extracted.
The present invention describes with reference to according to the process flow diagram of the method for the embodiment of the present invention, equipment (system) and computer program and/or block scheme.Should understand can by the combination of the flow process in each flow process in computer program instructions realization flow figure and/or block scheme and/or square frame and process flow diagram and/or block scheme and/or square frame.These computer program instructions can be provided to the processor of multi-purpose computer, special purpose computer, Embedded Processor or other programmable data processing device, make the function that the instruction that performed by the processor of this computing machine or other programmable data processing device can be specified in a flow process in realization flow figure or multiple flow process and/or block scheme square frame or multiple square frame.
These computer program instructions also can be stored in can in the computer-readable memory that works in a specific way of vectoring computer or other programmable data processing device, the instruction making to be stored in this computer-readable memory produces the manufacture comprising command device, and this command device realizes the function of specifying in process flow diagram flow process or multiple flow process and/or block scheme square frame or multiple square frame.
These computer program instructions also can be loaded in computing machine or other programmable data processing device, make on computing machine or other programmable devices, to perform sequence of operations step to produce computer implemented process, thus the instruction performed on computing machine or other programmable devices is provided for the step realizing the function of specifying in a flow process of process flow diagram or a square frame of multiple flow process and/or block scheme or multiple square frame.
Although describe the preferred embodiments of the present invention, those skilled in the art once obtain the basic creative concept of cicada, then can make other change and amendment to these embodiments.So claims are intended to be interpreted as comprising preferred embodiment and falling into all changes and the amendment of the scope of the invention.
Obviously, those skilled in the art can carry out various change and modification to the present invention and not depart from the spirit and scope of the present invention.Like this, if these amendments of the present invention and modification belong within the scope of the claims in the present invention and equivalent technologies thereof, then the present invention is also intended to comprise these change and modification.

Claims (10)

1. a gesture method for packing, is characterized in that, comprising:
Gather gesture, described gesture is finger or touching device motion track on the touchscreen;
Obtain the one or more tasks selected, set up the corresponding relation of the task of described gesture and selection;
Preserve the corresponding relation of the task of described gesture and selection.
2. the method for claim 1, is characterized in that, the described one or more tasks obtaining selection, sets up the corresponding relation of the task of described gesture and selection, specifically comprises:
After collecting gesture, extract the track characteristic value of described gesture;
By the task list that task list interface provides, select one or more task, set up the corresponding relation of the mark of the track characteristic value of described gesture and the task of selection.
3. the method for claim 1, is characterized in that, after collecting gesture, comprises further: be described gesture allocation identification;
Described method comprises further:
Receive gesture delete instruction, in described gesture delete instruction, comprise the mark of the gesture that needs are deleted;
Corresponding gesture and the gesture of described correspondence and the corresponding relation of task is deleted according to the mark of described gesture.
4. a task executing method, is characterized in that, comprising:
Receive the gesture of input, the motion track that described gesture is obtained by collection finger or touching device movement on the touchscreen;
Identify the gesture of described input;
According to the corresponding relation of the gesture prestored and task, searching the task corresponding with the gesture of described input, and when finding the task corresponding with the gesture of described input, calling the task of finding and performing; The corresponding relation of the described gesture that prestores and task specifically comprises: by the gesture that will the gather one or more tasks with selection, the corresponding relation of the task of the described gesture that foundation obtains and selection.
5. method as claimed in claim 4, is characterized in that, according to the corresponding relation of the gesture prestored and task, searches the task corresponding with the gesture of described input, specifically comprises:
Obtain the gesture of input, extract the track characteristic value of the gesture of described input;
According to the track characteristic value of the gesture prestored and the corresponding relation of task, and the track characteristic value of the gesture extracted, search the task corresponding with the track characteristic value of the described gesture extracted.
6. a gesture packaging system, is characterized in that, comprising:
Gesture collecting unit, for gathering gesture, described gesture is finger or touching device motion track on the touchscreen;
Gesture encapsulation unit, for obtaining one or more tasks of selection, sets up the corresponding relation of the task of described gesture and selection;
Gesture storage unit, for preserving the corresponding relation of the task of described gesture and selection.
7. device as claimed in claim 6, is characterized in that, described gesture encapsulation unit specifically for: after collecting gesture, extract the track characteristic value of described gesture; By the task list that task list interface provides, select one or more task, set up the corresponding relation of the mark of the track characteristic value of described gesture and the task of selection.
8. device as claimed in claim 6, it is characterized in that, described gesture collecting unit is further used for as described gesture allocation identification; This device also comprises:
Gesture delete cells, for obtaining gesture delete instruction, comprises the mark of the gesture that needs are deleted in described gesture delete instruction; According to described gesture delete instruction, delete the corresponding relation of the task of corresponding gesture and described gesture and selection.
9. a task execution device, is characterized in that, comprising:
Gesture identification unit, for receiving the gesture of input, the motion track that described gesture is obtained by collection finger or touching device movement on the touchscreen; Identify the gesture of described input;
Call unit, for identify input in described gesture identification unit gesture after, according to the corresponding relation of gesture and task, search the task corresponding with the gesture of described input, and when finding the task corresponding with the gesture of described input, calling the task of finding and performing; The corresponding relation of the described gesture that prestores and task specifically comprises: by the gesture that will the gather one or more tasks with selection, the corresponding relation of the task of the described gesture that foundation obtains and selection.
10. device as claimed in claim 9, is characterized in that, described call unit specifically for, obtain the gesture of input, extract the track characteristic value of the gesture of described input; According to the track characteristic value of gesture and the corresponding relation of task, and the track characteristic value of the gesture extracted, search the task corresponding with the track characteristic value of the described gesture extracted.
CN201310705689.3A 2013-12-19 2013-12-19 Gesture packaging and task executing method and device Pending CN104731439A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310705689.3A CN104731439A (en) 2013-12-19 2013-12-19 Gesture packaging and task executing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310705689.3A CN104731439A (en) 2013-12-19 2013-12-19 Gesture packaging and task executing method and device

Publications (1)

Publication Number Publication Date
CN104731439A true CN104731439A (en) 2015-06-24

Family

ID=53455390

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310705689.3A Pending CN104731439A (en) 2013-12-19 2013-12-19 Gesture packaging and task executing method and device

Country Status (1)

Country Link
CN (1) CN104731439A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106453823A (en) * 2016-08-31 2017-02-22 腾讯科技(深圳)有限公司 Method and device for sending messages rapidly, and terminal
WO2018023574A1 (en) * 2016-08-04 2018-02-08 薄冰 Adjustment method for using gesture to open file folder, and file system
CN109325904A (en) * 2018-08-28 2019-02-12 百度在线网络技术(北京)有限公司 Image filters treating method and apparatus
CN109558060A (en) * 2018-11-29 2019-04-02 深圳市车联天下信息科技有限公司 Operating method, device and the vehicle-mounted ancillary equipment of vehicle-mounted ancillary equipment
CN111324762A (en) * 2018-12-17 2020-06-23 珠海格力电器股份有限公司 Picture display method and device, storage medium and terminal
CN112416236A (en) * 2020-03-23 2021-02-26 上海幻电信息科技有限公司 Gesture packaging and interaction method and device based on web page and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7532196B2 (en) * 2003-10-30 2009-05-12 Microsoft Corporation Distributed sensing techniques for mobile devices
CN101916166A (en) * 2010-08-19 2010-12-15 中兴通讯股份有限公司 Method for starting application program and mobile terminal
CN102662462A (en) * 2012-03-12 2012-09-12 中兴通讯股份有限公司 Electronic device, gesture recognition method and gesture application method
CN102841682A (en) * 2012-07-12 2012-12-26 宇龙计算机通信科技(深圳)有限公司 Terminal and gesture manipulation method
CN103064620A (en) * 2012-12-24 2013-04-24 华为终端有限公司 Touch screen operation method and touch screen terminal
CN103164064A (en) * 2011-12-15 2013-06-19 英顺源(上海)科技有限公司 System and method enabling corresponding control to be carried out when target gesture is input at any position

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7532196B2 (en) * 2003-10-30 2009-05-12 Microsoft Corporation Distributed sensing techniques for mobile devices
CN101916166A (en) * 2010-08-19 2010-12-15 中兴通讯股份有限公司 Method for starting application program and mobile terminal
CN103164064A (en) * 2011-12-15 2013-06-19 英顺源(上海)科技有限公司 System and method enabling corresponding control to be carried out when target gesture is input at any position
CN102662462A (en) * 2012-03-12 2012-09-12 中兴通讯股份有限公司 Electronic device, gesture recognition method and gesture application method
CN102841682A (en) * 2012-07-12 2012-12-26 宇龙计算机通信科技(深圳)有限公司 Terminal and gesture manipulation method
CN103064620A (en) * 2012-12-24 2013-04-24 华为终端有限公司 Touch screen operation method and touch screen terminal

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018023574A1 (en) * 2016-08-04 2018-02-08 薄冰 Adjustment method for using gesture to open file folder, and file system
CN106453823A (en) * 2016-08-31 2017-02-22 腾讯科技(深圳)有限公司 Method and device for sending messages rapidly, and terminal
CN106453823B (en) * 2016-08-31 2020-01-31 腾讯科技(深圳)有限公司 method, device and terminal for quickly sending information
CN109325904A (en) * 2018-08-28 2019-02-12 百度在线网络技术(北京)有限公司 Image filters treating method and apparatus
CN109558060A (en) * 2018-11-29 2019-04-02 深圳市车联天下信息科技有限公司 Operating method, device and the vehicle-mounted ancillary equipment of vehicle-mounted ancillary equipment
CN111324762A (en) * 2018-12-17 2020-06-23 珠海格力电器股份有限公司 Picture display method and device, storage medium and terminal
CN112416236A (en) * 2020-03-23 2021-02-26 上海幻电信息科技有限公司 Gesture packaging and interaction method and device based on web page and storage medium

Similar Documents

Publication Publication Date Title
CN107422934B (en) Icon setting method and electronic equipment
CN106202367B (en) A kind of processing method and processing device of object information
KR101412764B1 (en) Alternative unlocking patterns
KR102129795B1 (en) Mobile terminal and method for controlling thereof
CN101916166B (en) Method for starting application program and mobile terminal
CN104731439A (en) Gesture packaging and task executing method and device
CN105490919B (en) Message cancelling method and device
CN102891931B (en) Contact person search processing method based on mobile phone and mobile phone
CN111736980B (en) Memory management method and device
CN106020602A (en) Shortcut icon search method and apparatus as well as terminal device
CN105794155B (en) Method, device and equipment for displaying message
JP2014514674A (en) Item display control method and apparatus
CN109062467A (en) Split screen application switching method, device, storage medium and electronic equipment
CN106775185B (en) Desktop display method and terminal
CN103294385B (en) A kind of method and apparatus of information association
CN106302996A (en) Message display method and device
KR20140089224A (en) Device and method for executing operation based on touch-input
CN104111927A (en) Method, equipment and system for information classification
CN104133832A (en) Pirate application identification method and device
CN109074171A (en) Input method and electronic equipment
CN106713608A (en) Application function state modifying method and apparatus, and terminal
CN104239316B (en) Search the method and device of object
CN113126838A (en) Application icon sorting method and device and electronic equipment
CN106033301B (en) Application program desktop management method and touch screen terminal
CN105205066B (en) Picture searching method and device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20150624