CN103955274A - Application control method and device - Google Patents

Application control method and device Download PDF

Info

Publication number
CN103955274A
CN103955274A CN201410160815.6A CN201410160815A CN103955274A CN 103955274 A CN103955274 A CN 103955274A CN 201410160815 A CN201410160815 A CN 201410160815A CN 103955274 A CN103955274 A CN 103955274A
Authority
CN
China
Prior art keywords
gesture
application
preset range
corresponding relation
feature point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410160815.6A
Other languages
Chinese (zh)
Other versions
CN103955274B (en
Inventor
王川
李创奇
刘小鹤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Technology Co Ltd
Xiaomi Inc
Original Assignee
Xiaomi Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiaomi Inc filed Critical Xiaomi Inc
Priority to CN201410160815.6A priority Critical patent/CN103955274B/en
Publication of CN103955274A publication Critical patent/CN103955274A/en
Application granted granted Critical
Publication of CN103955274B publication Critical patent/CN103955274B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention belongs to an application control method and device and belongs to the field of terminals. The application control method comprises the steps of capturing a gesture and a reference object in a first preset range; detecting the application operated at the foreground; acquiring an operation command corresponding to the gesture and the reference object according to the type of the application; executing the operation command in the operation process of the application. The operation command corresponding to the gesture can be executed in the operation process of the application only if a user makes the gesture around the reference object, and the operation is simple, convenient and fast.

Description

Application controls method and apparatus
Technical field
The disclosure is directed to terminal field, is about application controls method and apparatus specifically.
Background technology
Along with the development of intelligent television technology, on intelligent television, a plurality of application can be installed, to realize different functions.But, the application scenarios of intelligent television and user's operating habit have determined that intelligent television still needs to rely on telepilot to operate, due to the buttons such as directionkeys, acknowledgement key being only provided on telepilot, when application on intelligent television is more, user need to click repeatedly button could start a certain application on intelligent television, and in this application operational process, user also needs to click repeatedly button and controls this application.Take Video Applications as example, user need to click repeatedly button just can find this Video Applications, also needs afterwards to click repeatedly the video that button could find hope to watch in this Video Applications, needs again to click repeatedly button and could start to play this video, control operation is too loaded down with trivial details, consuming time long.
Summary of the invention
In order to solve the problem existing in correlation technique, the disclosure provides a kind of application controls method and apparatus.Described technical scheme is as follows:
According to the first aspect of disclosure embodiment, a kind of application controls method is provided, described method comprises:
Catch gesture and object of reference in the first preset range;
Detect the application of front stage operation;
According to the type of described application, obtain described gesture and operational order corresponding to described object of reference;
In the operational process of described application, carry out described operational order.
Described according to the type of described application, obtain described gesture and operational order corresponding to described object of reference comprises:
When described, while being applied as main desktop application, according to the first default corresponding relation, obtain described gesture and application identities corresponding to described object of reference, the described first default corresponding relation comprises the corresponding relation between gesture, object of reference and application identities.
Described in the operational process of described application, carry out described operational order and comprise:
In the operational process of described main desktop application, start the indicated application of described application identities.
Described according to the type of described application, obtain described gesture and operational order corresponding to described object of reference comprises:
When the described application being applied as beyond main desktop application, according to the second default corresponding relation, obtain described gesture and control operation corresponding to described object of reference, the described second default corresponding relation comprises the corresponding relation between the control operation of gesture, object of reference and described application.
Described in the operational process of described application, carry out described operational order and comprise:
In the operational process of described application, described application is carried out to described control operation.
Described gesture and object of reference of catching in the first preset range comprises:
Catch the static gesture in described the first preset range;
Catch the characteristics of human body in described static gesture the second preset range, according to described characteristics of human body, determine described object of reference; Or,
Catch the dynamic gesture in described the first preset range;
Catch the characteristics of human body in described dynamic gesture the second preset range, according to described characteristics of human body, determine described object of reference.
Described static gesture of catching in described the first preset range comprises:
Catch at least one the gesture feature point in described the first preset range;
Obtain the positional information of described at least one gesture feature point;
According to the positional information of described at least one gesture feature point, generate described static gesture.
Described dynamic gesture of catching in described the first preset range comprises:
Catch at least one the gesture feature point in described the first preset range;
Described in obtaining, at least one gesture feature point is in the positional information in each sampling period of the period of motion, and the described period of motion comprises a plurality of sampling periods;
According to the positional information in described each sampling period, generate a plurality of static gestures, and obtain the movement locus of described at least one gesture feature point;
At least one item according in the movement locus of described a plurality of static gestures and described at least one gesture feature point, generates described dynamic gesture.
The described characteristics of human body who catches in described static gesture the second preset range, according to described characteristics of human body, determines that described object of reference comprises:
Catch the face feature in described static gesture the second preset range, using described face feature as described object of reference;
Accordingly, catch the characteristics of human body in described dynamic gesture the second preset range, according to described characteristics of human body, determine that described object of reference comprises:
Catch the face feature in described dynamic gesture the second preset range, using described face feature as described object of reference.
Described according to the type of described application, obtain described gesture and operational order corresponding to described object of reference comprises:
Obtain the relative position relation between described gesture and described object of reference;
When described while being applied as main desktop application, according to the described first default corresponding relation, obtain described gesture, described object of reference and application identities corresponding to described relative position relation, the described first default corresponding relation comprises the corresponding relation between gesture, object of reference, relative position relation and application identities.
Described according to the type of described application, obtain described gesture and operational order corresponding to described object of reference comprises:
Obtain the relative position relation between described gesture and described object of reference;
When the described application being applied as beyond main desktop application, according to the described second default corresponding relation, obtain described gesture, described object of reference and control operation corresponding to described relative position relation, the described second default corresponding relation comprises the corresponding relation between the control operation of gesture, object of reference, relative position relation and described application.
Described gesture feature point comprises at least one in finger-joint, finger segment and palm.
According to the second aspect of disclosure embodiment, a kind of application controls device is provided, described device comprises:
Trapping module, for catching gesture and the object of reference in the first preset range;
Detection module, for detection of the application of front stage operation;
Instruction acquisition module, for according to the type of described application, obtains described gesture and operational order corresponding to described object of reference;
Instruction execution module, for the operational process in described application, carries out described operational order.
Described instruction acquisition module comprises:
Application identities acquiring unit, for when described in while being applied as main desktop application, according to the first default corresponding relation, obtain described gesture and application identities corresponding to described object of reference, the described first default corresponding relation comprises the corresponding relation between gesture, object of reference and application identities.
Described instruction execution module, for the operational process in described main desktop application, starts the indicated application of described application identities.
Described instruction acquisition module comprises:
Control operation acquiring unit, for when described in while being applied as the application in addition of main desktop application, according to the second default corresponding relation, obtain described gesture and control operation corresponding to described object of reference, the described second default corresponding relation comprises the corresponding relation between the control operation of gesture, object of reference and described application.
Described instruction execution module, for the operational process in described application, is carried out described control operation to described application.
Described trapping module comprises:
Static capturing unit, for catching the static gesture in described the first preset range;
The first object of reference determining unit, for catching the characteristics of human body in described static gesture the second preset range, according to described characteristics of human body, determines described object of reference; Or,
Dynamical capture unit, for catching the dynamic gesture in described the first preset range;
The second object of reference determining unit, for catching the characteristics of human body in described dynamic gesture the second preset range, according to described characteristics of human body, determines described object of reference.
Described static capturing unit is for catching at least one the gesture feature point in described the first preset range; Obtain the positional information of described at least one gesture feature point; According to the positional information of described at least one gesture feature point, generate described static gesture.
Described Dynamical capture unit is for catching at least one the gesture feature point in described the first preset range; Described in obtaining, at least one gesture feature point is in the positional information in each sampling period of the period of motion, and the described period of motion comprises a plurality of sampling periods; According to the positional information in described each sampling period, generate a plurality of static gestures, and obtain the movement locus of described at least one gesture feature point; At least one item according in the movement locus of described a plurality of static gestures and described at least one gesture feature point, generates described dynamic gesture.
Described the first object of reference determining unit is for catching the face feature in described static gesture the second preset range, using described face feature as described object of reference;
Correspondingly, described the second object of reference determining unit is for catching the face feature in described dynamic gesture the second preset range, using described face feature as described object of reference.
Described instruction acquisition module also comprises:
The first relative position acquiring unit, for obtaining the relative position relation between described gesture and described object of reference;
Sign acquiring unit, for when described in while being applied as main desktop application, according to the described first default corresponding relation, obtain described gesture, described object of reference and application identities corresponding to described relative position relation, the described first default corresponding relation comprises the corresponding relation between gesture, object of reference, relative position relation and application identities.
Described instruction acquisition module also comprises:
The second relative position acquiring unit, for obtaining the relative position relation between described gesture and described object of reference;
Operation acquiring unit, for when described in while being applied as the application in addition of main desktop application, according to the second default corresponding relation, obtain described gesture, described object of reference and control operation corresponding to described relative position relation, the described second default corresponding relation comprises the corresponding relation between the control operation of gesture, object of reference, relative position relation and described application.
Described gesture feature point comprises at least one in finger-joint, finger segment and palm.
The technical scheme that embodiment of the present disclosure provides can comprise following beneficial effect:
The method and apparatus that the present embodiment provides, by gesture and object of reference in this capture terminal first preset range; Detect the application of front stage operation; According to the type of this application, obtain this gesture and operational order corresponding to this object of reference; In the operational process of this application, carry out this operational order.Only need user can in the operational process in this application, carry out operational order corresponding to this gesture making gesture near object of reference, simple and efficient to handle.
Should be understood that, it is only exemplary that above general description and details are hereinafter described, and can not limit the disclosure.
Accompanying drawing explanation
Accompanying drawing is herein merged in instructions and forms the part of this instructions, shows embodiment according to the invention, and is used from and explains principle of the present invention with instructions one.
Fig. 1 is according to the process flow diagram of a kind of application controls method shown in an exemplary embodiment;
Fig. 2 is according to the process flow diagram of a kind of application controls method shown in an exemplary embodiment;
Fig. 3 a is according to the static gesture schematic diagram shown in an exemplary embodiment;
Fig. 3 b is according to a kind of gesture and the object of reference schematic diagram played shown in an exemplary embodiment;
Fig. 3 c is according to a kind of telescope gesture and the object of reference schematic diagram shown in an exemplary embodiment;
Fig. 4 is according to the process flow diagram of a kind of application controls method shown in an exemplary embodiment;
Fig. 5 a is according to the schematic diagram of a kind of wave gesture and the object of reference shown in an exemplary embodiment;
Fig. 5 b launches the schematic diagram of gesture and object of reference according to a kind of palm shown in an exemplary embodiment;
Fig. 5 c is according to the make progress schematic diagram of gesture and object of reference of a kind of forefinger shown in an exemplary embodiment;
Fig. 5 d is according to a kind of schematic diagram of closing gesture and object of reference shown in an exemplary embodiment;
Fig. 5 e is according to a kind of heart-shaped gesture shown in an exemplary embodiment and the schematic diagram of object of reference;
Fig. 5 f is according to the make progress schematic diagram of gesture and object of reference of a kind of forefinger shown in an exemplary embodiment;
Fig. 6 is according to a kind of application controls device schematic diagram shown in an exemplary embodiment;
Fig. 7 is according to the block diagram of a kind of device shown in an exemplary embodiment.
Embodiment
For making object of the present disclosure, technical scheme and advantage clearer, below in conjunction with embodiment and accompanying drawing, the disclosure is described in further details.At this, exemplary embodiment of the present disclosure and explanation thereof are used for explaining the disclosure, but not as to restriction of the present disclosure.
Disclosure embodiment provides a kind of application controls method and apparatus, below in conjunction with accompanying drawing, the disclosure is elaborated.
Fig. 1 is according to the process flow diagram of a kind of application controls method shown in an exemplary embodiment, and as shown in Figure 1, this application controls method, for terminal, comprises the following steps:
In step 101, catch gesture and object of reference in the first preset range.
In step 102, detect the application of front stage operation.
In step 103, according to the type of this application, obtain this gesture and operational order corresponding to this object of reference.
In step 104, in the operational process of this application, carry out this operational order.
The method that the present embodiment provides, by catching gesture and the object of reference in the first preset range; Detect the application of front stage operation; According to the type of this application, obtain this gesture and operational order corresponding to this object of reference; In the operational process of this application, carry out this operational order, only need user can in the operational process in this application, carry out operational order corresponding to this gesture making gesture near object of reference, simple and efficient to handle.
This obtains this gesture and operational order corresponding to this object of reference and comprises according to the type of this application:
When this is applied as main desktop application, according to the first default corresponding relation, obtain this gesture and application identities corresponding to this object of reference, this first default corresponding relation comprises the corresponding relation between gesture, object of reference and application identities.
Should, in the operational process of this application, carry out this operational order and comprise:
In the operational process of this main desktop application, start the indicated application of this application identities.
This obtains this gesture and operational order corresponding to this object of reference and comprises according to the type of this application:
When this is applied as the application beyond main desktop application, according to the second default corresponding relation, obtain this gesture and control operation corresponding to this object of reference, this second default corresponding relation comprises the corresponding relation between the control operation of gesture, object of reference and this application.
Should, in the operational process of this application, carry out this operational order and comprise:
In the operational process of this application, this application is carried out to this control operation.
This gesture and object of reference of catching in the first preset range comprises:
Catch the static gesture in this first preset range;
Catch the characteristics of human body in this static gesture second preset range, according to this characteristics of human body, determine this object of reference; Or,
Catch the dynamic gesture in this first preset range;
Catch the characteristics of human body in this dynamic gesture second preset range, according to this characteristics of human body, determine this object of reference.
This static gesture of catching in this first preset range comprises:
Catch at least one the gesture feature point in this first preset range;
Obtain the positional information of this at least one gesture feature point;
According to the positional information of this at least one gesture feature point, generate this static gesture.
This dynamic gesture of catching in this first preset range comprises:
Catch at least one the gesture feature point in this first preset range;
Obtain this at least one gesture feature point in the positional information in each sampling period of the period of motion, this period of motion comprises a plurality of sampling periods;
According to the positional information in this each sampling period, generate a plurality of static gestures, and obtain the movement locus of this at least one gesture feature point;
At least one item according in the movement locus of the plurality of static gesture and this at least one gesture feature point, generates this dynamic gesture.
This catches the characteristics of human body in this static gesture second preset range, according to this characteristics of human body, determines that this object of reference comprises:
Catch the face feature in this static gesture second preset range, using this face feature as this object of reference;
Accordingly, catch the characteristics of human body in this dynamic gesture second preset range, according to this characteristics of human body, determine that this object of reference comprises:
Catch the face feature in this dynamic gesture second preset range, using this face feature as this object of reference.
This obtains this gesture and operational order corresponding to this object of reference and comprises according to the type of this application:
Obtain the relative position relation between this gesture and this object of reference;
When this is applied as main desktop application, according to this first default corresponding relation, obtain this gesture, this object of reference and application identities corresponding to this relative position relation, this first default corresponding relation comprises the corresponding relation between gesture, object of reference, relative position relation and application identities.
This obtains this gesture and operational order corresponding to this object of reference and comprises according to the type of this application:
Obtain the relative position relation between this gesture and this object of reference;
When this is applied as the application beyond main desktop application, according to the second default corresponding relation, obtain this gesture, this object of reference and control operation corresponding to this relative position relation, this second default corresponding relation comprises the corresponding relation between the control operation of gesture, object of reference, relative position relation and this application.
This gesture feature point comprises at least one in finger-joint, finger segment and palm.
Above-mentioned all optional technical schemes, can adopt any combination to form optional embodiment of the present invention, and this is no longer going to repeat them.
In one embodiment, Fig. 2 is the exemplary process diagram of application controls method, and the executive agent of this embodiment is terminal, referring to Fig. 2, comprises the following steps:
In step 201, the gesture in this capture terminal first preset range.
Wherein, this terminal can be the equipment such as TV, computer, mobile phone, this terminal configuration has body sense collecting device, by this body sense collecting device, can catch the gesture that user makes, this body sense collecting device can be body sense camera or data glove etc., this first preset range is definite by the acquisition range of this body sense collecting device, and the present embodiment does not all limit this.Take TV as example, on TV, configure body sense camera, this body sense camera and this TV are electrically connected, and according to the image pickup scope of this body sense camera, can determine this first preset range, and this body sense camera can be caught the gesture that in this first preset range, user makes.
People's hand can be divided into palm, thumb, forefinger, middle finger, the third finger and little finger, and each finger is comprised of segment and joint, along with the motion of finger-joint, can form different gestures.In the present embodiment, in order to catch the gesture in this first preset range, this first preset range of this terminal monitoring, while there is object in this first preset range being detected, this object is identified, judged whether this object is people's hand, when determining the hand that this object is behaved, the posture of obtaining this object, is gesture.So, when user wishes to start application, hand need be stretched in the first preset range of this terminal, in this first preset range, make the corresponding gesture of application identities.
The gesture that user makes can be divided into static gesture and dynamic gesture, and corresponding, this step 201 " gesture in this capture terminal first preset range " can comprise the following steps 201a or 201b:
Static gesture in 201a, this first preset range of this capture terminal.
Static gesture refers to the gesture remaining static that user makes.User puts into hand the first preset range of this terminal, and while making gesture and transfixion, this terminal can capture static gesture.
In the present embodiment, this terminal can be using at least one in the finger-joint of hand, finger segment and palm as gesture feature point, this terminal detects in this preset range, whether to occur arbitrary gesture feature point, when this capture terminal is during to gesture feature point in this first preset range, can catch gesture according to this gesture feature point.This step 201a can comprise the following steps 201a-1 to 201a-3:
At least one gesture feature point in 201a-1, this first preset range of this capture terminal.
This terminal can be extracted the feature of each gesture feature point in advance, while there is object in this first preset range being detected, extract the feature of this object, the feature that judges this object whether with the characteristic matching of each gesture feature point, when the feature of this object and the characteristic matching of arbitrary gesture feature point, can determine and capture this gesture feature point.For example, when this terminal is determined the feature of this object and the characteristic matching of finger-joint, determine and capture finger-joint.
201a-2, this terminal are obtained the positional information of this at least one gesture feature point.
This terminal can be set three-dimensional system of coordinate, when this capture terminal arrives this at least one gesture feature point, determine the position of this at least one gesture feature point, form by the position of this at least one gesture feature point with the coordinate figure in this three-dimensional system of coordinate represents, thereby obtains the positional information of this at least one gesture feature point.
201a-3, this terminal, according to the positional information of this at least one gesture feature point, generate this static gesture.
This terminal can carry out curve fitting according to the positional information of this at least one gesture feature point, obtains this static gesture.Take finger-joint as example, and all finger-joints in this first preset range of this capture terminal, obtain the positional information of each finger-joint, according to the positional information of each finger-joint, carry out curve fitting, and generate this static gesture.
Dynamic gesture in 201b, this first preset range of this capture terminal.
Dynamic gesture refers to the gesture being kept in motion that user makes.User puts into hand the first preset range of this terminal, and while making the gesture under motion state, this terminal can capture dynamic gesture.
Based on above-mentioned gesture feature point, this step 201b can comprise the following steps 201b-1 to 201b-4:
At least one gesture feature point in 201b-1, this first preset range of this capture terminal.
201b-2, this terminal are obtained this at least one gesture feature point in the positional information in each sampling period of the period of motion, and this period of motion comprises a plurality of sampling periods;
This terminal can preset the duration of the period of motion, and this period of motion comprises a plurality of sampling periods, and this sampling period refers to the sampling time interval of the body sense collecting device of this terminal configuration.When this capture terminal arrives this at least one gesture feature point, this period of motion starts, this terminal is obtained this at least one gesture feature and is put current positional information, every during through a sampling period afterwards, this terminal is obtained positional information one time,, when this period of motion finishes, this terminal can obtain a plurality of positional informations of this at least one gesture feature point.
For example, the duration that this terminal is set this period of motion is 1s, the duration in this sampling period is 0.1s, suppose that this terminal captures this at least one gesture feature point when 0s, obtain this at least one gesture feature and put current positional information, afterwards every 0.1s, obtain the positional information of this at least one gesture feature point, until while arriving 1s, this terminal can get 11 positional informations of this at least one gesture feature point.
201b-3, this terminal, according to the positional information in this each sampling period, generate a plurality of static gestures, and obtain the movement locus of this at least one gesture feature point;
For a sampling period, this terminal carries out curve fitting according to the positional information of this at least one gesture feature point, can obtain a static gesture.And for a gesture feature point, the positional information in each sampling period carries out curve fitting this terminal according to this gesture feature point, can obtain the movement locus of this gesture feature point.
201b-4, this terminal, according at least one in the movement locus of the plurality of static gesture and this at least one gesture feature point, generate this dynamic gesture.
In the present embodiment, this terminal can be according to the plurality of static gesture, the motion that staff is done when a upper static gesture switches to next static gesture is simulated, and obtains this dynamic gesture, or, according to the movement locus of this at least one gesture feature point, movement locus to staff is simulated, and generates this dynamic gesture, or, movement locus according to the plurality of static gesture and this at least one gesture feature point, generates this dynamic gesture.
Referring to Fig. 3 a, this terminal obtains a plurality of static gestures, and the motion that staff is done when a upper static gesture switches to next static gesture is simulated, and can obtain the gesture that palm swings.
In step 202, the face feature in this gesture second preset range of this capture terminal, using this face feature as object of reference.
In the present embodiment, when this capture terminal arrives this gesture, the second preset range IT face feature in this gesture, while there is object in the second preset range that this gesture detected, this object is identified, judged whether this object is face feature, when definite this object is arbitrary face feature, determine while capturing face feature, using the face feature capturing as object of reference.Wherein, this second preset range be take this gesture as benchmark, can when developing, be determined by technician, and the present embodiment does not limit this.
This terminal can be caught the characteristics of human body in this gesture second preset range, according to this characteristics of human body, determine this object of reference, as eyes, ear, face, nose, this face feature of eyebrow and four limbs and trunk etc., the present embodiment is only characterized as example with these capture terminal face and describes.
In step 203, this terminal detects the application of front stage operation, judges whether this application is main desktop application.
In the present embodiment, arbitrary application of just installing on the application of front stage operation can be for this terminal, may be main desktop application or other application.This terminal can detect the application of front stage operation, judge whether this application is main desktop application, when this is applied as main desktop application, this terminal determines that this gesture and this object of reference are for starting this gesture and application corresponding to this object of reference, and when this is applied as the application beyond this main desktop application, this terminal determines that this gesture and this object of reference carry out this gesture and control operation corresponding to this object of reference for controlling this application.
In step 204, when this is applied as main desktop application, this terminal, according to the first default corresponding relation, is obtained this gesture and application identities corresponding to this object of reference.
The present embodiment is only applied as example with the top table face that is applied as of this front stage operation and describes.This first default corresponding relation comprises the corresponding relation between gesture, object of reference and application identities, and this application identities can be Apply Names, application numbers etc., and the present embodiment does not limit this.
In the present embodiment, this gesture can above this object of reference, below, left, right-hand, the place ahead or rear etc., when this gesture is different from relative position between this object of reference, corresponding application identities also can be different.For this reason, this first default corresponding relation can comprise the corresponding relation between gesture, object of reference, relative position relation and application identities, accordingly, this terminal is when capturing this gesture and this object of reference, obtain the relative position relation between this gesture and this object of reference, and according to this first default corresponding relation, obtain this gesture, this object of reference and application identities corresponding to this relative position relation, to start the indicated application of this application identities.
In addition, this terminal can also show to user the gesture and the image of object of reference that each application identities is corresponding in advance according to this first default corresponding relation, the gesture that need make while user can be known start each application and the object of reference of correspondence.
In step 205, this terminal starts the indicated application of this application identities.An application of each application identities indication, when this terminal gets this application identities, can start the indicated application of this application identities.
In the present embodiment, a plurality of application are installed in this terminal, and definite corresponding gesture and object of reference of a plurality of application identities, when this capture terminal is during to object of reference in arbitrary gesture and this gesture the second preset range, can be according to this first default corresponding relation, obtain this gesture and application identities corresponding to this object of reference, thereby start the indicated application of this application identities, simple and efficient to handle.
For instance, the method comprises the following steps (2-1) or (2-2):
(2-1) when this capture terminal is to playing gesture, and this object of reference is while being trunk, starts music application.
Referring to Fig. 3 b, when this terminal detects the five fingers bending and moves back and forth, determine to capture and play gesture, when the characteristics of human body in determining this second preset range of playing gesture is trunk, determine that this object of reference is trunk, this terminal starts music application.
(2-2) when this capture terminal is to telescope gesture, and this object of reference is while being eyes, starts inquiry application.
Wherein, this inquiry application can be for the inquiry application of local file, the application of requester network file etc., the present embodiment does not limit this.Referring to Fig. 3 c, when this terminal detects the thumb of both hands and all the other four fingers, connect into circularly, and two eyes are by two anchor rings forming during to this terminal, determine to capture this telescope gesture and this object of reference is eyes, start inquiry application.
It should be noted that, above-mentioned is only to provide severally for starting gesture and the object of reference of application for example, but does not form the restriction to gesture and object of reference.
The method that the present embodiment provides, by gesture and object of reference in this capture terminal first preset range; Detect the application of front stage operation; According to the type of this application, obtain this gesture and operational order corresponding to this object of reference; In the operational process of this application, carry out this operational order.Only need user near object of reference, to make gesture and can from a plurality of application, start application corresponding to this gesture, simple and efficient to handle.Further, static gesture and dynamic gesture in this preset range of this capture terminal, improved dirigibility.
Fig. 4 is according to the process flow diagram of a kind of application controls method shown in an exemplary embodiment, and as shown in Figure 4, this application controls method, for terminal, comprises the following steps:
In step 401, the gesture in this capture terminal first preset range.
This step 401 is similar with step 201, does not repeat them here.
In step 402, the face feature in this gesture second preset range of this capture terminal, using this face feature as object of reference.
In step 403, this terminal detects the application of front stage operation, judges whether this application is main desktop application.
In step 404, when this is applied as the application beyond main desktop application, this terminal, according to the second default corresponding relation, is obtained this gesture and control operation corresponding to this object of reference.
The present embodiment is not only that top table face is applied as example and describes with the application of this front stage operation.This second default corresponding relation comprises the corresponding relation between the control operation of gesture, object of reference and this application, the control operation of this application can for click arbitrary button in this application operation, close operation of this application etc., the present embodiment does not limit this.
In the present embodiment, for identical gesture and object of reference, the control operation that this gesture and this object of reference can be corresponding different in different application, for the ease of distinguishing, this terminal can be determined respectively the second default corresponding relation of each application, for each application, this terminal is obtained the second default corresponding relation of this application, thereby when capturing this gesture and this object of reference, according to second of this application the default corresponding relation, obtain this gesture and control operation corresponding to this object of reference.
In the present embodiment, this gesture can above this object of reference, below, left, right-hand, the place ahead or rear etc., when this gesture is different from relative position between this object of reference, corresponding control operation also can be different.For this reason, this second default corresponding relation can comprise the corresponding relation between the control operation of gesture, object of reference, relative position relation and this application, accordingly, this terminal is when capturing this gesture and this object of reference, obtain the relative position relation between this gesture and this object of reference, and according to this second default corresponding relation, obtain this gesture, this object of reference and control operation corresponding to this relative position relation, so that this application is carried out to this control operation.
In addition, this terminal can also be according to this second default corresponding relation, to user, show gesture that each control operation is corresponding and the image of object of reference in advance, the gesture that user can be known need make while controlling this application and the object of reference of correspondence near this object of reference.
In step 405, this terminal is carried out this control operation to this application.
In the present embodiment, when this terminal captures arbitrary gesture in this application operational process, and while capturing the face feature in this gesture second preset range, can be according to this second default corresponding relation, obtain this gesture and control operation corresponding to this object of reference, thereby this application is carried out to this control operation, simple and efficient to handle.
For instance, the method comprises the following steps any one in (4-1) to (4-11):
(4-1) when this capture terminal is to the gesture of waving, and this object of reference is while being arm, activates the current function of choosing.
In this application operational process, this terminal can provide several functions, and user can activate arbitrary function or close arbitrary function in the function setting menu of this application.
Referring to Fig. 5 a, when this terminal detects palmar aspect to this terminal and swings, confirm to capture the gesture of waving, while only having this characteristics of human body of arm in this terminal is determined the second preset range of this gesture of waving, determine that this object of reference is arm, this terminal is determined the current function of choosing, and activates this function.
(4-2) when this capture terminal launches gesture to palm, and this object of reference is while being ear, improves volume.
Referring to Fig. 5 b, when this terminal detects that palm launches and the five fingers close up, determine that capturing this palm launches gesture, when this terminal determines that this palm launches characteristics of human body in the second preset range of gesture for ear, determine that this object of reference is ear, this terminal improves volume.
(4-3) when this capture terminal is to the forefinger gesture that makes progress, and this object of reference is while being face, reduces volume.
Referring to Fig. 5 c, when this terminal detect that forefinger stretches out and vertically upward, all the other four fingers are when clench fist, determine and to capture this forefinger gesture that makes progress, when this terminal determines that characteristics of human body that this forefinger makes progress in the second preset range of gesture is for face, determine that this object of reference is face, this terminal reduces volume.
(4-4) when this capture terminal is to closing gesture, and this object of reference is while being trunk, closes this application.
Referring to Fig. 5 d, when detecting both hands, this terminal intersects while being placed in front, determine that capturing this closes gesture and this object of reference is trunk, closes this application.
(4-5) when this capture terminal is to heart-shaped gesture, and this object of reference is while being head, carries out the top comment operation to current demonstration information.
In this application operational process, this terminal may show the information of issuing on information display platform, and now user can comment on this information, as commented on and step on comment etc. in top, when this capture terminal arrives heart-shaped gesture, carry out the top comment operation to current demonstration information.
Referring to Fig. 5 e, when the finger that both hands detected when this terminal joins, determine and capture heart-shaped gesture, the characteristics of human body in this terminal is determined the second preset range of this heart gesture is during for head, determine that this object of reference is head, this terminal is carried out the top comment operation to current demonstration information.
(4-6) when this capture terminal is to the forefinger gesture that makes progress, and this object of reference is while being nose, carries out current demonstration information to step on comment operation.
Referring to Fig. 5 f, when this terminal detect that forefinger stretches out and vertically upward, other fingers are when clench fist, confirm to capture this forefinger gesture that makes progress, when determining that characteristics of human body that this forefinger makes progress in the second preset range of gesture is for nose, determine that this object of reference is nose, this terminal is carried out the comment of stepping on of current demonstration information is operated.
It should be noted that, above-mentioned is only to provide severally for controlling gesture and the object of reference of this application for example, but does not form the restriction to gesture and object of reference.
The method that the present embodiment provides, by gesture and object of reference in this capture terminal first preset range; Detect the application of front stage operation; According to the type of this application, obtain this gesture and operational order corresponding to this object of reference; In the operational process of this application, carry out this operational order.Only need user near object of reference, to make gesture and can carry out corresponding control operation to this application, simple and efficient to handle.
Fig. 6 is according to a kind of application controls device schematic diagram shown in an exemplary embodiment.With reference to Fig. 6, this device comprises trapping module 601, detection module 602, instruction acquisition module 603 and instruction execution module 604.
This trapping module 601 is configured to for catching gesture and the object of reference in the first preset range;
This detection module 602 is configured to the application for detection of front stage operation;
This instruction acquisition module 603 is configured to, for according to the type of this application, obtain this gesture and operational order corresponding to this object of reference;
This instruction execution module 604 is configured in the operational process of this application, carries out this operational order.
The device that the present embodiment provides, by gesture and object of reference in this capture terminal first preset range; Detect the application of front stage operation; According to the type of this application, obtain this gesture and operational order corresponding to this object of reference; In the operational process of this application, carry out this operational order.Only need user can in the operational process in this application, carry out operational order corresponding to this gesture making gesture near object of reference, simple and efficient to handle.
This instruction acquisition module 603 comprises:
Application identities acquiring unit, for when this is applied as main desktop application, according to the first default corresponding relation, obtain this gesture and application identities corresponding to this object of reference, this first default corresponding relation comprises the corresponding relation between gesture, object of reference and application identities.
This instruction execution module 604, for the operational process in this main desktop application, starts the indicated application of this application identities.
This instruction acquisition module 603 comprises:
Control operation acquiring unit, while being used for the application beyond this is applied as main desktop application, according to the second default corresponding relation, obtain this gesture and control operation corresponding to this object of reference, this second default corresponding relation comprises the corresponding relation between the control operation of gesture, object of reference and this application.
This instruction execution module 604, for the operational process in this application, is carried out this control operation to this application.
This trapping module 601 comprises:
Static capturing unit, for catching the static gesture in this first preset range;
The first object of reference determining unit, for catching the characteristics of human body in this static gesture second preset range, according to this characteristics of human body, determines this object of reference; Or,
Dynamical capture unit, for catching the dynamic gesture in this first preset range;
The second object of reference determining unit, for catching the characteristics of human body in this dynamic gesture second preset range, according to this characteristics of human body, determines this object of reference.
This static state capturing unit is for catching at least one the gesture feature point in this first preset range; Obtain the positional information of this at least one gesture feature point; According to the positional information of this at least one gesture feature point, generate this static gesture.
This Dynamical capture unit is for catching at least one the gesture feature point in this first preset range; Obtain this at least one gesture feature point in the positional information in each sampling period of the period of motion, this period of motion comprises a plurality of sampling periods; According to the positional information in this each sampling period, generate a plurality of static gestures, and obtain the movement locus of this at least one gesture feature point; At least one item according in the movement locus of the plurality of static gesture and this at least one gesture feature point, generates this dynamic gesture.
This first object of reference determining unit is for catching the face feature in this static gesture second preset range, using this face feature as this object of reference;
Correspondingly, this second object of reference determining unit is for catching the face feature in this dynamic gesture second preset range, using this face feature as this object of reference.
This instruction acquisition module 603 also comprises:
The first relative position acquiring unit, for obtaining the relative position relation between this gesture and this object of reference;
Sign acquiring unit, for when this is applied as main desktop application, according to this first default corresponding relation, obtain this gesture, this object of reference and application identities corresponding to this relative position relation, this first default corresponding relation comprises the corresponding relation between gesture, object of reference, relative position relation and application identities.
This instruction acquisition module 603 also comprises:
The second relative position acquiring unit, for obtaining the relative position relation between this gesture and this object of reference;
Operation acquiring unit, while being used for the application beyond this is applied as main desktop application, according to the second default corresponding relation, obtain this gesture, this object of reference and control operation corresponding to this relative position relation, this second default corresponding relation comprises the corresponding relation between the control operation of gesture, object of reference, relative position relation and this application.
This gesture feature point comprises at least one in finger-joint, finger segment and palm.
Above-mentioned all optional technical schemes, can adopt any combination to form optional embodiment of the present invention, and this is no longer going to repeat them.
About the device in above-described embodiment, wherein the concrete mode of modules executable operations have been described in detail in the embodiment of relevant the method, will not elaborate explanation herein.
It should be noted that: the application controls device that above-described embodiment provides is when controlling application, only the division with above-mentioned each functional module is illustrated, in practical application, can above-mentioned functions be distributed and by different functional modules, completed as required, the inner structure that is about to terminal is divided into different functional modules, to complete all or part of function described above.In addition, application controls device and application controls embodiment of the method that above-described embodiment provides belong to same design, and its specific implementation process refers to embodiment of the method, repeats no more here.
Fig. 7 is according to the block diagram of a kind of device 700 shown in an exemplary embodiment, and this device 700 can be for starting application or controlling application.For example, device 700 can be mobile phone, computing machine, digital broadcast terminal, information receiving and transmitting equipment, game console, flat-panel devices, Medical Devices, body-building equipment, personal digital assistant etc.
With reference to Fig. 7, device 700 can comprise following one or more assembly: processing components 702, storer 704, power supply module 706, multimedia groupware 708, audio-frequency assembly 710, the interface 712 of I/O (I/O), sensor module 714, and communications component 716.
The integrated operation of processing components 702 common control device 700, such as with demonstration, call, data communication, the operation that camera operation and record operation are associated.Treatment element 702 can comprise that one or more processors 720 carry out instruction, to complete all or part of step of above-mentioned method.In addition, processing components 702 can comprise one or more modules, is convenient to mutual between processing components 702 and other assemblies.For example, processing element 702 can comprise multi-media module, to facilitate mutual between multimedia groupware 708 and processing components 702.
Storer 704 is configured to store various types of data to be supported in the operation of equipment 700.The example of these data comprises for any application program of operation on device 700 or the instruction of method, contact data, telephone book data, message, picture, video etc.Storer 704 can be realized by the volatibility of any type or non-volatile memory device or their combination, as static RAM (SRAM), Electrically Erasable Read Only Memory (EEPROM), Erasable Programmable Read Only Memory EPROM (EPROM), programmable read-only memory (prom), ROM (read-only memory) (ROM), magnetic store, flash memory, disk or CD.
Electric power assembly 706 provides electric power for installing 700 various assemblies.Electric power assembly 706 can comprise power-supply management system, one or more power supplys, and other and the assembly that generates, manages and distribute electric power to be associated for device 700.
Multimedia groupware 708 is included in the screen that an output interface is provided between this device 700 and user.In certain embodiments, screen can comprise liquid crystal display (LCD) and touch panel (TP).If screen comprises touch panel, screen may be implemented as touch-screen, to receive the input signal from user.Touch panel comprises that one or more touch sensors are with the gesture on sensing touch, slip and touch panel.This touch sensor is the border of sensing touch or sliding action not only, but also detects duration and the pressure relevant to this touch or slide.In certain embodiments, multimedia groupware 708 comprises a front-facing camera and/or post-positioned pick-up head.When equipment 700 is in operator scheme, during as screening-mode or video mode, front-facing camera and/or post-positioned pick-up head can receive outside multi-medium data.Each front-facing camera and post-positioned pick-up head can be fixing optical lens systems or have focal length and optical zoom ability.
Audio-frequency assembly 710 is configured to output and/or input audio signal.For example, audio-frequency assembly 710 comprises a microphone (MIC), and when device 700 is in operator scheme, during as call model, logging mode and speech recognition mode, microphone is configured to receive external audio signal.The sound signal receiving can be further stored in storer 704 or be sent via communications component 716.In certain embodiments, audio-frequency assembly 710 also comprises a loudspeaker, for output audio signal.
I/O interface 712 is for providing interface between processing components 702 and peripheral interface module, and above-mentioned peripheral interface module can be keyboard, some striking wheel, button etc.These buttons can include but not limited to: home button, volume button, start button and locking press button.
Sensor module 714 comprises one or more sensors, is used to device 700 that the state estimation of various aspects is provided.For example, sensor module 714 can detect the opening/closing state of equipment 700, the relative positioning of assembly, for example this assembly is display and the keypad of device 700, the position of all right pick-up unit 700 of sensor module 714 or 700 1 assemblies of device changes, user is with device 700 existence that contact or do not have the temperature variation of device 700 orientation or acceleration/deceleration and device 700.Sensor module 714 can comprise proximity transducer, be configured to without any physical contact time detect near the existence of object.Sensor module 714 can also comprise optical sensor, as CMOS or ccd image sensor, for using in imaging applications.In certain embodiments, this sensor module 714 can also comprise acceleration transducer, gyro sensor, Magnetic Sensor, pressure transducer or temperature sensor.
Communications component 716 is configured to be convenient to the communication of wired or wireless mode between device 700 and other equipment.Device 700 wireless networks that can access based on communication standard, as WiFi, 2G or 3G, or their combination.In one exemplary embodiment, communication component 716 receives broadcast singal or the broadcast related information from external broadcasting management system via broadcast channel.In one exemplary embodiment, this communication component 716 also comprises near-field communication (NFC) module, to promote junction service.For example, can be based on radio-frequency (RF) identification (RFID) technology in NFC module, Infrared Data Association (IrDA) technology, ultra broadband (UWB) technology, bluetooth (BT) technology and other technologies realize.
In the exemplary embodiment, device 700 can be realized by one or more application specific integrated circuit (ASIC), digital signal processor (DSP), digital signal processing appts (DSPD), programmable logic device (PLD) (PLD), field programmable gate array (FPGA), controller, microcontroller, microprocessor or other electronic components, for carrying out said method.
In the exemplary embodiment, also provide a kind of non-provisional computer-readable recording medium that comprises instruction, for example, comprised the storer 704 of instruction, above-mentioned instruction can have been carried out said method by the processor 720 of device 700.For example, this non-provisional computer-readable recording medium can be ROM, random-access memory (ram), CD-ROM, tape, floppy disk and optical data storage equipment etc.
A non-provisional computer-readable recording medium, when the instruction in this storage medium is carried out by the processor of mobile terminal, makes mobile terminal can carry out a kind of application controls method, and the method comprises:
Catch gesture and object of reference in the first preset range;
Detect the application of front stage operation;
According to the type of this application, obtain this gesture and operational order corresponding to this object of reference;
In the operational process of this application, carry out this operational order.
This obtains this gesture and operational order corresponding to this object of reference and comprises according to the type of this application:
When this is applied as main desktop application, according to the first default corresponding relation, obtain this gesture and application identities corresponding to this object of reference, this first default corresponding relation comprises the corresponding relation between gesture, object of reference and application identities.
Should, in the operational process of this application, carry out this operational order and comprise:
In the operational process of this main desktop application, start the indicated application of this application identities.
This obtains this gesture and operational order corresponding to this object of reference and comprises according to the type of this application:
When this is applied as the application beyond main desktop application, according to the second default corresponding relation, obtain this gesture and control operation corresponding to this object of reference, this second default corresponding relation comprises the corresponding relation between the control operation of gesture, object of reference and this application.
Should, in the operational process of this application, carry out this operational order and comprise:
In the operational process of this application, this application is carried out to this control operation.
This gesture and object of reference of catching in the first preset range comprises:
Catch the static gesture in this first preset range;
Catch the characteristics of human body in this static gesture second preset range, according to this characteristics of human body, determine this object of reference; Or,
Catch the dynamic gesture in this first preset range;
Catch the characteristics of human body in this dynamic gesture second preset range, according to this characteristics of human body, determine this object of reference.
This static gesture of catching in this first preset range comprises:
Catch at least one the gesture feature point in this first preset range;
Obtain the positional information of this at least one gesture feature point;
According to the positional information of this at least one gesture feature point, generate this static gesture.
This dynamic gesture of catching in this first preset range comprises:
Catch at least one the gesture feature point in this first preset range;
Obtain this at least one gesture feature point in the positional information in each sampling period of the period of motion, this period of motion comprises a plurality of sampling periods;
According to the positional information in this each sampling period, generate a plurality of static gestures, and obtain the movement locus of this at least one gesture feature point;
At least one item according in the movement locus of the plurality of static gesture and this at least one gesture feature point, generates this dynamic gesture.
This catches the characteristics of human body in this static gesture second preset range, according to this characteristics of human body, determines that this object of reference comprises:
Catch the face feature in this static gesture second preset range, using this face feature as this object of reference;
Accordingly, catch the characteristics of human body in this dynamic gesture second preset range, according to this characteristics of human body, determine that this object of reference comprises:
Catch the face feature in this dynamic gesture second preset range, using this face feature as this object of reference.
This obtains this gesture and operational order corresponding to this object of reference and comprises according to the type of this application:
Obtain the relative position relation between this gesture and this object of reference;
When this is applied as main desktop application, according to this first default corresponding relation, obtain this gesture, this object of reference and application identities corresponding to this relative position relation, this first default corresponding relation comprises the corresponding relation between gesture, object of reference, relative position relation and application identities.
This obtains this gesture and operational order corresponding to this object of reference and comprises according to the type of this application:
Obtain the relative position relation between this gesture and this object of reference;
When this is applied as the application beyond main desktop application, according to this second default corresponding relation, obtain this gesture, this object of reference and control operation corresponding to this relative position relation, this second default corresponding relation comprises the corresponding relation between the control operation of gesture, object of reference, relative position relation and this application.
This gesture feature point comprises at least one in finger-joint, finger segment and palm.
Those skilled in the art, considering instructions and putting into practice after invention disclosed herein, will easily expect other embodiment of the present invention.The application is intended to contain any modification of the present invention, purposes or adaptations, and these modification, purposes or adaptations are followed general principle of the present invention and comprised undocumented common practise or the conventional techniques means in the art of the disclosure.Instructions and embodiment are only regarded as exemplary, and true scope of the present invention and spirit are pointed out by claim below.
Should be understood that, the present invention is not limited to precision architecture described above and illustrated in the accompanying drawings, and can carry out various modifications and change not departing from its scope.Scope of the present invention is only limited by appended claim.

Claims (24)

1. an application controls method, is characterized in that, described method comprises:
Catch gesture and object of reference in the first preset range;
Detect the application of front stage operation;
According to the type of described application, obtain described gesture and operational order corresponding to described object of reference;
In the operational process of described application, carry out described operational order.
2. method according to claim 1, is characterized in that, described according to the type of described application, obtains described gesture and operational order corresponding to described object of reference comprises:
When described, while being applied as main desktop application, according to the first default corresponding relation, obtain described gesture and application identities corresponding to described object of reference, the described first default corresponding relation comprises the corresponding relation between gesture, object of reference and application identities.
3. method according to claim 2, is characterized in that, described in the operational process of described application, carries out described operational order and comprises:
In the operational process of described main desktop application, start the indicated application of described application identities.
4. method according to claim 1, is characterized in that, described according to the type of described application, obtains described gesture and operational order corresponding to described object of reference comprises:
When the described application being applied as beyond main desktop application, according to the second default corresponding relation, obtain described gesture and control operation corresponding to described object of reference, the described second default corresponding relation comprises the corresponding relation between the control operation of gesture, object of reference and described application.
5. method according to claim 4, is characterized in that, described in the operational process of described application, carries out described operational order and comprises:
In the operational process of described application, described application is carried out to described control operation.
6. method according to claim 1, is characterized in that, described in gesture and the object of reference of catching in the first preset range comprise:
Catch the static gesture in described the first preset range;
Catch the characteristics of human body in described static gesture the second preset range, according to described characteristics of human body, determine described object of reference; Or,
Catch the dynamic gesture in described the first preset range;
Catch the characteristics of human body in described dynamic gesture the second preset range, according to described characteristics of human body, determine described object of reference.
7. method according to claim 6, is characterized in that, described in the static gesture of catching in described the first preset range comprise:
Catch at least one the gesture feature point in described the first preset range;
Obtain the positional information of described at least one gesture feature point;
According to the positional information of described at least one gesture feature point, generate described static gesture.
8. method according to claim 6, is characterized in that, described in the dynamic gesture of catching in described the first preset range comprise:
Catch at least one the gesture feature point in described the first preset range;
Described in obtaining, at least one gesture feature point is in the positional information in each sampling period of the period of motion, and the described period of motion comprises a plurality of sampling periods;
According to the positional information in described each sampling period, generate a plurality of static gestures, and obtain the movement locus of described at least one gesture feature point;
At least one item according in the movement locus of described a plurality of static gestures and described at least one gesture feature point, generates described dynamic gesture.
9. method according to claim 6, is characterized in that, described in catch the characteristics of human body in described static gesture the second preset range, according to described characteristics of human body, determine that described object of reference comprises:
Catch the face feature in described static gesture the second preset range, using described face feature as described object of reference;
Accordingly, catch the characteristics of human body in described dynamic gesture the second preset range, according to described characteristics of human body, determine that described object of reference comprises:
Catch the face feature in described dynamic gesture the second preset range, using described face feature as described object of reference.
10. method according to claim 1, is characterized in that, described according to the type of described application, obtains described gesture and operational order corresponding to described object of reference comprises:
Obtain the relative position relation between described gesture and described object of reference;
When described while being applied as main desktop application, according to the described first default corresponding relation, obtain described gesture, described object of reference and application identities corresponding to described relative position relation, the described first default corresponding relation comprises the corresponding relation between gesture, object of reference, relative position relation and application identities.
11. methods according to claim 1, is characterized in that, described according to the type of described application, obtain described gesture and operational order corresponding to described object of reference comprises:
Obtain the relative position relation between described gesture and described object of reference;
When the described application being applied as beyond main desktop application, according to the described second default corresponding relation, obtain described gesture, described object of reference and control operation corresponding to described relative position relation, the described second default corresponding relation comprises the corresponding relation between the control operation of gesture, object of reference, relative position relation and described application.
12. according to the method described in claim 7 or 8, it is characterized in that, described gesture feature point comprises at least one in finger-joint, finger segment and palm.
13. 1 kinds of application controls devices, is characterized in that, described device comprises:
Trapping module, for catching gesture and the object of reference in the first preset range;
Detection module, for detection of the application of front stage operation;
Instruction acquisition module, for according to the type of described application, obtains described gesture and operational order corresponding to described object of reference;
Instruction execution module, for the operational process in described application, carries out described operational order.
14. devices according to claim 13, is characterized in that, described instruction acquisition module comprises:
Application identities acquiring unit, for when described in while being applied as main desktop application, according to the first default corresponding relation, obtain described gesture and application identities corresponding to described object of reference, the described first default corresponding relation comprises the corresponding relation between gesture, object of reference and application identities.
15. devices according to claim 14, is characterized in that, described instruction execution module, for the operational process in described main desktop application, starts the indicated application of described application identities.
16. devices according to claim 13, is characterized in that, described instruction acquisition module comprises:
Control operation acquiring unit, for when described in while being applied as the application in addition of main desktop application, according to the second default corresponding relation, obtain described gesture and control operation corresponding to described object of reference, the described second default corresponding relation comprises the corresponding relation between the control operation of gesture, object of reference and described application.
17. devices according to claim 16, is characterized in that, described instruction execution module, for the operational process in described application, is carried out described control operation to described application.
18. devices according to claim 13, is characterized in that, described trapping module comprises:
Static capturing unit, for catching the static gesture in described the first preset range;
The first object of reference determining unit, for catching the characteristics of human body in described static gesture the second preset range, according to described characteristics of human body, determines described object of reference; Or,
Dynamical capture unit, for catching the dynamic gesture in described the first preset range;
The second object of reference determining unit, for catching the characteristics of human body in described dynamic gesture the second preset range, according to described characteristics of human body, determines described object of reference.
19. devices according to claim 18, is characterized in that, described static capturing unit is for catching at least one the gesture feature point in described the first preset range; Obtain the positional information of described at least one gesture feature point; According to the positional information of described at least one gesture feature point, generate described static gesture.
20. devices according to claim 18, is characterized in that, described Dynamical capture unit is for catching at least one the gesture feature point in described the first preset range; Described in obtaining, at least one gesture feature point is in the positional information in each sampling period of the period of motion, and the described period of motion comprises a plurality of sampling periods; According to the positional information in described each sampling period, generate a plurality of static gestures, and obtain the movement locus of described at least one gesture feature point; At least one item according in the movement locus of described a plurality of static gestures and described at least one gesture feature point, generates described dynamic gesture.
21. devices according to claim 18, is characterized in that, described the first object of reference determining unit is for catching the face feature in described static gesture the second preset range, using described face feature as described object of reference;
Correspondingly, described the second object of reference determining unit is for catching the face feature in described dynamic gesture the second preset range, using described face feature as described object of reference.
22. devices according to claim 13, is characterized in that, described instruction acquisition module also comprises:
The first relative position acquiring unit, for obtaining the relative position relation between described gesture and described object of reference;
Sign acquiring unit, for when described in while being applied as main desktop application, according to the described first default corresponding relation, obtain described gesture, described object of reference and application identities corresponding to described relative position relation, the described first default corresponding relation comprises the corresponding relation between gesture, object of reference, relative position relation and application identities.
23. devices according to claim 13, is characterized in that, described instruction acquisition module also comprises:
The second relative position acquiring unit, for obtaining the relative position relation between described gesture and described object of reference;
Operation acquiring unit, for when described in while being applied as the application in addition of main desktop application, according to the described second default corresponding relation, obtain described gesture, described object of reference and control operation corresponding to described relative position relation, the described second default corresponding relation comprises the corresponding relation between the control operation of gesture, object of reference, relative position relation and described application.
24. according to the device described in claim 19 or 20, it is characterized in that, described gesture feature point comprises at least one in finger-joint, finger segment and palm.
CN201410160815.6A 2014-04-21 2014-04-21 Application control method and apparatus Active CN103955274B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410160815.6A CN103955274B (en) 2014-04-21 2014-04-21 Application control method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410160815.6A CN103955274B (en) 2014-04-21 2014-04-21 Application control method and apparatus

Publications (2)

Publication Number Publication Date
CN103955274A true CN103955274A (en) 2014-07-30
CN103955274B CN103955274B (en) 2017-09-01

Family

ID=51332559

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410160815.6A Active CN103955274B (en) 2014-04-21 2014-04-21 Application control method and apparatus

Country Status (1)

Country Link
CN (1) CN103955274B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104598265A (en) * 2014-12-12 2015-05-06 宇龙计算机通信科技(深圳)有限公司 Method and system for starting applications based on body resistance values
CN105988560A (en) * 2015-02-03 2016-10-05 中兴通讯股份有限公司 Application starting method and device
CN106453836A (en) * 2016-09-09 2017-02-22 珠海格力电器股份有限公司 Application closing method and device
CN107450717A (en) * 2016-05-31 2017-12-08 联想(北京)有限公司 A kind of information processing method and Wearable
CN107991893A (en) * 2017-11-14 2018-05-04 美的集团股份有限公司 Realize method, gesture identification module, main control module and the home appliance of communication
CN109684006A (en) * 2018-12-11 2019-04-26 东莞市步步高通信软件有限公司 A kind of terminal control method and device
CN110134232A (en) * 2019-04-22 2019-08-16 东风汽车集团有限公司 A kind of mobile phone support adjusting method and system based on gesture identification

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101482772A (en) * 2008-01-07 2009-07-15 纬创资通股份有限公司 Electronic device and its operation method
US20110216075A1 (en) * 2010-03-08 2011-09-08 Sony Corporation Information processing apparatus and method, and program
CN102221885A (en) * 2011-06-15 2011-10-19 青岛海信电器股份有限公司 Television, and control method and device thereof
CN102253709A (en) * 2010-05-19 2011-11-23 禾瑞亚科技股份有限公司 Method and device for determining gestures
CN103226389A (en) * 2013-04-27 2013-07-31 苏州佳世达电通有限公司 Method for executing application program according to gesture

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101482772A (en) * 2008-01-07 2009-07-15 纬创资通股份有限公司 Electronic device and its operation method
US20110216075A1 (en) * 2010-03-08 2011-09-08 Sony Corporation Information processing apparatus and method, and program
CN102253709A (en) * 2010-05-19 2011-11-23 禾瑞亚科技股份有限公司 Method and device for determining gestures
CN102221885A (en) * 2011-06-15 2011-10-19 青岛海信电器股份有限公司 Television, and control method and device thereof
CN103226389A (en) * 2013-04-27 2013-07-31 苏州佳世达电通有限公司 Method for executing application program according to gesture

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104598265A (en) * 2014-12-12 2015-05-06 宇龙计算机通信科技(深圳)有限公司 Method and system for starting applications based on body resistance values
CN105988560A (en) * 2015-02-03 2016-10-05 中兴通讯股份有限公司 Application starting method and device
CN107450717A (en) * 2016-05-31 2017-12-08 联想(北京)有限公司 A kind of information processing method and Wearable
CN106453836A (en) * 2016-09-09 2017-02-22 珠海格力电器股份有限公司 Application closing method and device
CN107991893A (en) * 2017-11-14 2018-05-04 美的集团股份有限公司 Realize method, gesture identification module, main control module and the home appliance of communication
CN109684006A (en) * 2018-12-11 2019-04-26 东莞市步步高通信软件有限公司 A kind of terminal control method and device
CN110134232A (en) * 2019-04-22 2019-08-16 东风汽车集团有限公司 A kind of mobile phone support adjusting method and system based on gesture identification

Also Published As

Publication number Publication date
CN103955274B (en) 2017-09-01

Similar Documents

Publication Publication Date Title
CN105657173B (en) Volume adjusting method, device and mobile terminal
CN105204742B (en) Control method, device and the terminal of electronic equipment
CN103955275A (en) Application control method and device
CN103955274A (en) Application control method and device
CN105653085B (en) Touch-responsive method and apparatus
CN104182173A (en) Camera switching method and device
CN104793739A (en) Play control method and device
CN104090721A (en) Terminal control method and device
CN106951884A (en) Gather method, device and the electronic equipment of fingerprint
CN104794382A (en) Application starting method and device
CN104461304A (en) Application control method and device
CN104850769A (en) Method and device for executing operation
CN104598130A (en) Mode switching method, terminal, wearable equipment and device
CN104867506A (en) Music automatic control method and device
CN104090741A (en) Statistical method and device for electronic book reading
CN106375782A (en) Video playing method and device
CN109144260B (en) Dynamic motion detection method, dynamic motion control method and device
CN106489113A (en) The method of VR control, device and electronic equipment
CN103995666A (en) Method and device for setting work mode
CN104615359A (en) Method and device for performing voice operation on application software
CN105447150A (en) Face album based music playing method and apparatus, and terminal device
CN105406882A (en) Terminal equipment control method and device
CN106600530A (en) Photograph synthetic method and apparatus
CN106201108B (en) Gloves control mode touch mode control method and device and electronic equipment
CN107529699A (en) Control method of electronic device and device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant