CN103955274B - Application control method and apparatus - Google Patents

Application control method and apparatus Download PDF

Info

Publication number
CN103955274B
CN103955274B CN201410160815.6A CN201410160815A CN103955274B CN 103955274 B CN103955274 B CN 103955274B CN 201410160815 A CN201410160815 A CN 201410160815A CN 103955274 B CN103955274 B CN 103955274B
Authority
CN
China
Prior art keywords
gesture
application
preset range
corresponding relation
capture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410160815.6A
Other languages
Chinese (zh)
Other versions
CN103955274A (en
Inventor
王川
李创奇
刘小鹤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiaomi Inc
Original Assignee
Xiaomi Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiaomi Inc filed Critical Xiaomi Inc
Priority to CN201410160815.6A priority Critical patent/CN103955274B/en
Publication of CN103955274A publication Critical patent/CN103955274A/en
Application granted granted Critical
Publication of CN103955274B publication Critical patent/CN103955274B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure is directed to a kind of application control method and apparatus, belong to field of terminal.The application control method includes:Capture the gesture and object of reference in the first preset range;Detect the application of front stage operation;According to the type of the application, gesture operational order corresponding with the object of reference is obtained;In the running of the application, the operational order is performed.The present invention only needs user to be made around gesture in object of reference can perform the corresponding operational order of the gesture in the running of the application, simple and efficient to handle.

Description

Application control method and apparatus
Technical field
The disclosure is directed to field of terminal, specifically on application control method and apparatus.
Background technology
With the development of ntelligent television technolog, multiple applications can be installed on intelligent television, to realize different functions. But, the application scenarios of intelligent television and the operating habit of user determine that intelligent television still needs by remote control Row operation, due to provide only the buttons such as directionkeys, acknowledgement key on remote control, when the application on intelligent television is more, user The a certain application on intelligent television could be started by needing to click on multiple button, and user also needs to click in this applies running Multiple button controls the application.By taking Video Applications as an example, user, which needs to click on multiple button, can just find the Video Applications, it The video for wishing viewing could be found in the Video Applications by also needing to click on multiple button afterwards, need to click on multiple button again The video can be commenced play out, control operation is excessively cumbersome, taken long.
The content of the invention
In order to solve problem present in correlation technique, present disclose provides a kind of application control method and apparatus.It is described Technical scheme is as follows:
According to the first aspect of the embodiment of the present disclosure there is provided a kind of application control method, methods described includes:
Capture the gesture and object of reference in the first preset range;
Detect the application of front stage operation;
According to the type of the application, the gesture and the corresponding operational order of the object of reference are obtained;
In the running of the application, the operational order is performed.
The type according to the application, obtaining the gesture and the corresponding operational order of the object of reference includes:
When the application is main desktop application, according to the first default corresponding relation, the gesture and the reference are obtained The corresponding application identities of thing, the described first default corresponding relation includes the corresponding relation between gesture, object of reference and application identities.
It is described in the running of the application, performing the operational order includes:
In the running of the main desktop application, start the application indicated by the application identities.
The type according to the application, obtaining the gesture and the corresponding operational order of the object of reference includes:
When the application beyond the application is main desktop application, according to the second default corresponding relation, the gesture is obtained Control operation corresponding with the object of reference, the described second default corresponding relation includes gesture, object of reference and the control of the application Corresponding relation between system operation.
It is described in the running of the application, performing the operational order includes:
In the running of the application, the control operation is performed to the application.
Gesture and object of reference in the first preset range of the capture include:
Capture the static gesture in first preset range;
The characteristics of human body in the preset range of static gesture second is captured, according to the characteristics of human body, the ginseng is determined According to thing;Or,
Capture the dynamic gesture in first preset range;
The characteristics of human body in the preset range of dynamic gesture second is captured, according to the characteristics of human body, the ginseng is determined According to thing.
Static gesture in capture first preset range includes:
Capture at least one gesture feature point in first preset range;
Obtain the positional information of at least one gesture feature point;
According to the positional information of at least one gesture feature point, the static gesture is generated.
Dynamic gesture in capture first preset range includes:
Capture at least one gesture feature point in first preset range;
At least one described gesture feature point is obtained in the positional information in each sampling period of the period of motion, the motion Cycle includes multiple sampling periods;
According to the positional information in each sampling period, multiple static gestures are generated, and obtain at least one described hand The movement locus of gesture characteristic point;
At least one of in the movement locus of the multiple static gesture and at least one gesture feature point, it is raw Into the dynamic gesture.
Characteristics of human body in the capture preset range of static gesture second, according to the characteristics of human body, determines institute Stating object of reference includes:
The face feature in the preset range of static gesture second is captured, the face feature is regard as the reference Thing;
Accordingly, the characteristics of human body in the preset range of dynamic gesture second is captured, according to the characteristics of human body, it is determined that The object of reference includes:
The face feature in the preset range of dynamic gesture second is captured, the face feature is regard as the reference Thing.
The type according to the application, obtaining the gesture and the corresponding operational order of the object of reference includes:
Obtain the relative position relation between the gesture and the object of reference;
When the application is main desktop application, according to the described first default corresponding relation, the gesture, the ginseng are obtained According to thing and the corresponding application identities of the relative position relation, the described first default corresponding relation includes gesture, object of reference, relative Corresponding relation between position relationship and application identities.
The type according to the application, obtaining the gesture and the corresponding operational order of the object of reference includes:
Obtain the relative position relation between the gesture and the object of reference;
When the application beyond the application is main desktop application, according to the described second default corresponding relation, obtain described Gesture, the object of reference and the corresponding control operation of the relative position relation, the described second default corresponding relation include gesture, Corresponding relation between the control operation of object of reference, relative position relation and the application.
The gesture feature point includes at least one in finger-joint, finger segment and palm.
According to the second aspect of the embodiment of the present disclosure there is provided a kind of application control device, described device includes:
Trapping module, for capturing gesture and object of reference in the first preset range;
Detection module, the application for detecting front stage operation;
Instruction acquisition module, for the type according to the application, obtains the gesture and the corresponding behaviour of the object of reference Instruct;
Performing module is instructed, in the running of the application, performing the operational order.
The instruction acquisition module includes:
Application identities acquiring unit, for when the application is main desktop application, according to the first default corresponding relation, obtaining The gesture and the corresponding application identities of the object of reference are taken, the described first default corresponding relation includes gesture, object of reference with answering With the corresponding relation between mark.
The instruction performing module is used in the running of the main desktop application, starts the application identities signified The application shown.
The instruction acquisition module includes:
Control operation acquiring unit, for being main desktop application when the application beyond application when, it is default according to second Corresponding relation, obtains the gesture and the corresponding control operation of the object of reference, the described second default corresponding relation include gesture, Corresponding relation between object of reference and the control operation of the application.
The instruction performing module is used in the running of the application, and the control is performed to the application and is grasped Make.
The trapping module includes:
Static capturing unit, for capturing the static gesture in first preset range;
First object of reference determining unit, for capturing the characteristics of human body in the preset range of static gesture second, according to The characteristics of human body, determines the object of reference;Or,
Dynamical capture unit, for capturing the dynamic gesture in first preset range;
Second object of reference determining unit, for capturing the characteristics of human body in the preset range of dynamic gesture second, according to The characteristics of human body, determines the object of reference.
The static capturing unit is used to capture at least one gesture feature point in first preset range;Obtain institute State the positional information of at least one gesture feature point;According to the positional information of at least one gesture feature point, generation is described Static gesture.
The Dynamical capture unit is used to capture at least one gesture feature point in first preset range;Obtain institute At least one positional information of gesture feature point in each sampling period of the period of motion is stated, the period of motion includes multiple adopt The sample cycle;According to the positional information in each sampling period, multiple static gestures are generated, and obtain at least one described gesture The movement locus of characteristic point;According in the movement locus of the multiple static gesture and at least one gesture feature point extremely One item missing, generates the dynamic gesture.
The first object of reference determining unit is used to capture the face feature in the preset range of static gesture second, will The face feature is used as the object of reference;
Correspondingly, the second object of reference determining unit is used to capture the face in the preset range of dynamic gesture second Feature, regard the face feature as the object of reference.
The instruction acquisition module also includes:
First relative position acquiring unit, for obtaining the relative position relation between the gesture and the object of reference;
Mark acquiring unit, for when the application is main desktop application, according to the described first default corresponding relation, obtaining Take the gesture, the object of reference and the corresponding application identities of the relative position relation, the described first default corresponding relation bag Include the corresponding relation between gesture, object of reference, relative position relation and application identities.
The instruction acquisition module also includes:
Second relative position acquiring unit, for obtaining the relative position relation between the gesture and the object of reference;
Operate acquiring unit, for being main desktop application when the application beyond application when, according to the second default correspondence Relation, obtains the gesture, the object of reference and the corresponding control operation of the relative position relation, the described second default correspondence Relation includes the corresponding relation between gesture, object of reference, relative position relation and the control operation of the application.
The gesture feature point includes at least one in finger-joint, finger segment and palm.
The technical scheme provided by this disclosed embodiment can include the following benefits:
The method and apparatus that the present embodiment is provided, pass through the gesture and object of reference in the preset range of capture terminal first; Detect the application of front stage operation;According to the type of the application, gesture operational order corresponding with the object of reference is obtained;Should at this In running, the operational order is performed.Only need user object of reference be made around gesture can the application operation During perform the corresponding operational order of the gesture, it is simple and efficient to handle.
It should be appreciated that the general description of the above and detailed description hereinafter are only exemplary, this can not be limited It is open.
Brief description of the drawings
Accompanying drawing herein is merged in specification and constitutes the part of this specification, shows the implementation for meeting the present invention Example, and for explaining principle of the invention together with specification.
Fig. 1 is a kind of flow chart of application control method according to an exemplary embodiment;
Fig. 2 is a kind of flow chart of application control method according to an exemplary embodiment;
Fig. 3 a are the static gesture schematic diagrames according to an exemplary embodiment;
Fig. 3 b are that one kind according to an exemplary embodiment plays gesture and object of reference schematic diagram;
Fig. 3 c are a kind of telescope gesture and the object of reference schematic diagram according to an exemplary embodiment;
Fig. 4 is a kind of flow chart of application control method according to an exemplary embodiment;
Fig. 5 a are the schematic diagrames of a kind of wave gesture and the object of reference according to an exemplary embodiment;
Fig. 5 b be according to an exemplary embodiment a kind of palm expansion gesture and object of reference schematic diagram;
Fig. 5 c are the schematic diagrames of the upward gesture of a kind of forefinger according to an exemplary embodiment and object of reference;
Fig. 5 d are the schematic diagrames of a kind of closing gesture according to an exemplary embodiment and object of reference;
Fig. 5 e are the schematic diagrames of a kind of heart-shaped gesture according to an exemplary embodiment and object of reference;
Fig. 5 f are the schematic diagrames of the upward gesture of a kind of forefinger according to an exemplary embodiment and object of reference;
Fig. 6 is a kind of application control schematic device according to an exemplary embodiment;
Fig. 7 is a kind of block diagram of device according to an exemplary embodiment.
Embodiment
It is right with reference to embodiment and accompanying drawing for the purpose, technical scheme and advantage of the disclosure are more clearly understood The disclosure is described in further details.Here, the exemplary embodiment of the disclosure and its illustrating to be used to explain the disclosure, but simultaneously Limited not as of this disclosure.
The embodiment of the present disclosure provides a kind of application control method and apparatus, and the disclosure is carried out specifically below in conjunction with accompanying drawing It is bright.
Fig. 1 is a kind of flow chart of application control method according to an exemplary embodiment, as shown in figure 1, should answer It is used for control method in terminal, is comprised the following steps:
In a step 101, the gesture and object of reference in the first preset range of capture.
In a step 102, the application of front stage operation is detected.
In step 103, according to the type of the application, gesture operational order corresponding with the object of reference is obtained.
At step 104, in the running of the application, the operational order is performed.
The method that the present embodiment is provided, by capturing gesture and object of reference in the first preset range;Detect front stage operation Application;According to the type of the application, gesture operational order corresponding with the object of reference is obtained;In the running of the application In, the operational order is performed, this can be performed in the running of the application by only needing user to be made around gesture in object of reference The corresponding operational order of gesture, it is simple and efficient to handle.
This is according to the type of the application, and obtaining gesture operational order corresponding with the object of reference includes:
When the application is main desktop application, according to the first default corresponding relation, the gesture is obtained corresponding with the object of reference Application identities, the first default corresponding relation includes the corresponding relation between gesture, object of reference and application identities.
This is in the running of the application, and performing the operational order includes:
In the running of the main desktop application, start the application indicated by the application identities.
This is according to the type of the application, and obtaining gesture operational order corresponding with the object of reference includes:
When the application is the application beyond main desktop application, according to the second default corresponding relation, obtains the gesture and be somebody's turn to do The corresponding control operation of object of reference, the second default corresponding relation is included between gesture, object of reference and the control operation of the application Corresponding relation.
This is in the running of the application, and performing the operational order includes:
In the running of the application, the control operation is performed to the application.
Gesture and object of reference in the preset range of capture first include:
Capture the static gesture in first preset range;
The characteristics of human body in the preset range of static gesture second is captured, according to the characteristics of human body, the object of reference is determined; Or,
Capture the dynamic gesture in first preset range;
The characteristics of human body in the preset range of dynamic gesture second is captured, according to the characteristics of human body, the object of reference is determined.
Static gesture in the capture first preset range includes:
Capture at least one gesture feature point in first preset range;
Obtain the positional information of at least one gesture feature point;
According to the positional information of at least one gesture feature point, the static gesture is generated.
Dynamic gesture in the capture first preset range includes:
Capture at least one gesture feature point in first preset range;
At least one gesture feature point is obtained in the positional information in each sampling period of the period of motion, the period of motion Including multiple sampling periods;
According to the positional information in each sampling period, multiple static gestures are generated, and it is special to obtain at least one gesture Levy movement locus a little;
At least one of in the movement locus of the plurality of static gesture and at least one gesture feature point, generation should Dynamic gesture.
Characteristics of human body in the capture preset range of static gesture second, according to the characteristics of human body, determines the object of reference Including:
The face feature in the preset range of static gesture second is captured, the face feature is regard as the object of reference;
Accordingly, the characteristics of human body in the preset range of dynamic gesture second is captured, according to the characteristics of human body, the ginseng is determined Include according to thing:
The face feature in the preset range of dynamic gesture second is captured, the face feature is regard as the object of reference.
This is according to the type of the application, and obtaining gesture operational order corresponding with the object of reference includes:
Obtain the relative position relation between the gesture and the object of reference;
When the application is main desktop application, according to the first default corresponding relation, obtains the gesture, the object of reference and be somebody's turn to do The corresponding application identities of relative position relation, the first default corresponding relation includes gesture, object of reference, relative position relation with answering With the corresponding relation between mark.
This is according to the type of the application, and obtaining gesture operational order corresponding with the object of reference includes:
Obtain the relative position relation between the gesture and the object of reference;
When the application is the application beyond main desktop application, according to the second default corresponding relation, the gesture, the ginseng are obtained According to thing control operation corresponding with the relative position relation, the second default corresponding relation includes gesture, object of reference, relative position Corresponding relation between relation and the control operation of the application.
The gesture feature point includes at least one in finger-joint, finger segment and palm.
Above-mentioned all optional technical schemes, can form the alternative embodiment of the present invention, herein no longer using any combination Repeat one by one.
In one embodiment, Fig. 2 is the exemplary process diagram of application control method, and the executive agent of the embodiment is eventually End, referring to Fig. 2, comprises the following steps:
In step 201, the gesture in the preset range of capture terminal first.
Wherein, the terminal can be the equipment such as TV, computer, mobile phone, and the terminal is configured with body-sensing collecting device, by this Body-sensing collecting device can capture the gesture that user makes, and the body-sensing collecting device can be body-sensing camera or data glove Etc., first preset range determines that the present embodiment is not limited this by the acquisition range of the body-sensing collecting device.With electricity It is considered as on example, TV and configures body-sensing camera, the body-sensing camera is electrically connected with the TV, according to takes the photograph for the body-sensing camera As scope can determine first preset range, the body-sensing camera can capture the hand that user makes in first preset range Gesture.
The hand of people can be divided into palm, thumb, forefinger, middle finger, the third finger and little finger, and each finger is by segment and pass Section composition, with the motion of finger-joint, can form different gestures.In the present embodiment, in order to capture this first preset In the range of gesture, the terminal monitoring first preset range, when detect there is object in first preset range when, to this Object is identified, judge the object whether be people hand, when it is determined that the object behave hand when, obtain the posture of the object, As gesture.So, when user wishes that startup is applied, hand need to be stretched into the first preset range of the terminal, this first The gesture corresponding to application identities is made in preset range.
The gesture that user makes can be divided into static gesture and dynamic gesture, accordingly, the step 201 " capture terminal Gesture in first preset range " may comprise steps of 201a or 201b:
Static gesture in 201a, the capture terminal first preset range.
Static gesture refers to the gesture remained static that user makes.User put his hand into the terminal first preset In the range of, when making gesture and transfixion, the terminal can capture static gesture.
In the present embodiment, the terminal can using in the finger-joint of hand, finger segment and palm at least one of as Gesture feature point, the terminal detects in the preset range any gesture feature point whether occur, when the capture terminal to this first During gesture feature point in preset range, you can capture gesture according to the gesture feature point.That is step 201a can include with Lower step 201a-1 to 201a-3:
At least one gesture feature point in 201a-1, the capture terminal first preset range.
The terminal can extract the feature of each gesture feature point in advance, thing occur when detecting in first preset range During body, extract the feature of the object, judge the object feature whether the characteristic matching with each gesture feature point, when the object Feature and any gesture feature point characteristic matching when, it may be determined that capture the gesture feature point.For example, the terminal is determined When the feature of the object and the characteristic matching of finger-joint, it is determined that capturing finger-joint.
201a-2, the terminal obtain the positional information of at least one gesture feature point.
The terminal can set three-dimensional system of coordinate, when the capture terminal is at least one gesture feature point, it is determined that should The position of at least one gesture feature point, by the position of at least one gesture feature point with the coordinate value in the three-dimensional system of coordinate Form represent, so as to obtain the positional information of at least one gesture feature point.
201a-3, the terminal generate the static gesture according to the positional information of at least one gesture feature point.
The terminal can carry out curve fitting according to the positional information of at least one gesture feature point, obtain the static hand Gesture.By taking finger-joint as an example, all finger-joints in the capture terminal first preset range obtain each finger-joint Positional information, carries out curve fitting according to the positional information of each finger-joint, generates the static gesture.
Dynamic gesture in 201b, the capture terminal first preset range.
Dynamic gesture refers to the gesture being kept in motion that user makes.User put his hand into the terminal first preset In the range of, when making the gesture under motion state, the terminal can capture dynamic gesture.
Based on above-mentioned gesture feature point, step 201b may comprise steps of 201b-1 to 201b-4:
At least one gesture feature point in 201b-1, the capture terminal first preset range.
201b-2, the terminal obtain at least one gesture feature point and believed in the position in each sampling period of the period of motion Breath, the period of motion includes multiple sampling periods;
The terminal can preset the duration of the period of motion, and the period of motion includes multiple sampling periods, sampling week Phase refers to the sampling time interval of the body-sensing collecting device of terminal configuration.When the capture terminal at least one gesture feature During point, the period of motion, the terminal obtains the current positional information of at least one gesture feature point, and one is often passed through afterwards During the individual sampling period, the terminal obtains a positional information, then at the end of the period of motion, the terminal can obtain this at least Multiple positional informations of one gesture feature point.
For example, when a length of 1s of the terminal profile period of motion, when a length of 0.1s in the sampling period, it is assumed that the terminal At least one gesture feature point is captured in 0s, then obtains the current positional information of at least one gesture feature point, it Afterwards every 0.1s, the positional information of at least one gesture feature point is obtained, until when reaching 1s, the terminal can get this 11 positional informations of at least one gesture feature point.
201b-3, the terminal generate multiple static gestures, and obtain this extremely according to the positional information in each sampling period The movement locus of a few gesture feature point;
For a sampling period, the terminal carries out curve according to the positional information of at least one gesture feature point Fitting, can obtain a static gesture.And for a gesture feature point, the terminal is according to the gesture feature point every The positional information in individual sampling period carries out curve fitting, and can obtain the movement locus of the gesture feature point.
201b-4, the terminal according in the movement locus of the plurality of static gesture and at least one gesture feature point extremely One item missing, generates the dynamic gesture.
In the present embodiment, the terminal can switch according to the plurality of static gesture to human hand from a upper static gesture The motion done during to next static gesture is simulated, and obtains the dynamic gesture, or, it is special according at least one gesture Movement locus a little is levied, the movement locus of human hand is simulated, the dynamic gesture is generated, or, according to the plurality of static hand The movement locus of gesture and at least one gesture feature point, generates the dynamic gesture.
Referring to Fig. 3 a, the terminal obtains multiple static gestures, human hand is switched to from a upper static gesture next quiet The motion done during state gesture is simulated, and can obtain the gesture that palm swings.
In step 202., the face feature in the capture terminal preset range of gesture second, using the face feature as Object of reference.
In the present embodiment, when the capture terminal is to the gesture, face are captured in the second preset range of the gesture Feature, when there is object in the second preset range for detecting the gesture, is identified to the object, whether judges the object For face feature, when it is determined that the object is any face feature, it is determined that when capturing face feature, the face captured are special Levy as object of reference.Wherein, second preset range can be determined on the basis of the gesture by technical staff in exploitation, this Embodiment is not limited this.
The terminal can capture the characteristics of human body in the preset range of gesture second, and the reference is determined according to the characteristics of human body Thing, such as eyes, ear, face, nose, this face feature of eyebrow and four limbs and trunk, the present embodiment are only caught with the terminal Obtain and illustrate exemplified by face feature.
In step 203, the terminal detects the application of front stage operation, and whether judge the application is main desktop application.
In the present embodiment, can be any application for installing in the terminal just in the application of front stage operation, may based on Desktop application or other application.The terminal can detect the application of front stage operation, and whether judge the application is main desktop application, When the application is main desktop application, the terminal determines that the gesture is corresponding with the object of reference for starting the gesture with the object of reference Application, and when the application is the application beyond the main desktop application, the terminal determines that the gesture and the object of reference are used to control Make the application and perform gesture control operation corresponding with the object of reference.
In step 204, when the application is main desktop application, the terminal is according to the first default corresponding relation, and obtaining should Gesture application identities corresponding with the object of reference.
The present embodiment is only illustrated so that the application of the front stage operation is main desktop application as an example.The first default correspondence is closed System includes the corresponding relation between gesture, object of reference and application identities, and the application identities can be Apply Names, application numbers Deng the present embodiment is not limited this.
In the present embodiment, the gesture can in the top of the object of reference, lower section, left, right, front or behind etc., When relative position between the gesture and the object of reference is different, corresponding application identities can also be different.Therefore, this is first pre- If corresponding relation can include the corresponding relation between gesture, object of reference, relative position relation and application identities, accordingly, should Terminal is capturing the gesture and during the object of reference, obtains the relative position relation between the gesture and the object of reference, and according to The first default corresponding relation, obtains the gesture, the object of reference application identities corresponding with the relative position relation, is somebody's turn to do with starting Application indicated by application identities.
In addition, the terminal can also show each application identities pair to user in advance according to the first default corresponding relation The gesture and the image of object of reference answered so that user can know the gesture and corresponding ginseng that starts and need to be made during each application According to thing.
In step 205, the terminal starts the application indicated by the application identities.Each application identities indicate that one is answered With when the terminal gets the application identities, you can start the application indicated by the application identities.
In the present embodiment, multiple applications are installed in the terminal, and determine the corresponding gesture of multiple application identities and reference Thing, can be first pre- according to this when the object of reference in the capture terminal to any gesture and the preset range of gesture second If corresponding relation, gesture application identities corresponding with the object of reference are obtained, so that start the application indicated by the application identities, It is simple and efficient to handle.
For example, this method comprises the following steps(2-1)Or(2-2):
(2-1)When the capture terminal is to playing gesture, and the object of reference is when being trunk, starts music application.
Referring to Fig. 3 b, when the terminal detects the five fingers bending and moves back and forth, gesture is played it is determined that capturing, when it is determined that When the characteristics of human body played in the second preset range of gesture is trunk, it is trunk to determine the object of reference, then the terminal starts Music application.
(2-2)When the capture terminal is to telescope gesture, and the object of reference is when being eyes, starts inquiry application.
Wherein, the inquiry application can be application, application of inquiry network file of inquiry local file etc., the present embodiment This is not limited.Referring to Fig. 3 c, when the terminal detects the thumb of both hands and remaining four finger connects into annular shape, and two Eyes by two annulus of formation towards the terminal when, it is determined that it is eyes to capture the telescope gesture and the object of reference, then Start inquiry application.
It should be noted that the example above is only to provide several gestures and object of reference for being used to start application, but not structure The restriction of paired gesture and object of reference.
The method that the present embodiment is provided, passes through the gesture and object of reference in the preset range of capture terminal first;Before detection The application of platform operation;According to the type of the application, gesture operational order corresponding with the object of reference is obtained;In the fortune of the application During row, the operational order is performed.The hand can be started from multiple applications by only needing user to be made around gesture in object of reference The corresponding application of gesture, it is simple and efficient to handle.Further, the static gesture in the capture terminal preset range and dynamic hand Gesture, improves flexibility.
Fig. 4 is a kind of flow chart of application control method according to an exemplary embodiment, as shown in figure 4, should answer It is used for control method in terminal, is comprised the following steps:
In step 401, the gesture in the preset range of capture terminal first.
The step 401 is similar with step 201, will not be repeated here.
In step 402, the face feature in the capture terminal preset range of gesture second, using the face feature as Object of reference.
In step 403, the terminal detects the application of front stage operation, and whether judge the application is main desktop application.
In step 404, when the application is the application beyond main desktop application, the terminal is closed according to the second default correspondence System, obtains gesture control operation corresponding with the object of reference.
The present embodiment is only illustrated so that the application of the front stage operation is not main desktop application as an example.The second default correspondence Relation includes the corresponding relation between gesture, object of reference and the control operation of the application, and the control operation of the application can be point The operation of any button in the application, the operation for closing the application etc. are hit, the present embodiment is not limited this.
In the present embodiment, for identical gesture and object of reference, the gesture and reference in different applications Thing can correspond to different control operations, then for the ease of distinguishing, and the terminal can determine that each apply second presets respectively Corresponding relation, for each application, the terminal obtains the second default corresponding relation of the application, so as to capture the hand When gesture and the object of reference, according to the second of the application the default corresponding relation, gesture control behaviour corresponding with the object of reference is obtained Make.
In the present embodiment, the gesture can in the top of the object of reference, lower section, left, right, front or behind etc., When relative position between the gesture and the object of reference is different, corresponding control operation can also be different.Therefore, this is second pre- If corresponding relation can include the corresponding relation between gesture, object of reference, relative position relation and the control operation of the application, phase Answer, the terminal obtains the relative position relation between the gesture and the object of reference when capturing the gesture and the object of reference, And according to the second default corresponding relation, the gesture, the object of reference control operation corresponding with the relative position relation are obtained, with The control operation is performed to the application.
In addition, the terminal can also show each control operation pair to user in advance according to the second default corresponding relation The gesture and the image of object of reference answered so that the gesture that user need to be made around when can know and control this to apply in the object of reference And corresponding object of reference.
In step 405, terminal-pair application performs the control operation.
In the present embodiment, any gesture is captured in the terminal applies running at this, and captures the gesture During face feature in the second preset range, the gesture and the object of reference pair can be obtained according to the second default corresponding relation The control operation answered, so that the control operation is performed to the application, it is simple and efficient to handle.
For example, this method comprises the following steps(4-1)Extremely(4-11)Any one of:
(4-1)When the capture terminal is to gesture of waving, and the object of reference is when being arm, activates the function of currently choosing.
In this applies running, the terminal can provide a variety of functions, and be used in the function setting menu of the application Family can activate any function or close any function.
Referring to Fig. 5 a, when the terminal detects palm towards the terminal and when swinging, confirmation captures gesture of waving, When the terminal determines to only have this characteristics of human body of arm in the second preset range of the gesture of waving, it is arm to determine the object of reference Arm, then the terminal determines the function of currently choosing, and activates the function.
(4-2)When the capture terminal to palm deploy gesture, and the object of reference be ear when, improve volume.
Referring to Fig. 5 b, when the terminal detects palm expansion and the five fingers close up, it is determined that palm expansion gesture is captured, When the characteristics of human body that the terminal is determined in the second preset range of palm expansion gesture is ear, it is ear to determine the object of reference Piece, then the terminal improves volume.
(4-3)When the capture terminal to the upward gesture of forefinger, and the object of reference is when being face, reduces volume.
Referring to Fig. 5 c, when the terminal detect forefinger stretch out and vertically upward, remaining four finger gripping fist when, it is determined that capturing this The upward gesture of forefinger, when the characteristics of human body that the terminal is determined in the second preset range of the upward gesture of the forefinger is face, really The fixed object of reference is face, then terminal reduction volume.
(4-4)When the capture terminal to close gesture, and the object of reference be trunk when, close the application.
Referring to Fig. 5 d, when the terminal, which detects both hands intersection, is placed in front, it is determined that capturing the closing gesture and the ginseng It is trunk according to thing, then closes the application.
(4-5)When the capture terminal to heart-shaped gesture, and the object of reference is when being head, performs the top to current display information Comment operation.
In this applies running, the information that the terminal may be issued on display information display platform, now user The information can be commented on, such as comment is commented on and stepped on top, when the capture terminal is to heart-shaped gesture, performed to current aobvious Show the top comment operation of information.
Referring to Fig. 5 e, when the finger that the terminal detects both hands connects, it is determined that heart-shaped gesture is captured, when the terminal is true When characteristics of human body in second preset range of the fixed heart-shaped gesture is head, it is head to determine the object of reference, then the terminal is held Row is commented on the top of current display information and operated.
(4-6)When the capture terminal to the upward gesture of forefinger, and the object of reference is when being nose, performs to current display information Step on comment operation.
Referring to Fig. 5 f, when the terminal detect forefinger stretch out and vertically upward, other fingers clench fist when, confirmation captures this The upward gesture of forefinger, when it is determined that the characteristics of human body in the second preset range of the upward gesture of the forefinger is nose, determines the ginseng It is nose according to thing, then the terminal is performed steps on comment operation to current display information.
It should be noted that the example above is only to provide several gestures and object of reference for being used to control the application, but not Constitute the restriction to gesture and object of reference.
The method that the present embodiment is provided, passes through the gesture and object of reference in the preset range of capture terminal first;Before detection The application of platform operation;According to the type of the application, gesture operational order corresponding with the object of reference is obtained;In the fortune of the application During row, the operational order is performed.Corresponding control can be performed to the application by only needing user to be made around gesture in object of reference System operation, it is simple and efficient to handle.
Fig. 6 is a kind of application control schematic device according to an exemplary embodiment.Reference picture 6, the device bag Include trapping module 601, detection module 602, instruction acquisition module 603 and instruction performing module 604.
The trapping module 601 is configurable for capturing gesture and object of reference in the first preset range;
The detection module 602 is configurable for detecting the application of front stage operation;
The instruction acquisition module 603 is configurable for the type according to the application, obtains the gesture and the object of reference pair The operational order answered;
The instruction performing module 604 is configured in the running of the application, performs the operational order.
The device that the present embodiment is provided, passes through the gesture and object of reference in the preset range of capture terminal first;Before detection The application of platform operation;According to the type of the application, gesture operational order corresponding with the object of reference is obtained;In the fortune of the application During row, the operational order is performed.Only needing user to be made around gesture in object of reference can be in the running of the application The corresponding operational order of the gesture is performed, it is simple and efficient to handle.
The instruction acquisition module 603 includes:
Application identities acquiring unit, for when the application is main desktop application, according to the first default corresponding relation, obtaining Gesture application identities corresponding with the object of reference, the first default corresponding relation include gesture, object of reference and application identities it Between corresponding relation.
The instruction performing module 604 is used in the running of the main desktop application, starts indicated by the application identities Application.
The instruction acquisition module 603 includes:
Control operation acquiring unit, for when the application is the application beyond main desktop application, according to second default pair Should be related to, obtain gesture control operation corresponding with the object of reference, the second default corresponding relation include gesture, object of reference with Corresponding relation between the control operation of the application.
The instruction performing module 604 is used in the running of the application, and the control operation is performed to the application.
The trapping module 601 includes:
Static capturing unit, for capturing the static gesture in first preset range;
First object of reference determining unit, for capturing the characteristics of human body in the preset range of static gesture second, according to this Characteristics of human body, determines the object of reference;Or,
Dynamical capture unit, for capturing the dynamic gesture in first preset range;
Second object of reference determining unit, for capturing the characteristics of human body in the preset range of dynamic gesture second, according to this Characteristics of human body, determines the object of reference.
The static capturing unit is used to capture at least one gesture feature point in first preset range;Obtain this at least The positional information of one gesture feature point;According to the positional information of at least one gesture feature point, the static gesture is generated.
The Dynamical capture unit is used to capture at least one gesture feature point in first preset range;Obtain this at least Positional information of one gesture feature point in each sampling period of the period of motion, the period of motion includes multiple sampling periods; According to the positional information in each sampling period, multiple static gestures are generated, and obtain the fortune of at least one gesture feature point Dynamic rail mark;At least one of in the movement locus of the plurality of static gesture and at least one gesture feature point, generation should Dynamic gesture.
The first object of reference determining unit is used to capture face feature in the preset range of static gesture second, by this five Official's feature is used as the object of reference;
Correspondingly, the face that the second object of reference determining unit is used to capture in the preset range of dynamic gesture second are special Levy, regard the face feature as the object of reference.
The instruction acquisition module 603 also includes:
First relative position acquiring unit, for obtaining the relative position relation between the gesture and the object of reference;
Mark acquiring unit, for when the application is main desktop application, according to the first default corresponding relation, obtaining should Gesture, the object of reference application identities corresponding with the relative position relation, the first default corresponding relation include gesture, reference Corresponding relation between thing, relative position relation and application identities.
The instruction acquisition module 603 also includes:
Second relative position acquiring unit, for obtaining the relative position relation between the gesture and the object of reference;
Acquiring unit is operated, for when the application is the application beyond main desktop application, being closed according to the second default correspondence System, obtains the gesture, the object of reference control operation corresponding with the relative position relation, and the second default corresponding relation includes hand Corresponding relation between gesture, object of reference, relative position relation and the control operation of the application.
The gesture feature point includes at least one in finger-joint, finger segment and palm.
Above-mentioned all optional technical schemes, can form the alternative embodiment of the present invention, herein no longer using any combination Repeat one by one.
On the device in above-described embodiment, wherein modules perform the concrete mode of operation in relevant this method Embodiment in be described in detail, explanation will be not set forth in detail herein.
It should be noted that:The application control device that above-described embodiment is provided is when control is applied, only with above-mentioned each function The division progress of module is for example, in practical application, as needed can distribute above-mentioned functions by different function moulds Block is completed, i.e., the internal structure of terminal is divided into different functional modules, to complete all or part of work(described above Energy.In addition, the application control device that above-described embodiment is provided belongs to same design with application control embodiment of the method, it is specific real Existing process refers to embodiment of the method, repeats no more here.
Fig. 7 is a kind of block diagram of device 700 according to an exemplary embodiment, and the device 700 can be used for starting Using or control application.For example, device 700 can be mobile phone, and computer, digital broadcast terminal, messaging devices, Game console, tablet device, Medical Devices, body-building equipment, personal digital assistant etc..
Reference picture 7, device 700 can include following one or more assemblies:Processing assembly 702, memory 704, power supply Component 706, multimedia groupware 708, audio-frequency assembly 710, input/output(I/O)Interface 712, sensor cluster 714, and Communication component 716.
The integrated operation of the usual control device 700 of processing assembly 702, such as with display, call, data communication, phase Machine operates the operation associated with record operation.Treatment element 702 can refer to including one or more processors 720 to perform Order, to complete all or part of step of above-mentioned method.In addition, processing assembly 702 can include one or more modules, just Interaction between processing assembly 702 and other assemblies.For example, processing component 702 can include multi-media module, it is many to facilitate Interaction between media component 708 and processing assembly 702.
Memory 704 is configured as storing various types of data supporting the operation in equipment 700.These data are shown Example includes the instruction of any application program or method for being operated on device 700, and contact data, telephone book data disappears Breath, picture, video etc..Memory 704 can be by any kind of volatibility or non-volatile memory device or their group Close and realize, such as static RAM(SRAM), Electrically Erasable Read Only Memory(EEPROM), it is erasable to compile Journey read-only storage(EPROM), programmable read only memory(PROM), read-only storage(ROM), magnetic memory, flash Device, disk or CD.
Electric power assembly 706 provides electric power for the various assemblies of device 700.Electric power assembly 706 can include power management system System, one or more power supplys, and other components associated with generating, managing and distributing electric power for device 700.
Multimedia groupware 708 is included in the screen of one output interface of offer between the device 700 and user.At some In embodiment, screen can include liquid crystal display(LCD)And touch panel(TP).If screen includes touch panel, screen Touch-screen is may be implemented as, to receive the input signal from user.Touch panel includes one or more touch sensors With the gesture on sensing touch, slip and touch panel.The touch sensor can not only sensing touch or sliding action side Boundary, but also the detection duration related to the touch or slide and pressure.In certain embodiments, multimedia groupware 708 include a front camera and/or rear camera.When equipment 700 is in operator scheme, such as screening-mode or video screen module During formula, front camera and/or rear camera can receive the multi-medium data of outside.Each front camera and rearmounted take the photograph As head can be a fixed optical lens system or with focusing and optical zoom capabilities.
Audio-frequency assembly 710 is configured as output and/or input audio signal.For example, audio-frequency assembly 710 includes a Mike Wind(MIC), when device 700 be in operator scheme, when such as call model, logging mode and speech recognition mode, microphone by with It is set to reception external audio signal.The audio signal received can be further stored in memory 704 or via communication set Part 716 is sent.In certain embodiments, audio-frequency assembly 710 also includes a loudspeaker, for exports audio signal.
I/O interfaces 712 is provide interface between processing assembly 702 and peripheral interface module, above-mentioned peripheral interface module can To be keyboard, click wheel, button etc..These buttons may include but be not limited to:Home button, volume button, start button and lock Determine button.
Sensor cluster 714 includes one or more sensors, and the state for providing various aspects for device 700 is commented Estimate.For example, sensor cluster 714 can detect opening/closed mode of equipment 700, the relative positioning of component, such as group Part is the display and keypad of device 700, and sensor cluster 714 can be with 700 1 components of detection means 700 or device Position changes, the existence or non-existence that user contacts with device 700, the orientation of device 700 or acceleration/deceleration and the temperature of device 700 Degree change.Sensor cluster 714 can include proximity transducer, be configured to detect attached in not any physical contact The presence of nearly object.Sensor cluster 714 can also include optical sensor, such as CMOS or ccd image sensor, in imaging Used in.In certain embodiments, the sensor cluster 714 can also include acceleration transducer, gyro sensor, Magnetic Sensor, pressure sensor or temperature sensor.
Communication component 716 is configured to facilitate the communication of wired or wireless way between device 700 and other equipment.Device 700 can access the wireless network based on communication standard, such as WiFi, 2G or 3G, or combinations thereof.In an exemplary implementation In example, communication component 716 receives broadcast singal or broadcast related information from external broadcasting management system via broadcast channel. In one exemplary embodiment, the communication component 716 also includes near-field communication(NFC)Module, to promote junction service.Example Such as, radio frequency identification can be based in NFC module(RFID)Technology, Infrared Data Association(IrDA)Technology, ultra wide band(UWB)Technology, Bluetooth(BT)Technology and other technologies are realized.
In the exemplary embodiment, device 700 can be by one or more application specific integrated circuits(ASIC), numeral letter Number processor(DSP), digital signal processing appts(DSPD), PLD(PLD), field programmable gate array (FPGA), controller, microcontroller, microprocessor or other electronic components realize, for performing the above method.
In the exemplary embodiment, a kind of non-transitorycomputer readable storage medium including instructing, example are additionally provided Such as include the memory 704 of instruction, above-mentioned instruction can be performed to complete the above method by the processor 720 of device 700.For example, The non-transitorycomputer readable storage medium can be ROM, random access memory(RAM), CD-ROM, tape, floppy disk and Optical data storage devices etc..
A kind of non-transitorycomputer readable storage medium, when the instruction in the storage medium is by the processor of mobile terminal During execution so that mobile terminal is able to carry out a kind of application control method, this method includes:
Capture the gesture and object of reference in the first preset range;
Detect the application of front stage operation;
According to the type of the application, gesture operational order corresponding with the object of reference is obtained;
In the running of the application, the operational order is performed.
This is according to the type of the application, and obtaining gesture operational order corresponding with the object of reference includes:
When the application is main desktop application, according to the first default corresponding relation, the gesture is obtained corresponding with the object of reference Application identities, the first default corresponding relation includes the corresponding relation between gesture, object of reference and application identities.
This is in the running of the application, and performing the operational order includes:
In the running of the main desktop application, start the application indicated by the application identities.
This is according to the type of the application, and obtaining gesture operational order corresponding with the object of reference includes:
When the application is the application beyond main desktop application, according to the second default corresponding relation, obtains the gesture and be somebody's turn to do The corresponding control operation of object of reference, the second default corresponding relation is included between gesture, object of reference and the control operation of the application Corresponding relation.
This is in the running of the application, and performing the operational order includes:
In the running of the application, the control operation is performed to the application.
Gesture and object of reference in the preset range of capture first include:
Capture the static gesture in first preset range;
The characteristics of human body in the preset range of static gesture second is captured, according to the characteristics of human body, the object of reference is determined; Or,
Capture the dynamic gesture in first preset range;
The characteristics of human body in the preset range of dynamic gesture second is captured, according to the characteristics of human body, the object of reference is determined.
Static gesture in the capture first preset range includes:
Capture at least one gesture feature point in first preset range;
Obtain the positional information of at least one gesture feature point;
According to the positional information of at least one gesture feature point, the static gesture is generated.
Dynamic gesture in the capture first preset range includes:
Capture at least one gesture feature point in first preset range;
At least one gesture feature point is obtained in the positional information in each sampling period of the period of motion, the period of motion Including multiple sampling periods;
According to the positional information in each sampling period, multiple static gestures are generated, and it is special to obtain at least one gesture Levy movement locus a little;
At least one of in the movement locus of the plurality of static gesture and at least one gesture feature point, generation should Dynamic gesture.
Characteristics of human body in the capture preset range of static gesture second, according to the characteristics of human body, determines the object of reference Including:
The face feature in the preset range of static gesture second is captured, the face feature is regard as the object of reference;
Accordingly, the characteristics of human body in the preset range of dynamic gesture second is captured, according to the characteristics of human body, the ginseng is determined Include according to thing:
The face feature in the preset range of dynamic gesture second is captured, the face feature is regard as the object of reference.
This is according to the type of the application, and obtaining gesture operational order corresponding with the object of reference includes:
Obtain the relative position relation between the gesture and the object of reference;
When the application is main desktop application, according to the first default corresponding relation, obtains the gesture, the object of reference and be somebody's turn to do The corresponding application identities of relative position relation, the first default corresponding relation includes gesture, object of reference, relative position relation with answering With the corresponding relation between mark.
This is according to the type of the application, and obtaining gesture operational order corresponding with the object of reference includes:
Obtain the relative position relation between the gesture and the object of reference;
When the application is the application beyond main desktop application, according to the second default corresponding relation, obtains the gesture, be somebody's turn to do Object of reference control operation corresponding with the relative position relation, the second default corresponding relation includes gesture, object of reference, relative position Put the corresponding relation between relation and the control operation of the application.
The gesture feature point includes at least one in finger-joint, finger segment and palm.
Those skilled in the art will readily occur to its of the present invention after considering specification and putting into practice invention disclosed herein Its embodiment.The application be intended to the present invention any modification, purposes or adaptations, these modifications, purposes or Person's adaptations follow the general principle of the present invention and including the undocumented common knowledge in the art of the disclosure Or conventional techniques.Description and embodiments are considered only as exemplary, and true scope and spirit of the invention are by following Claim is pointed out.
It should be appreciated that the invention is not limited in the precision architecture for being described above and being shown in the drawings, and And various modifications and changes can be being carried out without departing from the scope.The scope of the present invention is only limited by appended claim.

Claims (20)

1. a kind of application control method, it is characterised in that methods described includes:
Capture the gesture and object of reference in the first preset range;
Detect the application of front stage operation;
According to the type of the application, the gesture and the corresponding operational order of the object of reference are obtained;
In the running of the application, the operational order is performed;
Gesture and object of reference in the first preset range of the capture include:
The static gesture in first preset range is captured, the human body captured in the preset range of static gesture second is special Levy, the characteristics of human body is defined as the object of reference;Or,
The dynamic gesture in first preset range is captured, the human body captured in the preset range of dynamic gesture second is special Levy, the characteristics of human body is defined as the object of reference;
Dynamic gesture in capture first preset range includes:
Capture at least one gesture feature point in first preset range;
At least one described gesture feature point is obtained in the positional information in each sampling period of the period of motion, the period of motion Including multiple sampling periods;
According to the positional information in each sampling period, multiple static gestures are generated, and it is special to obtain at least one described gesture Levy movement locus a little;
At least one of in the movement locus of the multiple static gesture and at least one gesture feature point, generate institute State dynamic gesture.
2. according to the method described in claim 1, it is characterised in that the type according to the application, obtain the gesture Operational order corresponding with the object of reference includes:
When the application is main desktop application, according to the first default corresponding relation, the gesture and the object of reference pair are obtained The application identities answered, the described first default corresponding relation includes the corresponding relation between gesture, object of reference and application identities.
3. method according to claim 2, it is characterised in that described in the running of the application, is performed described Operational order includes:
In the running of the main desktop application, start the application indicated by the application identities.
4. according to the method described in claim 1, it is characterised in that the type according to the application, obtain the gesture Operational order corresponding with the object of reference includes:
When the application beyond the application is main desktop application, according to the second default corresponding relation, the gesture and institute are obtained The corresponding control operation of object of reference is stated, the described second default corresponding relation includes gesture, object of reference and the control of the application is grasped Corresponding relation between work.
5. method according to claim 4, it is characterised in that described in the running of the application, is performed described Operational order includes:
In the running of the application, the control operation is performed to the application.
6. according to the method described in claim 1, it is characterised in that the static gesture in capture first preset range Including:
Capture at least one gesture feature point in first preset range;
Obtain the positional information of at least one gesture feature point;
According to the positional information of at least one gesture feature point, the static gesture is generated.
7. according to the method described in claim 1, it is characterised in that in the capture preset range of static gesture second Characteristics of human body, the characteristics of human body is defined as into the object of reference includes:
The face feature in the preset range of static gesture second is captured, the face feature is regard as the object of reference;
Accordingly, the characteristics of human body in the preset range of dynamic gesture second is captured, the characteristics of human body is defined as described Object of reference includes:
The face feature in the preset range of dynamic gesture second is captured, the face feature is regard as the object of reference.
8. according to the method described in claim 1, it is characterised in that the type according to the application, obtain the gesture Operational order corresponding with the object of reference includes:
Obtain the relative position relation between the gesture and the object of reference;
When the application is main desktop application, according to the first default corresponding relation, the gesture, the object of reference and institute are obtained The corresponding application identities of relative position relation are stated, the described first default corresponding relation includes gesture, object of reference, relative position relation Corresponding relation between application identities.
9. according to the method described in claim 1, it is characterised in that the type according to the application, obtain the gesture Operational order corresponding with the object of reference includes:
Obtain the relative position relation between the gesture and the object of reference;
When the application beyond the application is main desktop application, according to the second default corresponding relation, the gesture is obtained, described Object of reference and the corresponding control operation of the relative position relation, the described second default corresponding relation include gesture, object of reference, phase To the corresponding relation between position relationship and the control operation of the application.
10. the method according to claim 1 or 6, it is characterised in that the gesture feature point includes finger-joint, finger At least one of in segment and palm.
11. a kind of application control device, it is characterised in that described device includes:
Trapping module, for capturing gesture and object of reference in the first preset range;
Detection module, the application for detecting front stage operation;
Instruction acquisition module, for the type according to the application, obtains the gesture and the corresponding operation of the object of reference refers to Order;
Performing module is instructed, in the running of the application, performing the operational order;
The trapping module includes:
Static capturing unit, for capturing the static gesture in first preset range;
First object of reference determining unit, for capturing the characteristics of human body in the preset range of static gesture second, by the people Body characteristicses are defined as the object of reference;Or,
Dynamical capture unit, for capturing the dynamic gesture in first preset range;
Second object of reference determining unit, for capturing the characteristics of human body in the preset range of dynamic gesture second, by the people Body characteristicses are defined as the object of reference;
The Dynamical capture unit is used to capture at least one gesture feature point in first preset range;Described in obtaining extremely Few positional information of the gesture feature point in each sampling period of the period of motion, the period of motion includes multiple sampling weeks Phase;According to the positional information in each sampling period, multiple static gestures are generated, and obtain at least one described gesture feature The movement locus of point;According at least one in the movement locus of the multiple static gesture and at least one gesture feature point , generate the dynamic gesture.
12. device according to claim 11, it is characterised in that the instruction acquisition module includes:
Application identities acquiring unit, for when the application is main desktop application, according to the first default corresponding relation, obtaining institute Gesture and the corresponding application identities of the object of reference are stated, the described first default corresponding relation includes gesture, object of reference and marked with application Corresponding relation between knowledge.
13. device according to claim 12, it is characterised in that the instruction performing module is used for should in the top table face In running, start the application indicated by the application identities.
14. device according to claim 11, it is characterised in that the instruction acquisition module includes:
Control operation acquiring unit, for being main desktop application when the application beyond application when, according to the second default correspondence Relation, obtains the gesture and the corresponding control operation of the object of reference, and the described second default corresponding relation includes gesture, reference Corresponding relation between thing and the control operation of the application.
15. device according to claim 14, it is characterised in that the instruction performing module is used for the fortune in the application During row, the control operation is performed to the application.
16. device according to claim 11, it is characterised in that the static capturing unit is pre- for capturing described first If at least one gesture feature point in scope;Obtain the positional information of at least one gesture feature point;According to it is described extremely The positional information of a few gesture feature point, generates the static gesture.
17. device according to claim 11, it is characterised in that the first object of reference determining unit is used to capture described Face feature in the preset range of static gesture second, regard the face feature as the object of reference;
Correspondingly, the face that the second object of reference determining unit is used to capture in the preset range of dynamic gesture second are special Levy, regard the face feature as the object of reference.
18. device according to claim 11, it is characterised in that the instruction acquisition module also includes:
First relative position acquiring unit, for obtaining the relative position relation between the gesture and the object of reference;
Mark acquiring unit, for when the application is main desktop application, according to the first default corresponding relation, obtaining the hand Gesture, the object of reference and the corresponding application identities of the relative position relation, the described first default corresponding relation include gesture, ginseng According to the corresponding relation between thing, relative position relation and application identities.
19. device according to claim 11, it is characterised in that the instruction acquisition module also includes:
Second relative position acquiring unit, for obtaining the relative position relation between the gesture and the object of reference;
Operate acquiring unit, for being main desktop application when the application beyond application when, according to the second default corresponding relation, Obtain the gesture, the object of reference and the corresponding control operation of the relative position relation, the described second default corresponding relation Including the corresponding relation between gesture, object of reference, relative position relation and the control operation of the application.
20. the device according to claim 11 or 16, it is characterised in that the gesture feature point includes finger-joint, hand Refer at least one in segment and palm.
CN201410160815.6A 2014-04-21 2014-04-21 Application control method and apparatus Active CN103955274B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410160815.6A CN103955274B (en) 2014-04-21 2014-04-21 Application control method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410160815.6A CN103955274B (en) 2014-04-21 2014-04-21 Application control method and apparatus

Publications (2)

Publication Number Publication Date
CN103955274A CN103955274A (en) 2014-07-30
CN103955274B true CN103955274B (en) 2017-09-01

Family

ID=51332559

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410160815.6A Active CN103955274B (en) 2014-04-21 2014-04-21 Application control method and apparatus

Country Status (1)

Country Link
CN (1) CN103955274B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104598265A (en) * 2014-12-12 2015-05-06 宇龙计算机通信科技(深圳)有限公司 Method and system for starting applications based on body resistance values
CN105988560B (en) * 2015-02-03 2020-09-25 南京中兴软件有限责任公司 Application starting method and device
CN107450717B (en) * 2016-05-31 2021-05-18 联想(北京)有限公司 Information processing method and wearable device
CN106453836A (en) * 2016-09-09 2017-02-22 珠海格力电器股份有限公司 Application closing method and device
CN107991893A (en) * 2017-11-14 2018-05-04 美的集团股份有限公司 Realize method, gesture identification module, main control module and the home appliance of communication
CN109684006B (en) * 2018-12-11 2023-01-24 维沃移动通信(深圳)有限公司 Terminal control method and device
CN110134232A (en) * 2019-04-22 2019-08-16 东风汽车集团有限公司 A kind of mobile phone support adjusting method and system based on gesture identification

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101482772A (en) * 2008-01-07 2009-07-15 纬创资通股份有限公司 Electronic device and its operation method
CN102221885A (en) * 2011-06-15 2011-10-19 青岛海信电器股份有限公司 Television, and control method and device thereof
CN102253709A (en) * 2010-05-19 2011-11-23 禾瑞亚科技股份有限公司 Method and device for determining gestures
CN103226389A (en) * 2013-04-27 2013-07-31 苏州佳世达电通有限公司 Method for executing application program according to gesture

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5413673B2 (en) * 2010-03-08 2014-02-12 ソニー株式会社 Information processing apparatus and method, and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101482772A (en) * 2008-01-07 2009-07-15 纬创资通股份有限公司 Electronic device and its operation method
CN102253709A (en) * 2010-05-19 2011-11-23 禾瑞亚科技股份有限公司 Method and device for determining gestures
CN102221885A (en) * 2011-06-15 2011-10-19 青岛海信电器股份有限公司 Television, and control method and device thereof
CN103226389A (en) * 2013-04-27 2013-07-31 苏州佳世达电通有限公司 Method for executing application program according to gesture

Also Published As

Publication number Publication date
CN103955274A (en) 2014-07-30

Similar Documents

Publication Publication Date Title
CN103955275B (en) Application control method and apparatus
CN103955274B (en) Application control method and apparatus
CN104394312B (en) Filming control method and device
CN104867506B (en) The method and apparatus for automatically controlling music
CN104383674B (en) Counting method and device used for intelligent wearing equipment as well as intelligent wearing equipment
CN105204742B (en) Control method, device and the terminal of electronic equipment
CN106776999A (en) Multi-medium data recommends method and device
CN108052079A (en) Apparatus control method, device, plant control unit and storage medium
CN106572299A (en) Camera switching-on method and device
CN105744293A (en) Video live broadcast method and device
CN106489113B (en) The method, apparatus and electronic equipment of VR control
CN105653085B (en) Touch-responsive method and apparatus
CN106537319A (en) Screen-splitting display method and device
CN104615240B (en) The unlocking method and a device of terminal
CN106464939A (en) Method and device for playing sound effect
CN107515925A (en) Method for playing music and device
CN104090721A (en) Terminal control method and device
CN107832036A (en) Sound control method, device and computer-readable recording medium
CN104808522B (en) State switching method and device
CN105447150B (en) Method for playing music, device and terminal device based on face photograph album
CN106375782A (en) Video playing method and device
CN105049807B (en) Monitored picture sound collection method and device
CN105933539A (en) Audio playing control method and device, and terminal
CN104394137B (en) A kind of method and device of prompting voice call
CN106502859A (en) The method and device of control terminal equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant