CN103513752B - Gesture operation method, gesture operation device and gesture operation system - Google Patents
Gesture operation method, gesture operation device and gesture operation system Download PDFInfo
- Publication number
- CN103513752B CN103513752B CN201210201760.XA CN201210201760A CN103513752B CN 103513752 B CN103513752 B CN 103513752B CN 201210201760 A CN201210201760 A CN 201210201760A CN 103513752 B CN103513752 B CN 103513752B
- Authority
- CN
- China
- Prior art keywords
- gesture
- prompting
- application
- user
- path
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Abstract
The invention discloses a gesture operation method, a gesture operation device and a gesture operation system and relates to the field of the communication network technology. The gesture operation method, the gesture operation device and the gesture operation system can be used for carrying out gesture inputting by the adoption of simple operations, improving operation efficiency, and providing visual prompts and feedback support for a user so as to improve user experience. The gesture operation method includes the first step of receiving and analyzing video data, with three-dimensional depth information, of the user to obtain three-dimensional data of the user, the second step of generating a prompt gesture according to the three-dimensional data when an interrupt operation generates, wherein the prompt gesture is used for directing the user to carry out the interrupt operation, and the third step of detecting the track of the gesture inputted by the user, and executing the operation which the gesture corresponds to when the track of the gesture is matched with the prompt gesture. The gesture operation method, the gesture operation device and the gesture operation system are suitable for being adopted when the gesture inputting is carried out.
Description
Technical field
The present invention relates to technical field of communication network, more particularly, to a kind of method of gesture operation, apparatus and system.
Background technology
It is being used as when control command is inputted by human body gesture it will usually there is operating efficiency problem and action criteria
Type, conforming problem.
The COS that prior art is provided by limiting device, it is possible to reduce gesture input, to be operated, to improve
The efficiency of gesture input.For example, several operating gestures are fixed to user, typically not over ten kinds of gestures, if too many handss
Gesture, to be used as operation, can increase the puzzlement of user, so the gesture of design and UI (User Interface, user interface)
Must be to control gesture seldom, very simple UI.But, during if in occurring to application switching, for example, browsing video need
Volume to be adjusted or fast forwarding and fast rewinding, are generally intended to acquire the right of control first, are then passed through multistage control menu and enter volume
Adjustment interface or progress adjustment interface, recycle several gestures of existing support to return after reaching the purpose of control.
, when solving action criteria type, conforming problem, the gesture to user's design can using one hand not for prior art
The gesture of consistency problem, the gesture that for example one hand is drawn a circle can be caused very much.Alternatively, it is also possible to control light target side by singlehanded
Formula.Terminal by user is singlehanded control cursor after point out the position representated by the handss of user to be now arranged in the position of UI, Ran Houzai
The gesture path that detection user is inputted with this position for starting point.
However, when gesture input is carried out using prior art, complex operation, operating efficiency is relatively low, and cannot give user
Visual cues and feedback are supported.
Content of the invention
Embodiments of the invention provide a kind of method of gesture operation, apparatus and system, can be entered using shirtsleeve operation
Row gesture input, improves operating efficiency, and can support to user's visual cues and feedback, improves Consumer's Experience.
Embodiments of the invention adopt the following technical scheme that:
A kind of method of gesture operation, including:
Receive and analyze the video data of the user with three-dimensional depth information, obtain the three-dimensional data of described user;
When producing interrupt operation, prompting gesture is generated according to described three-dimensional data, described prompting gesture is used for instructing institute
State user and carry out interrupt operation;
Detect the gesture path of described user input, when described gesture path is mated with described prompting gesture, execute institute
State the operation corresponding to prompting gesture.
A kind of device of gesture operation, including:
Analytic unit, for receiving and analyzing the video data of the user with three-dimensional depth information, obtains described user
Three-dimensional data;
Signal generating unit, for when producing interrupt operation, generating prompting gesture, described prompting handss according to described three-dimensional data
Gesture is used for instructing described user to carry out interrupt operation;
Performance element, for detecting the gesture path of described user input, when described gesture path and described prompting gesture
Operation during coupling, corresponding to execution described prompting gesture.
A kind of system of gesture operation, including:Depth capture device, central processor equipment, gesture identification equipment and display
Equipment;
Described depth capture device, for obtaining the video data of the user with three-dimensional depth information, and by described tool
The video data having the user of three-dimensional depth information is sent to described central processor equipment;
Described central processor equipment, for receiving and analyzing the video data of the user with three-dimensional depth information, obtains
The three-dimensional data of described user;When producing interrupt operation, prompting gesture, described prompting gesture are generated according to described three-dimensional data
For instructing described user to carry out interrupt operation;When described gesture path is mated with described prompting gesture, execute described prompting
Operation corresponding to gesture;
Described gesture identification equipment, for detecting the gesture path of user input;
Described display device, for showing the current operation interface of current application, and described prompting gesture, and described
The gesture path of user input.
The embodiment of the present invention provides a kind of method of gesture operation, apparatus and system, by receiving and analyzing with three-dimensional
The video data of the user of depth information, obtains the three-dimensional data of described user;When producing interrupt operation, according to described three-dimensional
Data genaration points out gesture, and described prompting gesture is used for instructing described user to carry out interrupt operation;Detect described user input
Gesture path, when described gesture path is mated with described prompting gesture, the operation corresponding to execution described prompting gesture.With now
Have when carrying out gesture input in technology, complex operation, operating efficiency is relatively low, and cannot support to user's visual cues and feedback
Compare, scheme provided in an embodiment of the present invention, by carrying out gesture input using shirtsleeve operation using prompting gesture, carries
High operating efficiency, and can support to user's visual cues and feedback, improve Consumer's Experience.
Brief description
In order to be illustrated more clearly that the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing
Have technology description in required use accompanying drawing be briefly described it should be apparent that, drawings in the following description be only this
Some embodiments of invention, for those of ordinary skill in the art, without having to pay creative labor, also may be used
So that other accompanying drawings are obtained according to these accompanying drawings.
A kind of flow chart of the method for gesture operation that Fig. 1 provides for the embodiment of the present invention 1;
A kind of block diagram of the device of gesture operation that Fig. 2 provides for the embodiment of the present invention 1;
A kind of schematic diagram of the system of gesture operation that Fig. 3 provides for the embodiment of the present invention 1;
A kind of flow chart of the method for gesture operation that Fig. 4 provides for the embodiment of the present invention 2;
Fig. 5 interrupts the schematic flow sheet of the first application for the second application that the embodiment of the present invention 2 provides;
The first application that Fig. 6 A- Fig. 6 B provides for the embodiment of the present invention 2, the schematic diagram of the relation between gesture bag, gesture;
The prompting gesture schematic diagram that Fig. 7 provides for the embodiment of the present invention 2;
A kind of block diagram of the device of gesture operation that Fig. 8 provides for the embodiment of the present invention 2.
Specific embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete
Site preparation description is it is clear that described embodiment is only a part of embodiment of the present invention, rather than whole embodiments.It is based on
Embodiment in the present invention, it is every other that those of ordinary skill in the art are obtained under the premise of not making creative work
Embodiment, broadly falls into the scope of protection of the invention.
Embodiment 1
The embodiment of the present invention provides a kind of method of gesture operation, and the executive agent of the method can set for centre reason
Standby, it is specifically as follows central processing system.As shown in figure 1, the method includes:
Step 101, receives and analyzes the video data of the user with three-dimensional depth information, obtains the three-dimensional of described user
Data;
The video data with three bit depth information is obtained by depth capture device, is then sent to central authorities' process and sets
Standby.After central processor equipment receives video data, it is analyzed processing, filters the noise in depth information, and remove background,
Scanning human body input target.Input the range data of target according to human body, analyze acquisition human body target by wave filter
Three-dimensional data.Specifically, the three-dimensional data of human body target includes face, trunk, posture (stand or be seated) etc..
Step 102, when producing interrupt operation, generates prompting gesture according to described three-dimensional data, described prompting gesture is used
Carry out interrupt operation in instructing described user;
When interrupt operation is produced due to described user or system, according to described three-dimensional data, and run on described
The gesture bag that mated of current operation interface of current first application under system, generates corresponding prompting gesture bag, described carries
Show that gesture bag includes at least one described prompting gesture.Produce interrupt operation to include:Current application just in operation, user
Trigger another application or operate, or system receives the request operation of other application.
Wherein, in the gesture bag mated with described current first application of the described prompting gesture in described prompting gesture bag
Gesture differ.
Further, according to the different gesture bag that described current first application is mated, described prompting gesture bag is carried out
Update.Described prompting gesture bag includes:Refuse the second Application Hints gesture, receive the second Application Hints gesture, wherein, described refuse
Exhausted second Application Hints gesture is used for instructing described second application of described user's refusal, and the described second Application Hints gesture that receives is used
Receive described second application in instructing described user;After connecing described receipts the second application, the prompting gesture bag of generation includes:Return
Prompting gesture is reminded in described first Application Hints gesture and cancellation, and wherein, the described first Application Hints gesture of described return is used for
Instruct described user to return described first to apply, described cancellation reminds prompting gesture to be used for instructing described user to cancel prompting, continues
Continuous execution described second is applied.
Each is applied in the different gesture bag in different operating interface or operational phase coupling;Gesture bag is by least one hands
Gesture forms.
Step 103, detects the gesture path of described user input, when described gesture path is mated with described prompting gesture
When, the operation corresponding to execution described prompting gesture.
Specifically, when the described gesture path when user input is carried with described refusal second application in described prompting gesture
When showing gesture coupling, the second application described in refusal respond;
Described reception the second Application Hints gesture in the described gesture path when user input with described prompting gesture
During coupling, run described second application;
When running described second application, the described gesture path when user input is returned with described in described prompting in gesture
When returning described first Application Hints gesture coupling, run described first application;
When running described second application, the described gesture path when user input takes with described in described prompting in gesture
During the prompting prompting gesture coupling that disappears, continue to run with described second application.
The embodiment of the present invention provides a kind of method of gesture operation, is generated by the three-dimensional data according to human body target and corresponds to
Prompting gesture, instruct user to carry out needing, according to prompting gesture, the operation carrying out so that the present invention can be grasped using simple
Make to carry out gesture input, improve operating efficiency, and can support to user's visual cues and feedback, improve Consumer's Experience.
The embodiment of the present invention provides a kind of device of gesture operation, and this device can be central processor equipment, as Fig. 2 institute
Show, this device includes:Analytic unit 201, signal generating unit 202, performance element 203;
Analytic unit 201, for receiving and analyzing the video data of the user with three-dimensional depth information, obtains described use
The three-dimensional data at family;
Signal generating unit 202, for when producing interrupt operation, generating prompting gesture according to described three-dimensional data, described carrying
Show that gesture is used for instructing described user to carry out interrupt operation;
Described signal generating unit 202 specifically for when interrupt operation is produced due to described user or system, according to described
Three-dimensional data, and run on the gesture bag that the current operation interface of current first application under described system is mated, generate
Corresponding prompting gesture bag, described prompting gesture bag includes at least one and points out gesture.
Described device also includes gesture memory element, for storing described prompting gesture bag, wherein said prompting gesture bag
In described prompting gesture differ with the gesture in described current first the mated gesture bag of application;Further, update
Unit, for being updated to described prompting gesture bag according to the different gesture bag that described current first application is mated.
Described prompting gesture bag includes:Refuse the second Application Hints gesture, receive the second Application Hints gesture, wherein, institute
State refusal the second Application Hints gesture to be used for instructing described second application of described user's refusal, described reception the second Application Hints handss
Gesture is used for instructing described user to receive described second application;After receiving the second application, the prompting gesture bag of generation includes:Return
Prompting gesture is reminded in described first Application Hints gesture and cancellation, and wherein, the described first Application Hints gesture of described return is used for
Instruct described user to return described first to apply, described cancellation reminds prompting gesture to be used for instructing described user to cancel prompting, continues
Continuous execution described second is applied, it should be noted that now prompting gesture bag and described second application current operation interface institute
Gesture in the gesture bag joined differs.
Described gesture memory element, is additionally operable to store described gesture bag;Wherein, each apply at different operating interface or
The different gesture bag of operational phase coupling;Gesture bag is made up of at least one gesture.
Performance element 203, for detecting the gesture path of described user input, when described gesture path and described prompting handss
Operation during gesture coupling, corresponding to execution described prompting gesture.
Refusal module in described performance element 203, for when the described gesture path when user input and described prompting
When described refusal the second Application Hints gesture in gesture is mated, the second application described in refusal respond;
Respond module in described performance element 203, for when the described gesture path when user input and described prompting
When described reception the second Application Hints gesture in gesture is mated, run described second application;
Return module in described performance element 203, for when running described second application, described when user input
When gesture path is mated with the described first Application Hints gesture of described return in described prompting gesture, running described first should
With;
Cancellation module in described performance element 203, for when running described second application, described when user input
When gesture path reminds prompting gesture to mate with the described cancellation in described prompting gesture, continue to run with described second application.
The embodiment of the present invention provides a kind of device of gesture operation, by signal generating unit according to the three-dimensional data of human body target
Generate prompting gesture, when the gesture path of user input is mated with described prompting gesture, performance element executes described prompting handss
Operation corresponding to gesture.Allow the present invention to carry out gesture input using shirtsleeve operation, improve operating efficiency, and permissible
Support to user's visual cues and feedback, improve Consumer's Experience.
The embodiment of the present invention provides a kind of system of gesture operation, as shown in figure 3, this system includes:Depth capture device
301st, central processor equipment 302, gesture identification equipment 303 and display device 304;
Described depth capture device 301, for obtaining the video data of the user with three-dimensional depth information, and will be described
The video data with the user of three-dimensional depth information is sent to described central processor equipment 302;
Described central processor equipment 302, for receiving and analyzing the video data of the user with three-dimensional depth information, obtains
Take the three-dimensional data of described user;When producing interrupt operation, prompting gesture, described prompting handss are generated according to described three-dimensional data
Gesture is used for instructing described user to carry out interrupt operation;When described gesture path is mated with described prompting gesture, carry described in execution
Show the operation corresponding to gesture;
Described gesture identification equipment 303, for detecting the gesture path of user input;
Described display device 304, for showing the current operation interface of current application, and described prompting gesture, and
The gesture path of described user input.
Described central processor equipment includes the device of the gesture operation described in accompanying drawing 2.
The embodiment of the present invention provides a kind of system of gesture operation, by central processor equipment, depth capture device is obtained
The video data of depth information process, generate prompting gesture so that user can for prompting gesture be inputted, thus holding
The corresponding operation of row, so that the present invention can carry out gesture input using shirtsleeve operation, improves operating efficiency, and can give
User's visual cues and feedback are supported, improve Consumer's Experience.
Embodiment 2
The embodiment of the present invention provides a kind of method of gesture operation, as shown in figure 4, the method includes:
Step 401, central processor equipment receives regarding of the user with three-dimensional depth information that depth capture device sends
Frequency evidence;
Optionally, depth capture device is the equipment with depth information or three-dimensional information acquisition capability, can adopt
The capture such as structured light technique, stereo-picture technology, TOF (Time Of Flight, flight time) technology has depth information
The video data of user, specifically, it is possible to obtain along the deep video data of depth camera Z-direction, depth capture sets
Standby can be with integrated 2D (dimensional, dimension) common RGB ((red green blue, red green orchid) photographic head, integrated microphone
Array inputs as user speech.
Step 402, described central processor equipment analyzes described video data, obtains the three-dimensional data of described user;
After central processor equipment receives video data, it is analyzed processing, filters the noise in depth information, and remove
Background, scanning human body input target.Input the range data of target according to human body, analyzed by wave filter and obtain user
Three-dimensional data, the three-dimensional data of wherein user is referred to as original depth data.Specifically, the three-dimensional data of user includes
Face, trunk, posture (stand or be seated) etc..
Step 403, when producing interrupt operation, described central processor equipment generates prompting handss according to described three-dimensional data
Gesture, described prompting gesture is used for instructing described user to carry out interrupt operation;
Produce interrupt operation to include:Just in operation, user triggers another application or operates current application, or
Person's system receives the request operation of other application.For example, just in operation, such as user watches for current first application
Video, when now user triggers this application of volume adjusting, when central processor equipment receives this triggering, according to trigger condition
Generate prompting gesture;When user watches video, current first application is interrupted in the second application, when system judges the second application
When the priority of priority ratio first application is higher, system can point out user second application to occur, and now produces prompting gesture, with
Just user operates further according to prompting gesture.
Interrupt further describing as a example the first application by the second application below, as shown in figure 5, including following sub-step:
Step 501, runs current first application;
Step 502, when the second application needs execution, judges whether the priority of described second application is more than described first
The priority of application;
Step 503, when the priority of the described second application is more than the priority of described first application, according to current first
The gesture bag that the current operation interface of application is mated, generates corresponding first prompting gesture bag, described first prompting gesture bag
Including prompting gesture at least one described;
It should be noted that when the priority of the described second application is not more than the priority of described first application, then not
Request to the described second application is pointed out, and continues executing with step 501;
First application can mate multiple gesture bags at different operating interface or operational phase, such as shown in Fig. 6 A, the
One application can mate gesture bag 1, gesture bag 2, gesture bag 3 etc..Such as one vt applications, select contact in dialing
The stage of people may need gesture bag 1 to support, needs to adjust volume operation, at this moment vt applications meeting in communication process
The gesture bag 2 of automatic adaptation is supporting that is to say, that the gesture that only meets gesture bag 2 is applied when operating and just can be rung
Should corresponding user operation.
As shown in Figure 6B, each gesture bag is to be made up of multiple gestures, gesture bag can support various gestures or
Person's posture is as input.Each gesture represents a gesture or action, such as:Gesture 1 represents waves from right to left, handss
Gesture 2 represents from left to right, and gesture 3 represents waves from top to bottom or from the bottom up.
It should be noted that the gesture included in different gesture bags may have the gesture of inclusion relation or repetition.Bag
Gesture containing relation refers in different application difference gesture bag, and identical gesture represents different operation implications.Different application institute
Corresponding gesture bag has incidence relation.In gesture bag, the incidence relation of corresponding different gestures can be by configuring adjustment phase
The corresponding relation answered.
Handss in described prompting gesture in the first prompting gesture bag gesture bag mated with described current first application
Gesture differs, and described first prompting gesture bag is carried out more according to the different gesture bag that described current first application is mated
Newly.For example, first application current operation interphase match gesture bag 1, then produce first prompting gesture bag in prompting gesture with
Gesture in gesture bag 1 is different, when the first application coupling gesture bag 2, then correspondingly adjusts the according to the gesture in gesture bag 2
One prompting gesture bag, that is, when a certain gesture in gesture bag 2 is identical with one of the first prompting gesture bag prompting gesture, this
When, first is pointed out this prompting gesture in gesture bag be replaced by the prompting gesture differing with any gesture in gesture bag 2.
Optionally, when the first application is interrupted in the second application, user can be applied with Response to selection second or refuse second
Application, then comprise two kinds of prompting gestures in the first prompting gesture bag generating:Refuse the second Application Hints gesture, receiving second should
With pointing out gesture.
Step 404, when user input gesture path is detected, judge the original position of described gesture path whether with institute
The original position stating prompting gesture is corresponding;
There is an optimal input original position, optimal gesture input track in input gesture.Then when user input is detected
During gesture path, whether refuse the second Application Hints gesture or receive the second Application Hints gesture, be required for first determining whether
Whether the original position of described gesture path is corresponding with the original position of described prompting gesture.
As shown in fig. 7, two prompting gestures in prompting gesture bag are respectively and refuse the second Application Hints gesture, receive the
Wherein, described refusal the second Application Hints gesture is used for instructing described second application of described user's refusal two Application Hints gestures,
Described reception the second Application Hints gesture is used for instructing described user to receive described second application.For example level brandishes handss generation to the right
The second Application Hints gesture refused by table, and level is brandished handss and represented reception the second Application Hints gesture to the left.Fig. 7 is a signal
Figure, icon can be arbitrarily prompting icon, the display lamp of such as handss, the direction that gesture is brandished can also by point out icon,
The others mode such as arrow is carried out instruction user gesture and is brandished direction, so pass through prompting icon bright dark change be prompted to described in
User's matching relationship.
Below this gesture is referred to as gesture X1 by Fig. 7 as a example the gesture brandished is turned left on the right side, due to inputting human body target
It is possible to the parameter differences such as build, height very big, or posture difference (stand or be seated), then generate different initiateing in real time
Position coordinateses.The original position of gesture X1 should be:If the right hand is then located at the right side of body trunk, positioned at face lower right side
Position, the right hand is parallel with trunk or angle that assume a very little, we term it the standard original position of the right hand;Same handss
Gesture, if left hand then needs also exist for the right side positioned at body trunk, but can be again near body trunk and face, referred to as left hand is complete
Become the standard original position of gesture X1.Zero in Fig. 7 is the optimal input original position setting, around zero
Circle represents the action threshold value that can allow, and that is, the optional position in circle can serve as original position.
Step 405, when described gesture path original position with described prompting gesture original position not to corresponding when pass through
The prompting icon of bright dark change is pointed out described user and is continued to judge whether the original position of described gesture path is carried with described
Show that the original position of gesture is corresponding;
When described gesture path original position with described prompting gesture original position not to corresponding when be user input
Not in the preset threshold range of prompting gesture, central processor equipment is not detected by effective original position to gesture path,
Now, point out described user by the prompting icon of bright dark change and proceed to detect and judge rising of described gesture path
Whether beginning position is corresponding with the original position of described prompting gesture, for example, points out icon to point out to input gesture rail in gesture
The position of mark is brighter, the dark display of other positions, or prompting icon is shown with red light, to represent the gesture path of user input
Original position with described prompting gesture original position not corresponding.It should be noted that the bright dark change of prompting icon needs
It was configured before this method is implemented.
Step 406, when the original position of described gesture path is corresponding with the original position of described prompting gesture, judges
Whether described gesture path moves along the described desired guiding trajectory pointing out gesture;
When the original position of described gesture path is corresponding with the original position of described prompting gesture, i.e. gesture path
In the preset threshold range of the original position in prompting gesture for the original position, central processor equipment is inclined according to gesture original position
From position adjustment export prompting, now user can be according to prompting further adjust original position so that input gesture rail
The original position of mark is located at optimal original position it is also possible to not be adjusted.
Further, also need whether the gesture path judging user input has corresponding actions, be such as not detected by accordingly moving
Make it indicates that icon is motionless.Indicate icon is whether gesture path according to user input is matched with prompting gesture and carried
The icon showing, for example, pass through to point out the bright dark change prompting user of icon, when user is inputted according to prompting gesture, point out
Icon all light, the off-track when user is inputted according to prompting gesture, then the part deviateing can be carried out with dark brightness
Display.Instruction icon can be pointed out by GPU (Graphic Processing Unit, graphic process unit) control icon.
Step 407, when described gesture path is not moved along the desired guiding trajectory of described prompting gesture, by bright dark change
The prompting icon changed points out described user not input gesture according to described desired guiding trajectory;
When gesture path is moved along the desired guiding trajectory of prompting gesture, corresponding prompting icon is also according to gesture path
Motion gradually brightens, and when gesture path deviates the desired guiding trajectory motion of described prompting gesture, then points out icon can return initial
Position, now needs user to re-enter.
Step 408, when described gesture path is moved along the desired guiding trajectory of described prompting gesture, judges described gesture rail
Whether mark reaches predeterminated position along the described desired guiding trajectory pointing out gesture;
As long as gesture path, when pointing out in the preset threshold range of desired guiding trajectory of gesture, can think gesture path
Along prompting gesture desired guiding trajectory move, now GPU can real-time update point out gesture picture mark position, specifically, GPU according to
The gesture relative position real-time update that gesture recognition engine is provided points out the picture mark position of gesture.
When whether judging described gesture path along the described desired guiding trajectory arrival predeterminated position pointing out gesture, need basis
The three-dimensional data of input human body target is judged.For example, level brandishes gesture from right to left, taking the right hand as a example, predeterminated position
Can be by the relative position of the right hand and body trunk, the relative position of the right hand and face, the right hand is come with the relative position of arm
Judge, such as whether the right hand has exceeded the position in the middle of body trunk in level is brandished, if reach with the relative position of face, head
To respective threshold, judge that level brandishes gesture from right to left with this, whether gesture input reaches predeterminated position taking the right hand as a example.
Step 409, when described gesture path does not reach predeterminated position along the desired guiding trajectory of described prompting gesture, leads to
Whether the prompting icon crossing bright dark change is pointed out described user and is continued to judge described gesture path along described prompting gesture
Desired guiding trajectory reach predeterminated position;
For example, the gesture path of user input is shown along the part that the described desired guiding trajectory pointing out gesture moves with highlighted
Show, point out the part that user in the desired guiding trajectory of gesture does not input gesture path then to show with dark brightness, to feed back to
User does not now also reach the predeterminated position of the desired guiding trajectory of prompting gesture.Or, along the default rail of described prompting gesture
The part of mark user input gesture path is shown with green light, and the part without inputting gesture path is shown with red light.Need
Bright, point out the bright dark change of icon to need to be configured before this method is implemented.
Step 410, when the described gesture path of described user input is mated with described prompting gesture, executes described prompting
Operation corresponding to gesture;
The described gesture path of described user input is mated with described prompting gesture, and that is, described gesture path carries along described
Show that the desired guiding trajectory of gesture does not reach predeterminated position.Now, the operation corresponding to execution described prompting gesture, specifically, when
When the described gesture path when user input is mated with described refusal the second Application Hints gesture in described prompting gesture, refusal
Respond described second application;When the described gesture path when user input is applied with the described reception second in described prompting gesture
During prompting gesture coupling, run described second application.
As shown in figure 5, step 504, when user is according to described refusal the second Application Hints handss in described prompting gesture bag
During gesture input gesture path, then judge whether described gesture path is mated with described prompting gesture;
When whether described gesture path is mismatched with described prompting gesture, proceed to detect described gesture path whether with
Described prompting gesture coupling, i.e. execution step 504;
When whether described gesture path is mated with described prompting gesture, then continue to run with the first application, i.e. execution step
501;
Step 505, when user is according to described reception the second Application Hints gesture input gesture in described prompting gesture bag
During track, then judge whether described gesture path is mated with described prompting gesture;
When whether described gesture path is mismatched with described prompting gesture, proceed to detect described gesture path whether with
Described prompting gesture coupling, i.e. execution step 505;
Step 506, when whether described gesture path is mated with described prompting gesture, then runs the second application;
Step 507, the gesture bag being mated according to the current operation interface of the current second application, generate corresponding second and carry
Show gesture bag;
Wherein, the mode generating the second prompting gesture bag is identical with the mode generating the first prompting gesture bag in step 503,
Here does not describe in detail one by one.Except for the difference that, the prompting gesture now comprising in the second prompting gesture bag is to return described first application
Prompting gesture, cancels and reminds prompting gesture.
It should be noted that " first " " second " in the first prompting gesture bag and the second prompting gesture bag is intended merely to retouch
State conveniently, be not ranked up.Prompting gesture bag is one, real to prompting gesture bag according to the application that the different stages is different
Shi Gengxin.
Step 508, when user is according to the described first Application Hints gesture of described return in the described second prompting gesture bag
During input gesture path, then judge whether described gesture path is mated with described prompting gesture;
When whether described gesture path is mismatched with described prompting gesture, proceed to detect described gesture path whether with
Described prompting gesture coupling, i.e. execution step 508;
Step 509, when whether described gesture path is mated with described prompting gesture, then returns first and applies and run institute
State the first application;
Step 510, when user reminds prompting gesture input gesture according to the described cancellation in the described second prompting gesture bag
During track, then judge whether described gesture path is mated with described prompting gesture;
When whether described gesture path is mismatched with described prompting gesture, proceed to detect described gesture path whether with
Described prompting gesture coupling, i.e. execution step 510;
When whether described gesture path is mated with described prompting gesture, then run described second application, continue executing with step
Rapid 506.
It should be noted that judging mode that whether gesture path mate and step 404- step 410 with prompting gesture
Mode is identical.
The embodiment of the present invention provides a kind of method of gesture operation, by generating corresponding prompting handss according to three-dimensional data
Gesture, instructs user to carry out needing the operation carrying out so that the present invention can carry out handss using shirtsleeve operation according to prompting gesture
Gesture inputs, and improves operating efficiency, and can support to user's visual cues and feedback, improves Consumer's Experience, can solve simultaneously
The certainly action criteria in gesture operation, conforming problem.
The embodiment of the present invention provides a kind of device of gesture operation, and this device can be central processor equipment, as Fig. 8 institute
Show, this device includes:Analytic unit 801, signal generating unit 802, performance element 803, refusal module 8031, respond module 8032,
Return module 8033, cancel module 8034, gesture memory element 804, updating block 805, judging unit 806, first judges mould
Block 8061, the second judge module 8062, reminding module 8063, the 3rd judge module 8064;
Analytic unit 801, for receiving and analyzing the video data of the user with three-dimensional depth information, obtains described use
The three-dimensional data at family;
Signal generating unit 802, for when producing interrupt operation, generating prompting gesture according to described three-dimensional data, described carrying
Show that gesture is used for instructing described user to carry out interrupt operation;
Further, described signal generating unit 802 specifically for:When due to described user or system generation interrupt operation
When, according to described three-dimensional data, and run on what the current operation interface of current first application under described system was mated
Gesture bag, generates corresponding prompting gesture bag, and described prompting gesture bag includes at least one and points out gesture.
Performance element 803, for detecting the gesture path of described user input, when described gesture path and described prompting handss
Operation during gesture coupling, corresponding to execution described prompting gesture.
Further, described device also includes:Gesture memory element 804, for storing described prompting gesture bag;Described carry
Show that the gesture in the gesture bag mated with described current first application of the described prompting gesture in gesture bag differs;
Described device also includes:Updating block 805, for the different gesture mated according to described current first application
Bag is updated to described prompting gesture bag.
Described prompting gesture bag includes:Refuse the second Application Hints gesture, receive the second Application Hints gesture, wherein, institute
State refusal the second Application Hints gesture to be used for instructing described second application of described user's refusal, described reception the second Application Hints handss
Gesture is used for instructing described user to receive described second application;
After receiving the second application, the prompting gesture bag of generation includes:Return described first Application Hints gesture and cancellation
Remind prompting gesture, wherein, the described first Application Hints gesture of described return is used for instructing described user's return described first to answer
With described cancellation reminds prompting gesture to be used for instructing described user to cancel prompting, continues executing with described second application.
Described gesture memory element 804 is additionally operable to store described gesture bag;Wherein, each apply at different operating interface or
Person's operational phase mates different gesture bags;Gesture bag is made up of at least one gesture.
Further, when the gesture path of described user input is detected, judging unit 806 judges described gesture path
Whether mate with described prompting gesture, and matching relationship is prompted to described use by pointing out icon or the change of prompting gesture
Family;
Specifically, the first judge module 8061 in described judging unit 806, for judging the initial of described gesture path
Whether position is corresponding with the original position of described prompting gesture;
When described gesture path original position with described prompting gesture original position not to corresponding when described in first judge
Module 8061, is pointed out described user and is continued to judge the original position of described gesture path by the prompting icon of bright dark change
Whether corresponding with the original position of described prompting gesture.
When the original position of described gesture path is corresponding with the original position of described prompting gesture, described judging unit
Whether the second judge module 8062 in 806, for judging described gesture path along the described desired guiding trajectory fortune pointing out gesture
Dynamic;
When described gesture path is not moved along the desired guiding trajectory of described prompting gesture, in described judging unit 806
Reminding module 8063, point out described user not input handss according to described desired guiding trajectory by pointing out the bright dark change of icon
Gesture;
When described gesture path is moved along the desired guiding trajectory of described prompting gesture, the in described judging unit 806
Three judge modules 8064, for judging whether described gesture path reaches default position along the described desired guiding trajectory pointing out gesture
Put;
When described gesture path does not reach predeterminated position along the desired guiding trajectory of described prompting gesture, the described 3rd sentences
Disconnected module 8064 by the prompting icon of bright dark change point out described user and continuing judge described gesture path whether along
The desired guiding trajectory of described prompting gesture reaches predeterminated position;
When described gesture path reaches predeterminated position along the desired guiding trajectory of described prompting gesture, then described gesture path
Mate with described prompting gesture, described performance element 803 is by the prompting icon of all light described user of prompting, and carries described in executing
Show the operation corresponding to gesture.
Specifically, the refusal module 8031 in described performance element 803, for when the described gesture path when user input
When mating with described refusal the second Application Hints gesture in described prompting gesture, the second application described in refusal respond;
Respond module 8032 in described performance element 803, for when the described gesture path when user input with described
When described reception the second Application Hints gesture in prompting gesture is mated, run described second application;
Return module 8033 in described performance element 803, for when run described second application when, described when user defeated
When the gesture path entering is mated with the described first Application Hints gesture of described return in described prompting gesture, run described first
Application;
Cancellation module 8034 in described performance element 803, for when run described second application when, described when user defeated
When the gesture path entering reminds prompting gesture to mate with the described cancellation in described prompting gesture, continuing to run with described second should
With.
The embodiment of the present invention provides a kind of device of gesture operation, by signal generating unit according to the three-dimensional data of human body target
Generate prompting gesture, when the gesture path of user input is mated with described prompting gesture, performance element executes described prompting handss
Operation corresponding to gesture.Allow the present invention to carry out gesture input using shirtsleeve operation, improve operating efficiency, and permissible
Support to user's visual cues and feedback, improve Consumer's Experience.
The embodiment of the present invention provides a kind of system of gesture operation, and shown in Figure 3, this system includes:Depth capture sets
Standby 301, central processor equipment 302, gesture identification equipment 303 and display device 304;
Described depth capture device 301, for obtaining the video data of the user with three-dimensional depth information, and will be described
The video data with the user of three-dimensional depth information is sent to described central processor equipment 302;
Described central processor equipment 302, for receiving and analyzing the video data of the user with three-dimensional depth information, obtains
Take the three-dimensional data of described user;When producing interrupt operation, prompting gesture, described prompting handss are generated according to described three-dimensional data
Gesture is used for instructing described user to carry out interrupt operation;When described gesture path is mated with described prompting gesture, carry described in execution
Show the operation corresponding to gesture;
Described gesture identification equipment 303, for detecting the gesture path of user input;
It should be noted that described gesture identification equipment 303 is in gesture path described in each user input, all carry out reality
When identification so that described central processor equipment 302 carries out matching judgment.
Described display device 304, for showing the current operation interface of current application, and described prompting gesture, and
The gesture path of described user input.
Described central processor equipment includes the device of the gesture operation described in accompanying drawing 8.
The embodiment of the present invention provides a kind of system of gesture operation, by central processor equipment, depth capture device is obtained
The video data of depth information process, generate prompting gesture so that user can for prompting gesture be inputted, thus holding
The corresponding operation of row, so that the present invention can carry out gesture input using shirtsleeve operation, improves operating efficiency, and can give
User's visual cues and feedback are supported, improve Consumer's Experience.
The above, the only specific embodiment of the present invention, but protection scope of the present invention is not limited thereto, and any
Those familiar with the art the invention discloses technical scope in, change or replacement can be readily occurred in, all should contain
Cover within protection scope of the present invention.Therefore, protection scope of the present invention should described be defined by scope of the claims.
Claims (20)
1. a kind of method of gesture operation is it is characterised in that include:
Receive and analyze the video data of the user with three-dimensional depth information, obtain the three-dimensional data of described user;
When producing interrupt operation, prompting gesture is generated according to described three-dimensional data, described prompting gesture is used for instructing described use
Family carries out interrupt operation;
Detect the gesture path of described user input, when described gesture path is mated with described prompting gesture, carry described in execution
Show the operation corresponding to gesture;
During the described interrupt operation when generation, prompting gesture is generated according to described three-dimensional data and includes:
When interrupt operation is produced due to described user or system, according to described three-dimensional data, and run on described system
Under current first application the gesture bag that mated of current operation interface, generate corresponding prompting gesture bag, described prompting handss
Gesture bag includes at least one and points out gesture;
Handss in the described gesture bag that described prompting gesture in described prompting gesture bag is mated with described current first application
Gesture differs.
2. method according to claim 1 it is characterised in that when produce interrupt operation when, according to described three-dimensional data
After generating prompting gesture, also include:
According to the different gesture bag that described current first application is mated, described prompting gesture bag is updated.
3. the method according to any one of claim 1-2 is it is characterised in that described prompting gesture bag includes:Refusal the
Two Application Hints gestures, receive the second Application Hints gesture, wherein, described refusal the second Application Hints gesture is used for instructing described
Described second application of user's refusal, described reception the second Application Hints gesture is used for instructing described user's reception described second to answer
With;
After receiving the second application, the prompting gesture bag of generation includes:Return described first Application Hints gesture and cancel prompting
Prompting gesture, wherein, the described first Application Hints gesture of described return is used for instructing described user to return described first application, institute
Stating cancellation reminds prompting gesture to be used for instructing described user to cancel prompting, continues executing with described second application.
4. method according to claim 1 is it is characterised in that each was applied in different operating interface or operational phase
Join different gesture bags;Gesture bag is made up of at least one gesture.
5. method according to claim 1 it is characterised in that described detect described user input gesture path it
Afterwards, when described gesture path is mated with described prompting gesture, also include before the operation corresponding to execution described prompting gesture:
Judge whether described gesture path is mated with described prompting gesture, and matching relationship is passed through to point out icon or prompting handss
The change of gesture is prompted to described user.
6. method according to claim 5 is it is characterised in that the described gesture path of described judgement with described prompting gesture is
No coupling includes:
Judge whether the original position of described gesture path is corresponding with the original position of described prompting gesture;
When the original position of described gesture path is corresponding with the original position of described prompting gesture, judge described gesture path
Whether move along the described desired guiding trajectory pointing out gesture;
When described gesture path is moved along the desired guiding trajectory of described prompting gesture, judge described gesture path whether along institute
The desired guiding trajectory stating prompting gesture reaches predeterminated position;
When described gesture path reaches predeterminated position along the desired guiding trajectory of described prompting gesture, then described gesture path and institute
State prompting gesture coupling, point out described user, the operation corresponding to execution described prompting gesture by the prompting icon of all light;
When described gesture path does not reach predeterminated position along the desired guiding trajectory of described prompting gesture, by bright dark change
Prompting icon is pointed out described user and is continued to judge whether described gesture path arrives along the described desired guiding trajectory pointing out gesture
Reach predeterminated position.
7. method according to claim 6 is it is characterised in that work as the original position of described gesture path and described prompting handss
The original position of gesture not to corresponding when by the prompting icon described user of prompting of bright dark change and continue to judge described gesture rail
Whether the original position of mark is corresponding with the original position of described prompting gesture.
8. method according to claim 6 is it is characterised in that work as described gesture path not along described prompting gesture
During desired guiding trajectory motion, described user is pointed out not input handss according to described desired guiding trajectory by the prompting icon of bright dark change
Gesture.
9. method according to claim 3 is it is characterised in that work as the described gesture path when user input and described prompting
During gesture coupling, the operation corresponding to execution described prompting gesture includes:
When the described gesture path when user input is mated with described refusal the second Application Hints gesture in described prompting gesture
When, the second application described in refusal respond;
When the described gesture path when user input is mated with described reception the second Application Hints gesture in described prompting gesture
When, run described second application;
Described return institute when running described second application, in the described gesture path when user input and described prompting gesture
When stating the first Application Hints gesture coupling, run described first application;
When running described second application, the described gesture path when user input is carried with the described cancellation in described prompting gesture
During prompting gesture coupling of waking up, continue to run with described second application.
10. a kind of device of gesture operation is it is characterised in that include:
Analytic unit, for receiving and analyzing the video data of the user with three-dimensional depth information, obtains the three of described user
Dimension data;
Signal generating unit, for when producing interrupt operation, generating prompting gesture according to described three-dimensional data, described prompting gesture is used
Carry out interrupt operation in instructing described user;
Performance element, for detecting the gesture path of described user input, when described gesture path is mated with described prompting gesture
When, the operation corresponding to execution described prompting gesture;
Described signal generating unit specifically for:When interrupt operation is produced due to described user or system, according to described three dimensions
According to, and run on the gesture bag that the current operation interface of current first application under described system is mated, generate corresponding
Prompting gesture bag, described prompting gesture bag includes at least one and points out gesture;
Gesture memory element, for store described prompting gesture bag, the described prompting gesture in wherein said prompting gesture bag with
Gesture in the described gesture bag that described current first application is mated differs;
Updating block, for carrying out to described prompting gesture bag according to the different gesture bag that described current first application is mated
Update.
11. devices according to claim 10 are it is characterised in that described prompting gesture bag includes:Refusal second application carries
Show gesture, receive the second Application Hints gesture, wherein, described refusal the second Application Hints gesture is used for instructing described user's refusal
Described second application, described reception the second Application Hints gesture is used for instructing described user to receive described second application;
After receiving the second application, the prompting gesture bag of generation includes:Return described first Application Hints gesture and cancel prompting
Prompting gesture, wherein, the described first Application Hints gesture of described return is used for instructing described user to return described first application, institute
Stating cancellation reminds prompting gesture to be used for instructing described user to cancel prompting, continues executing with described second application.
12. devices according to claim 10 it is characterised in that
Described gesture memory element, is additionally operable to store described gesture bag;Wherein, each is applied in different operating interface or operation
The different gesture bag of stage match;Gesture bag is made up of at least one gesture.
13. devices according to claim 10 are it is characterised in that described device also includes:
Judging unit, for judging whether described gesture path is mated with described prompting gesture, and matching relationship is passed through prompting
The change of icon or prompting gesture is prompted to described user.
14. devices according to claim 13 are it is characterised in that described judging unit includes:
First judge module, for judging the original position whether original position phase with described prompting gesture of described gesture path
Corresponding;
Second judge module, corresponding with the original position of described prompting gesture for the original position when described gesture path
When, judge whether described gesture path moves along the described desired guiding trajectory pointing out gesture;
3rd judge module, for when described gesture path is moved along the desired guiding trajectory of described prompting gesture, judging described
Whether gesture path reaches predeterminated position along the described desired guiding trajectory pointing out gesture.
15. devices according to claim 10 it is characterised in that
Described performance element specifically for:When described gesture path reaches predeterminated position along the desired guiding trajectory of described prompting gesture
When, then described gesture path is mated with described prompting gesture, points out described user by the prompting icon of all light, and executes described
Operation corresponding to prompting gesture.
16. devices according to claim 14 it is characterised in that
Described 3rd judge module, for when described gesture path along described prompting gesture desired guiding trajectory do not reach default
During position, described user and continuing is pointed out to judge described gesture path whether along described by the prompting icon of bright dark change
The desired guiding trajectory of prompting gesture reaches predeterminated position.
17. devices according to claim 14 it is characterised in that
Described first judge module, not right with the original position of described prompting gesture for the original position when described gesture path
At once, whether point out original position that described user and continuing judges described gesture path by the prompting icon of bright dark change
Corresponding with the original position of described prompting gesture.
18. devices according to claim 13 are it is characterised in that described judging unit includes:
Reminding module, for when described gesture path is not moved along the desired guiding trajectory of described prompting gesture, pointing out icon
Bright dark change point out described user not input gesture according to described desired guiding trajectory.
19. devices according to claim 11 are it is characterised in that described performance element includes:
Refusal module, for applying with the described refusal second in described prompting gesture when the described gesture path when user input
During prompting gesture coupling, the second application described in refusal respond;
Respond module, for applying with the described reception second in described prompting gesture when the described gesture path when user input
During prompting gesture coupling, run described second application;
Return module, for when running described second application, the described gesture path when user input and described prompting gesture
In described return described first Application Hints gesture coupling when, run described first application;
Cancel module, for when running described second application, the described gesture path when user input and described prompting gesture
In described cancellation when reminding prompting gesture coupling, continue to run with described second application.
A kind of 20. systems of gesture operation are it is characterised in that include:Depth capture device, central processor equipment, gesture identification
Equipment and display device;
Described depth capture device, for obtaining the video data of the user with three-dimensional depth information, and has three by described
The video data of the user of dimension depth information is sent to described central processor equipment;
Described central processor equipment, for receiving and analyzing the video data of the user with three-dimensional depth information, obtains described
The three-dimensional data of user;When producing interrupt operation, prompting gesture is generated according to described three-dimensional data, described prompting gesture is used for
Described user is instructed to carry out interrupt operation;When described gesture path is mated with described prompting gesture, execute described prompting gesture
Corresponding operation;
Described gesture identification equipment, for detecting the gesture path of user input;
Described display device, for showing the current operation interface of current application, and described prompting gesture, and described user
The gesture path of input;
Described central processor equipment includes the device of the gesture operation described in any one in claim 10 to 19.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210201760.XA CN103513752B (en) | 2012-06-18 | 2012-06-18 | Gesture operation method, gesture operation device and gesture operation system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210201760.XA CN103513752B (en) | 2012-06-18 | 2012-06-18 | Gesture operation method, gesture operation device and gesture operation system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103513752A CN103513752A (en) | 2014-01-15 |
CN103513752B true CN103513752B (en) | 2017-02-22 |
Family
ID=49896621
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201210201760.XA Active CN103513752B (en) | 2012-06-18 | 2012-06-18 | Gesture operation method, gesture operation device and gesture operation system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103513752B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104615984B (en) * | 2015-01-28 | 2018-02-02 | 广东工业大学 | Gesture identification method based on user task |
CN105204743A (en) | 2015-09-28 | 2015-12-30 | 百度在线网络技术(北京)有限公司 | Interaction control method and device for speech and video communication |
CN107819962A (en) * | 2017-11-09 | 2018-03-20 | 上海市共进通信技术有限公司 | The system and method for intelligent call function is realized based on home gateway |
CN109068063B (en) * | 2018-09-20 | 2021-01-15 | 维沃移动通信有限公司 | Three-dimensional image data processing and displaying method and device and mobile terminal |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101349944A (en) * | 2008-09-03 | 2009-01-21 | 宏碁股份有限公司 | Gesticulation guidance system and method for controlling computer system by touch control gesticulation |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9875013B2 (en) * | 2009-03-16 | 2018-01-23 | Apple Inc. | Methods and graphical user interfaces for editing on a multifunction device with a touch screen display |
-
2012
- 2012-06-18 CN CN201210201760.XA patent/CN103513752B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101349944A (en) * | 2008-09-03 | 2009-01-21 | 宏碁股份有限公司 | Gesticulation guidance system and method for controlling computer system by touch control gesticulation |
Also Published As
Publication number | Publication date |
---|---|
CN103513752A (en) | 2014-01-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107817939B (en) | Image processing method and mobile terminal | |
CN105940365B (en) | A kind of notification information processing method, device and terminal | |
CN104461084B (en) | A kind of control device and method | |
CN105389111B (en) | A kind of operating method and electronic equipment of split screen display available | |
EP3059942B1 (en) | Method and apparatus for supporting image processing, and computer-readable recording medium for executing the method | |
CN103513752B (en) | Gesture operation method, gesture operation device and gesture operation system | |
WO2020063758A1 (en) | Game Picture Display Method and Apparatus, Storage Medium and Electronic Device | |
CN104571925B (en) | The one-handed performance method and device of mobile terminal | |
US9781252B2 (en) | Information processing method, system and mobile terminal | |
US20070110287A1 (en) | Remote input method using fingerprint recognition sensor | |
US20110246952A1 (en) | Electronic device capable of defining touch gestures and method thereof | |
JP2013505495A (en) | Input device and method for portable terminal | |
KR20100052378A (en) | Motion input device for portable device and operation method using the same | |
CN103365393A (en) | Display method and electronic device | |
WO2017185459A1 (en) | Method and apparatus for moving icons | |
CN105867818A (en) | Terminal interaction control device | |
CN102880420B (en) | Method and system based on touch screen for starting and implementing area selection operation | |
CN103279253A (en) | Method and terminal device for theme setting | |
CN103677266B (en) | Electronic equipment and display control method and system thereof | |
CN105094645A (en) | Information processing method and electronic equipment | |
CN103856798A (en) | Operation method and electronic equipment | |
CN110136718A (en) | The method and apparatus of voice control | |
CN109859307A (en) | A kind of image processing method and terminal device | |
CN108984096A (en) | touch operation method, device, storage medium and electronic equipment | |
CN109688253A (en) | A kind of image pickup method and terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |