CN107450717A - A kind of information processing method and Wearable - Google Patents
A kind of information processing method and Wearable Download PDFInfo
- Publication number
- CN107450717A CN107450717A CN201610378110.0A CN201610378110A CN107450717A CN 107450717 A CN107450717 A CN 107450717A CN 201610378110 A CN201610378110 A CN 201610378110A CN 107450717 A CN107450717 A CN 107450717A
- Authority
- CN
- China
- Prior art keywords
- gesture
- wearable
- control instruction
- field pictures
- feature points
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Social Psychology (AREA)
- Psychiatry (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses a kind of information processing method and Wearable, methods described is applied to Wearable, including:The M frame ambient images of the Wearable are obtained, wherein, M is positive integer;The first operating gesture of the user of the Wearable is identified from the M frames ambient image;Corresponding relation based on default operating gesture and control instruction, it is determined that and execution the first control instruction corresponding with first operating gesture.The single technical problem of control mode be present for solving wearable electronic equipment of the prior art, realize the diversified technique effect of control mode of wearable electronic equipment.
Description
Technical field
The present invention relates to electronic technology field, more particularly to a kind of information processing method and Wearable.
Background technology
With the continuous development of scientific technology, electronic technology has also obtained development at full speed, many wearable electronic equipments,
Such as intelligent watch, intelligent glasses, become the necessity of people's daily life.
In the prior art, by taking intelligent glasses as an example, used in order to facilitate user, intelligent glasses are except that can pass through voice
Outside being controlled, intelligent glasses also provide touch-control interactive mode for user, so, when Voice command is occurring in user
During the situation of failure, as being in noisy environment, intelligent glasses can be controlled with use touch control manner.
It can be seen that wearable electronic equipment of the prior art can only carry out touch-control, institute by touch-control and voice two ways
So that wearable electronic equipment of the prior art has the single technical problem of control mode.
The content of the invention
The embodiment of the present invention provides a kind of information processing method and Wearable, for solving wearing of the prior art
There is the single technical problem of control mode in formula electronic equipment, realize the diversified technology of control mode of wearable electronic equipment
Effect.
The embodiment of the present application first aspect provides a kind of information processing method, applied to Wearable, including:
The M frame ambient images of the Wearable are obtained, wherein, M is positive integer;
The first operating gesture of the user of the Wearable is identified from the M frames ambient image;
Corresponding relation based on default operating gesture and control instruction, it is determined that and performing and first operating gesture pair
The first control instruction answered.
Optionally, first operating gesture is O-shaped gesture, wherein, the O-shaped gesture for the user thumbtip with
Forefinger tip contacts and forefinger, middle finger, the curvature of the third finger and little finger of toe are in the first preset range;Or
First operating gesture is gesture of clenching fist, wherein, the thumb and forefinger portion that gesture is the user of clenching fist
Point fitting and forefinger, middle finger, the curvature of the third finger and little finger of toe are in the second preset range;Or
First operating gesture is V-type gesture, wherein, the V-type gesture is opened for the forefinger of the user with middle finger,
And thumb is bonded with ring finger portion, and the nameless and little finger of toe curvature is in the 3rd preset range;Or
First operating gesture is expansion gesture, wherein, the expansion gesture is opened for the five fingers of the user;Or
First operating gesture is to swing gesture to the left, wherein, it is described to swing the five fingers that gesture is the user to the left
Fitting and motion bending to the left;
First operating gesture is to swing gesture to the right, wherein, it is described to swing the five fingers that gesture is the user to the right
Fitting and motion bending to the right;
Correspondingly, the first manipulator of the user that the Wearable is identified from the M frames ambient image
Gesture, including:
The O-shaped gesture is identified from the M two field pictures, wherein, the O-shaped gesture is every in the M two field pictures
The line of the characteristic point of individual hand region forms the closed curve for including closed area, and the first of each hand region refers to
The second feature point corresponding with the second finger tip of fisrt feature point corresponding to point is in contact condition;Or
The gesture of clenching fist is identified from the M two field pictures, wherein, the gesture of clenching fist is in the M two field picture
The line of the characteristic point of each hand region forms the curve for including closed area, and the first finger of each hand region
Corresponding first area second area corresponding with second finger is in contact condition;
The V-type gesture is identified from the M two field pictures, wherein, the V-type gesture is every in the M two field pictures
The V-shaped gesture of at least a portion of the line of the characteristic point of individual hand region;Or
The expansion gesture is identified from the M two field pictures, wherein, the expansion gesture is in the M two field pictures
The line of the characteristic point of each hand region is in the gesture of palmate curve;Or
Gesture is swung to the left described in being identified from the M two field pictures, wherein, the gesture that swings to the left is the M frames
The change direction of M hand region in image is gesture from right to left;Or
Gesture is swung to the right described in being identified from the M two field pictures, wherein, the gesture that swings to the right is the M frames
The change direction of M hand region in image is gesture from left to right.
Optionally, the corresponding relation based on default operating gesture and control instruction, it is determined that and performing and described the
First control instruction corresponding to one operating gesture, including:
Based on the corresponding relation, it is determined that and perform Home key control instruction corresponding with the O-shaped gesture, described in control
The display content of Wearable is adjusted to desktop contents by present displayed content;Or
Based on the corresponding relation, it is determined that and perform it is corresponding with the gesture of clenching fist reduce control instruction, described in control
The display content of Wearable is adjusted to the second size by first size, and second size is less than the first size;Or
Based on the corresponding relation, it is determined that and perform it is corresponding with the V-type gesture confirm control instruction, wear described in control
Wear formula equipment and perform operation corresponding with the confirmation control instruction;Or
Based on the corresponding relation, it is determined that and perform it is corresponding with the expansion gesture amplify control instruction, described in control
The display content of Wearable is the 4th size by the 3rd size adjusting, and the 4th size is more than the 3rd size;Or
Based on the corresponding relation, it is determined that and perform with the corresponding Page forward control instruction of swing gesture to the left,
The display content of the Wearable is controlled to be adjusted to the first display content, first display content by present displayed content
For the page up content of the present displayed content;
Based on the corresponding relation, it is determined that and perform with the corresponding page turning control instruction backward of swing gesture to the right,
The display content of the Wearable is controlled to be adjusted to the second display content, second display content by present displayed content
For the lower one page of content of the present displayed content.
Optionally, the first manipulator of the user that the Wearable is identified from the M frames ambient image
Gesture, including:
It is 1 to M to take i successively, identify user described in the i-th frame ambient image in the M frames ambient image i-th
Hand region, when i is M, obtain M hand region;
It is 1 to M to take j successively, extracts the edge of jth group at least three of j-th of hand region in the M hand region
Characteristic point, when j is M, the Edge Feature Points of M groups at least three are obtained, wherein, every group of at least three Edge Feature Points is positioned at institutes
State the characteristic point of the marginal portion of hand region;
Determine that the variable quantity of any two group of at least three Edge Feature Points between the Edge Feature Points of M groups at least three is small
In default changing value;
Line shapes based on the Edge Feature Points of M groups at least three determine first operating gesture.
Optionally, to take j successively be 1 to M described, extracts the jth of j-th of hand region in the M hand region
After at least three Edge Feature Points of group, methods described also includes:
Determine that the variable quantity of any two group of at least three Edge Feature Points between the Edge Feature Points of M groups at least three is big
In default changing value;
The direction of motion based on the Edge Feature Points of M groups at least three determines first operating gesture.
The embodiment of the present application second aspect provides a kind of Wearable, including:
Housing;
Sensor, it is arranged in the housing, for obtaining the M frame ambient images of the Wearable, wherein, M is
Positive integer;
Processor, it is arranged in the housing, for identifying the Wearable from the M frames ambient image
The first operating gesture of user;Corresponding relation based on default operating gesture and control instruction, it is determined that and performing and described the
First control instruction corresponding to one operating gesture.
Optionally, the processor is specifically used for:
O-shaped gesture is identified from the M two field pictures, wherein, the O-shaped gesture is each hand in the M two field pictures
The line of the characteristic point in portion region forms the closed curve for including closed area, and the first finger tip pair of each hand region
The fisrt feature point second feature point corresponding with the second finger tip answered is in contact condition;Or
Gesture of clenching fist is identified from the M two field pictures, wherein, the gesture of clenching fist is each in the M two field pictures
The line of the characteristic point of hand region forms the curve for including closed area, and the first finger of each hand region is corresponding
First area second area corresponding with second finger be in contact condition;
V-type gesture is identified from the M two field pictures, wherein, the V-type gesture is each hand in the M two field pictures
The V-shaped gesture of at least a portion of the line of the characteristic point in portion region;Or
Expansion gesture is identified from the M two field pictures, wherein, the expansion gesture is each in the M two field pictures
The line of the characteristic point of hand region is in the gesture of palmate curve;Or
Identified from the M two field pictures and swing gesture to the left, wherein, the gesture that swings to the left is the M two field pictures
In the change direction of M hand region be gesture from right to left;Or
Identified from the M two field pictures and swing gesture to the right, wherein, the gesture that swings to the right is the M two field pictures
In the change direction of M hand region be gesture from left to right.
Optionally, the processor is specifically used for:
Based on the corresponding relation, it is determined that and perform Home key control instruction corresponding with the O-shaped gesture, described in control
The display content of Wearable is adjusted to desktop contents by present displayed content;Or
Based on the corresponding relation, it is determined that and perform it is corresponding with the gesture of clenching fist reduce control instruction, described in control
The display content of Wearable is adjusted to the second size by first size, and second size is less than the first size;Or
Based on the corresponding relation, it is determined that and perform it is corresponding with the V-type gesture confirm control instruction, wear described in control
Wear formula equipment and perform operation corresponding with the confirmation control instruction;Or
Based on the corresponding relation, it is determined that and perform it is corresponding with the expansion gesture amplify control instruction, described in control
The display content of Wearable is the 4th size by the 3rd size adjusting, and the 4th size is more than the 3rd size;Or
Based on the corresponding relation, it is determined that and perform with the corresponding Page forward control instruction of swing gesture to the left,
The display content of the Wearable is controlled to be adjusted to the first display content, first display content by present displayed content
For the page up content of the present displayed content;
Based on the corresponding relation, it is determined that and perform with the corresponding page turning control instruction backward of swing gesture to the right,
The display content of the Wearable is controlled to be adjusted to the second display content, second display content by present displayed content
For the lower one page of content of the present displayed content.
Optionally, the processor is specifically used for:
It is 1 to M to take i successively, identify user described in the i-th frame ambient image in the M frames ambient image i-th
Hand region, when i is M, obtain M hand region;
It is 1 to M to take j successively, extracts the edge of jth group at least three of j-th of hand region in the M hand region
Characteristic point, when j is M, the Edge Feature Points of M groups at least three are obtained, wherein, every group of at least three Edge Feature Points is positioned at institutes
State the characteristic point of the marginal portion of hand region;
Determine that the variable quantity of any two group of at least three Edge Feature Points between the Edge Feature Points of M groups at least three is small
In default changing value;
Line shapes based on the Edge Feature Points of M groups at least three determine first operating gesture.
Optionally, the processor is additionally operable to:
Determine that the variable quantity of any two group of at least three Edge Feature Points between the Edge Feature Points of M groups at least three is big
In default changing value;
The direction of motion based on the Edge Feature Points of M groups at least three determines first operating gesture.
The embodiment of the present application third aspect provides a kind of Wearable, including:
First acquisition unit, for obtaining the M frame ambient images of the Wearable, wherein, M is positive integer;
First recognition unit, first of user for identifying the Wearable from the M frames ambient image
Operating gesture;
First execution unit, for the corresponding relation based on default operating gesture and control instruction, it is determined that and perform with
First control instruction corresponding to first operating gesture.
Said one or multiple technical schemes in the embodiment of the present application, at least there is following one or more technology effects
Fruit:
Due to the technical scheme in the embodiment of the present application, using the M frame ambient images for obtaining Wearable, wherein, M is
Positive integer;The first operating gesture of the user of the Wearable is identified from the M frames ambient image;Based on default
The corresponding relation of operating gesture and control instruction, it is determined that and performing corresponding with first operating gesture the first control instruction
Technological means, so, Wearable can be controlled by the user gesture in environment-identification image to Wearable
System, so as to provide the new control mode in addition to touch-control and Voice command mode to user, user can be according to reality
Use demand selects suitable control mode, and solving wearable electronic equipment of the prior art, control mode to be present single
Technical problem, realize the diversified technique effect of control mode of wearable electronic equipment.
Brief description of the drawings
In order to illustrate more clearly of the embodiment of the present application or technical scheme of the prior art, embodiment will be described below
In the required accompanying drawing used be briefly described, it should be apparent that, drawings in the following description be only the present invention some
Embodiment.
Fig. 1 is a kind of flow chart of the information processing method provided in the embodiment of the present application one;
Fig. 2A is the schematic diagram of O-shaped gesture in the embodiment of the present application one;
Fig. 2 B are the schematic diagram of gesture of being clenched fist in the embodiment of the present application one;
Fig. 2 C are the schematic diagram of V-type gesture in the embodiment of the present application one;
Fig. 2 D are the schematic diagram for deploying gesture in the embodiment of the present application one;
Fig. 2 E are the schematic diagram for swinging gesture in the embodiment of the present application one to the left;
Fig. 2 F are the schematic diagram for swinging gesture in the embodiment of the present application one to the right;
Fig. 3 A are the schematic diagram of the first other operating gesture in the embodiment of the present application one;
Fig. 3 B are the schematic diagram of second of other operating gesture in the embodiment of the present application one;
Fig. 4 is the flow chart of step S102 specific implementation in the embodiment of the present application one;
Fig. 5 is that the embodiment of the present application two provides a kind of structural representation of Wearable;
Fig. 6 is that the embodiment of the present application three provides a kind of structured flowchart of Wearable.
Embodiment
The embodiment of the present invention provides a kind of information processing method and Wearable, for solving wearing of the prior art
There is the single technical problem of control mode in formula electronic equipment, realize the diversified technology of control mode of wearable electronic equipment
Effect.
Technical scheme in the embodiment of the present application is the above-mentioned technical problem of solution, and general thought is as follows:
A kind of information processing method, applied to Wearable, including:
The M frame ambient images of the Wearable are obtained, wherein, M is positive integer;
The first operating gesture of the user of the Wearable is identified from the M frames ambient image;
Corresponding relation based on default operating gesture and control instruction, it is determined that and performing and first operating gesture pair
The first control instruction answered.
In the above-mentioned technical solutions, using the M frame ambient images for obtaining Wearable, wherein, M is positive integer;From institute
State the first operating gesture for the user that the Wearable is identified in M frame ambient images;Based on default operating gesture with
The corresponding relation of control instruction, it is determined that and perform the technological means of corresponding with first operating gesture the first control instruction,
So, Wearable can be controlled by the user gesture in environment-identification image to Wearable, so as to be made
User provides the new control mode in addition to touch-control and Voice command mode, and user can be according to actual use demand selection
Suitable control mode, solve wearable electronic equipment of the prior art and the single technical problem of control mode be present, it is real
The diversified technique effect of control mode of existing wearable electronic equipment.
In order to be better understood from above-mentioned technical proposal, below by accompanying drawing and specific embodiment to technical solution of the present invention
It is described in detail, it should be understood that the specific features in the embodiment of the present application and embodiment are to the detailed of technical solution of the present invention
Thin explanation, rather than the restriction to technical solution of the present invention, in the case where not conflicting, the embodiment of the present application and embodiment
In technical characteristic can be combined with each other.
Embodiment one
Fig. 1 is refer to, is a kind of flow chart of the information processing method provided in the embodiment of the present application one, methods described should
For in Wearable, including:
S101:The M frame ambient images of the Wearable are obtained, wherein, M is positive integer;
S102:The first operating gesture of the user of the Wearable is identified from the M frames ambient image;
S103:Corresponding relation based on default operating gesture and control instruction, it is determined that and performing and the described first operation
First control instruction corresponding to gesture.
In specific implementation process, the Wearable can be intelligent glasses, Intelligent bracelet etc., it is of course also possible to
It is the Wearable that other can obtain image and carry out image procossing, one is schematically illustrated here, just differing.Implement in the application
, will be so that the Wearable be intelligent glasses as an example, the method in the embodiment of the present application to be described in detail in example.
When the method in using the embodiment of the present application carries out information processing, step S101 is first carried out, i.e.,:Described in acquisition
The M frame ambient images of Wearable, wherein, M is positive integer.
In specific implementation process, so that the Wearable is intelligent glasses as an example, it can be equipped with intelligent glasses
There are image acquisition units, as infrared camera, common camera or depth camera are first-class.It can also be set on intelligent glasses
Control mode switch key, the control model for facilitating user to select oneself to need to use from various control pattern, such as in noisy ring
Selection uses gesture identification in border, and touch control manner or voice mode etc. are used in quiet environment.When intelligent glasses detect
When the current control model of user is gesture identification, then the camera of intelligent glasses is opened, now, it is pre- that camera can gather one
If the ambient image in duration, as gathered the ambient image in 5s, it is assumed that the frequency acquisition of camera is that a figure is gathered per 1s
Picture, then intelligent glasses will 5 frame ambient images of acquisition.
After completion step S101 is performed, the method in the embodiment of the present application just performs step S102, i.e.,:From the M
The first operating gesture of the user of the Wearable is identified in frame ambient image.
In the embodiment of the present application one, first operating gesture is O-shaped gesture, wherein, the O-shaped gesture is the use
The thumbtip at family is contacted with forefinger tip and forefinger, middle finger, the curvature of the third finger and little finger of toe are in the first preset range;Or
First operating gesture is gesture of clenching fist, wherein, the thumb and forefinger portion that gesture is the user of clenching fist
Point fitting and forefinger, middle finger, the curvature of the third finger and little finger of toe are in the second preset range;Or
First operating gesture is V-type gesture, wherein, the V-type gesture is opened for the forefinger of the user with middle finger,
And thumb is bonded with ring finger portion, and the nameless and little finger of toe curvature is in the 3rd preset range;Or
First operating gesture is expansion gesture, wherein, the expansion gesture is opened for the five fingers of the user;Or
First operating gesture is to swing gesture to the left, wherein, it is described to swing the five fingers that gesture is the user to the left
Fitting and motion bending to the left;
First operating gesture is to swing gesture to the right, wherein, it is described to swing the five fingers that gesture is the user to the right
Fitting and motion bending to the right.
In specific implementation process, above-mentioned example is continued to use, various manipulators have been prestored in the memory of intelligent glasses
The definition of gesture:O-shaped gesture, as shown in Figure 2 A;Clench fist gesture, as shown in Figure 2 B;V-type gesture, as shown in Figure 2 C;Deploy gesture,
As shown in Figure 2 D;Gesture is swung to the left, as shown in Figure 2 E;Gesture is swung to the right, as shown in Figure 2 F.Can certainly be such as Fig. 3 A
With the gesture of sensing to the left shown in Fig. 3 B and point to gesture to the right.After intelligent glasses obtain 5 frame ambient image, just to described
5 frame ambient images carry out image procossing, e.g., the scene depth figure to obtaining image, are identified based on depth map, characteristic point carries
Take, user gesture is obtained by continuous tracking data.
In the embodiment of the present application, Fig. 4 is refer to, step S102 specific implementation is:
S401:It is 1 to M to take i successively, identifies user described in the i-th frame ambient image in the M frames ambient image
I-th of hand region, when i is M, obtain M hand region;
S402:It is 1 to M to take j successively, extracts the jth group at least three of j-th of hand region in the M hand region
Individual Edge Feature Points, when j is M, the Edge Feature Points of M groups at least three are obtained, wherein, every group of at least three Edge Feature Points are
Characteristic point positioned at the marginal portion of the hand region;
S403:Determine the change of any two group of at least three Edge Feature Points between the Edge Feature Points of M groups at least three
Change amount is less than default changing value;
S404:Line shapes based on the Edge Feature Points of M groups at least three determine first operating gesture;
S405:Determine the change of any two group of at least three Edge Feature Points between the Edge Feature Points of M groups at least three
Change amount is more than default changing value;
S406:The direction of motion based on the Edge Feature Points of M groups at least three determines first operating gesture.
In specific implementation process, above-mentioned example is continued to use, every two field picture in 5 frame ambient images of acquisition is carried out first
Gesture positions and segmentation, the hand region in obtaining per two field picture, e.g., the colour of skin of user is modeled, and then pass through judgement
Whether the color of each pixel in image belongs to the colour of skin to determine the hand region of user, certainly, those skilled in the art
Other modes can also be used;It is determined that after per the hand region in two field picture, hand region in just extracting per two field picture
Characteristic point, e.g., after obtaining the bianry image of hand region, handle portion image projection obtains to the four edges of its outsourcing rectangle
Hand contour feature point, the centre of the palm position of hand region is determined, polar coordinate system is established using the position as origin, palm profile is special
Sign point is transformed into the polar coordinate system, and the rule that some local maximum of polar radius is corresponded to using finger tip extracts finger position,
Characteristic point using the position feature of palm contour feature point and each finger point as the hand region in every two field picture.Compare every
Hand region characteristic point in two field pictures, judge obtain 5 two field pictures in hand region characteristic point variable quantity it is whether big
In default changing value, such as judge whether the changing value of the position of hand region characteristic point is more than default changing value 0.5cm, work as position
Changing value be less than 0.5cm, it is determined that the gesture of active user is static gesture, then by static gesture know obtain otherwise
The gesture at family is taken, e.g., the hand region characteristic point of a certain two field picture is obtained, obtains the position relationship of each finger, pass through position
The relation of putting determines current gesture;When the changing value of position is more than 0.5cm, it is determined that the gesture of active user is dynamic gesture, then
User gesture is obtained by way of dynamic hand gesture recognition, e.g., obtain the first two field picture first group of hand region characteristic point and
3rd group of hand region characteristic point of the 3rd two field picture, calculate from the first two field picture to the 3rd two field picture, the change of hand region
Change trend, current gesture is determined by variation tendency.
In specific implementation process, above-mentioned example is continued to use, can be prestored in the memory of intelligent glasses corresponding with each gesture
Characteristic point position relationship.As the characteristic point position relation of O-shaped gesture is:Each hand region in the M two field pictures
The line of characteristic point forms the closed curve for including closed area, and first corresponding to the first finger tip of each hand region
Characteristic point second feature point corresponding with the second finger tip is in contact condition;The characteristic point position relation for gesture of clenching fist is:The M
The line of the characteristic point of each hand region in two field picture forms the curve for including closed area, and each hand region
The first finger corresponding to first area second area corresponding with second finger be in contact condition;The characteristic point of V-type gesture
Position relationship is:The V-shaped gesture of at least a portion of the line of the characteristic point of each hand region in the M two field pictures;
Expansion gesture characteristic point position relation be:The line of the characteristic point of each hand region in the M two field pictures is bent in palmate
The gesture of line;The characteristic point position relation of swing gesture is to the left:The change direction of M hand region in the M two field pictures
For gesture from right to left;The characteristic point position relation of swing gesture is to the right:M hand region in the M two field pictures
Change direction is gesture from left to right.
So as to which in the embodiment of the present application, step S102 specific implementation is:
The O-shaped gesture is identified from the M two field pictures;Or
The gesture of clenching fist is identified from the M two field pictures;Or
The V-type gesture is identified from the M two field pictures;Or
The expansion gesture is identified from the M two field pictures;Or
Gesture is swung to the left described in being identified from the M two field pictures;Or
Gesture is swung to the right described in being identified from the M two field pictures.
After completion step S102 is performed, the method in the embodiment of the present application just performs step S103, i.e.,:Based on default
Operating gesture and control instruction corresponding relation, it is determined that and perform it is corresponding with first operating gesture first control refer to
Order.
In the embodiment of the present application, step S103 specific implementation is:
Based on the corresponding relation, it is determined that and perform Home key control instruction corresponding with the O-shaped gesture, described in control
The display content of Wearable is adjusted to desktop contents by present displayed content;Or
Based on the corresponding relation, it is determined that and perform it is corresponding with the gesture of clenching fist reduce control instruction, described in control
The display content of Wearable is adjusted to the second size by first size, and second size is less than the first size;Or
Based on the corresponding relation, it is determined that and perform it is corresponding with the V-type gesture confirm control instruction, wear described in control
Wear formula equipment and perform operation corresponding with the confirmation control instruction;Or
Based on the corresponding relation, it is determined that and perform it is corresponding with the expansion gesture amplify control instruction, described in control
The display content of Wearable is the 4th size by the 3rd size adjusting, and the 4th size is more than the 3rd size;Or
Based on the corresponding relation, it is determined that and perform with the corresponding Page forward control instruction of swing gesture to the left,
The display content of the Wearable is controlled to be adjusted to the first display content, first display content by present displayed content
For the page up content of the present displayed content;
Based on the corresponding relation, it is determined that and perform with the corresponding page turning control instruction backward of swing gesture to the right,
The display content of the Wearable is controlled to be adjusted to the second display content, second display content by present displayed content
For the lower one page of content of the present displayed content.
In specific implementation process, above-mentioned example is continued to use, after intelligent glasses identify the gesture of user, intelligent glasses
Just according to the gesture and the corresponding relation of control instruction to be prestored in memory, it is determined that and perform it is corresponding with current gesture control refer to
Order.The corresponding relation stored in memory is:Home key control instruction corresponding to O-shaped gesture;Reduced corresponding to gesture of clenching fist
Control instruction;Control instruction is confirmed corresponding to V-type gesture;Deploy amplification control instruction corresponding to gesture;Gesture pair is swung to the left
The Page forward control instruction answered;Page turning control instruction backward corresponding to gesture is swung to the right.Gesture using active user is holds
Exemplified by boxer's gesture, now, intelligent glasses determine current control instruction to reduce control instruction according to the corresponding relation to prestore, then
Intelligent glasses just obtain current display interface, and the font size for determining current display interface is No. 4, the contracting to be prestored in intelligent glasses
Small rule is will currently show that font size reduces a font size, so that it is determined that going out to need the font size of current display interface is adjusted into 5
Number, so as to which the font size of current display interface is adjusted to No. 5 again to being shown again after the page again typesetting by intelligent glasses,
Realize the function of being reduced to the page.
When user gesture is other gestures, then other control instructions are performed, are repeated no more in the embodiment of the present application.
Embodiment two
Based on the identical inventive concept of the embodiment of the present application one, refer to Fig. 5, the embodiment of the present application two provides one kind and worn
Formula equipment is worn, including:
Housing 10;
Sensor 20, it is arranged in housing 10, for obtaining the M frame ambient images of the Wearable, wherein, M is
Positive integer;
Processor 30, it is arranged in housing 10, for identifying the Wearable from the M frames ambient image
The first operating gesture of user;Corresponding relation based on default operating gesture and control instruction, it is determined that and performing and described the
First control instruction corresponding to one operating gesture.
In the embodiment of the present application two, processor 30 is specifically used for:
O-shaped gesture is identified from the M two field pictures, wherein, the O-shaped gesture is each hand in the M two field pictures
The line of the characteristic point in portion region forms the closed curve for including closed area, and the first finger tip pair of each hand region
The fisrt feature point second feature point corresponding with the second finger tip answered is in contact condition;Or
Gesture of clenching fist is identified from the M two field pictures, wherein, the gesture of clenching fist is each in the M two field pictures
The line of the characteristic point of hand region forms the curve for including closed area, and the first finger of each hand region is corresponding
First area second area corresponding with second finger be in contact condition;
V-type gesture is identified from the M two field pictures, wherein, the V-type gesture is each hand in the M two field pictures
The V-shaped gesture of at least a portion of the line of the characteristic point in portion region;Or
Expansion gesture is identified from the M two field pictures, wherein, the expansion gesture is each in the M two field pictures
The line of the characteristic point of hand region is in the gesture of palmate curve;Or
Identified from the M two field pictures and swing gesture to the left, wherein, the gesture that swings to the left is the M two field pictures
In the change direction of M hand region be gesture from right to left;Or
Identified from the M two field pictures and swing gesture to the right, wherein, the gesture that swings to the right is the M two field pictures
In the change direction of M hand region be gesture from left to right.
In the embodiment of the present application two, processor 30 is specifically used for:
Based on the corresponding relation, it is determined that and perform Home key control instruction corresponding with the O-shaped gesture, described in control
The display content of Wearable is adjusted to desktop contents by present displayed content;Or
Based on the corresponding relation, it is determined that and perform it is corresponding with the gesture of clenching fist reduce control instruction, described in control
The display content of Wearable is adjusted to the second size by first size, and second size is less than the first size;Or
Based on the corresponding relation, it is determined that and perform it is corresponding with the V-type gesture confirm control instruction, wear described in control
Wear formula equipment and perform operation corresponding with the confirmation control instruction;Or
Based on the corresponding relation, it is determined that and perform it is corresponding with the expansion gesture amplify control instruction, described in control
The display content of Wearable is the 4th size by the 3rd size adjusting, and the 4th size is more than the 3rd size;Or
Based on the corresponding relation, it is determined that and perform with the corresponding Page forward control instruction of swing gesture to the left,
The display content of the Wearable is controlled to be adjusted to the first display content, first display content by present displayed content
For the page up content of the present displayed content;
Based on the corresponding relation, it is determined that and perform with the corresponding page turning control instruction backward of swing gesture to the right,
The display content of the Wearable is controlled to be adjusted to the second display content, second display content by present displayed content
For the lower one page of content of the present displayed content.
In the embodiment of the present application two, processor 30 is specifically used for:
It is 1 to M to take i successively, identify user described in the i-th frame ambient image in the M frames ambient image i-th
Hand region, when i is M, obtain M hand region;
It is 1 to M to take j successively, extracts the edge of jth group at least three of j-th of hand region in the M hand region
Characteristic point, when j is M, the Edge Feature Points of M groups at least three are obtained, wherein, every group of at least three Edge Feature Points is positioned at institutes
State the characteristic point of the marginal portion of hand region;
Determine that the variable quantity of any two group of at least three Edge Feature Points between the Edge Feature Points of M groups at least three is small
In default changing value;
Line shapes based on the Edge Feature Points of M groups at least three determine first operating gesture.
In the embodiment of the present application two, processor 30 is additionally operable to:
Determine that the variable quantity of any two group of at least three Edge Feature Points between the Edge Feature Points of M groups at least three is big
In default changing value;
The direction of motion based on the Edge Feature Points of M groups at least three determines first operating gesture.
Embodiment three
Based on the identical inventive concept of the embodiment of the present application one, refer to Fig. 6, the embodiment of the present application three provides one kind and worn
Formula equipment is worn, including:
First acquisition unit 101, for obtaining the M frame ambient images of the Wearable, wherein, M is positive integer;
First recognition unit 102, of user for identifying the Wearable from the M frames ambient image
One operating gesture;
First execution unit 103, for the corresponding relation based on default operating gesture and control instruction, it is determined that and performing
The first control instruction corresponding with first operating gesture.
In the embodiment of the present application three, the first recognition unit 102 is specifically used for:
O-shaped gesture is identified from the M two field pictures, wherein, the O-shaped gesture is each hand in the M two field pictures
The line of the characteristic point in portion region forms the closed curve for including closed area, and the first finger tip pair of each hand region
The fisrt feature point second feature point corresponding with the second finger tip answered is in contact condition;Or
Gesture of clenching fist is identified from the M two field pictures, wherein, the gesture of clenching fist is each in the M two field pictures
The line of the characteristic point of hand region forms the curve for including closed area, and the first finger of each hand region is corresponding
First area second area corresponding with second finger be in contact condition;
V-type gesture is identified from the M two field pictures, wherein, the V-type gesture is each hand in the M two field pictures
The V-shaped gesture of at least a portion of the line of the characteristic point in portion region;Or
Expansion gesture is identified from the M two field pictures, wherein, the expansion gesture is each in the M two field pictures
The line of the characteristic point of hand region is in the gesture of palmate curve;Or
Identified from the M two field pictures and swing gesture to the left, wherein, the gesture that swings to the left is the M two field pictures
In the change direction of M hand region be gesture from right to left;Or
Identified from the M two field pictures and swing gesture to the right, wherein, the gesture that swings to the right is the M two field pictures
In the change direction of M hand region be gesture from left to right.
In the embodiment of the present application three, the first execution unit 103 is specifically used for:
Based on the corresponding relation, it is determined that and perform Home key control instruction corresponding with the O-shaped gesture, described in control
The display content of Wearable is adjusted to desktop contents by present displayed content;Or
Based on the corresponding relation, it is determined that and perform it is corresponding with the gesture of clenching fist reduce control instruction, described in control
The display content of Wearable is adjusted to the second size by first size, and second size is less than the first size;Or
Based on the corresponding relation, it is determined that and perform it is corresponding with the V-type gesture confirm control instruction, wear described in control
Wear formula equipment and perform operation corresponding with the confirmation control instruction;Or
Based on the corresponding relation, it is determined that and perform it is corresponding with the expansion gesture amplify control instruction, described in control
The display content of Wearable is the 4th size by the 3rd size adjusting, and the 4th size is more than the 3rd size;Or
Based on the corresponding relation, it is determined that and perform with the corresponding Page forward control instruction of swing gesture to the left,
The display content of the Wearable is controlled to be adjusted to the first display content, first display content by present displayed content
For the page up content of the present displayed content;
Based on the corresponding relation, it is determined that and perform with the corresponding page turning control instruction backward of swing gesture to the right,
The display content of the Wearable is controlled to be adjusted to the second display content, second display content by present displayed content
For the lower one page of content of the present displayed content.
In the embodiment of the present application three, the first recognition unit 102 is specifically used for:
It is 1 to M to take i successively, identify user described in the i-th frame ambient image in the M frames ambient image i-th
Hand region, when i is M, obtain M hand region;
It is 1 to M to take j successively, extracts the edge of jth group at least three of j-th of hand region in the M hand region
Characteristic point, when j is M, the Edge Feature Points of M groups at least three are obtained, wherein, every group of at least three Edge Feature Points is positioned at institutes
State the characteristic point of the marginal portion of hand region;
Determine that the variable quantity of any two group of at least three Edge Feature Points between the Edge Feature Points of M groups at least three is small
In default changing value;
Line shapes based on the Edge Feature Points of M groups at least three determine first operating gesture.
In the embodiment of the present application three, the first recognition unit 102 is additionally operable to:
Determine that the variable quantity of any two group of at least three Edge Feature Points between the Edge Feature Points of M groups at least three is big
In default changing value;
The direction of motion based on the Edge Feature Points of M groups at least three determines first operating gesture.
Pass through one or more of the embodiment of the present application technical scheme, it is possible to achieve following one or more technology effects
Fruit:
Due to the technical scheme in the embodiment of the present application, using the M frame ambient images for obtaining Wearable, wherein, M is
Positive integer;The first operating gesture of the user of the Wearable is identified from the M frames ambient image;Based on default
The corresponding relation of operating gesture and control instruction, it is determined that and performing corresponding with first operating gesture the first control instruction
Technological means, so, Wearable can be controlled by the user gesture in environment-identification image to Wearable
System, so as to provide the new control mode in addition to touch-control and Voice command mode to user, user can be according to reality
Use demand selects suitable control mode, and solving wearable electronic equipment of the prior art, control mode to be present single
Technical problem, realize the diversified technique effect of control mode of wearable electronic equipment.
It should be understood by those skilled in the art that, embodiments of the invention can be provided as method, system or computer program
Product.Therefore, the present invention can use the reality in terms of complete hardware embodiment, complete software embodiment or combination software and hardware
Apply the form of example.Moreover, the present invention can use the computer for wherein including computer usable program code in one or more
The computer program production that usable storage medium is implemented on (including but is not limited to magnetic disk storage, CD-ROM, optical memory etc.)
The form of product.
The present invention is the flow with reference to method according to embodiments of the present invention, equipment (system) and computer program product
Figure and/or block diagram describe.It should be understood that can be by every first-class in computer program instructions implementation process figure and/or block diagram
Journey and/or the flow in square frame and flow chart and/or block diagram and/or the combination of square frame.These computer programs can be provided
The processors of all-purpose computer, special-purpose computer, Embedded Processor or other programmable data processing devices is instructed to produce
A raw machine so that produced by the instruction of computer or the computing device of other programmable data processing devices for real
The device for the function of being specified in present one flow of flow chart or one square frame of multiple flows and/or block diagram or multiple square frames.
These computer program instructions, which may be alternatively stored in, can guide computer or other programmable data processing devices with spy
Determine in the computer-readable memory that mode works so that the instruction being stored in the computer-readable memory, which produces, to be included referring to
Make the manufacture of device, the command device realize in one flow of flow chart or multiple flows and/or one square frame of block diagram or
The function of being specified in multiple square frames.
These computer program instructions can be also loaded into computer or other programmable data processing devices so that counted
Series of operation steps is performed on calculation machine or other programmable devices to produce computer implemented processing, so as in computer or
The instruction performed on other programmable devices is provided for realizing in one flow of flow chart or multiple flows and/or block diagram one
The step of function of being specified in individual square frame or multiple square frames.
Specifically, computer program instructions can be stored in corresponding to the information processing method in the embodiment of the present application
In the storage mediums such as CD, hard disk, USB flash disk, when computer program instructions quilt corresponding with information processing method in storage medium
When one electronic equipment reads or is performed, comprise the following steps:
The M frame ambient images of Wearable are obtained, wherein, M is positive integer;
The first operating gesture of the user of the Wearable is identified from the M frames ambient image;
Corresponding relation based on default operating gesture and control instruction, it is determined that and performing and first operating gesture pair
The first control instruction answered.
Optionally, stored in the storage medium and step:Identified from the M frames ambient image described wearable
The first operating gesture of the user of equipment, corresponding computer program instructions when executed, including:
O-shaped gesture is identified from the M two field pictures, wherein, the O-shaped gesture is each hand in the M two field pictures
The line of the characteristic point in portion region forms the closed curve for including closed area, and the first finger tip pair of each hand region
The fisrt feature point second feature point corresponding with the second finger tip answered is in contact condition;Or
Gesture of clenching fist is identified from the M two field pictures, wherein, the gesture of clenching fist is each in the M two field pictures
The line of the characteristic point of hand region forms the curve for including closed area, and the first finger of each hand region is corresponding
First area second area corresponding with second finger be in contact condition;
V-type gesture is identified from the M two field pictures, wherein, the V-type gesture is each hand in the M two field pictures
The V-shaped gesture of at least a portion of the line of the characteristic point in portion region;Or
Expansion gesture is identified from the M two field pictures, wherein, the expansion gesture is each in the M two field pictures
The line of the characteristic point of hand region is in the gesture of palmate curve;Or
Identified from the M two field pictures and swing gesture to the left, wherein, the gesture that swings to the left is the M two field pictures
In the change direction of M hand region be gesture from right to left;Or
Identified from the M two field pictures and swing gesture to the right, wherein, the gesture that swings to the right is the M two field pictures
In the change direction of M hand region be gesture from left to right.
Optionally, stored in the storage medium and step:It is corresponding with control instruction based on default operating gesture
Relation, it is determined that and perform corresponding with first operating gesture the first control instruction, corresponding computer program instructions are in quilt
During execution, including:
Based on the corresponding relation, it is determined that and perform Home key control instruction corresponding with the O-shaped gesture, described in control
The display content of Wearable is adjusted to desktop contents by present displayed content;Or
Based on the corresponding relation, it is determined that and perform it is corresponding with the gesture of clenching fist reduce control instruction, described in control
The display content of Wearable is adjusted to the second size by first size, and second size is less than the first size;Or
Based on the corresponding relation, it is determined that and perform it is corresponding with the V-type gesture confirm control instruction, wear described in control
Wear formula equipment and perform operation corresponding with the confirmation control instruction;Or
Based on the corresponding relation, it is determined that and perform it is corresponding with the expansion gesture amplify control instruction, described in control
The display content of Wearable is the 4th size by the 3rd size adjusting, and the 4th size is more than the 3rd size;Or
Based on the corresponding relation, it is determined that and perform with the corresponding Page forward control instruction of swing gesture to the left,
The display content of the Wearable is controlled to be adjusted to the first display content, first display content by present displayed content
For the page up content of the present displayed content;
Based on the corresponding relation, it is determined that and perform with the corresponding page turning control instruction backward of swing gesture to the right,
The display content of the Wearable is controlled to be adjusted to the second display content, second display content by present displayed content
For the lower one page of content of the present displayed content.
Optionally, stored in the storage medium and step:Identified from the M frames ambient image described wearable
The first operating gesture of the user of equipment, corresponding computer program instructions when executed, including:
It is 1 to M to take i successively, identify user described in the i-th frame ambient image in the M frames ambient image i-th
Hand region, when i is M, obtain M hand region;
It is 1 to M to take j successively, extracts the edge of jth group at least three of j-th of hand region in the M hand region
Characteristic point, when j is M, the Edge Feature Points of M groups at least three are obtained, wherein, every group of at least three Edge Feature Points is positioned at institutes
State the characteristic point of the marginal portion of hand region;
Determine that the variable quantity of any two group of at least three Edge Feature Points between the Edge Feature Points of M groups at least three is small
In default changing value;
Line shapes based on the Edge Feature Points of M groups at least three determine first operating gesture.
Optionally, other computer program instructions are also stored with the storage medium, the other calculates
Machine programmed instruction with step:It is 1 to M to take j successively, extracts the jth group of j-th of hand region in the M hand region
Computer program instructions corresponding at least three Edge Feature Points are performed after performing, including:
Determine that the variable quantity of any two group of at least three Edge Feature Points between the Edge Feature Points of M groups at least three is big
In default changing value;
The direction of motion based on the Edge Feature Points of M groups at least three determines first operating gesture.
Although preferred embodiments of the present invention have been described, but those skilled in the art once know basic creation
Property concept, then can make other change and modification to these embodiments.So appended claims be intended to be construed to include it is excellent
Select embodiment and fall into having altered and changing for the scope of the invention.
Obviously, those skilled in the art can carry out the essence of various changes and modification without departing from the present invention to the present invention
God and scope.So, if these modifications and variations of the present invention belong to the scope of the claims in the present invention and its equivalent technologies
Within, then the present invention is also intended to comprising including these changes and modification.
Claims (11)
1. a kind of information processing method, applied to Wearable, including:
The M frame ambient images of the Wearable are obtained, wherein, M is positive integer;
The first operating gesture of the user of the Wearable is identified from the M frames ambient image;
Corresponding relation based on default operating gesture and control instruction, it is determined that and performing corresponding with first operating gesture
First control instruction.
2. the method as described in claim 1, it is characterised in that first operating gesture is O-shaped gesture, wherein, it is described O-shaped
Gesture is contacted for the thumbtip of the user with forefinger tip and forefinger, middle finger, the curvature of the third finger and little finger of toe preset model first
In enclosing;Or
First operating gesture is gesture of clenching fist, wherein, the thumb that gesture is the user of clenching fist pastes with index finger portion
Close and forefinger, middle finger, the curvature of the third finger and little finger of toe are in the second preset range;Or
First operating gesture is V-type gesture, wherein, the V-type gesture is opened for the forefinger of the user with middle finger, and thumb
Finger is bonded with ring finger portion, and the nameless and little finger of toe curvature is in the 3rd preset range;Or
First operating gesture is expansion gesture, wherein, the expansion gesture is opened for the five fingers of the user;Or
First operating gesture is to swing gesture to the left, wherein, the gesture that swings to the left is bonded for the five fingers of the user
And motion bending to the left;
First operating gesture is to swing gesture to the right, wherein, the gesture that swings to the right is bonded for the five fingers of the user
And motion bending to the right;
Correspondingly, the first operating gesture of the user that the Wearable is identified from the M frames ambient image, bag
Include:
The O-shaped gesture is identified from the M two field pictures, wherein, the O-shaped gesture is each hand in the M two field pictures
The line of the characteristic point in portion region forms the closed curve for including closed area, and the first finger tip pair of each hand region
The fisrt feature point second feature point corresponding with the second finger tip answered is in contact condition;Or
The gesture of clenching fist is identified from the M two field pictures, wherein, the gesture of clenching fist is each in the M two field pictures
The line of the characteristic point of hand region forms the curve for including closed area, and the first finger of each hand region is corresponding
First area second area corresponding with second finger be in contact condition;
The V-type gesture is identified from the M two field pictures, wherein, the V-type gesture is each hand in the M two field pictures
The V-shaped gesture of at least a portion of the line of the characteristic point in portion region;Or
The expansion gesture is identified from the M two field pictures, wherein, the expansion gesture is each in the M two field pictures
The line of the characteristic point of hand region is in the gesture of palmate curve;Or
Gesture is swung to the left described in being identified from the M two field pictures, wherein, the gesture that swings to the left is the M two field pictures
In the change direction of M hand region be gesture from right to left;Or
Gesture is swung to the right described in being identified from the M two field pictures, wherein, the gesture that swings to the right is the M two field pictures
In the change direction of M hand region be gesture from left to right.
3. method as claimed in claim 2, it is characterised in that described corresponding with control instruction based on default operating gesture
Relation, it is determined that and perform corresponding with first operating gesture the first control instruction, including:
Based on the corresponding relation, it is determined that and perform Home key control instruction corresponding with the O-shaped gesture, control the wearing
The display content of formula equipment is adjusted to desktop contents by present displayed content;Or
Based on the corresponding relation, it is determined that and perform it is corresponding with the gesture of clenching fist reduce control instruction, control the wearing
The display content of formula equipment is adjusted to the second size by first size, and second size is less than the first size;Or
Based on the corresponding relation, it is determined that and perform it is corresponding with the V-type gesture confirm control instruction, control it is described wearable
Equipment performs operation corresponding with the confirmation control instruction;Or
Based on the corresponding relation, it is determined that and perform with it is described expansion gesture it is corresponding amplify control instruction, control the wearing
The display content of formula equipment is the 4th size by the 3rd size adjusting, and the 4th size is more than the 3rd size;Or
Based on the corresponding relation, it is determined that and perform and swing the corresponding Page forward control instruction of gesture to the left with described, control
The display content of the Wearable is adjusted to the first display content by present displayed content, and first display content is institute
State the page up content of present displayed content;
Based on the corresponding relation, it is determined that and perform and swing the corresponding page turning control instruction backward of gesture to the right with described, control
The display content of the Wearable is adjusted to the second display content by present displayed content, and second display content is institute
State the lower one page of content of present displayed content.
4. the method as described in claim 1, it is characterised in that described to identify the wearing from the M frames ambient image
The first operating gesture of the user of formula equipment, including:
It is 1 to M to take i successively, identifies i-th of hand of user described in the i-th frame ambient image in the M frames ambient image
Region, when i is M, obtain M hand region;
It is 1 to M to take j successively, extracts the edge feature of jth group at least three of j-th of hand region in the M hand region
Point, when j is M, the Edge Feature Points of M groups at least three are obtained, wherein, every group of at least three Edge Feature Points are positioned at the hand
The characteristic point of the marginal portion in portion region;
Determine that the variable quantity of any two group of at least three Edge Feature Points between the Edge Feature Points of M groups at least three is less than in advance
If changing value;
Line shapes based on the Edge Feature Points of M groups at least three determine first operating gesture.
5. method as claimed in claim 4, it is characterised in that it is described take successively j be 1 to M, extract the M hand area
After the Edge Feature Points of jth group at least three of j-th of hand region in domain, methods described also includes:
Determine that the variable quantity of any two group of at least three Edge Feature Points between the Edge Feature Points of M groups at least three is more than in advance
If changing value;
The direction of motion based on the Edge Feature Points of M groups at least three determines first operating gesture.
6. a kind of Wearable, including:
Housing;
Sensor, it is arranged in the housing, for obtaining the M frame ambient images of the Wearable, wherein, M is just whole
Number;
Processor, it is arranged in the housing, for identifying the user of the Wearable from the M frames ambient image
The first operating gesture;Corresponding relation based on default operating gesture and control instruction, it is determined that and performing and the described first behaviour
First control instruction corresponding to making a sign with the hand.
7. Wearable as claimed in claim 6, it is characterised in that the processor is specifically used for:
O-shaped gesture is identified from the M two field pictures, wherein, the O-shaped gesture is each hand area in the M two field pictures
The line of the characteristic point in domain forms the closed curve for including closed area, and corresponding to the first finger tip of each hand region
Fisrt feature point second feature point corresponding with the second finger tip is in contact condition;Or
Gesture of clenching fist is identified from the M two field pictures, wherein, the gesture of clenching fist is each hand in the M two field pictures
The line of the characteristic point in region forms the curve for include closed area, and the corresponding to each first finger of hand region
One region second area corresponding with second finger is in contact condition;
V-type gesture is identified from the M two field pictures, wherein, the V-type gesture is each hand area in the M two field pictures
The V-shaped gesture of at least a portion of the line of the characteristic point in domain;Or
Expansion gesture is identified from the M two field pictures, wherein, the expansion gesture is each hand in the M two field pictures
The line of the characteristic point in region is in the gesture of palmate curve;Or
Identified from the M two field pictures and swing gesture to the left, wherein, the gesture that swings to the left is the M in the M two field pictures
The change direction of individual hand region is gesture from right to left;Or
Identified from the M two field pictures and swing gesture to the right, wherein, the gesture that swings to the right is the M in the M two field pictures
The change direction of individual hand region is gesture from left to right.
8. Wearable as claimed in claim 7, it is characterised in that the processor is specifically used for:
Based on the corresponding relation, it is determined that and perform Home key control instruction corresponding with the O-shaped gesture, control the wearing
The display content of formula equipment is adjusted to desktop contents by present displayed content;Or
Based on the corresponding relation, it is determined that and perform it is corresponding with the gesture of clenching fist reduce control instruction, control the wearing
The display content of formula equipment is adjusted to the second size by first size, and second size is less than the first size;Or
Based on the corresponding relation, it is determined that and perform it is corresponding with the V-type gesture confirm control instruction, control it is described wearable
Equipment performs operation corresponding with the confirmation control instruction;Or
Based on the corresponding relation, it is determined that and perform with it is described expansion gesture it is corresponding amplify control instruction, control the wearing
The display content of formula equipment is the 4th size by the 3rd size adjusting, and the 4th size is more than the 3rd size;Or
Based on the corresponding relation, it is determined that and perform and swing the corresponding Page forward control instruction of gesture to the left with described, control
The display content of the Wearable is adjusted to the first display content by present displayed content, and first display content is institute
State the page up content of present displayed content;
Based on the corresponding relation, it is determined that and perform and swing the corresponding page turning control instruction backward of gesture to the right with described, control
The display content of the Wearable is adjusted to the second display content by present displayed content, and second display content is institute
State the lower one page of content of present displayed content.
9. Wearable as claimed in claim 6, it is characterised in that the processor is specifically used for:
It is 1 to M to take i successively, identifies i-th of hand of user described in the i-th frame ambient image in the M frames ambient image
Region, when i is M, obtain M hand region;
It is 1 to M to take j successively, extracts the edge feature of jth group at least three of j-th of hand region in the M hand region
Point, when j is M, the Edge Feature Points of M groups at least three are obtained, wherein, every group of at least three Edge Feature Points are positioned at the hand
The characteristic point of the marginal portion in portion region;
Determine that the variable quantity of any two group of at least three Edge Feature Points between the Edge Feature Points of M groups at least three is less than in advance
If changing value;
Line shapes based on the Edge Feature Points of M groups at least three determine first operating gesture.
10. Wearable as claimed in claim 9, it is characterised in that the processor is additionally operable to:
Determine that the variable quantity of any two group of at least three Edge Feature Points between the Edge Feature Points of M groups at least three is more than in advance
If changing value;
The direction of motion based on the Edge Feature Points of M groups at least three determines first operating gesture.
11. a kind of Wearable, including:
First acquisition unit, for obtaining the M frame ambient images of the Wearable, wherein, M is positive integer;
First recognition unit, the first operation of the user for identifying the Wearable from the M frames ambient image
Gesture;
First execution unit, for the corresponding relation based on default operating gesture and control instruction, it is determined that and perform with it is described
First control instruction corresponding to first operating gesture.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610378110.0A CN107450717B (en) | 2016-05-31 | 2016-05-31 | Information processing method and wearable device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610378110.0A CN107450717B (en) | 2016-05-31 | 2016-05-31 | Information processing method and wearable device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107450717A true CN107450717A (en) | 2017-12-08 |
CN107450717B CN107450717B (en) | 2021-05-18 |
Family
ID=60485911
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610378110.0A Active CN107450717B (en) | 2016-05-31 | 2016-05-31 | Information processing method and wearable device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107450717B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107976183A (en) * | 2017-12-18 | 2018-05-01 | 北京师范大学珠海分校 | A kind of spatial data measuring method and device |
CN108668010A (en) * | 2018-03-30 | 2018-10-16 | 努比亚技术有限公司 | Method for controlling mobile terminal, bracelet and computer readable storage medium |
CN109814717A (en) * | 2019-01-29 | 2019-05-28 | 珠海格力电器股份有限公司 | Household equipment control method and device, control equipment and readable storage medium |
CN109917922A (en) * | 2019-03-28 | 2019-06-21 | 更藏多杰 | A kind of exchange method and wearable interactive device |
CN112947825A (en) * | 2021-01-28 | 2021-06-11 | 维沃移动通信有限公司 | Display control method, display control device, electronic device, and medium |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102609093A (en) * | 2012-02-16 | 2012-07-25 | 中国农业大学 | Method and device for controlling video playing by using gestures |
CN102880865A (en) * | 2012-09-28 | 2013-01-16 | 东南大学 | Dynamic gesture recognition method based on complexion and morphological characteristics |
CN103324274A (en) * | 2012-03-22 | 2013-09-25 | 联想(北京)有限公司 | Method and device for man-machine interaction |
CN103576840A (en) * | 2012-07-24 | 2014-02-12 | 上海辰戌信息科技有限公司 | Stereoscopic vision based gesture body-sense control system |
US8744645B1 (en) * | 2013-02-26 | 2014-06-03 | Honda Motor Co., Ltd. | System and method for incorporating gesture and voice recognition into a single system |
CN103869954A (en) * | 2012-12-17 | 2014-06-18 | 联想(北京)有限公司 | Processing method as well as processing device and electronic device |
CN103902202A (en) * | 2012-12-24 | 2014-07-02 | 联想(北京)有限公司 | Processing method and electronic device |
CN103955274A (en) * | 2014-04-21 | 2014-07-30 | 小米科技有限责任公司 | Application control method and device |
CN104407694A (en) * | 2014-10-29 | 2015-03-11 | 山东大学 | Man-machine interaction method and device combining human face and gesture control |
CN104881122A (en) * | 2015-05-29 | 2015-09-02 | 深圳奥比中光科技有限公司 | Somatosensory interactive system activation method and somatosensory interactive method and system |
CN105302294A (en) * | 2015-09-07 | 2016-02-03 | 哈尔滨市一舍科技有限公司 | Interactive virtual reality presentation device |
-
2016
- 2016-05-31 CN CN201610378110.0A patent/CN107450717B/en active Active
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102609093A (en) * | 2012-02-16 | 2012-07-25 | 中国农业大学 | Method and device for controlling video playing by using gestures |
CN103324274A (en) * | 2012-03-22 | 2013-09-25 | 联想(北京)有限公司 | Method and device for man-machine interaction |
CN103576840A (en) * | 2012-07-24 | 2014-02-12 | 上海辰戌信息科技有限公司 | Stereoscopic vision based gesture body-sense control system |
CN102880865A (en) * | 2012-09-28 | 2013-01-16 | 东南大学 | Dynamic gesture recognition method based on complexion and morphological characteristics |
CN103869954A (en) * | 2012-12-17 | 2014-06-18 | 联想(北京)有限公司 | Processing method as well as processing device and electronic device |
CN103902202A (en) * | 2012-12-24 | 2014-07-02 | 联想(北京)有限公司 | Processing method and electronic device |
US8744645B1 (en) * | 2013-02-26 | 2014-06-03 | Honda Motor Co., Ltd. | System and method for incorporating gesture and voice recognition into a single system |
CN103955274A (en) * | 2014-04-21 | 2014-07-30 | 小米科技有限责任公司 | Application control method and device |
CN104407694A (en) * | 2014-10-29 | 2015-03-11 | 山东大学 | Man-machine interaction method and device combining human face and gesture control |
CN104881122A (en) * | 2015-05-29 | 2015-09-02 | 深圳奥比中光科技有限公司 | Somatosensory interactive system activation method and somatosensory interactive method and system |
CN105302294A (en) * | 2015-09-07 | 2016-02-03 | 哈尔滨市一舍科技有限公司 | Interactive virtual reality presentation device |
Non-Patent Citations (1)
Title |
---|
谈家谱 等: "基于Kinect 的指尖检测与手势识别方法", 《计算机应用》 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107976183A (en) * | 2017-12-18 | 2018-05-01 | 北京师范大学珠海分校 | A kind of spatial data measuring method and device |
CN108668010A (en) * | 2018-03-30 | 2018-10-16 | 努比亚技术有限公司 | Method for controlling mobile terminal, bracelet and computer readable storage medium |
CN109814717A (en) * | 2019-01-29 | 2019-05-28 | 珠海格力电器股份有限公司 | Household equipment control method and device, control equipment and readable storage medium |
CN109814717B (en) * | 2019-01-29 | 2020-12-25 | 珠海格力电器股份有限公司 | Household equipment control method and device, control equipment and readable storage medium |
CN109917922A (en) * | 2019-03-28 | 2019-06-21 | 更藏多杰 | A kind of exchange method and wearable interactive device |
CN112947825A (en) * | 2021-01-28 | 2021-06-11 | 维沃移动通信有限公司 | Display control method, display control device, electronic device, and medium |
CN112947825B (en) * | 2021-01-28 | 2024-09-17 | 维沃移动通信有限公司 | Display control method, display control device, electronic equipment and medium |
Also Published As
Publication number | Publication date |
---|---|
CN107450717B (en) | 2021-05-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11861070B2 (en) | Hand gestures for animating and controlling virtual and graphical elements | |
US20220326781A1 (en) | Bimanual interactions between mapped hand regions for controlling virtual and graphical elements | |
US10394334B2 (en) | Gesture-based control system | |
CN104331168B (en) | Display adjusting method and electronic equipment | |
EP3090331B1 (en) | Systems with techniques for user interface control | |
Lv et al. | Touch-less interactive augmented reality game on vision-based wearable device | |
CN116724285A (en) | Micro-gestures for controlling virtual and graphical elements | |
CN107992188B (en) | Virtual reality interaction method, device and system | |
JP2019008351A (en) | Information processing apparatus, information processing method and recording medium | |
CN107450717A (en) | A kind of information processing method and Wearable | |
CN109634415B (en) | It is a kind of for controlling the gesture identification control method of analog quantity | |
CN104407694A (en) | Man-machine interaction method and device combining human face and gesture control | |
EP2577426A1 (en) | Information processing apparatus and method and program | |
US11009949B1 (en) | Segmented force sensors for wearable devices | |
CN108027655A (en) | Information processing system, information processing equipment, control method and program | |
CN110288715B (en) | Virtual necklace try-on method and device, electronic equipment and storage medium | |
CN108829239A (en) | Control method, device and the terminal of terminal | |
Hartanto et al. | Real time hand gesture movements tracking and recognizing system | |
US11500453B2 (en) | Information processing apparatus | |
Halarnkar et al. | Gesture recognition technology: A review | |
US11841920B1 (en) | Machine learning based gesture recognition | |
CN114327031A (en) | AR (augmented reality) glasses interaction method and device, computer-readable storage medium and AR glasses | |
Varma et al. | Gestural interaction with three-dimensional interfaces; current research and recommendations | |
Feng et al. | An HCI paradigm fusing flexible object selection and AOM-based animation | |
Cruz Bautista et al. | Hand features extractor using hand contour–a case study |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |