CN105302305A - Gesture control method and system - Google Patents
Gesture control method and system Download PDFInfo
- Publication number
- CN105302305A CN105302305A CN201510733994.2A CN201510733994A CN105302305A CN 105302305 A CN105302305 A CN 105302305A CN 201510733994 A CN201510733994 A CN 201510733994A CN 105302305 A CN105302305 A CN 105302305A
- Authority
- CN
- China
- Prior art keywords
- gesture
- hand
- control interface
- interface
- setting space
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- User Interface Of Digital Computer (AREA)
Abstract
The invention relates to a gesture control method. The method comprises the following steps: obtaining the size of a control interface; detecting a spatial position of a gesture in a set spatial range, and mapping the spatial position of the gesture into a position of an operation cursor in the control interface, wherein the set spatial range corresponds to the size of the control interface; and during the position mapping, if detecting that the gesture exceeds the set spatial range, performing corresponding translation adjustment on the set spatial range according to the exceeded size of the gesture, and correspondingly adjusting a mapping relationship between the control interface and the adjusted spatial range. The invention furthermore relates to a gesture control system. The method and the system can always enable the gesture of a user to be kept in a set range, so that the mapping between the control interface and the spatial range becomes simpler, and the situation that the gesture cannot be detected as the gesture exceeds the range does not occur.
Description
Technical field
The present invention relates to interactive controlling technical field, particularly relate to a kind of gesture control method and a kind of gesture control system.
Background technology
Along with the development of image recognition technology, gesture interaction has become a kind of mode manipulating smart machine easily.When being that operation and control interface is mutual, sometimes need positioning screen cursor, now need the coordinate locus of gesture being converted to operation and control interface, the object utilizing gesture positioning screen cursor could be realized.
In traditional implementation method, generally user's gesture is limited in a fixing region, so just conveniently can detects the corresponding relation with position, implementation space and operation and control interface.And when gesture drops on outside fixed area, detection will be lost efficacy, needed to reactivate a fixed area and user could be allowed to re-start operation in new position, brought too many inconvenience to the operation of user.
Summary of the invention
Based on this, be necessary to provide one can follow the tracks of user's hand position, convenient gesture control method of carrying out operating in any detectable region.
In addition, a kind of gesture control system is also provided.
A kind of gesture control method, obtains the locus of hand, comprises the steps: based on depth camera
Obtain the size of operation and control interface;
Whether be in the spatial dimension that preset in, if so, then the locus of hand is mapped as operational light and is marked on position in operation and control interface if detecting hand;
When carrying out position and mapping, if detect, hand exceeds described setting space scope, then described setting space scope is carried out corresponding translation adjustment according to the size that hand exceeds, and the mapping relations of corresponding adjustment operation and control interface and the spatial dimension after adjusting.
Wherein in an embodiment, also comprise:
According to the initial position of hand, initialization is carried out to the operational light target reference position in described operation and control interface.
Wherein in an embodiment, described operational light target reference position is positioned at the center of operation and control interface.
Wherein in an embodiment, the position activated residing for gesture is the initial position of described hand.
Wherein in an embodiment, also comprise:
Detect the movement of hand within the scope of setting space, and hand movement is in a first direction converted to striking operation; Wherein, described first direction is: from away from operation interface near the direction of operation interface, and hand translational speed is in said first direction greater than the threshold value of setting.
A kind of gesture control system, comprising:
Whether mapping block, for obtaining size and the setting space scope of operation and control interface, and detect hand and be in a spatial dimension preset, and if so, then the locus of hand is mapped as operational light and is marked on position in operation and control interface;
Mapping adjusting module, for when detecting that hand exceeds described setting space scope, described setting space scope being carried out corresponding translation adjustment according to the size that hand exceeds, and the mapping relations of corresponding adjustment operation and control interface and the spatial dimension after adjusting.
Wherein in an embodiment, also comprise initialization module; Described initialization module is used for the initial position according to hand, carries out initialization to the operational light target reference position in described operation and control interface.
Wherein in an embodiment, described operational light target reference position is placed in the center of operation and control interface by described initialization module.
Wherein in an embodiment, described initialization module is started working after activation gesture, and the record position activated residing for gesture is the initial position of described hand.
Wherein in an embodiment, also comprise striking module; Hand movement in a first direction for detecting the movement of hand within the scope of setting space, and is converted to striking operation by described striking module; Wherein, described first direction is: from away from operation interface near the direction of operation interface, and hand translational speed is in said first direction greater than the threshold value of setting.
Said method and system, by adjusting spatial dimension when being beyond the boundary, all the time the gesture of user can be allowed to remain in the scope of a setting size, make the mapping between operation and control interface and spatial dimension more simple, also there will not be gesture because of the situation that cannot detect that goes beyond the scope simultaneously.
Accompanying drawing explanation
Fig. 1 is the gesture control method process flow diagram of an embodiment;
Fig. 2 is the schematic diagram of a setting space scope;
Fig. 3 is the schematic diagram of setting space scope to right translation;
Fig. 4 a ~ 4f is that hand exceeds the situation schematic diagram of setting space scope in each different direction;
Fig. 5 is the gesture control method process flow diagram of another embodiment;
Fig. 6 is the gesture control method process flow diagram of another embodiment;
Fig. 7 is the gesture control system module map of an embodiment.
Embodiment
Be further described below in conjunction with drawings and Examples.
Fig. 1 is the gesture control method process flow diagram of an embodiment.This gesture control method comprises the steps:
Step S101: the size obtaining operation and control interface.The method of the present embodiment can be applied to the remote control interactive of user and intelligent television.Now operation and control interface is exactly the display interface of intelligent television.It is mutual that the method for the present embodiment can also be used for other equipment, such as projector etc.The size of operation and control interface generally represents with the pixel count (or image resolution ratio) of display, and image resolution ratio generally can be limited to the physical resolution of display.Any position on operation interface can represent with the position of the pixel on image.
Obtaining the size of operation and control interface, is in fact exactly obtain the pixel coverage on operation and control interface.
Step S102: detect the locus of gesture within the scope of setting space, and the locus of gesture is mapped as operational light and is marked on position in operation and control interface; Described setting space scope is corresponding with the size of operation and control interface.
Setting space scope refers to a three-dimensional region, usually can be chosen for a boxed area.With reference to figure 2.When the gesture of user moves in the setting space scope shown in Fig. 2, the motion of gesture can be decomposed into the component on XYZ tri-directions.
When the locus of gesture being mapped as the position that operational light is marked in operation and control interface, Z axis component can be ignored.The displacement of gesture in X-axis is just mapped to the transversal displacement of operation and control interface, and the displacement of gesture in Y-axis is just mapped to the length travel of operation and control interface.
Described setting space scope is corresponding with the size of operation and control interface, specifically refers to, the size in XY cross section is corresponding with the size of operation and control interface.The depth-width ratio in XY cross section can be made identical with the depth-width ratio of operation and control interface.Such as current display resolution is 1024 × 768, then can arrange XY cross section and be of a size of 1024cm × 768cm.
Like this, when user's gesture within the scope of setting space, move 1cm in the X-axis direction time, the distance of a corresponding mobile pixel in the horizontal on operation and control interface.Be appreciated that the size in XY cross section can also be other similar settings, the mapping relations between operation and control interface to XY cross section are corresponding adjustment then.
Step S103: detect gesture and whether exceed described setting space scope, if so, then perform step S104; Otherwise return step S102.
Usually, aforementioned setting space scope is with reference to what generate with an initial hand gesture location, such as, centered by initial hand gesture location, expand to a boxed area.Setting space scope just no longer changes after determining.User when making gesture, then may exceed this setting space scope.When exceeding setting space scope, according to original, between operation and control interface and setting space scope corresponding relation, just no longer can correctly positioning action cursor, now need further process, perform step S104.
Step S104: described setting space scope is carried out corresponding translation adjustment according to the size that gesture exceeds, and the mapping relations of corresponding adjustment operation and control interface and the spatial dimension after adjusting.
With reference to figure 3, initial setting space scope is called the first spatial dimension 100, the spatial dimension after translation adjustment is called second space scope 200.Concrete adjustment mode is, if gesture exceeds the longitudinal boundary of the first spatial dimension 100, then with the distance d that gesture exceeds, by the first spatial dimension 100 transverse translation distance d, obtains second space scope 200, specifically can with reference to figure 3; Similar, if gesture exceeds the horizontal boundary of the first spatial dimension 100, then with the distance d that gesture exceeds, by the first spatial dimension 100 longitudinal translation distance d, obtain second space scope 200.
Second space scope 200 there occurs overall movement on locus, based on the first spatial dimension 100, therefore, between operation and control interface and the first spatial dimension 100, original mapping relations are no longer valid, need the mapping relations be adjusted between second space scope 200.
As Fig. 4 a ~ 4f, when operating, gesture may exceed setting space scope in all directions (forward and backward, upper and lower, left and right), now all readjusts setting space scope with reference to said method, makes this spatial dimension follow hand and move.Fig. 4 a is the off-limits situation of withdraw the arm backward; Fig. 4 b releases forward the off-limits situation of arm; Fig. 4 c is the off-limits situation of the hand that moves up; Fig. 4 d moves down the off-limits situation of hand; Fig. 4 e is the off-limits situation of the hand that moves right; Fig. 4 f is moved to the left the off-limits situation of hand.Be appreciated that the movement of non-above-mentioned standard moving direction all can be decomposed into above-mentioned standard and move, thus make this setting space scope follow hand all the time, hand is remained within the scope of this setting space.
When hand continues to move toward the outside edges of setting space, screen cursor rests on the edge of screen all the time.
Said method, can allow the gesture of user remain in the scope of a setting size all the time, makes the mapping between operation and control interface and spatial dimension more simple, also there will not be gesture because of the situation that cannot detect that goes beyond the scope simultaneously.
Fig. 5 is the gesture control method process flow diagram of another embodiment.This gesture control method comprises the steps:
Step S201: detect and activate gesture.
Step S202: according to the position setting space scope of current gesture.
Step S203: according to the position of current gesture, carries out initialization to the operational light target reference position in described operation and control interface.Described operational light target reference position can be positioned at the center of operation and control interface.
Step S204: detect the locus of gesture within the scope of setting space, and the locus of gesture is mapped as operational light and is marked on position in operation and control interface; Described setting space scope is corresponding with the size of operation and control interface.
Step S205: detect whether gesture exceeds described setting space scope, if so, then performs step S206; Otherwise return step S204.
Step S206: described setting space scope is carried out corresponding translation adjustment according to the size that gesture exceeds, and the mapping relations of corresponding adjustment operation and control interface and the spatial dimension after adjusting.
The present embodiment is from upper the different of an embodiment, comprises the step detecting and activate gesture process.Activating gesture is a certain gestures, activates after gesture when detecting, the follow-up various gesture that has been defined can be detected and for alternately.Activation gesture can activate the interaction process about gesture in interactive system, and when not activating, gesture interaction operation is not effective, can avoid some unnecessary maloperations.
In addition, after activation gesture, on operation and control interface, operation cursor and a given reference position is also shown.On operation and control interface, display highlighting may be used for selection, instruction etc.Operational light target reference position can be positioned at the center of operation and control interface, also can be other suitable position, such as upper left corners etc.
Fig. 6 is the gesture control method process flow diagram of another embodiment.This gesture control method comprises the steps:
Step S301: the size obtaining operation and control interface.
Step S302: detect the locus of gesture within the scope of setting space, and the locus of gesture is mapped as operational light and is marked on position in operation and control interface; Described setting space scope is corresponding with the size of operation and control interface.
Step S303: detect whether gesture exceeds described setting space scope, if so, then performs step S304; Otherwise return step S302.
Step S304: described setting space scope is carried out corresponding translation adjustment according to the size that gesture exceeds, and the mapping relations of corresponding adjustment operation and control interface and the spatial dimension after adjusting.
Step S305: detect the movement of gesture within the scope of setting space, and gesture movement is in a first direction converted to striking operation; Wherein, described first direction is: from away from operation interface near the direction of operation interface, and gesture translational speed is in said first direction greater than the threshold value of setting.
The present embodiment is from the different of the first embodiment, except according to circumstances spatial dimension can being followed gesture and adjusting, also comprises the step of process striking operation.Striking class of operation in gesture is similar to the click of mouse, the touch control operation etc. of touch-screen, may be used for start-up routine, select target etc.
When carrying out striking detection and process, need to consider the movement of gesture in Z-direction.Usually, user can to carry out gesture operation in the face of the mode of operation and control interface, therefore the direction of Z axis namely from away from operation interface to the direction near operation interface.Be appreciated that can also there be other relations between Z-direction and operation and control interface.
If gesture translational speed is in the Z-axis direction greater than the threshold value of setting, then think that user has made a striking gesture.Threshold speed can be determined according to actual conditions, moves forward and backward the error of generation when should neglect mobile cursor, is unlikely to again to be difficult to detect.
Arrange striking operational processes, can enrich the type of gesture operation, simultaneously this is also the operation familiar in other interactive modes of a kind of user, is easy to study and grasps.Arrange threshold speed to detect, mobile operation cursor and striking can be distinguished again well.
Be appreciated that the order of step S305 and step S302 in no particular order.
With reference to figure 7, the gesture control system of an embodiment comprises initialization module 310, mapping block 320, maps adjusting module 330 and striking module 340.
Beginningization module 310, for the initial position according to gesture, carries out initialization to the operational light target reference position in described operation and control interface.Described operational light target reference position can be placed in the center of operation and control interface by initialization module 310.Initialization module 310 is started working after activation gesture, and the record position activated residing for gesture is the initial position of described gesture.
Mapping block 320 for obtaining size and the setting space scope of operation and control interface, and detects the locus of gesture within the scope of setting space, the locus of gesture is mapped as operational light and is marked on position in operation and control interface.
Map adjusting module 330 for when detecting that gesture exceeds described setting space scope, described setting space scope is carried out corresponding translation adjustment according to the size that gesture exceeds, and the mapping relations of corresponding adjustment operation and control interface and the spatial dimension after adjusting.
Gesture movement in a first direction for detecting the movement of gesture within the scope of setting space, and is converted to striking operation by striking module 340; Wherein, described first direction is: from away from operation interface near the direction of operation interface, and gesture translational speed is in said first direction greater than the threshold value of setting.
Each technical characteristic of the above embodiment can combine arbitrarily, for making description succinct, the all possible combination of each technical characteristic in above-described embodiment is not all described, but, as long as the combination of these technical characteristics does not exist contradiction, be all considered to be the scope that this instructions is recorded.
The above embodiment only have expressed several embodiment of the present invention, and it describes comparatively concrete and detailed, but can not therefore be construed as limiting the scope of the patent.It should be pointed out that for the person of ordinary skill of the art, without departing from the inventive concept of the premise, can also make some distortion and improvement, these all belong to protection scope of the present invention.Therefore, the protection domain of patent of the present invention should be as the criterion with claims.
Claims (10)
1. a gesture control method, obtains the locus of hand, comprises the steps: based on depth camera
Obtain the size of operation and control interface;
Whether be in the spatial dimension that preset in, if so, then the locus of hand is mapped as operational light and is marked on position in operation and control interface if detecting hand;
When carrying out position and mapping, if detect, hand exceeds described setting space scope, then described setting space scope is carried out corresponding translation adjustment according to the size that hand exceeds, and the mapping relations of corresponding adjustment operation and control interface and the spatial dimension after adjusting.
2. gesture control method according to claim 1, is characterized in that, also comprises:
According to the initial position of hand, initialization is carried out to the operational light target reference position in described operation and control interface.
3. gesture control method according to claim 2, is characterized in that, described operational light target reference position is positioned at the center of operation and control interface.
4. gesture control method according to claim 2, is characterized in that, the position activated residing for gesture is the initial position of described hand.
5. gesture control method according to claim 1, is characterized in that, also comprises:
Detect the movement of hand within the scope of setting space, and hand movement is in a first direction converted to striking operation; Wherein, described first direction is: from away from operation interface near the direction of operation interface, and hand translational speed is in said first direction greater than the threshold value of setting.
6. a gesture control system, comprising:
Whether mapping block, for obtaining size and the setting space scope of operation and control interface, and detect hand and be in a spatial dimension preset, and if so, then the locus of hand is mapped as operational light and is marked on position in operation and control interface;
Mapping adjusting module, for when detecting that hand exceeds described setting space scope, described setting space scope being carried out corresponding translation adjustment according to the size that hand exceeds, and the mapping relations of corresponding adjustment operation and control interface and the spatial dimension after adjusting.
7. gesture control system according to claim 6, is characterized in that, also comprises initialization module; Described initialization module is used for the initial position according to hand, carries out initialization to the operational light target reference position in described operation and control interface.
8. gesture control system according to claim 7, is characterized in that, described operational light target reference position is placed in the center of operation and control interface by described initialization module.
9. gesture control system according to claim 7, is characterized in that, described initialization module is started working after activation gesture, and the record position activated residing for gesture is the initial position of described hand.
10. gesture control system according to claim 6, is characterized in that, also comprises striking module; Hand movement in a first direction for detecting the movement of hand within the scope of setting space, and is converted to striking operation by described striking module; Wherein, described first direction is: from away from operation interface near the direction of operation interface, and hand translational speed is in said first direction greater than the threshold value of setting.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510733994.2A CN105302305A (en) | 2015-11-02 | 2015-11-02 | Gesture control method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510733994.2A CN105302305A (en) | 2015-11-02 | 2015-11-02 | Gesture control method and system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN105302305A true CN105302305A (en) | 2016-02-03 |
Family
ID=55199657
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510733994.2A Pending CN105302305A (en) | 2015-11-02 | 2015-11-02 | Gesture control method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105302305A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107803024A (en) * | 2017-09-28 | 2018-03-16 | 网易(杭州)网络有限公司 | A kind of shooting criterions method and device |
WO2019041238A1 (en) * | 2017-08-31 | 2019-03-07 | 华为技术有限公司 | Input method and intelligent terminal device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102033702A (en) * | 2009-10-05 | 2011-04-27 | 日立民用电子株式会社 | Image display device and display control method thereof |
US20140118252A1 (en) * | 2012-10-25 | 2014-05-01 | Min Ho Kim | Method of displaying cursor and system performing cursor display method |
CN104063040A (en) * | 2013-03-18 | 2014-09-24 | 维沃移动通信有限公司 | Intelligent control method and system for object detection and identification by front-facing camera in movable handheld equipment |
CN104808790A (en) * | 2015-04-08 | 2015-07-29 | 冯仕昌 | Method of obtaining invisible transparent interface based on non-contact interaction |
-
2015
- 2015-11-02 CN CN201510733994.2A patent/CN105302305A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102033702A (en) * | 2009-10-05 | 2011-04-27 | 日立民用电子株式会社 | Image display device and display control method thereof |
US20140118252A1 (en) * | 2012-10-25 | 2014-05-01 | Min Ho Kim | Method of displaying cursor and system performing cursor display method |
CN104063040A (en) * | 2013-03-18 | 2014-09-24 | 维沃移动通信有限公司 | Intelligent control method and system for object detection and identification by front-facing camera in movable handheld equipment |
CN104808790A (en) * | 2015-04-08 | 2015-07-29 | 冯仕昌 | Method of obtaining invisible transparent interface based on non-contact interaction |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019041238A1 (en) * | 2017-08-31 | 2019-03-07 | 华为技术有限公司 | Input method and intelligent terminal device |
CN110050249A (en) * | 2017-08-31 | 2019-07-23 | 华为技术有限公司 | A kind of input method and intelligent terminal |
CN110050249B (en) * | 2017-08-31 | 2020-08-25 | 华为技术有限公司 | Input method and intelligent terminal equipment |
US11429191B2 (en) | 2017-08-31 | 2022-08-30 | Huawei Technologies Co., Ltd. | Input method and smart terminal device |
CN107803024A (en) * | 2017-09-28 | 2018-03-16 | 网易(杭州)网络有限公司 | A kind of shooting criterions method and device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103294401B (en) | A kind of icon disposal route and device with the electronic equipment of touch-screen | |
CN102819394B (en) | Terminal and terminal operation method | |
CN103929603B (en) | Image projecting equipment, image projection system and control method | |
CN104317494B (en) | The method and system of mobile cursor | |
AU2013222958B2 (en) | Method and apparatus for object size adjustment on a screen | |
CN102880420B (en) | Method and system for starting and executing region selection operation based on touch screen | |
CN103870156A (en) | Method and device for processing object | |
JP7232054B2 (en) | Image measuring machine and program | |
CN102968245B (en) | Mouse touches cooperative control method, device and Intelligent television interaction method, system | |
CN103631419A (en) | Cursor positioning method and system based on remote control touch pad | |
JP6470112B2 (en) | Mobile device operation terminal, mobile device operation method, and mobile device operation program | |
WO2013081594A1 (en) | Input mode based on location of hand gesture | |
CN105404384A (en) | Gesture operation method, method for positioning screen cursor by gesture, and gesture system | |
CN104699249A (en) | Information processing method and electronic equipment | |
CN102707802A (en) | Method for controlling speed of mapping of gesture movement to interface | |
CN105302305A (en) | Gesture control method and system | |
CN105302404A (en) | Method and system for quickly moving mouse pointer | |
WO2013153750A1 (en) | Display system, display device, and operation device | |
CN105573692A (en) | Projection control method, associated terminal and system | |
CN102541417B (en) | Multi-object tracking method and system in virtual touch screen system | |
CN104142736A (en) | Video monitoring equipment controlling method and video monitoring equipment controlling device | |
US10712917B2 (en) | Method for selecting an element of a graphical user interface | |
EP2669783A1 (en) | Virtual ruler for stylus input | |
CN103941978B (en) | Displaying method for control content | |
CN105528059B (en) | A kind of gesture operation in three-dimensional space method and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20160203 |