CN103365401A - Gesture control method and gesture control device - Google Patents

Gesture control method and gesture control device Download PDF

Info

Publication number
CN103365401A
CN103365401A CN2012100880089A CN201210088008A CN103365401A CN 103365401 A CN103365401 A CN 103365401A CN 2012100880089 A CN2012100880089 A CN 2012100880089A CN 201210088008 A CN201210088008 A CN 201210088008A CN 103365401 A CN103365401 A CN 103365401A
Authority
CN
China
Prior art keywords
gesture
user
planar
display screen
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012100880089A
Other languages
Chinese (zh)
Other versions
CN103365401B (en
Inventor
郭彦麟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Acer Inc
Original Assignee
Acer Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Acer Inc filed Critical Acer Inc
Priority to CN201210088008.9A priority Critical patent/CN103365401B/en
Publication of CN103365401A publication Critical patent/CN103365401A/en
Application granted granted Critical
Publication of CN103365401B publication Critical patent/CN103365401B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention provides a gesture control method and a gesture control device. The gesture control method is applicable to an electronic device with a display screen. The gesture control method includes firstly, detecting first gesture operation which is carried out by a user in three-dimensional space at the front of the display screen so that an operation plane can be defined; secondly, computing a proportion of the palm of the user to the coverage of the operation plane; thirdly, dividing the operation plane into a plurality of operation areas according to the proportion and dividing a display plane of the display screen into a plurality of corresponding display areas; fourthly, detecting second gesture operation which is carried out by the user in the operation plane and controlling displayed articles in the corresponding display area according to the certain operation area where the second gesture operation is carried out.

Description

Gesture control method and device
Technical field
The invention relates to a kind of gesture control method and device, and particularly relevant for a kind of gesture control method and device that in three dimensions, carries out gesture control.
Background technology
The consumer is light, thin, short, little for the requirement tendency of electronic installation external form now, and this requirement is so that the volume of electronic installation and weight are subject to suitable restriction.Recently the application of touch-screen is more and more universal, for the portable apparatus such as smart mobile phone or panel computer, touch-screen can be simultaneously as demonstration and the inputting interface of device, and can economize the cost of configure conventional keyboard and shared area.And for desktop computer or notebook computer, then can allow consumer's choice for use conventional keyboard or touch-screen control input.
Yet, in the existing touch-screen, no matter be capacitive touch screen or resistive touch screen, all can be at liquid crystal display (liquid crystal display, LCD) increase again a slice touch-control glass or film before the panel, thereby can increase thickness and the weight of display screen.Therefore, when notebook computer was unlatching, the consumer easily caused rocking of notebook computer by the finger touches display screen.And certain distance and angle are arranged between the display screen of notebook computer and the consumer, as panel computer, can move freely, thereby the device kenel of notebook computer is unsuitable for directly touching of finger.In addition, with finger display screen is directly touched control, easily cause fingerprint residues in the problem of display screen.
Summary of the invention
In view of this, the invention provides a kind of gesture control method and device, can according to the gesture operation that carries out in the three dimensions of user before display screen, come shown object in the corresponding control display screen.
The present invention proposes a kind of gesture control method, is applicable to have the electronic installation of display screen.The method is to detect first the first gesture operation that carries out in the three dimensions of user before display screen, to define operation planar.Then, calculate user's palm with respect to the ratio of operation planar institute covering scope.Then, be a plurality of operating areas according to this ratio cutting operation planar, and the display plane of cutting display screen is corresponding a plurality of viewing areas.At last, detect the second gesture operation that the user carries out in operation planar, and the operating area at foundation the second gesture operation place, object shown in the corresponding viewing area controlled.
In one embodiment of this invention, the operating area at above-mentioned foundation the second gesture operation place, the step of controlling the object in the corresponding viewing area also comprises the gesture kind of judging the second gesture operation, and carries out control operation corresponding to this gesture kind in this object.
In one embodiment of this invention, after the step of the second gesture operation that above-mentioned detection user carries out in operation planar, described method comprises that also foundation the second gesture operates in the operating position in the operation planar, and screen position corresponding in display screen shows the gesture icon.
In one embodiment of this invention, above-mentioned is a plurality of operating areas according to ratio cutting operation planar, and the display plane of cutting display screen is that the step of corresponding a plurality of viewing areas also comprises according to this ratio show this object in one or more viewing area, and defines in each viewing area in order to respond to the induction region of the second gesture operation.
In one embodiment of this invention, the operating area at above-mentioned foundation the second gesture operation place, whether the step of controlling the object that shows in the corresponding viewing area also comprises according to this ratio judges that the second gesture operates in gesture zone corresponding in the viewing area and whether includes this induction region, to determine according to the object that shows in the second gesture operation control display zone.
In one embodiment of this invention, the first gesture operation that carries out in the three dimensions of above-mentioned detection user before display screen, step with the defining operation plane comprises the extending range of viewing and admiring distance and user's both hands that detects between user and the display screen, determines to be suitable for according to this range and position that the user carries out the operation planar of gesture operation.
In one embodiment of this invention, after the step of the second gesture operation that above-mentioned detection user carries out in operation planar, described method also comprises the distance that detects between the second gesture operation and the operation planar, and compares with preset value.If this distance is greater than preset value, display reminding information in display screen then is in order to the position of pointing out the user to adjust the second gesture operation, to be suitable for carrying out gesture operation at operation planar.
In one embodiment of this invention, the step of the second gesture operation of carrying out in operation planar of above-mentioned detection user also comprises and detects the lower edge whether the second gesture operation moves to operation planar.If then the control model with electronic installation becomes the key-press input pattern from the gesture control mode switch.
In one embodiment of this invention, above-mentioned with the control model of electronic installation after the gesture control mode switch becomes the step of key-press input pattern, described method also comprises the button operation of detection user on physical keyboard, or projection virtual keyboard and detect the button operation of user on this dummy keyboard, to carry out key-press input.
The present invention proposes a kind of gesture control device in addition, and it comprises detection module, operation planar definition module, gesture operation respective modules and control module.Wherein, detection module is to detect the gesture operation that the user carries out before display screen.The operation planar definition module is to define operation planar according to the first gesture operation that detection module detects.The gesture operation respective modules is to calculate user's palm with respect to the ratio of operation planar institute covering scope, and is a plurality of operating areas according to this ratio cutting operation planar, and the display plane of cutting display screen is corresponding a plurality of viewing areas.Control module is to judge that the second gesture that detection module detects operates in the operating area in the operation planar, controls the object that shows in the corresponding viewing area according to this.
In one embodiment of this invention, above-mentioned detection module also comprises the distance that detects between the second gesture operation and the operation planar.Described gesture control device also comprises reminding module, compare with preset value in order to the distance that detection module is detected, if this distance is greater than preset value, reminding module display reminding information in display screen then, in order to the position of pointing out the user to adjust the second gesture operation, to be suitable for carrying out gesture operation at operation planar.
In one embodiment of this invention, above-mentioned gesture control device also comprises keyboard input module, is in order in the key-press input pattern, detects the button operation of user on physical keyboard, or detect the button operation of user on the dummy keyboard of projection, to carry out corresponding key-press input.
Based on above-mentioned, gesture control method provided by the present invention and device are by means of detecting the gesture operation that carries out in the three dimensions of user before display screen, control object shown in the corresponding viewing area, avoid pointing direct touching display screen and reduce rocking of notebook computer, also can avoid fingerprint residues in the problem of display screen.
For above-mentioned feature and advantage of the present invention can be become apparent, embodiment cited below particularly, and accompanying drawing is described in detail below shown in cooperating.
Description of drawings
Fig. 1 is the calcspar of the gesture control device that illustrates according to one embodiment of the invention;
Fig. 2 is the gesture control method process flow diagram that illustrates according to one embodiment of the invention;
Fig. 3 (a) and Fig. 3 (b) are the application situation schematic diagram of the gesture control method that illustrates according to one embodiment of the invention;
Fig. 4 is that the operation planar that illustrates according to one embodiment of the invention and the ratio of display plane are cut apart schematic diagram;
Fig. 5 is the enlarged diagram of the display plane of Fig. 4;
Fig. 6 is the schematic diagram of the gesture kind that illustrates according to one embodiment of the invention;
Fig. 7 is the calcspar of the gesture control device that illustrates according to another embodiment of the present invention;
Fig. 8 is the process flow diagram of a kind of gesture control method of illustrating according to another embodiment of the present invention;
Fig. 9 is the schematic diagram of the gesture icon that illustrates according to another embodiment of the present invention.
Description of reference numerals:
10: notebook computer;
100,700: the gesture control device;
110: detection module;
120: the operation planar definition module;
130: the gesture operation respective modules;
140: control module;
102: display plane;
102a~102x: viewing area;
20: the user;
30: operation planar;
30a~30x: operating area;
610~670: gesture;
750: reminding module;
760: keyboard input module;
A, GP1, GP2: position;
GP1 ', GP2 ': display position;
GA: gesture zone;
I1~I8: object;
Z: induction region;
D1~d8: length;
G1, g2: spacing;
S210~S240: each step of the gesture control method of an embodiment;
S810~S880: each step of the gesture control method of another embodiment.
Embodiment
In order to allow the user not need also can control input by the display screen of pointing direct touching electronic installation, the present invention is at electronic installation configuration gesture control device, to detect the gesture operation that carries out in the three dimensions of user before display screen, and by detecting position and the kind of gesture operation, correspondence is controlled at object shown in the display screen.In order to make content of the present invention more clear, below enumerate the example that embodiment can implement really according to this as the present invention.
Fig. 1 is the calcspar of the gesture control device that illustrates according to one embodiment of the invention.Please refer to Fig. 1, the gesture control device 100 of the present embodiment is applicable to have the electronic installation of display screen, and this electronic installation does not limit at this such as being desktop computer or notebook computer etc.Gesture control device 100 comprises detection module 110, operation planar definition module 120, gesture operation respective modules 130 and control module 140, and its function is described below:
Detection module 110 for example is to have charge coupled cell (Charge Coupled Device, CCD) or CMOS (Complementary Metal Oxide Semiconductor) (Complementary Metal-Oxide Semiconductor, CMOS) imageing sensor of element, in order to gather the image in display screen the place ahead, in three dimensions, carry out position and the kind of gesture operation with the user in detection display screen the place ahead.
Operation planar definition module 120 is coupled to detection module 110, analyze in order to the image that above-mentioned imageing sensor is gathered, and then obtain and view and admire distance and the extending range of user's both hands between user and the display screen, determine to be suitable for according to this range and position that the user carries out the operation planar of gesture operation.In one embodiment, operation planar definition module 120 also can comprise projecting cell, goes out virtual operation planar in order to projection between user and display screen, is convenient to the user and carries out gesture operation at this virtual operation planar.
Gesture operation respective modules 130 for example is the arithmetic element that combines with logic circuit component, in order to the palm that calculates the user ratio with respect to operation planar definition module 120 defined operation planars, and be a plurality of operating areas according to this ratio cutting operation planar, and the display plane of cutting display screen is corresponding a plurality of viewing areas.
Control module 140 is such as being to possess the processor of calculation function or Programmable Logic Controller etc., judge that according to the gesture operation position that detection module 110 detects user's hand is arranged in which operating area of operation planar, controls object shown in the corresponding viewing area according to this.
Fig. 2 is the gesture control method process flow diagram that illustrates according to one embodiment of the invention.Please refer to Fig. 1 and Fig. 2, the method for the present embodiment is applicable to the gesture control device 100 in above-described embodiment, below namely the arrange in pairs or groups detailed step of each the module declaration the present embodiment gesture control method in the gesture control device 100.
At first, as described in step S210, detection module 110 detects first the first gesture operation that carries out in the three dimensions of user before display screen, to define operation planar.Fig. 3 (a) and Fig. 3 (b) are the application situation schematic diagram of the gesture control method that illustrates according to one embodiment of the invention.Please cooperate with reference to Fig. 3 (a) and Fig. 3 (b) and Fig. 1, the electronic installation of the present embodiment for example is notebook computer 10, gesture control device 100 is configured in the notebook computer 10, wherein, detection module 110 for example is the imageing sensor that is configured in A place, position, can carry out in order to gather user 20 many images of gesture operation.
The first gesture operation of indication is that both hands by the user carry out simple brandishing up and down in this step, uses defining suitable operation planar.Shown in Fig. 3 (a), user 20 hand swings up and down; Shown in Fig. 3 (b), about carrying out, stretch user 20 both hands.The image that 120 pairs of detection modules 110 of operation planar definition module gather is analyzed, and then obtains the extending range of viewing and admiring distance (viewing and admiring apart from d1 of the present embodiment is about 600 millimeters) and user's 20 both hands between user 20 and the display screen.In the present embodiment, operation planar definition module 120 is that 470 millimeters, the long d3 of operation planar 30 are 770 millimeters according to the wide d2 of above-mentioned conditional definition operation planar 30, and operation planar 30 is between the display screen being 400 millimeters apart from d4.Wherein, the dimension definitions of operation planar is relevant with the extending range of viewing and admiring distance and both hands, and the size of operation planar is set and can be adjusted according to actual conditions, is not limited at this.
Then, in step S220, gesture operation respective modules 130 is calculated user's palm with respect to the ratio of operation planar institute covering scope.Fig. 4 is that the operation planar that illustrates according to one embodiment of the invention and the ratio of display plane are cut apart schematic diagram.Please refer to Fig. 4, suppose that the long d5 of user 20 palm is about 100 millimeters, the wide d6 of user 20 palm is about 170 millimeters, and therefore, the long d5 of user's palm is about 1: 8 with the ratio of the long d3 of operation planar 30; The wide d6 of user's palm is about 1: 3 with the ratio of the wide d2 of operation planar 30.
Obtain after the aforementioned proportion, but just subsequent steps S230, and gesture operation respective modules 130 is a plurality of operating areas according to this ratio cutting operation planar 30 namely, and the display plane of cutting display screen is corresponding a plurality of viewing areas.As shown in Figure 4, operation planar 30 is divided into 24 operating area 30a~30x; And the display plane 102 of notebook computer 10 is 24 viewing area 102a~102x according to identical ratio cutting also.
In addition, after the viewing area of display plane 102 102a~102x differentiation was finished, gesture operation respective modules 130 showed one or more object according to aforementioned proportion in the 102a~102x of the viewing area of display plane 102.Wherein, object is not limited to above-mentioned such as being application program contract drawing, image etc.Fig. 5 is the enlarged diagram of the display plane of Fig. 4.In the present embodiment, display plane 102 has for example shown 8 object I1~I8, if each viewing area 102a~102x all shows an object, then can show at most 24 objects.
At last, in step S240, detection module 110 detects the second gesture operation that the user carries out in operation planar, and control module 140 is judged the operating area at the second gesture operation place, uses shown object in the corresponding viewing area of control.In detail, 130 meetings of gesture operation respective modules define in order to respond to the induction region of the second gesture operation in each viewing area according to the size of each viewing area first.Whether control module 140 judges that the second gesture operates in gesture zone corresponding in the viewing area and whether includes induction region, to determine according to the object that shows in the second gesture operation control display zone.
Take Fig. 4 and Fig. 5 as example explains, suppose that display plane shown in Figure 5 102 is of a size of 11.6 inches, then the long d7 of display plane 102 is about 256 millimeters; The wide d8 of display plane 102 is about 144 millimeters.Therefore, the size of each viewing area 102a~102x is about the 32*48 millimeter; Induction region Z is of a size of the 24*36 millimeter; Object size I1~I8 is the 16*24 millimeter.Should be noted, must keep spacing (gap) between each object in the display plane 102, when avoiding the gesture zone to include simultaneously two objects, cause the misoperation of device.For instance, the object I2 of the present embodiment and the spacing g1 between the object I3 are 8 millimeters; Spacing g2 between object I3 and the object I4 is 12 millimeters.
In the present embodiment, the ratio that gesture zone GA includes induction region Z is over half, therefore the control module 140 object I7 that for example can control corresponding this position presents the whiting form, and the position of using its second gesture operation of prompting user is to be positioned within the induction range of object I7.On the contrary, if though gesture zone GA includes the part of induction region Z, and ratio does not reach half, then control module 140 judges that the position of the second gesture operation is not to be positioned within the induction range of object I7, so object I7 will can not present the whiting form.
Control module 140 judges whether the position of the second gesture operation is positioned at after the induction range of object, also comprises the gesture kind of judging the second gesture operation, and carries out control operation corresponding to this gesture kind in this object.Wherein, control operation comprises amplification, dwindles, chooses or moves etc.Fig. 6 is the schematic diagram of the gesture kind that illustrates according to one embodiment of the invention.Please refer to Fig. 6, gesture 610~gesture 670 represents respectively different control operations, for instance, and the control operation of gesture 640 representatives " choosing object "; The control operation that gesture 670 representatives " are determined to carry out ".Wherein, the corresponding relation of gesture kind and control operation can be set in advance by the user, does not limit at this.
It is worth mentioning that, the object of the display plane 102 in above-described embodiment is to be a plurality of different application contract drawings, yet display plane 102 shown objects also can be single object, for example are images.When if display plane 102 shown objects only are an image, then control module 140 can be according to the gesture kind of the second gesture operation and the operating area at place, and the image section on the display screen in the corresponding viewing area operates as object.For instance, if control module 140 judges that the gesture kind of the second gesture operation is to be an amplifying gesture, then control module 140 namely can be amplified whole image centered by the image section in above-mentioned viewing area.
Thus, gesture control method of the present invention and device can reach the gesture operation that carries out in the three dimensions of user before display screen by means of detecting, control object shown in the corresponding viewing area, avoid pointing direct touching display screen and can reduce rocking of notebook computer.Except the gesture control model, the user still often need to utilize keyboard to carry out the action of key-press input, and for this kind situation, the present invention also provides corresponding adjustment mode, below elaborates for an embodiment in addition.
Fig. 7 is the calcspar of the gesture control device that illustrates according to another embodiment of the present invention.The gesture control device 700 of the present embodiment also is applicable to have the electronic installation of display screen, and electronic installation does not limit at this such as being desktop computer or notebook computer etc.
In the present embodiment, gesture control device 700 also comprises reminding module 750 and keyboard input module 760 except comprising detection module 110, operation planar definition module 120, gesture operation respective modules 130 and control module 140.Wherein, reminding module 750 can be in display screen (not illustrating) display reminding information.Keyboard input module 760 is for when control module 140 becomes the key-press input pattern with control model by the gesture control mode switch, detects user's button operation, to carry out corresponding key-press input.
Fig. 8 is the process flow diagram of a kind of gesture control method of illustrating according to another embodiment of the present invention.Below the function mode of gesture control device 700 will be described with Fig. 8.Please be simultaneously with reference to Fig. 7 and Fig. 8.
At first, detection module 110 detects first the first gesture operation that carries out in the three dimensions of user before display screen, to define operation planar (step S810).Then, gesture operation respective modules 130 is calculated user's palm with respect to the ratio (step S820) of operation planar institute covering scope.Obtain after the aforementioned proportion, gesture operation respective modules 130 is a plurality of operating areas according to this ratio cutting operation planar namely, and the display plane of cutting display screen is corresponding a plurality of viewing areas (step S830).The detailed content of above-mentioned steps S810~S830 be with previous embodiment in step S210~S230 is same or similar, be not repeated herein.
Next, the distance between detection module 110 detection the second gesture operations and the operation planar.Reminding module 750 compares with preset value in order to the distance that detection module 110 is detected, and judge that whether distance that detection module 110 detects is greater than preset value (step S840), if this distance is greater than preset value, reminding module 750 display reminding information in display screen then, in order to point out the user to adjust the position of the second gesture operation, to be suitable for carrying out gesture operation (step S850) at operation planar, and get back to step S840, continue the distance that is detected between the second gesture operation and the operation planar by detection module 110.Wherein, preset value is less than the distance between operation planar and the display screen, and it can know that usually the knowledgeable adjusts according to actual conditions and sets by this area tool.
If the distance between the second gesture operation and the operation planar is not more than preset value, then control module 140 comprises that also the image that detects according to detection module 110 judges whether the second gesture operation moves to the lower edge of operation planar (step S860).If not, then control module 140 judges that the second gesture operates in the operating position in the operation planar, and screen position corresponding in display screen shows the gesture icon, and controls the object (step S870) in the corresponding viewing area.Fig. 9 is the schematic diagram of the gesture icon that illustrates according to another embodiment of the present invention.Please refer to Fig. 1 and Fig. 9, in one embodiment, if the second gesture operation of user is positioned at the primary importance GP1 of operation planar 30, then control module 140 can show corresponding gesture icon according to the first display position GP1 ' of corresponding ratio in display plane 102.In like manner, if if the second gesture operation of user is positioned at the second place GP2 of operation planar 30, control module 140 shows corresponding gesture icon according to the second display position GP2 ' of corresponding ratio in display plane 102 equally.In another embodiment, the shown gesture icon in the position of corresponding the second gesture operation also can be fixing mouse diagram or other patterns in display plane 102, does not limit at this.
After showing the gesture icon, control module 140 can further be judged the gesture kind of the second gesture operation, and carries out control operation corresponding to this gesture kind.
Get back to step S860, if control module 140 judges that the second gesture operation moves to the lower edge of operation planar really, then control module 140 becomes key-press input pattern (step S880) with the control model of electronic installation from the gesture control mode switch.In one embodiment, keyboard input module 760 is in order in the key-press input pattern, detects the button operation of user on the physical keyboard of electronic installation.In another embodiment, detection module 110 comprises that also projection goes out a dummy keyboard, and keyboard input module 760 then can detect the button operation of user on the dummy keyboard of projection, to carry out corresponding key-press input.
In sum, the present invention is by means of detecting the gesture operation that carries out in the three dimensions of user before display screen, control object shown in the corresponding viewing area, avoid pointing direct touching display screen and can reduce rocking of notebook computer, also can avoid fingerprint residues in the problem of display screen.In addition, the present invention also provides a kind of method that detects gesture control model and the conversion of key-press input pattern, is convenient to the user and selects suitable control model according to the practical operation situation.
It should be noted that at last: above each embodiment is not intended to limit only in order to technical scheme of the present invention to be described; Although with reference to aforementioned each embodiment the present invention is had been described in detail, those of ordinary skill in the art is to be understood that: it still can be made amendment to the technical scheme that aforementioned each embodiment puts down in writing, and perhaps some or all of technical characterictic wherein is equal to replacement; And these modifications or replacement do not make the essence of appropriate technical solution break away from the scope of various embodiments of the present invention technical scheme.

Claims (18)

1. gesture control method is applicable to have an electronic installation of a display screen, and the method comprises the following steps:
Detect one first gesture operation that carries out in the three dimensions of a user before this display screen, to define an operation planar;
Calculate this user's a palm with respect to a ratio of this operation planar institute covering scope;
Be a plurality of operating areas according to this this operation planar of ratio cutting, and a display plane of this display screen of cutting is a plurality of viewing areas corresponding with described operating area; And
Detect one second gesture operation that this user carries out in this operation planar, and according to the operating area at this second gesture operation place, control an object that shows in the corresponding viewing area.
2. gesture control method according to claim 1, wherein according to the operating area at this second gesture operation place, the step of controlling this object in the corresponding viewing area also comprises:
Judge a gesture kind of this second gesture operation; And
Carry out a control operation corresponding to this gesture kind in this object.
3. gesture control method according to claim 1 wherein after the step that detects this second gesture operation that this user carries out in this operation planar, also comprises:
Operate in a operating position in this operation planar according to this second gesture, a corresponding screen position shows a gesture icon in this display screen.
4. gesture control method according to claim 1 is those operating areas according to this this operation planar of ratio cutting wherein, and this display plane of this display screen of cutting also comprises for the step of the viewing area corresponding with described operating area:
In one or more viewing area, show object according to this ratio, and define in each described viewing area in order to respond to an induction region of this second gesture operation.
5. gesture control method according to claim 4, wherein according to the operating area at this second gesture operation place, the step of controlling this object that shows in the corresponding viewing area also comprises:
Judge that according to this ratio this second gesture operates in a gesture zone corresponding in this viewing area and whether includes this induction region, to determine whether control this object that shows in this viewing area according to this second gesture operation.
6. gesture control method according to claim 1 wherein detects this first gesture operation that carries out in this user this three dimensions before this display screen, comprises with the step that defines this operation planar:
Detect the extending range of viewing and admiring distance and these user's both hands between this user and this display screen, determine to be suitable for according to this range and position that this user carries out this operation planar of this gesture operation.
7. gesture control method according to claim 1, the step that wherein detects the second gesture operation that this user carries out in this operation planar also comprises:
Detect the distance between this second gesture operation and this operation planar, and compare with a preset value; And
If this distance greater than this preset value, shows an information, in order to the position of pointing out this user to adjust this second gesture operation, to be suitable for carrying out gesture operation at this operation planar in this display screen.
8. gesture control method according to claim 1, the step that wherein detects the second gesture operation that this user carries out in this operation planar also comprises:
Detect the lower edge whether this second gesture operation moves to this operation planar; And
If a control model of this electronic installation is become a key-press input pattern from a gesture control mode switch.
9. gesture control method according to claim 8, wherein with this control model of this electronic installation after this gesture control mode switch becomes the step of this key-press input pattern, also comprise:
Detect the button operation of this user on a physical keyboard, or projection one dummy keyboard and detect the button operation of this user on this dummy keyboard, to carry out a key-press input.
10. gesture control device comprises:
One detection module detects the gesture operation of carrying out in the three dimensions of a user before a display screen;
One operation planar definition module, one first gesture operation according to this detection module detects defines an operation planar;
One gesture operational correspondence module, calculate this user's a palm with respect to a ratio of this operation planar institute covering scope, and be a plurality of operating areas according to this this operation planar of ratio cutting, and a display plane of this display screen of cutting is a plurality of viewing areas corresponding with described operating area; And
One control module judges that one second gesture that this detection module detects operates in the operating area in this operation planar, controls an object that shows in the corresponding viewing area according to this.
11. gesture control device according to claim 10, wherein:
This control module is also judged a gesture kind of this second gesture operation and is carried out a control operation corresponding to this gesture kind in this object.
12. gesture control device according to claim 10, wherein:
This gesture operation respective modules also operates in a operating position in this operation planar according to this second gesture, and a corresponding screen position shows a gesture icon in this display screen.
13. gesture control device according to claim 10, wherein:
This gesture operation respective modules shows this object according to this ratio in one or more viewing area, and defines in each described viewing area in order to respond to an induction region of this second gesture operation.
14. gesture control device according to claim 13, wherein:
This control module judges that according to this ratio this second gesture operates in a gesture zone corresponding in this viewing area and whether includes this induction region, to determine whether control this object that shows in this viewing area according to this second gesture operation.
15. gesture control device according to claim 10, wherein:
This detection module detects the extending range of viewing and admiring distance and these user's both hands between this user and this display screen, and this operation planar definition module calculates according to this and is suitable for the range and position that this user carries out this operation planar of gesture operation.
16. gesture control device according to claim 10, wherein this detection module also detects the distance between this second gesture operation and this operation planar, and this gesture control device also comprises:
One reminding module, this distance that this detection module is detected compares with a preset value, if this distance greater than this preset value, shows an information in this display screen, in order to the position of pointing out this user to adjust this second gesture operation, to be suitable for carrying out gesture operation at this operation planar.
17. gesture control device according to claim 10, wherein:
This control module judges also whether this second gesture operation moves to a lower edge of this operation planar, if a control model is become a key-press input pattern from a gesture control mode switch.
18. gesture control device according to claim 17 also comprises:
One keyboard input module in this key-press input pattern, detects the button operation of this user on a physical keyboard, or detects the button operation of this user on a dummy keyboard of projection, to carry out a corresponding key-press input.
CN201210088008.9A 2012-03-29 2012-03-29 Gestural control method and device Active CN103365401B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210088008.9A CN103365401B (en) 2012-03-29 2012-03-29 Gestural control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210088008.9A CN103365401B (en) 2012-03-29 2012-03-29 Gestural control method and device

Publications (2)

Publication Number Publication Date
CN103365401A true CN103365401A (en) 2013-10-23
CN103365401B CN103365401B (en) 2016-08-10

Family

ID=49366932

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210088008.9A Active CN103365401B (en) 2012-03-29 2012-03-29 Gestural control method and device

Country Status (1)

Country Link
CN (1) CN103365401B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103713740A (en) * 2013-12-31 2014-04-09 华为技术有限公司 Wrist-wearing-type terminal device and display control method thereof
CN103759737A (en) * 2014-01-17 2014-04-30 深圳市凯立德欣软件技术有限公司 Gesture control method and navigation device
CN103927118A (en) * 2014-04-15 2014-07-16 深圳市中兴移动通信有限公司 Mobile terminal and sliding control device and method thereof
CN104049772A (en) * 2014-05-30 2014-09-17 北京搜狗科技发展有限公司 Input method, device and system
CN105630134A (en) * 2014-10-27 2016-06-01 乐视致新电子科技(天津)有限公司 Operation event identification method and apparatus
CN103941858B (en) * 2014-03-11 2017-01-25 何川丰 Electronic equipment display screen operation control system and method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW561423B (en) * 2000-07-24 2003-11-11 Jestertek Inc Video-based image control system
TW200945174A (en) * 2008-04-14 2009-11-01 Pointgrab Ltd Vision based pointing device emulation
TW201019239A (en) * 2008-10-30 2010-05-16 Nokia Corp Method, apparatus and computer program product for providing adaptive gesture analysis
TW201129918A (en) * 2009-10-13 2011-09-01 Pointgrab Ltd Computer vision gesture based control of a device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW561423B (en) * 2000-07-24 2003-11-11 Jestertek Inc Video-based image control system
TW200945174A (en) * 2008-04-14 2009-11-01 Pointgrab Ltd Vision based pointing device emulation
TW201019239A (en) * 2008-10-30 2010-05-16 Nokia Corp Method, apparatus and computer program product for providing adaptive gesture analysis
TW201129918A (en) * 2009-10-13 2011-09-01 Pointgrab Ltd Computer vision gesture based control of a device

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103713740A (en) * 2013-12-31 2014-04-09 华为技术有限公司 Wrist-wearing-type terminal device and display control method thereof
CN103759737A (en) * 2014-01-17 2014-04-30 深圳市凯立德欣软件技术有限公司 Gesture control method and navigation device
CN103941858B (en) * 2014-03-11 2017-01-25 何川丰 Electronic equipment display screen operation control system and method
CN103927118A (en) * 2014-04-15 2014-07-16 深圳市中兴移动通信有限公司 Mobile terminal and sliding control device and method thereof
CN104049772A (en) * 2014-05-30 2014-09-17 北京搜狗科技发展有限公司 Input method, device and system
CN104049772B (en) * 2014-05-30 2017-11-07 北京搜狗科技发展有限公司 A kind of input method, device and system
CN105630134A (en) * 2014-10-27 2016-06-01 乐视致新电子科技(天津)有限公司 Operation event identification method and apparatus

Also Published As

Publication number Publication date
CN103365401B (en) 2016-08-10

Similar Documents

Publication Publication Date Title
US9575654B2 (en) Touch device and control method thereof
US9035883B2 (en) Systems and methods for modifying virtual keyboards on a user interface
TWI585672B (en) Electronic display device and icon control method
KR101471267B1 (en) Method and device for generating dynamically touch keyboard
US20120013645A1 (en) Display and method of displaying icon image
US20150185953A1 (en) Optimization operation method and apparatus for terminal interface
CN103365401A (en) Gesture control method and gesture control device
US9898126B2 (en) User defined active zones for touch screen displays on hand held device
CN102880332A (en) Local control method and system of touch panel
CN103207757B (en) Portable apparatus and its operational approach
TW201520876A (en) Method for operating user interface and electronic device thereof
CN107450820B (en) Interface control method and mobile terminal
JP6017995B2 (en) Portable information processing apparatus, input method thereof, and computer-executable program
CN101482799A (en) Method for controlling electronic equipment through touching type screen and electronic equipment thereof
JP2012128832A (en) Method of adjusting display appearance of keyboard layout displayed on touch display device
US10564844B2 (en) Touch-control devices and methods for determining keys of a virtual keyboard
CN101470575B (en) Electronic device and its input method
CN104166460B (en) Electronic equipment and information processing method
TW201337640A (en) Method of touch command integration and touch system using the same
US20110187654A1 (en) Method and system for user interface adjustment of electronic device
US20130300685A1 (en) Operation method of touch panel
TWI488068B (en) Gesture control method and apparatus
US9454248B2 (en) Touch input method and electronic apparatus thereof
TWI493431B (en) Method and system for prompting adjustable direction of cursor
TW201349046A (en) Touch sensing input system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant