CN101989150A - Gesture recognition method and touch system using same - Google Patents
Gesture recognition method and touch system using same Download PDFInfo
- Publication number
- CN101989150A CN101989150A CN200910160654XA CN200910160654A CN101989150A CN 101989150 A CN101989150 A CN 101989150A CN 200910160654X A CN200910160654X A CN 200910160654XA CN 200910160654 A CN200910160654 A CN 200910160654A CN 101989150 A CN101989150 A CN 101989150A
- Authority
- CN
- China
- Prior art keywords
- touch
- image
- panel surface
- control system
- gesture
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Landscapes
- Position Input By Displaying (AREA)
Abstract
The invention discloses a gesture recognition method for a touch system, which comprises the following steps of: continuously acquiring images crossing the surface of a panel by using at least one image sensor; processing the images so as to judge the contact condition change between a single indicator and the surface of the panel; and if the contact condition change is greater than a threshold value, recognizing whether the relative change between the single indicator and the surface of the panel accords with the preset gesture. The invention also provides a touch system. In the gesture recognition method and the touch system using the same, because the gesture can be recognized only according to the single indicator, the condition that the coordinate of the indicator cannot be accurately calculated due to a plurality of indicators shielded with one another can be avoided.
Description
Technical field
The present invention relates to a kind of touch-control system, particularly a kind of gesture discrimination method and the touch-control system that uses this method.
Background technology
Please refer to shown in Fig. 1 a and Fig. 1 b, it is a kind of operation chart of existing touch-control system 9.This touch-control system 9 comprises touch surface 90 and at least 2 video cameras 91,92, and whole touch surface 90 is contained in the visual field of this video camera 91,92, is used to capture the image across this touch surface 90 surfaces.When user 8 uses single finger 81 to touch touch surface 90, video camera 91,92 capture respectively comprise point 81 tips cover shadow I
81Image form W
91, W
92Processing unit then can be according to image form W
91, W
92In, that points 81 tips covers shadow I
81The one dimension position, calculate finger and 81 touch the two-dimensional position coordinate of touch surface 90.By this, finger 81 all can be tried to achieve with respect to the position and the displacement of touch surface 90, and processing unit then according to the two-dimensional position changes in coordinates of finger 81, is controlled display relatively and carried out corresponding action.
When user 8 uses two fingers 81 and 82 to touch touch surface 90 simultaneously, the image form W that video camera 91,92 is captured
91', W
92' then comprise two fingers 81,82 respectively cover shadow I
81, I
82Processing unit is according to image form W
91', W
92' in cover shadow I
81, I
82The one dimension position, calculate two fingers, 81,82 two-dimensional position coordinates respectively, and change according to two fingers, 81,82 coordinate position and to carry out the gesture identification with respect to touch surface 90.
Yet the operating principle of touch-control system 9 is the one dimension positions of covering shadow according to finger tips in each image form, calculates finger and touches the two-dimensional position coordinate of touch surface 90.When the user utilizes a plurality of fingers (for example, finger 81,82) when touching touch surface 90, because finger covers the image form W that video camera 92 is captured each other with respect to video camera 92
92' in the shadow that covers of all finger tips might not can appear, shown in Fig. 1 b.Therefore, may produce the situation of two-dimensional position coordinate that can't each finger of correct calculation.Though can be by video camera being set in addition, yet can increase system cost with head it off.
In view of this, the touch-control system that the present invention proposes a kind of gesture discrimination method in addition and uses this method is to solve the existing problem of above-mentioned existing touch-control system.
Summary of the invention
The objective of the invention is to propose the touch-control system of a kind of gesture discrimination method and this method of use, it can change to carry out mode switch according to the contact condition of single finger and panel.
The present invention proposes a kind of gesture discrimination method of touch-control system, and this method comprises the following steps: to utilize the image of the continuous acquisition of at least one image sensor across panel surface; Handle this image, change with the contact condition of judging single indicant and described panel surface; And if described contact condition changes greater than threshold value, then whether the described single indicant of identification meets default gesture with the relative variation of described panel surface.
The present invention proposes a kind of gesture discrimination method of touch-control system in addition, and this method comprises the following steps: to utilize the image of the continuous acquisition of at least one image sensor across panel surface; Handle this image, to detect the contact point of single indicant at described panel surface; And according to the state variation and the change in location of this contact point, whether the described single indicant of identification meets default gesture with contacting of panel surface.
The present invention proposes a kind of touch-control system in addition, and this system comprises panel, at least one light source, at least one image sensor and processing unit.Described panel has panel surface.Described light source is arranged at described panel surface.Described image sensor captures continuously along described panel surface and comprises that single indicant covers the image form that covers shadow that described light source forms.Described processing unit judges that whether width that covers shadow in the described image form or area change are greater than threshold value, if judge described width or area change greater than described threshold value, then whether the change in location of described single indicant of identification and described panel surface meets default gesture.
According to gesture discrimination method of the present invention and the touch-control system that uses this method, in first pattern, described touch-control system can be according to the action of changes in coordinates (change in location) the control cursor of indicant; In second pattern, described touch-control system can be according to the display frame of changes in coordinates (change in location) the update image display of described indicant, for example display object is chosen (object select), picture rolling (scroll), object towing (dragging), object convergent-divergent (zoom in/out) or object rotation (rotation) etc., and wherein said object comprises diagram (icon) and form.
At gesture discrimination method of the present invention and use in the touch-control system of this method owing to can only carry out the gesture identification according to single indicant, therefore can avoid because of a plurality of indicants cover each other cause can't correct calculation indicant coordinate situation.
Description of drawings
Fig. 1 a is a kind of operation chart of existing touch-control system;
Fig. 1 b is another operation chart of the touch-control system of Fig. 1 a;
Fig. 2 a is the stereographic map of the touch-control system of the embodiment of the invention;
Fig. 2 b is the part visual field of image sensor of Fig. 2 a and the synoptic diagram of the image form that captures;
Fig. 3 is the top view of the touch-control system of first embodiment of the invention;
Fig. 4 a is the operation chart of the touch-control system of first embodiment of the invention;
Fig. 4 b is the synoptic diagram of the image form that image sensor captured of Fig. 4 a;
Fig. 5 a is the stereographic map of the touch-control system of second embodiment of the invention;
Fig. 5 b be Fig. 5 a two image sensors the synoptic diagram of image form of acquisition respectively;
Fig. 6 a~6c is the operation chart of first pattern of the touch-control system of the embodiment of the invention;
Fig. 7 a~7c is the operation chart of second pattern of the touch-control system of the embodiment of the invention; And
Fig. 8 a-8c is the synoptic diagram of different gestures of the touch-control system of the embodiment of the invention.
The main element symbol description
10,10 ' touch-control system, 100 panels
Second limit of the first limit 100b panel of 100a panel
The 4th limit of the 3rd limit 100d panel of 100c panel
The surface of 100d ' the 4th mirror image 100s panel
11,11 ' luminescence unit 11a reflecting surface
121 first light sources, 121 ' the second mirror images
122 secondary light sources 122 ' the 3rd mirror image
13,13 ' image sensor, 14 processing units
15 image displays, 150 display screens
151 cursors, 20 image forms
IS virtual image space, RS real image space
T
81The contact point T of indicant
81The contact point of ' the first mirror image
A
81First included angle A
81' the second angle
D
1The distance D on first limit and the 3rd limit
2The distance on contact point and the 4th limit
R
81The first sense path R
81' the second sense path
I
81, I
82Cover shadow I
81' cover shadow
L, L ' cover shadow width B A high light zone
O, O ' object W
13, W
13' image form
8 users, 81,82 fingers
9 touch-control systems, 90 touch surface
91~92 video camera W
91, W
92The image form
W
91', W
92' image form VA the visual field
Embodiment
For allow above-mentioned and other purposes of the present invention, feature and advantage can be more obvious, hereinafter will cooperate appended diagram, be described in detail below.In addition, need to prove that in explanation of the present invention, identical member is with identical symbolic representation.
Please be simultaneously with reference to shown in Fig. 2 a and Fig. 2 b, Fig. 2 a is the stereographic map of the touch-control system 10 of the embodiment of the invention, Fig. 2 b is the synoptic diagram of the part visual field of image sensor 13 among Fig. 2 a and the image form 20 that captured.Touch-control system 10 comprises panel 100, luminescence unit 11, first light source 121, secondary light source 122, image sensor 13, processing unit 14 and image display 15.
Described panel 100 comprises the first limit 100a, the second limit 100b, the 3rd limit 100c, the 4th limit 100d and panel surface 100s.The embodiment of described panel 100 comprises blank (white board) or Touch Screen (touch screen).Described panel surface 100s is as the input field of touch-control system 10.
Among this embodiment, luminescence unit 11 is arranged on the panel surface 100s of the first limit 100a of panel 100.Luminescence unit 11 can be initiatively light source (active light source) or passive light source (passive light source).When luminescence unit 11 was the active light source, it was preferably line source.When luminescence unit 11 is passive light source, it is used to reflect the light that other light sources (for example first light source 121 and secondary light source 122) is sent, at this moment, luminescence unit 11 comprises that towards the reflecting surface 11a of the 3rd limit 100c of this panel wherein this reflecting surface 11a can utilize suitable material to form.First light source 121 is arranged on the panel surface 100s of the second limit 100b of panel, and preferably luminous towards the 4th limit 100d of panel.Secondary light source 122 is arranged on the panel surface 100s of the 3rd limit 100c of panel, and preferably luminous towards the first limit 100a of panel; Wherein first light source 121 and secondary light source 122 are preferably initiatively light source, are line source for example, but are not limited to this.
Described image sensor 13 preferably is arranged at the corner of panel 100, for example in this embodiment, image sensor 13 is set at the corner of secondary light source 122 and the intersection of the 4th limit 100d of panel, and luminescence unit 11 can be arranged on panel surface 100s and the image sensor 13 non-conterminous limits.Image sensor 13 along panel surface 100s acquisition across panel surface 100s and comprise the image in the space that 100d defines, the 4th limit of luminescence unit 11, first light source 121, secondary light source 122 and panel.When indicant (pointer) (for example, finger 81) during the 100s of touch panel surface, the most advanced and sophisticated image that can occur finger 81 in the visual field of image sensor 13, shown in the last figure of Fig. 2 b, wherein BA represents that the height of high light zone and this high light zone BA is determined by the size of luminescence unit 11 and light source 121,122 usually.Therefore, image sensor 13 can capture continuously comprise that finger 81 tips cover that luminescence unit 11 or light source 121 form cover shadow I
81 Image form 20, shown in figure below of Fig. 2 b.The embodiment of image sensor 13 comprises ccd image sensor and cmos image sensor, but is not limited to this.Scrutable is that indicant also can be other suitable objects and replaces, and is not defined as finger.
, then control touch-control system 10 and work in second pattern when surpassing threshold value when processing unit 14 picks out the width that covers shadow of indicant or area change (can be become big or diminish); At this moment, 14 positions of covering shadow of processing unit according to indicant in the image form, calculate the two-dimensional position coordinate that indicant touches panel surface 100s relatively, carry out the gesture identification according to the two-dimensional position changes in coordinates that the consecutive image form is tried to achieve, and upgrade according to the display frame that the gesture that is picked out is controlled image display relatively, for example display object is chosen, the towing of picture rolling, object, object convergent-divergent or object rotation etc., after its detailed content will be specified in.In addition, among the present invention, can be by dynamically adjusting the size of threshold value, to adjust the sensitivity (sensitivity) of switching first pattern and second pattern; Wherein, threshold value is more insensitive, and threshold value is healed little then sensitiveer.
Among Fig. 2 a, be clear demonstration touch-control system 10 of the present invention, panel 100 is independent of outside the image display 15, but it is not in order to limit the present invention.Among other embodiment, panel 100 also can be incorporated on the display screen 150 of image display 15.In addition, when panel 100 was Touch Screen, the display screen 150 of image display 15 also can be used as panel 100, and luminescence unit 11, first light source 121, secondary light source 122 and image sensor 13 then are arranged on the surface of display screen 150.
Scrutablely be, though among Fig. 2 a, panel 100 is shown as rectangle and luminescence unit 11, first light source 121 and secondary light source 122 are shown as on three limits that are arranged at panel 100 orthogonally, but it only is a kind of embodiment of the present invention, is not to be used to limit the present invention.Among other embodiment, panel 100 can be made into other shapes; Luminescence unit 11, first light source 121, secondary light source 122 and image sensor 13 also can be arranged at other spatial relationship on the panel surface 100s.
First embodiment
Please refer to shown in Figure 3ly, it shows the top view of the touch-control system 10 of first embodiment of the invention.In this embodiment, luminescence unit 11 is passive light source (a for example reflecting element), and has towards the reflecting surface 11a of the 3rd limit 100c of panel.By this, first light source 121 relatively reflecting surface 11a map out second mirror image 121 ', secondary light source 122 relatively reflecting surface 11a map out the 3rd mirror image 122 ', the 4th limit 100d of panel reflecting surface 11a relatively maps out the 4th mirror image 100d '; Wherein, the 4th limit 100d of luminescence unit 11, first light source 121, secondary light source 122 and panel defines real image space RS jointly; Luminescence unit 11, second mirror image 121 ', the 3rd mirror image 122 ' and the 4th mirror image 100d ' define virtual image space IS jointly.
Please refer to shown in Fig. 4 a and the 4b, Fig. 4 a is the operation chart of the touch-control system of first embodiment of the invention; Fig. 4 b is the synoptic diagram of the image sensor 13 image form 20 that captures among Fig. 4 a.As shown in the figure, when touching on the panel surface 100s in the RS of real image space, this sentences contact point T when indicant (for example, the tip of finger 81)
81Expression, in the IS of virtual image space, this sentences contact point T to the reflecting surface 11a of the relative luminescence unit 11 of indicant (is reflecting element at this embodiment) with first Mirroring Mapping
81' expression.Image sensor 13 is according to the first sensing route R
81The most advanced and sophisticated image of acquisition indicant covers shadow I to form in image form 20
81And according to the second sensing route R
81The most advanced and sophisticated image of ' acquisition first mirror image covers shadow I to form in image form 20
81', shown in Fig. 4 b.In this embodiment, store the relativeness of covering the angle of shadow between the 3rd limit 100c of the one dimension position of image form 20 and sensing route and panel in the processing unit 14 in advance.Therefore, when forming image form 20 when the most advanced and sophisticated image of image sensor 13 acquisition indicants and its first mirror image, 14 of processing units can be according to covering shadow I in the image form 20
81, I
81' the one dimension position, obtain first included angle A respectively
81With second included angle A
81'.Then, according to the trigonometric function relation, processing unit 14 can be obtained the contact point T that indicant touches panel surface 100s
81The two-dimensional position coordinate.
For example in one embodiment, panel surface 100s constitutes a rectangular coordinate system, and the 3rd limit 100c is as the X-axis of this rectangular coordinate system, and the 4th limit 100d is as the Y-axis of this rectangular coordinate system, and with image sensor 13 positions as initial point.Therefore, contact point T
81The coordinate that is positioned at rectangular coordinate system then can be expressed as (distance of relative the 4th limit 100d, the distance of relative the 3rd limit 100c).In addition, store first limit 100a of panel and the distance D 1 between the 3rd limit 100c in the processing unit 14 in advance.By this, processing unit can be obtained the contact point T of indicant 81 according to the following step
81The two-dimensional position coordinate: (a) processing unit 14 is obtained the first sensing route R
81And first included angle A between the 3rd limit 100c of panel
81And the second sensing route R
82And second included angle A between the 3rd limit 100c of panel
81'; (b) according to equation D
2=2D
1/ (tanA
81+ tanA
81') obtain the contact point T of indicant 81
81And the distance D between the 4th limit 100d of panel
2(c) according to D
2* tanA
81Obtain contact point T
81The y coordinate.Therefore, contact point T
81The two-dimensional position coordinate then can be expressed as (D
2, D
2* tanA
81).
Second embodiment
Please refer to shown in Fig. 5 a and Fig. 5 b, Fig. 5 a be second embodiment of the invention touch-control system 10 ' stereographic map, Fig. 5 b be among Fig. 5 a image sensor 13,13 ' the synoptic diagram of image form of acquisition respectively.The difference of present embodiment and above-mentioned first embodiment is, luminescence unit 11 ' be in this embodiment is light source initiatively, and touch-control system 10 ' comprise two image sensors 13 and 13 '.
Among second embodiment, touch-control system 10 ' comprise panel 100, luminescence unit 11 ', first light source 121, secondary light source 122, two image sensors 13,13 ' and processing unit 14.On the panel surface 100s of the first limit 100a of luminescence unit 11 ' be arranged at panel, it is preferably luminous towards the 3rd limit 100c of panel.First light source 121 is arranged on the panel surface 100s of the second limit 100b of panel, and it is preferably luminous towards the 4th limit 100d of panel.Secondary light source 122 is arranged on the panel surface 100s of the 4th limit 100d of panel, and it is preferably luminous towards the second limit 100b of panel.Image sensor 13 is arranged at the 3rd limit 100c of panel and the intersection of the 4th limit 100d, and its visual field is across panel surface 100s.Image sensor 13 ' be arranged at second limit 100b of panel and the intersection of the 3rd limit 100c, its visual field is across panel surface 100s.When indicant (for example, finger 81) when touching panel surface 100s, what image sensor 13 acquisitions comprised finger 81 tips covers shadow I
81Image form W
13, what image sensor 13 ' acquisition comprised finger 81 tips covers shadow I
81' image form W
13'.Be understandable that touch-control system 10 ' can comprise the image display that is coupled to processing unit 14 equally.
Processing unit 14 couple image sensor 13 and 13 ', be used to handle the image of image sensor 13 and 13 ' captured, cover shadow I with the relative indicant of identification
81, I
81' width or area change, with control touch-control system 10 ' work in first pattern or second pattern.When processing unit 14 picks out indicant touch panel surface 100s, then start touch-control system 10 ' operate on first pattern; At this moment, processing unit 14 is according to image form W
13And W
13' middle indicant cover shadow I
81, I
81' the position, calculate the two-dimensional position coordinate that indicant touches panel surface 100s relatively, and according to consecutive image form W
13And W
13' two-dimensional position the changes in coordinates of being tried to achieve, the action of the cursor on the control image display.Cover shadow I when what processing unit 14 picked out indicant
81, I
81' width or area change when surpassing threshold value, then control touch-control system 10 ' operate on second pattern; At this moment, 14 of this processing units are according to image form W
13And W
13The position of covering shadow of ' middle indicant is to calculate the two-dimensional position coordinate that indicant touches panel surface 100s relatively, according to consecutive image form W
13And W
13' two-dimensional position the changes in coordinates of being tried to achieve is carried out the gesture identification, and upgrades according to the display frame that the gesture that is picked out is controlled image display relatively, and for example display object is chosen, picture rolling, object convergent-divergent, object towing or object rotation etc.Two-dimensional position Coordinate Calculation mode can be calculated by trigonometric function equally, and its detailed calculated mode is similar to the account form of first embodiment, so repeat no more in this.
The How It Works of the touch-control system of the embodiment of the invention then is described.Mandatory declaration be, following gesture discrimination method system be applicable to simultaneously the touch- control system 10 and 10 of the present invention first and second embodiment '.
Please be simultaneously with reference to shown in Fig. 2 a and Fig. 6 a~6c, when the user utilizes indicant (for example, finger 81) when touching panel surface 100s, image sensor 13 acquisition fingers 81 tips cover shadow I
81The back forms image form 20, wherein covers shadow I in this image form 20
81Width for example be L, after processing unit 14 picks out touch action, then start touch-control system 10, and control this touch-control system 10 and enter first pattern.In first pattern, processing unit 14 is according to covering shadow I in the image form 20
81The position, calculate the two-dimensional coordinate of finger 81 contact surface plates surface 100s, and control the action of the cursor 151 on the image display 15 relatively, shown in Fig. 6 b according to the variation of two-dimensional coordinate.
When panel 100 was Touch Screen, the user can use the directly panel surface 100s of touching object O position of finger 81, to start touch-control system, shown in Fig. 6 c.Processing unit 14 is equally according to covering shadow I in the image form 20
81The position, calculate the finger 81 two-dimensional coordinates with respect to panel surface 100s.
Please be simultaneously with reference to shown in Fig. 2 a and Fig. 7 a~7c, when the user changes the contact condition of finger 81 and panel surface 100s, contact area for example then can make and cover shadow I in the image form 20
81Width and area change, for example cover shadow I in the image form 20 that image sensor 13 is captured among Fig. 7 a
81Width increase to L '.When processing unit 14 judges that the wide variety of covering shadow surpasses threshold value, for example L '/L or | L '-L| surpasses the preset threshold value, then controls touch-control system 10 and enters second pattern.In like manner, the area change of covering shadow can be tried to achieve according to absolute value difference of the area of two contact conditions or ratio.That is threshold value can be the width that covers shadow or the variation ratio or the changing value of area.
In second pattern, processing unit 14 is equally according to covering shadow I in the image form 20
81The position, calculate finger 81 two-dimensional coordinates, and, judge to carry out gesture with the variation of two-dimensional coordinate and the default gesture data comparison that is stored in processing unit 14 in advance with respect to panel surface 100s; That is, in second pattern, the changes in coordinates that processing unit 14 is tried to achieve not is the action that is used to control cursor 151, but be used to judge user's gesture, to carry out the operation of preset function, for example object is chosen, the towing of picture rolling, object, object convergent-divergent and object rotation, but is not limited to this.Among the present invention, described object comprises diagram (icon) and form (window).
Among the present invention, when desiring to make touch-control system 10 to switch between first pattern and second pattern, during mode switch, finger 81 can be mobile or static with respect to panel surface 100s, and shadow I is covered in change
81Width or area after preferably keep Preset Time at least, for example 1 second, but be not limited to this.
Please the relation of user's gesture and each operational function then be described simultaneously with reference to shown in Fig. 6 a-8c, scrutable is that the relation of aftermentioned gesture and each operational function only is exemplary, is not to be used to limit the present invention.
Object is chosen:
When panel 100 was blank, the user was earlier with the panel surface 100s of indicant contact touch-control system, starting touch-control system, and controlled this touch-control system 10 and entered first pattern.Then, by changing the relative position of finger and panel surface 100s, to control cursor 151 to the object O that desires to choose, shown in Fig. 6 b.Then, the user changes the contact condition of finger 81 and panel surface 100s, and shown in Fig. 7 a, to enter second pattern, object O ' can be shown as and have characteristic variations this moment, and for example color or housing line width variation are represented to be selected, shown in Fig. 7 b.
When panel 100 was Touch Screen, the user touched the panel surface 100s of object O top earlier, to start touch-control system 10, shown in Fig. 6 c.Then the user changes the contact condition of finger 81 and panel surface 100s, so that touch-control system 10 enters second pattern, to choose object O ', shown in Fig. 7 c.
Picture rolling:
At first the user is earlier pointing 81 touch panels surface 100s, starting touch-control system 10, and controls this touch-control system 10 and enters first pattern, for example shown in Fig. 6 a or Fig. 7 a.Then, the user changes the contact condition of finger 81 and panel surface 100s, for example changes into Fig. 7 a or changes into Fig. 6 a from Fig. 7 a from Fig. 6 a, and keep Preset Time, so that touch-control system 10 enters second pattern.Then, when processing unit 14 detects finger 81 when moving up and down with respect to panel surface 100s, shown in Fig. 8 a, judge that then the user is carrying out the picture rolling gesture.The display screen 150 of 14 control of processing unit image display 15 carries out frame updating, to show corresponding display frame.
The object towing:
At first the user is earlier pointing 81 touch panels surface 100s, starting touch-control system 10, and control this touch-control system 10 and enter first pattern, then by changing the relative position of finger 81 and panel surface 100s, with control cursor 151 to desiring to choose object O.Then, the user changes the contact condition of finger 81 and panel surface 100s to enter second pattern, and object O ' was shown as and was selected this moment.Then, when processing unit 14 detects finger 81 when moving up and down with respect to panel surface 100s, shown in Fig. 8 a, judge that then the user is carrying out object towing gesture.The display screen 150 of 14 control of processing unit image display 15 carries out frame updating, to show corresponding display frame.
The object convergent-divergent:
At first the user is earlier pointing 81 touch panels surface 100s, starting touch-control system 10, and control this touch-control system 10 and enter first pattern, then by changing the relative position of finger 81 and panel surface 100s, with control cursor 151 to desiring to choose object O.Then, the user changes the contact condition of finger 81 and panel surface 100s, and to enter second pattern, this object O ' was shown as and was selected this moment.Then, carry out obliquely when mobile with respect to panel surface 100s, shown in Fig. 8 b, judge that then the user is carrying out object and amplifying and dwindle gesture when processing unit 14 detects finger 81.The display screen 150 of 14 control of processing unit image display 15 carries out frame updating, to show corresponding display frame.
The object rotation:
At first the user is earlier pointing 81 touch panels surface 100s, starting touch-control system 10, and control this touch-control system 10 and enter first pattern, then by changing the relative position of finger 81 and panel surface 100s, with control cursor 151 to desiring to choose object O.Then, the user changes the contact condition of finger 81 and panel, and to enter second pattern, object O ' was shown as and was selected this moment.Then, be rotated when mobile with respect to panel surface 100s, shown in Fig. 8 c, judge that then the user is carrying out the object rotate gesture when processing unit 14 detects finger 81.The display screen 150 of 14 control of processing unit image display 15 carries out frame updating, to show corresponding display frame.
As previously mentioned, since existing touch-control system when carrying out a plurality of indicant identification, can occur because of indicant cover mutually can't correct calculation contact point coordinate situation.The present invention proposes a kind ofly only to utilize the touch-control system that single indicant can carry out two kinds of operator schemes (Fig. 2 a, Fig. 3 and Fig. 5 are a) in addition, the contact condition that touch-control system of the present invention only need change indicant and panel surface can carry out the operator scheme switching easily, and has the effect that reduces system cost.
Though the present invention is disclosed by the foregoing description, yet the foregoing description is not to be used to limit the present invention, and any the technical staff in the technical field of the invention without departing from the spirit and scope of the present invention, should do various variations and modification.Therefore protection scope of the present invention should be as the criterion with the scope that claims were defined.
Claims (20)
1. the gesture discrimination method of a touch-control system, this method comprises the following steps:
Utilize the image of the continuous acquisition of at least one image sensor across panel surface;
Handle described image, change with the contact condition of judging single indicant and described panel surface; And
When described contact condition changed greater than threshold value, then whether the described single indicant of identification met default gesture with the relative variation of described panel surface.
2. gesture discrimination method according to claim 1, wherein, described contact condition changes and is to determine according to the width that covers shadow of single indicant described in the described image or area change.
3. gesture discrimination method according to claim 1 wherein, is kept Preset Time if described contact condition changes greater than described threshold value and after changing, and then whether the described single indicant of identification meets default gesture with the relative variation of described panel surface.
4. gesture discrimination method according to claim 1, wherein, described default gesture is picture rolling, object towing, object convergent-divergent or object rotation.
5. gesture discrimination method according to claim 1, this method also comprise the following steps: then to start the running of touch-control system if contact described panel surface according to the described single indicant of described spectral discrimination.
6. gesture discrimination method according to claim 1, this method also comprises the following steps: if described contact condition changes less than described threshold value, then, control the action of the cursor on the image display according to of the relative variation of described single indicant with described panel surface.
7. gesture discrimination method according to claim 6, wherein, described threshold value can dynamically be adjusted.
8. the gesture discrimination method of a touch-control system, this method comprises the following steps:
Utilize the image of the continuous acquisition of at least one image sensor across panel surface;
Handle described image, to detect the contact point of single indicant at described panel surface; And
According to the state variation and the change in location of described contact point, whether the described single indicant of identification meets default gesture with contacting of described panel surface.
9. gesture discrimination method according to claim 8, this method also comprise the following steps: the position of covering shadow according to single indicant described in the described image, calculate the change in location of described contact point.
10. gesture discrimination method according to claim 8, wherein, described state variation is that width that covers shadow or the area change according to single indicant described in the described image determines.
11. gesture discrimination method according to claim 10, wherein, if described width that covers shadow or area change are greater than threshold value, then according to the change in location of described contact point, whether the described single indicant of identification meets default gesture with contacting of described panel surface.
12. gesture discrimination method according to claim 8, wherein, described default gesture is picture rolling, object towing, object convergent-divergent or object rotation.
13. gesture discrimination method according to claim 8, this method also comprise the following steps: according to the gesture that is picked out, the display frame of update image display.
14. a touch-control system, this system comprises:
Panel, this panel has panel surface;
At least one light source, this at least one light source is arranged at described panel surface;
At least one image sensor, this at least one image sensor capture continuously along described panel surface and comprise that single indicant covers the image form that covers shadow that described light source forms; And
Processing unit, this processing unit judges that whether width that covers shadow in the described image form or area change are greater than threshold value, if judge described width or area change greater than described threshold value, then whether the change in location of described single indicant of identification and described panel surface meets default gesture.
15. touch-control system according to claim 14, wherein, described panel is blank or Touch Screen.
16. touch-control system according to claim 14, wherein, described image sensor is arranged at the corner that both sides had a common boundary of described panel surface, described touch-control system also comprises reflecting element, and this reflecting element is arranged on the limit of described panel surface and the non-conterminous panel surface of described image sensor.
17. touch-control system according to claim 16, wherein, described image sensor acquisition continuously comprises that single indicant covers two image forms that cover shadow of described light source and the formation of described reflecting element, described processing unit calculates the change in location of described single indicant and described panel surface according to two positions that hide worn-out shadow in the described image form.
18. touch-control system according to claim 14, wherein, this touch-control system comprises two image sensors, these two image sensors capture continuously respectively and comprise that described single indicant covers the image form that covers shadow that described light source forms, described processing unit calculates the change in location of described single indicant and described panel surface according to the position that hides worn-out shadow in the described image form.
19. touch-control system according to claim 14, this touch-control system also comprises image display, this image display couples described processing unit, if the change in location of described single indicant of wherein described processing unit identification and described panel surface meets default gesture, then control described image display and carry out frame updating.
20. touch-control system according to claim 14, wherein, described default gesture is picture rolling, object towing, object convergent-divergent or object rotation.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN200910160654XA CN101989150A (en) | 2009-07-29 | 2009-07-29 | Gesture recognition method and touch system using same |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN200910160654XA CN101989150A (en) | 2009-07-29 | 2009-07-29 | Gesture recognition method and touch system using same |
Publications (1)
Publication Number | Publication Date |
---|---|
CN101989150A true CN101989150A (en) | 2011-03-23 |
Family
ID=43745720
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN200910160654XA Pending CN101989150A (en) | 2009-07-29 | 2009-07-29 | Gesture recognition method and touch system using same |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN101989150A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103105978A (en) * | 2012-09-28 | 2013-05-15 | 友达光电股份有限公司 | Optical touch control panel and brightness control method thereof |
CN103389793A (en) * | 2012-05-07 | 2013-11-13 | 深圳泰山在线科技有限公司 | Human-computer interaction method and human-computer interaction system |
CN104699362A (en) * | 2013-12-06 | 2015-06-10 | 联想(北京)有限公司 | Information processing method and electronic equipment |
CN105353933A (en) * | 2015-12-02 | 2016-02-24 | 北京海杭通讯科技有限公司 | Capacitive touch screen control method |
CN108132713A (en) * | 2012-12-19 | 2018-06-08 | 原相科技股份有限公司 | Switching device |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1815428A (en) * | 2005-01-31 | 2006-08-09 | 株式会社东海理化电机制作所 | Touch input device |
CN101344828A (en) * | 2007-07-12 | 2009-01-14 | 索尼株式会社 | Input device, storage medium, information input method, and electronic apparatus |
-
2009
- 2009-07-29 CN CN200910160654XA patent/CN101989150A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1815428A (en) * | 2005-01-31 | 2006-08-09 | 株式会社东海理化电机制作所 | Touch input device |
CN101344828A (en) * | 2007-07-12 | 2009-01-14 | 索尼株式会社 | Input device, storage medium, information input method, and electronic apparatus |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103389793A (en) * | 2012-05-07 | 2013-11-13 | 深圳泰山在线科技有限公司 | Human-computer interaction method and human-computer interaction system |
CN103389793B (en) * | 2012-05-07 | 2016-09-21 | 深圳泰山在线科技有限公司 | Man-machine interaction method and system |
CN103105978A (en) * | 2012-09-28 | 2013-05-15 | 友达光电股份有限公司 | Optical touch control panel and brightness control method thereof |
CN103105978B (en) * | 2012-09-28 | 2016-08-10 | 友达光电股份有限公司 | Optical touch control panel and brightness control method thereof |
CN108132713A (en) * | 2012-12-19 | 2018-06-08 | 原相科技股份有限公司 | Switching device |
CN108132713B (en) * | 2012-12-19 | 2021-06-18 | 原相科技股份有限公司 | Switching device |
CN104699362A (en) * | 2013-12-06 | 2015-06-10 | 联想(北京)有限公司 | Information processing method and electronic equipment |
CN105353933A (en) * | 2015-12-02 | 2016-02-24 | 北京海杭通讯科技有限公司 | Capacitive touch screen control method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
TWI501121B (en) | Gesture recognition method and touch system incorporating the same | |
US8867791B2 (en) | Gesture recognition method and interactive system using the same | |
US10761610B2 (en) | Vehicle systems and methods for interaction detection | |
US9606618B2 (en) | Hand tracker for device with display | |
JP5412227B2 (en) | Video display device and display control method thereof | |
TWI483143B (en) | Hybrid pointing device | |
RU2541852C2 (en) | Device and method of controlling user interface based on movements | |
WO2013144599A2 (en) | Touch sensing systems | |
US8659577B2 (en) | Touch system and pointer coordinate detection method therefor | |
US10276133B2 (en) | Projector and display control method for displaying split images | |
CN103853321A (en) | Portable computer with pointing function and pointing system | |
CN102033656B (en) | Gesture identification method and interaction system using same | |
CN101989150A (en) | Gesture recognition method and touch system using same | |
JP2014026355A (en) | Image display device and image display method | |
KR20120136719A (en) | The method of pointing and controlling objects on screen at long range using 3d positions of eyes and hands | |
CN102033657B (en) | Touch system, method for sensing height of referent and method for sensing coordinates of referent | |
US9489077B2 (en) | Optical touch panel system, optical sensing module, and operation method thereof | |
TWI479363B (en) | Portable computer having pointing function and pointing system | |
WO2014181587A1 (en) | Portable terminal device | |
TWI444875B (en) | Multi-touch input apparatus and its interface method using data fusion of a single touch sensor pad and imaging sensor | |
KR20090037535A (en) | Method for processing input of touch screen | |
JP5118663B2 (en) | Information terminal equipment | |
TW201305853A (en) | Hybrid pointing device | |
CN102902419A (en) | Mixed type pointing device | |
TW201523395A (en) | Optical touch panel system, optical sensing module, and operation method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C02 | Deemed withdrawal of patent application after publication (patent law 2001) | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20110323 |