CN103019457A - Optical touch system - Google Patents

Optical touch system Download PDF

Info

Publication number
CN103019457A
CN103019457A CN2011102843357A CN201110284335A CN103019457A CN 103019457 A CN103019457 A CN 103019457A CN 2011102843357 A CN2011102843357 A CN 2011102843357A CN 201110284335 A CN201110284335 A CN 201110284335A CN 103019457 A CN103019457 A CN 103019457A
Authority
CN
China
Prior art keywords
image
pixel
control system
touch control
optical touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2011102843357A
Other languages
Chinese (zh)
Inventor
苏宗敏
柯怡贤
林育佳
林志新
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pixart Imaging Inc
Original Assignee
Pixart Imaging Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pixart Imaging Inc filed Critical Pixart Imaging Inc
Priority to CN2011102843357A priority Critical patent/CN103019457A/en
Publication of CN103019457A publication Critical patent/CN103019457A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Position Input By Displaying (AREA)

Abstract

The invention discloses an optical touch system which comprises an image sensor and a processor, wherein at least one target is detected by the image sensor; the processor is coupled with the image sensor; and the processor is used for analyzing the quantity of pixel clusters generated by the at least one target in an image generated by the image sensor and generating gesture information when the quantity of the pixel clusters is more than the preset quantity of the pixel clusters.

Description

Optical touch control system
Technical field
The present invention relates to a kind of optical touch control system, and particularly relate to a kind of gesture detection method and use the optical touch control system of this gesture detection method.
Background technology
In optical touch control system, image sensor is obtained the image of target, and touch-control system is then analyzed the position of target image in the image, then calculates target coordinates according to the position of target image and the partial geometry size of optical touch control system.
The United States Patent (USP) notification number discloses a kind of optical touch control screen system the 4th, 782, No. 328, and it comprises two sensors and processor, and wherein processor couples two sensors.Two sensors are used for obtaining the image of the target on the touch control screen zone.The image that processor produces by analyzing two sensors determines the sense wire (sensing path) of distinguishing linking objective and two sensors.Processor goes out the location coordinate of target again according to the sensing route calculation.
United States Patent (USP) discloses a kind of optical touch control screen system for 7689381B2 number, and it comprises a catoptron, an image sensor and a processor.Catoptron is arranged on the touch area periphery, and catoptron is for generation of a mirror image of target.Image sensor is for generation of the image of target and the image of mirror image.The processor analysis is by the sensing route of the image of target and the sensing route of the image that passes through mirror image, and the coordinate that goes out target according to described a plurality of sensing route calculation.
Multipoint-touch-technology can pick out simultaneous two or more target on touch-control surface, and multiple spot perception (multipoint awareness) is used on the operating function of high-order, for example: two finger convergent-divergents (pinch to zoom) or startup preset program (activating predefined programs).
Existing optical touch control system of supporting multi-point touch calculates multiobject coordinate in each predetermined point of time, then exports on the different time points multiobject coordinate in the multi-point touch application program.In the multi-point touch application program, the gesture (gestures) of the pre-defined multi-point touch of meeting, general gesture may be defined as for what singly refer to or refer to more scroll and stirs (pan), picture rotation (rotation) and two finger convergent-divergents (zoom).In addition, referring to singly that the touch-control gesture roughly comprises commonly presses, clicks (flick), chooses and dilatory.In addition, also have twoly to refer to that strikings (double-tap) etc. can be for the customized setting of specific software, such as the game startup etc.
The design key concept of multi-point touch application program also adopts event driven tupe.After computing system captures the event of touch-control, can call out the button.onrelease of having registered, produce to should button.onrelease designed go out effect.
When carrying out multi-point touch operation, existing optical touch control system of supporting multi-point touch must calculate multiobject coordinate in different time points, then exports multiobject coordinate in the multi-point touch application program.Calculate and export multiobject coordinate and need expend plenty of time and resource, also can cause the optical touch control system reaction slow.
Summary of the invention
For solving the above problems, the invention provides a kind of optical touch control system.
One embodiment of the invention provides a kind of optical touch control system, and it comprises an image sensor and a processor.Image sensor is detected at least one target.Processor couples image sensor.Processor is analyzed the pixel number of clusters that is produced by this at least one target in the image that image sensor produces, and during greater than an intended pixel number of clusters, produces a gesture information in the pixel number of clusters.
Another embodiment of the present invention provides a kind of optical touch control system, and it comprises an image sensor and a processor.Image sensor is detected at least one target.Processor couples image sensor.Processor is analyzed in the image pixel number of clusters that is produced by at least one target, and relatively this pixel number of clusters and an intended pixel number of clusters, to determine to export a coordinate or a gesture information.
Another embodiment of the present invention provides a kind of optical touch control system, and it comprises an image sensor and a processor.Image sensor is used for detecting at least one target and producing a plurality of images.Processor receives in described a plurality of image, each image of identification that a plurality of pixels are gathered together and changes according to distance between two pixels are gathered together farthest in described a plurality of images and produces a control information.
Another embodiment of the present invention provides a kind of optical touch control system, and it comprises an image sensor and a processor.Image sensor is used for detecting at least one target and producing many images.Processor receives in described a plurality of image, each image of identification at least one pixel and gathers together.Processor detects in the one of described a plurality of images that one first pixel is gathered together and this first pixel is gathered together remains under the preset range situation, detects one second pixel when gathering together, and then produces a control information.
Another embodiment of the present invention provides a kind of optical touch control system, and it comprises an image sensor and a processor.Image sensor is used for detecting at least one target.A plurality of pixels in one first image that this image sensor of processor identification sequentially produces gather together with one second image in a plurality of pixels gather together, the pixel of this second image is gathered together and at least part of pixel that should the first image is gathered together, the quantity of gathering together of the pixel in this second image quantity of gathering together more than or equal to the pixel in this first image wherein, and each pixel in this first image is gathered together and this second image in the corresponding pixel difference in size of gathering together and/or position difference during less than a threshold value, then produce a gesture information.
Above sketch out technical characterictic of the present invention and advantage are obtained better understanding so that the present invention hereinafter describes in detail.Other technical characterictic and the advantage that consist of claim target of the present invention will be described in hereinafter.The persond having ordinary knowledge in the technical field of the present invention should understand, and the concept that hereinafter discloses can be used as the basis with specific embodiment and revised quite easily or design other structure or technique and realize the purpose identical with the present invention.The persond having ordinary knowledge in the technical field of the present invention also should understand, and the construction of this class equivalence also can't break away from the spirit and scope of the present invention that accompanying claim proposes.
Description of drawings
Fig. 1 shows the synoptic diagram of an optical touch control system of one embodiment of the invention;
Fig. 2 A shows the synoptic diagram of the image that the image sensor of one embodiment of the invention produces;
Fig. 2 B shows the brightness waveform figure of one embodiment of the invention;
The synoptic diagram of a selecteed row pixel in one image of Fig. 2 C demonstration one embodiment of the invention;
Selecteed pixel distribution synoptic diagram in one image of Fig. 2 D demonstration one embodiment of the invention;
Fig. 3 system shows the synoptic diagram of two targets on optical touch control system according to one embodiment of the invention;
Fig. 4 shows the synoptic diagram that pixel that two targets of Fig. 3 produce is gathered together in an image;
Fig. 5 is for to show the synoptic diagram of a target on optical touch control system according to one embodiment of the invention;
Fig. 6 is for showing that according to one embodiment of the invention two targets are at the mobile synoptic diagram of optical touch control system;
Fig. 7 is the synoptic diagram that pixel that two targets of Fig. 6 produce in an image is gathered together;
Fig. 8 is for showing that according to one embodiment of the invention two targets are at the mobile synoptic diagram of optical touch control system;
Fig. 9 is the synoptic diagram that pixel that two targets of Fig. 8 produce in an image is gathered together;
Figure 10 makes the synoptic diagram of rotate gesture for show two targets according to one embodiment of the invention at optical touch control system;
Figure 11 to Figure 13 is the change in location synoptic diagram that pixel that two targets of making rotate gesture of Figure 10 produce in image is gathered together;
Figure 14 is the synoptic diagram of the optical touch control system of another embodiment of the present invention;
Figure 15 is the synoptic diagram that pixel that two targets of Figure 14 produce in an image is gathered together;
The synoptic diagram that Figure 16 gathers together for the pixel after two targets on the optical touch control system of Figure 14 are done amplifying gesture;
The synoptic diagram that Figure 17 gathers together for the pixel after two targets on the optical touch control system of Figure 14 are done rotate gesture;
Figure 18 is the synoptic diagram of the optical touch control system of yet another embodiment of the invention;
Figure 19 is the process flow diagram of the gesture detection method of one embodiment of the invention;
Figure 20 and Figure 21 illustrate respectively an image, and wherein two images are sequentially to obtain and comprise a plurality of pixels to gather together.
Wherein, description of reference numerals is as follows:
1: optical touch control system
2: the touch area
3,4: target
3 ', 4 ': mirror image
5,6: optical touch control system
11: image sensor
12: processor
13: mirror element
14: image
15,16: light-projecting component
17,18,19,20a, 20b: image
51,52: image sensor
53: processor
54,55,56: image
141,142,171,172,173,174: pixel is gathered together
141 ', 142 ': the part that brightness is lower
143: pixel
181,182: mirror element
220: the first images
230: the second images
221,222,221 ', 222 ', 223,541,542: pixel is gathered together
1711,1741: outer ledge
1712,1742: the inside edge
D1, d2: position difference
L1, L2, L3, L4, L5, L6, L7, L8: distance
W1, W2, W1 ', W2 ': size
S201~S212: process step
Embodiment
The optical touch control system of one embodiment of the invention is by comparing pixel number of clusters and the intended pixel number of clusters that target produces in the image, to judge whether carrying out many touch control operations (multi touch operation).When if simple target operates, the pixel number of clusters can be identical with the intended pixel number of clusters, and this moment, optical touch control system calculated target coordinates according to obtaining image; When if multiple goal operates, the pixel number of clusters will be greater than the intended pixel number of clusters, and change in location or number of variations that this moment, optical touch control system can be gathered together according to the pixel that a plurality of image produces of obtaining are exported gesture information accordingly.
Optical touch control system can comprise at least one image sensor, and image sensor is used for detecting at least one target.Optical touch control system also can comprise a processor, processor can couple image sensor, processor is used for analyzing the image that image sensor produces, calculate pixel number of clusters in the image, compared pixels number of clusters and intended pixel number of clusters, and when pixel number of clusters during greater than the intended pixel number of clusters, then export gesture information.
The intended pixel number of clusters is along with the optical touch control system design is different and different.In one embodiment, the intended pixel number of clusters can be the pixel number of clusters that a image that simple target obtains in each image sensor produces.
Following illustration concept of the present invention is used in different optical touch control systems.
Fig. 1 shows the synoptic diagram of an optical touch control system 1 of one embodiment of the invention.Fig. 2 shows the synoptic diagram of the image 14 that the image sensor of one embodiment of the invention produces.See figures.1.and.2 shown in the A, optical touch control system 1 can comprise an image sensor 11, a processor 12 and a mirror element 13.It is other that mirror element 13 is arranged at touch area 2, with the mirror image 3 that produces target 3 '.It is other that image sensor 11 can be arranged at a touch area 2, its incidence surface is towards the touch area 2, shooting on touch area 2 target 3 and the mirror image 3 that produces of mirror element 13 ', produce an image 14, the pixel that wherein comprises target 3 and mirror image 3 ' produce on the image 14 (the pixel cluster) 141 and 142 that gather together.Processor 12 couples image sensor 11.Processor 12 can be analyzed in the image 14 by gather together 141 and 142 quantity of the pixel of target 3 and mirror image 3 ' produce, and compared pixels gather together 141 or 142 quantity and an intended pixel number of clusters, determine to calculate the coordinate of target 3 or export a gesture information (gesture information) with this.
In the present embodiment, the intended pixel number of clusters of optical touch control system 1 is 2, so in the image of optical touch control system 1 acquisition, it is 2 o'clock that its pixel is gathered together, and optical touch control system 1 can calculate according to the image 14 that obtains the coordinate of targets.Can be with reference to United States Patent (USP) 7689381B2 number about the computing method of the coordinate of target, but the present invention is not limited with the method for United States Patent (USP) 7689381B2 number.In one embodiment, the pixel number of clusters that simple target 3 produces in image 14 is 2, so the intended pixel number of clusters is set as 2.
It is 141 or 142 close each other in brightness that pixel is gathered together, but be different from a plurality of pixels set of background luminance.Pixel is gathered together and 141 or 142 be can be arbitrary shape, is not subject to shown in Fig. 2 A.
Shown in Fig. 2 A to Fig. 2 D, processor 12 can utilize the monochrome information of at least part of pixel in the image 14, produces a brightness waveform figure (Fig. 2 B), and then picking out pixel from this brightness waveform figure gathers together 141 or 142 again.
If the pixel in the image 14 is gathered together and 141 or 142 is produced by target 3 shield light, then can produce on the brightness waveform figure the lower part 141 of corresponding brightness ' or 142 '.On the contrary, if the pixel in the image 14 is gathered together and 141 or 142 is target 3 reflective generations, then can produce the corresponding higher part of brightness on the brightness waveform figure.
In one embodiment, processor 12 can a predetermined threshold value or brightness change threshold value comparison brightness oscillogram, compare thus the lower or higher part of brightness.So, pixel is gathered together and 141 or 142 can easily be picked out.
The method that produces brightness waveform figure is a lot, below only lists several common methods, but the present invention is not limited with these methods.
In one embodiment, the summation of the pixel brightness value of every delegation or average in the calculating image 14 is to obtain a brightness waveform figure.
In one embodiment, shown in Fig. 2 C, the brightness value of pixel that brightness waveform figure can be at least one row of selection consists of, and perhaps the average brightness value of the pixel of multiple row or summation brightness value consist of.
In one embodiment, the brightness value that brightness waveform figure can be the pixel 143 of different lines consists of, shown in Fig. 2 D.
In one embodiment, brightness waveform figure can be the pixel (for example: the darkest N row or the brightest N row) of acquisition unit apportion in image 14 or obtains the brightness value of the darkest N or a brightest N pixel in every delegation.
In one embodiment, target 3 comprises finger.
Join shown in Figure 1ly, in one embodiment, optical touch control system 1 also can comprise two light-projecting components (light projecting members) 15 and 16 again, and light-projecting component 15 and 16 is 2 light projectors to the touch area, in order to the detecting of target 3.It is other that light-projecting component 15 and 16 can be arranged at touch area 2.
With reference to Fig. 3 and shown in Figure 4, when two targets (3 and 4) when touching touch area 2, mirror element 13 can produce two mirror images (3 ' and 4 '), and the image 17 that image sensor 11 is obtained can comprise pixel that target (3 and 4) and mirror image (3 ' and 4 ') produce gather together (171,172,173 and 174).Processor 12 is analyzed image 17, can calculate and comprise 4 pixels gather together (171,172,173 and 174) in the image 17, and after the quantity of gather together in compared pixels (171,172,173 and 174) and the intended pixel number of clusters (being 2 in the present embodiment), processor 12 can be analyzed the difference between image 17 and next image, corresponding output one gesture information.
In one embodiment, next image is after obtaining image 17, obtains on the sample time of being scheduled to.
To shown in Figure 5, after two targets (3 and 4) were touched touch area 2, then target 4 moved away touch area 2 with reference to Fig. 3.After target 4 moved away touch area 2, image sensor 11 can be obtained out comprised the image that 2 pixels are gathered together, shown in Fig. 2 A.After the image 17 of comparison diagram 4 and next comprised the image that 2 pixels gather together, the pixel number of clusters that processor 12 analyzes two images reduced, and can export by with (the press and tap) information of click.In one embodiment, by and click the corresponding instruction of the corresponding slide-mouse right button of (press and tap) information.
In one embodiment, processor 12 receives a plurality of images, and at least one pixel is gathered together in each image of identification.Detect one first pixel clump and the first pixel and gather together and remain under the preset range situation in an image, processor 12 detects one second pixel when gathering together, and then produces a control information.In one embodiment, gathering together when this second pixel disappears after remaining on a preset range, then output by and click information.
With reference to Fig. 3, Fig. 4, Fig. 6 and shown in Figure 7, behind acquisition image 17, processor 12 can obtain next image 18 (Fig. 7).Two targets on touch area 2 (3 and 4) can be each other away from movement, as shown in Figure 6; And image 18 can when two targets (3 and 4) be separated, be obtained and be got by image sensor 11.Processor 12 after obtaining image 18, the pixel of image 18 quantity and the intended pixel number of clusters (being 2 in the present embodiment) of (171,172,173 and 174) of gathering together relatively.The quantity of (171,172,173 and 174) is greater than the intended pixel number of clusters because the pixel of image 18 is gathered together, processor 12 can calculate in image 17 and image 18 interior pixels are gathered together (171,172,173 and 174) at a distance of two pixels farthest gather together distance L 1 and L2 between (171 and 174), the relatively variation between two distance L 1 and L2.In the present embodiment, because two targets (3 and 4) are separately, therefore the variation between two distance L 1 and L2 is to increase, and namely distance L 2 is greater than distance L 1.Processor 12 can be according to the analysis result of distance L 1 and L2 increase, output amplification message (zoom-in information).In one embodiment, amplification message can corresponding keyboard control key (Ctrl) with roller of sliding mouse to the corresponding instruction of first-class combination.
Distance L 1 or L2 can be at a distance of farthest two pixels the gather together representative point of (171 and 174) or the distance between the edge.In one embodiment, distance L 1 or L2 can be at a distance of gather together distance between the outer ledge (1711 and 1741) of (171 and 174) of farthest two pixels.In one embodiment, distance L 1 or L2 can be at a distance of gather together distance between the inside edge (1712 and 1742) of (171 and 174) of farthest two pixels.In one embodiment, distance L 1 or L2 can be at a distance of the gather together distance of the upper predetermined point of outer ledge (1711 and 1741) of (171 and 174) of farthest two pixels, and wherein predetermined point comprises marginal point or the mid point of outer ledge (1711 and 1741).In one embodiment, distance L 1 or L2 can be at a distance of the gather together distance of the upper predetermined point in inside edge (1712 and 1742) of (171 and 174) of farthest two pixels, and wherein predetermined point comprises marginal point or the mid point of inside edge (1712 and 1742).In one embodiment, distance L 1 or L2 can be at a distance of the gather together distance of central point of (171 and 174) of farthest two pixels.In one embodiment, distance L 1 or L2 can be at a distance of the gather together distance of focus point of (171 and 174) of farthest two pixels.
With reference to Fig. 3, Fig. 4, Fig. 8 and shown in Figure 9, after obtaining image 17, processor 12 can obtain next image 19 (Fig. 9).Two targets on touch area 2 (3 and 4) can movement close to each other, as shown in Figure 8; And image 19 can when two targets (3 and 4) be close mutually, be obtained and be got by image sensor 11.Processor 12 after obtaining image 19, the pixel of image 19 quantity and the intended pixel number of clusters (being 2 in the present embodiment) of (171,172,173 and 174) of gathering together relatively.The quantity of (171,172,173 and 174) is greater than the intended pixel number of clusters because the pixel of image 19 is gathered together, processor 12 can calculate in image 17 and image 19 interior pixels are gathered together (171,172,173 and 174) at a distance of two pixels farthest gather together distance L 1 and L3 between (171 and 174), the relatively variation between two distance L 1 and L3.In the present embodiment, because two targets (3 and 4) are mutually close, therefore the variation between two distance L 1 and L3 is to reduce, and namely distance L 3 is less than distance L 1.Processor 12 can be according to the analysis result of distance L 1 and L3 minimizing, and information (zoom-out information) is dwindled in output.In one embodiment, the information of dwindling can corresponding keyboard control key (Ctrl) with roller of sliding mouse to the corresponding instruction of inferior combination.
To shown in Figure 13, two targets on touch area 2 (3 and 4) may be made rotate gesture (rotation gesture) with reference to Figure 10.For example, target 4 is done clockwise or inverse clock rotation centered by target 3, as shown in figure 10.In the present embodiment, two targets (3 and 4) produce corresponding pixel at image 17 respectively and gather together 173 and 174 on the position of Figure 10; Mirror image (3 ' and 4 ') gather together 171 and 172 in pixel corresponding to image 17 generations respectively.Because the position of target 4 constantly changes and the invariant position of target 3, therefore in continuous image, gather together 171 and 173 position of pixel can not change, and pixel is gathered together and 172 and 174 then can constantly be changed.
When target 3 was rotated around target 4, image sensor 11 can sequentially be obtained out image 17 and 20a.Because the pixel of image 17 and 20a is gathered together the quantity of (171,172,173 and 174) greater than intended pixel number of clusters (being 2 in the present embodiment) after, processor 12 can calculate pixel in images 17 and the 20a and gather together in (171,172,173 and 174) at a distance of two pixels farthest gather together distance L 1 and L4 between (171 and 174).Different and pixel gather together 171 or 173 the invariant position or when remaining in the preset range, processor 12 can the output rotation informations of at least one pixel in (171,172,173 and 174) of gathering together when distance L 1 and distance L 4.
In another embodiment, image 17,20a and 20b sequentially produce, and the distance L 1, L4 and the L5 that gather together between (171 and 174) at a distance of two pixels farthest among each image 17,20a or the 20b are calculated respectively.Processor 12 relatively distance L 1 and distance L 4 can analyze the distance that two pixels gather together between (171 and 174) and increases; And relatively distance L 4 and distance L 5 can analyze the distance that two pixels gather together between (171 and 174) and reduce.Increased before this when processor 12 analyzes two pixels and gather together distance between (171 and 174), after be to reduce, perhaps reduce before this, after be to increase, then export rotation information.
In another embodiment, a plurality of images sequentially produce, and the distance in each image between two pixels are farthest gathered together is calculated respectively.Be to present when periodically increasing and decreasing when processor 12 analyzes the distance of two pixels between gathering together, then export rotation information.
Figure 14 shows the synoptic diagram of the optical touch control system 5 of another embodiment of the present invention.With reference to shown in Figure 14, optical touch control system 5 comprises two image sensor 51 and 52 and one processor 53, and wherein two image sensor 51 and 52 couple processor 53.Two image sensor 51 and 52 are positioned at the touch area and were arranged in 2 minutes, preferably are arranged at respectively on two corners of touch area 2.Two image sensor 51 and 52 at least one targets 3 or 4 of taking on touch area 2.The image that processor 53 produces according to two image sensor 51 and 52 calculates the coordinate of target 3 or 4, when perhaps the pixel number of clusters will be greater than the intended pixel number of clusters in analyzing image, and the output gesture information.
In one embodiment, the intended pixel number of clusters of optical touch control system 5 is predeterminable is 1.
In one embodiment, in optical touch control system 5, one target 3 or 4 coordinate are that the pixel in another image that a pixel is gathered together and image sensor 52 produces in the image that produces according to image sensor 51 is gathered together to calculate and got, therefore for calculating the coordinate of a target 3 or 4, required minimum pixel number of clusters is 1 in each image sensor 51 or 52 images that produce.Therefore, the intended pixel number of clusters can be 1.
To shown in Figure 16, when two targets 3 and 4 were on touch area 2, each image sensor 51 or 52 images that obtain out 54 can comprise two pixels and gather together 541 and 542 with reference to Figure 14.Because the quantity that pixel is gathered together in the image 54 is greater than the intended pixel number of clusters, processor 53 is the variation between image 54 and next image that obtains relatively, corresponding generation gesture information.In one embodiment, gather together if a pixel only appears in next image, processor 53 is image 54 and next image relatively, can analyze in two targets 3 and 4 one and leave touch area 2, thereby output by and (the press and tap) information of click.
With reference to Figure 15 and shown in Figure 16, the quantity of gathering together when pixel in next image 55 is during greater than the intended pixel number of clusters, processor 53 can calculate two pixels in image 54 and 55 gather together distance L 6 and L7 between (541 and 542), the relatively variation between two distance L 6 and L7.When distance L 6 during less than distance L 7, processor 53 output amplification messages (zoom-in information); And when distance L 6 during greater than distance L 7, processor 53 output amplification messages (zoom-out information).
With reference to Figure 15 and shown in Figure 17, when instantly opening pixel is gathered together in the image 55 quantity greater than the intended pixel number of clusters, processor 53 can calculate two pixels in image 54 and 55 gather together distance L 6 and L8 between (541 and 542), the relatively variation between two distance L 6 and L8.Processor 53 can compare in the image the gather together change in location of (541 and 542) of respective pixel in 54 and 55 in addition, when distance L 6 and distance L 8 not simultaneously and a pixel gather together (541 or 542) in image during 54 and 55 invariant position, processor 53 outputs one rotation information.
Similarly, the rear change that diminishes or diminish after the distance between pixel is gathered together in the many images of processor 53 analyses presents change greatly is large, then exports a rotation information.
Distance between pixel is gathered together can be gather together distance between representing point or edge of pixel as aforementioned.
Figure 18 shows the synoptic diagram of the optical touch control system 6 of further embodiment of this invention.With reference to shown in Figure 180, optical touch control system 6 comprises an image sensor 11, a processor 12 and two mirror elements 181 and 182.It is other that two mirror elements 181 and 182 are arranged at touch area 2, can be arranged at respectively 2 adjacent both sides, touch area.When target 3 is on touch area 2, two mirror elements 181 and 182 can produce the mirror image 3 of 3 targets 3 ', so in the image that obtained out of image sensor 11, can comprise 4 pixels and gather together.In optical touch control system 6, the intended pixel number of clusters is predeterminable to be 4.
Similarly, when processor 12 compares pixel number of clusters on the image greater than an intended pixel number of clusters, gesture information that can be corresponding according to next image output.In one embodiment, when the pixel number of clusters of next image tails off, then output by and (the press and tap) information of click; When the distance of gathering together at a distance of farthest two pixels in two images increases, then export amplification message (zoom-in information); When the distance of gathering together at a distance of farthest two pixels in two images reduces, processor 53 output amplification messages (zoom-in information); During at least one pixel is gathered together in different and two images when the distance at a distance of farthest two pixels are gathered together in two images invariant position, then export a rotation information.Similarly, the rear change that diminishes or diminish after the distance between pixel is gathered together in the many images of processor 12 analyses presents change greatly is large, then exports a rotation information.
Figure 19 shows the process flow diagram of the gesture detection method of one embodiment of the invention.With reference to shown in Figure 19, in step S201, a sensor obtains the first image.In step S202, calculate pixel number of clusters in the first image and determine the boundary position that pixel is gathered together.In S203, judge that whether the pixel number of clusters is greater than an intended pixel number of clusters in the first image.In step S204, when pixel number of clusters during greater than the intended pixel number of clusters, calculate target coordinates.In step S205, calculate one first distance between two pixels are farthest gathered together in the first image.In step S206, obtain one second image and calculate the second distance between two pixels are farthest gathered together in the second image.In step S207, judge whether the first distance and second distance change.In step S208, judge whether the pixel number of clusters equals the intended pixel number of clusters in the second image.In step S209, when the pixel number of clusters equaled the intended pixel number of clusters in the second image, output was pressed and (the press and tap) information of click.In step S210, judge that whether second distance is greater than the first distance.In step S211, when second distance greater than first apart from the time, output amplification message (zoom-in information).In step S212, when second distance less than first apart from the time, information (zoom-out information) is dwindled in output.
One embodiment of the invention discloses the feature of gathering together with the pixel of sequentially obtaining in the image in addition, judges whether to export gesture information.With reference to Figure 20 and shown in Figure 21, among the embodiment, the image sensor of optical touch control system can sequentially produce one first image 220 (Figure 20) and one second image 230 (Figure 21).The first image 220 can comprise a plurality of pixels and gather together 221 and 222.The second image 230 can comprise a plurality of pixels gathers together 221 ', 222 ' and 223.Pixel is gathered together and 221 is had generation table size or big or small W1.Pixel is gathered together and 222 is had generation table size or big or small W2.Pixel gathers together 221 ' have generation table size or a big or small W1 '.Pixel gathers together 222 ' have generation table size or a big or small W2 '.But a plurality of pixels of processor identification first image 220 of optical touch control system are gathered together a plurality of pixels of the 221 and 222 and second image 230 gather together 221 ', 222 ' and 223.The pixel of the first image 220 gather together 221 can corresponding the second image 230 pixel gather together 221 '.The pixel of the first image 220 gather together 222 can corresponding the second image 230 pixel gather together 222 '.In other words, the pixel of image 220 gather together the pixel of the 221 and second image 230 gather together 221 ' can be produced by same target.The pixel of the first image 220 gather together the pixel of the 222 and second image 230 gather together 222 ' can be produced by same target.Analyze the pixel number of clusters of the second image 230 when processor more than or equal to the pixel number of clusters of the first image 220 (or the second image 230 have pixel to gather together 223 can not find at the first image 220), and each pixel of the first image 220 gather together the respective pixel of the 221 or 222 and second image 230 gather together 221 ' or 222 ' the difference of size (W1 or W2 and W1 ' or W2 ') and/or position difference (d1 and d2) during less than a threshold value, then produce a gesture information.Especially, position difference can be that corresponding pixel is gathered together (221 and 221 ' or 222 and 222 ') but between the difference of marginal position, centre of gravity place, center or other represent pixels point position of gathering together.
The present invention discloses a kind of gesture detection method and the optical touch control system that uses this gesture detection method.Number and change in location that the gesture detection method that the present invention discloses can directly be gathered together according to pixel in the image, the output gesture information need not to calculate first target coordinates, and then gesture analysis is carried out in variation according to coordinate, so can significantly reduce computing time and resource, accelerate system response speed.
Technology contents of the present invention and technical characterstic disclose as above, yet the personage who is familiar with the technology still may be based on teaching of the present invention and announcement and done all replacement and modifications that does not deviate from spirit of the present invention.Therefore, protection scope of the present invention should be not limited to those disclosed embodiments, and should comprise various do not deviate from replacement of the present invention and modifications, and is contained by following claim.

Claims (34)

1. an optical touch control system is characterized in that, comprises:
One image sensor is used for detecting at least one target; And
One processor couples this image sensor, and this processor is analyzed the pixel number of clusters that is produced by this at least one target in the image that this image sensor produces, and during greater than an intended pixel number of clusters, produces a gesture information in this pixel number of clusters.
2. optical touch control system according to claim 1, it comprises two image sensor, and this intended pixel number of clusters is 1.
3. optical touch control system according to claim 1 is characterized in that, also comprises a mirror element, and wherein this mirror element produces a mirror image of this at least one target, and this intended pixel number of clusters is 2.
4. optical touch control system according to claim 1 is characterized in that, also comprises two mirror elements, and this two mirror element produces respectively a mirror image of this at least one target, and this intended pixel number of clusters is 4.
5. optical touch control system according to claim 1 is characterized in that, this intended pixel number of clusters is the pixel number of clusters that a image that simple target obtains in this image sensor produces.
6. optical touch control system according to claim 1 is characterized in that, this image comprises a plurality of pixels gathers together, this processor calculate described a plurality of pixel gather together in a distance between farthest two pixels are gathered together.
7. optical touch control system according to claim 6, it is characterized in that, this image sensor obtains respectively two images, and this processor calculates respectively two distances between two pixels are farthest gathered together and the relatively variation between this two distance during pixel is gathered together in this two image.
8. optical touch control system according to claim 7 is characterized in that, when the variation between this two distance is increase, and this processor output amplification message.
9. optical touch control system according to claim 7 is characterized in that, when the variation between this two distance was minimizing, information was dwindled in this processor output.
10. optical touch control system according to claim 7 is characterized in that, to analyze this two distance different when this processor, and rotation information is then exported when identical in the position that at least one this pixel is gathered together between described a plurality of pixels of this two image are gathered together.
11. optical touch control system according to claim 1 is characterized in that, this image sensor obtains two images, and this processor to analyze the variation of the pixel number of clusters of this two image be when reducing, this processor output by and click information.
12. optical touch control system according to claim 1 is characterized in that, this processor utilizes the monochrome information of at least part of pixel in this image, produces a brightness waveform figure, is gathered together by the pixel that this at least one target produces by this brightness waveform figure identification again.
13. an optical touch control system is characterized in that, comprises:
One image sensor is used for detecting at least one target; And
One processor couples this image sensor, and this processor is analyzed in the image pixel number of clusters that produced by this at least one target and relatively this pixel number of clusters and an intended pixel number of clusters, exports a coordinate or a gesture information with decision.
14. optical touch control system according to claim 13 is characterized in that, comprises two image sensor, and this intended pixel number of clusters is 1.
15. optical touch control system according to claim 13 is characterized in that, also comprises a mirror element, this mirror element produces a mirror image of this at least one target, and this intended pixel number of clusters is 2.
16. optical touch control system according to claim 13 is characterized in that, also comprises two mirror elements, this two mirror element produces respectively a mirror image of this at least one target, and this intended pixel number of clusters is 4.
17. optical touch control system according to claim 13 is characterized in that, this intended pixel number of clusters is the pixel number of clusters that a image that simple target obtains in this image sensor produces.
18. optical touch control system according to claim 13 is characterized in that, this image comprises a plurality of pixels gathers together, and this processor calculate described a plurality of pixel gather together in a distance between farthest two pixels are gathered together.
19. optical touch control system according to claim 18, it is characterized in that, this image sensor obtains respectively two images, and this processor calculates respectively two distances between two pixels are farthest gathered together and the relatively variation between this two distance during pixel is gathered together in this two image.
20. optical touch control system according to claim 19 is characterized in that, when the variation between this two distance is increase, and this processor output amplification message.
21. optical touch control system according to claim 19 is characterized in that, when the variation between this two distance was minimizing, information was dwindled in this processor output.
22. optical touch control system according to claim 19 is characterized in that, to analyze this two distance different when this processor, and described a plurality of pixels of this two image gather together between at least one this pixel position of gathering together when identical, then export rotation information.
23. optical touch control system according to claim 13 is characterized in that, this image sensor obtains two images, and the variation that this processor is analyzed the pixel number of clusters of this two image is when reducing, this processor output by and click information.
24. according to request 13 a described optical touch control system, it is characterized in that this processor utilizes the monochrome information of at least part of pixel in this image, produce a brightness waveform figure, gathered together by the pixel that this at least one target produces by this brightness waveform figure identification again.
25. an optical touch control system is characterized in that, comprises:
One image sensor is used for detecting at least one target and producing many images; And
One processor receives in described a plurality of image, each image of identification that a plurality of pixels are gathered together and changes according to distance between two pixels are gathered together farthest in described a plurality of images and produces a control information.
26. optical touch control system according to claim 25 is characterized in that, each image is a bidimensional image, and this processor calculates a brightness waveform figure of at least one row pixel in each image and gather together according to these a plurality of pixels of this brightness waveform figure identification.
27. optical touch control system according to claim 25 is characterized in that, each image is a bidimensional image, and this processor calculates the brightness waveform figure that is made of partial pixel in each image and gathers together according to these a plurality of pixels of this brightness waveform figure identification.
28. optical touch control system according to claim 25 is characterized in that, when this distance furthered, this control information was amplification message; When this distance zoomed out, this control information was the information of dwindling.
29. an optical touch control system is characterized in that, comprises:
One image sensor is used for detecting at least one target and producing many images; And
One processor receives in described a plurality of image, each image of identification at least one pixel and gathers together;
This processor detects in the one of described a plurality of images that one first pixel is gathered together and this first pixel is gathered together remains under the preset range situation, detects one second pixel when gathering together, and then produces a control information.
30. optical touch control system according to claim 29 is characterized in that, gathering together when this second pixel disappears after remaining on a preset range, then output by and click information.
31. optical touch control system according to claim 29 is characterized in that, gathers together when this second pixel and gathers together near this first pixel gradually, then produces amplification message; Gather together gradually when this second pixel and to gather together near this first pixel, then produce the information of dwindling.
32. optical touch control system according to claim 29 is characterized in that, each image is a bidimensional image, and this processor calculates a brightness waveform figure of at least one row pixel in each image and gather together according to these a plurality of pixels of this brightness waveform figure identification.
33. optical touch control system according to claim 29 is characterized in that, each image is a bidimensional image, and this processor calculates the brightness waveform figure that is made of partial pixel in each image and gathers together according to these a plurality of pixels of this brightness waveform figure identification.
34. an optical touch control system is characterized in that, comprises:
One image sensor is used for detecting at least one target; And
One processor, a plurality of pixels in one first image that this image sensor of identification sequentially produces gather together with one second image in a plurality of pixels gather together, the pixel of this second image is gathered together and at least part of pixel that should the first image is gathered together, the quantity of gathering together of the pixel in this second image quantity of gathering together more than or equal to the pixel in this first image wherein, and each pixel in this first image is gathered together and this second image in the corresponding pixel difference in size of gathering together and/or position difference during less than a threshold value, then produce a gesture information.
CN2011102843357A 2011-09-23 2011-09-23 Optical touch system Pending CN103019457A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2011102843357A CN103019457A (en) 2011-09-23 2011-09-23 Optical touch system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2011102843357A CN103019457A (en) 2011-09-23 2011-09-23 Optical touch system

Publications (1)

Publication Number Publication Date
CN103019457A true CN103019457A (en) 2013-04-03

Family

ID=47968125

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2011102843357A Pending CN103019457A (en) 2011-09-23 2011-09-23 Optical touch system

Country Status (1)

Country Link
CN (1) CN103019457A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106095133A (en) * 2016-05-31 2016-11-09 广景视睿科技(深圳)有限公司 A kind of method and system of alternative projection
CN106990874A (en) * 2016-01-21 2017-07-28 纬创资通股份有限公司 Optical touch device, touch finger position determination method and optical touch system

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4782328A (en) * 1986-10-02 1988-11-01 Product Development Services, Incorporated Ambient-light-responsive touch screen data input method and system
CN1666248A (en) * 2002-06-26 2005-09-07 Vkb有限公司 Multifunctional integrated image sensor and application to virtual interface technology
US20080309632A1 (en) * 2007-06-13 2008-12-18 Apple Inc. Pinch-throw and translation gestures
US20090044988A1 (en) * 2007-08-17 2009-02-19 Egalax_Empia Technology Inc. Device and method for determining function represented by continuous relative motion between/among multitouch inputs on signal shielding-based position acquisition type touch panel
US20090090569A1 (en) * 2005-10-13 2009-04-09 Cho-Yi Lin Sensing System
US20090273571A1 (en) * 2008-05-01 2009-11-05 Alan Bowens Gesture Recognition
TW201030579A (en) * 2009-02-10 2010-08-16 Quanta Comp Inc Optical touch displaying device and operating method thereof
CN102033656A (en) * 2009-09-28 2011-04-27 原相科技股份有限公司 Gesture identification method and interaction system using same
CN102033660A (en) * 2009-10-06 2011-04-27 原相科技股份有限公司 Touch-control system and method for touch detection
US20110122099A1 (en) * 2011-02-03 2011-05-26 Hong Kong Applied Science and Technology Research Institute Company Limited Multiple-input touch panel and method for gesture recognition
CN102122350A (en) * 2011-02-24 2011-07-13 浙江工业大学 Skeletonization and template matching-based traffic police gesture identification method

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4782328A (en) * 1986-10-02 1988-11-01 Product Development Services, Incorporated Ambient-light-responsive touch screen data input method and system
CN1666248A (en) * 2002-06-26 2005-09-07 Vkb有限公司 Multifunctional integrated image sensor and application to virtual interface technology
US20090090569A1 (en) * 2005-10-13 2009-04-09 Cho-Yi Lin Sensing System
US20080309632A1 (en) * 2007-06-13 2008-12-18 Apple Inc. Pinch-throw and translation gestures
US20090044988A1 (en) * 2007-08-17 2009-02-19 Egalax_Empia Technology Inc. Device and method for determining function represented by continuous relative motion between/among multitouch inputs on signal shielding-based position acquisition type touch panel
US20090273571A1 (en) * 2008-05-01 2009-11-05 Alan Bowens Gesture Recognition
TW201030579A (en) * 2009-02-10 2010-08-16 Quanta Comp Inc Optical touch displaying device and operating method thereof
CN102033656A (en) * 2009-09-28 2011-04-27 原相科技股份有限公司 Gesture identification method and interaction system using same
CN102033660A (en) * 2009-10-06 2011-04-27 原相科技股份有限公司 Touch-control system and method for touch detection
US20110122099A1 (en) * 2011-02-03 2011-05-26 Hong Kong Applied Science and Technology Research Institute Company Limited Multiple-input touch panel and method for gesture recognition
CN102122350A (en) * 2011-02-24 2011-07-13 浙江工业大学 Skeletonization and template matching-based traffic police gesture identification method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106990874A (en) * 2016-01-21 2017-07-28 纬创资通股份有限公司 Optical touch device, touch finger position determination method and optical touch system
CN106990874B (en) * 2016-01-21 2019-11-22 纬创资通股份有限公司 Optical touch control apparatus, touch-control refer to place position determining method and optical touch control system
CN106095133A (en) * 2016-05-31 2016-11-09 广景视睿科技(深圳)有限公司 A kind of method and system of alternative projection
CN106095133B (en) * 2016-05-31 2019-11-12 广景视睿科技(深圳)有限公司 A kind of method and system of alternative projection

Similar Documents

Publication Publication Date Title
US9195345B2 (en) Position aware gestures with visual feedback as input method
US8493341B2 (en) Optical touch display device and method thereof
EP2364013B1 (en) Information processing apparatus, method and program for imaging device
US20140333585A1 (en) Electronic apparatus, information processing method, and storage medium
GB2502087A (en) Gesture recognition
AU2012209611A1 (en) Terminal having touch screen and method for identifying touch event therein
US9223407B2 (en) Gesture recognition apparatus and complex optical apparatus
US20210247848A1 (en) Method for outputting command by detecting object movement and system thereof
TW201112092A (en) Optical touch system and method thereof
US20120007826A1 (en) Touch-controlled electric apparatus and control method thereof
CN103019518A (en) Method of automatically adjusting human-computer interaction interface
EP2960763A1 (en) Computerized systems and methods for cascading user interface element animations
US10656746B2 (en) Information processing device, information processing method, and program
CN103365485A (en) Optical Touch Sensing Device
CN103472957B (en) Dual-mode input device
CN101833401A (en) Optical touch control display device and operation method thereof
TW201331796A (en) Multi-touch sensing system capable of optimizing touch blobs according to variation of ambient lighting conditions and method thereof
CN103019457A (en) Optical touch system
CN104914985A (en) Gesture control method and system and video flowing processing device
TWI448918B (en) Optical panel touch system
CN103076925B (en) Optical touch control system, optical sensing module and How It Works thereof
US20140085264A1 (en) Optical touch panel system, optical sensing module, and operation method thereof
US9189075B2 (en) Portable computer having pointing functions and pointing system
CN111198644A (en) Method and system for identifying screen operation of intelligent terminal
US8723819B2 (en) Method for analyzing two-dimensional track to generate at least one non-linear index and touch control module using the same

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C12 Rejection of a patent application after its publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20130403