CN104808865B - Optical touch control system and its object analysis method - Google Patents

Optical touch control system and its object analysis method Download PDF

Info

Publication number
CN104808865B
CN104808865B CN201410043884.9A CN201410043884A CN104808865B CN 104808865 B CN104808865 B CN 104808865B CN 201410043884 A CN201410043884 A CN 201410043884A CN 104808865 B CN104808865 B CN 104808865B
Authority
CN
China
Prior art keywords
pixel
image
touch surface
indicant
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201410043884.9A
Other languages
Chinese (zh)
Other versions
CN104808865A (en
Inventor
林育佳
林志新
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pixart Imaging Inc
Original Assignee
Pixart Imaging Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pixart Imaging Inc filed Critical Pixart Imaging Inc
Priority to CN201410043884.9A priority Critical patent/CN104808865B/en
Publication of CN104808865A publication Critical patent/CN104808865A/en
Application granted granted Critical
Publication of CN104808865B publication Critical patent/CN104808865B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Position Input By Displaying (AREA)

Abstract

The present invention provides a kind of optical touch control system and its object analysis method, and the optical touch control system includes panel and Image sensor apparatus, and wherein Image sensor apparatus is disposed on the panel.The object analysis method comprises the following steps:First, Image sensor apparatus is driven to capture the first image across a touch surface, and the first image has an object image of a corresponding indicant.Then, go out defined in the first image to should object image an imagery zone;Then, according to the luminance difference between multiple pixels in imagery zone, it is to touch the touch surface or be suspended in the touch surface to judge the indicant.

Description

Optical touch control system and its object analysis method
Technical field
The present invention relates to a kind of touch-control system, and more particularly to a kind of optical touch control system and its object analysis method.
Background technology
With the development of touch technology, panel is gradually integrally formed touch-screen with display device, so that user is directly logical Cross touch control manner and carry out input operation.Optical touch control system is because with high accuracy, reliability is good, spoilage is low, supports multiple spot Touch-control and reaction speed are fast and the advantages that influence by face plate manufacturing technology, have been widely used at present all kinds of medium-and-large-sized Electronical display product, such as visitor's guide system or industrial control system.
Optical touch control system includes an at least CIS and multiple light emitting diodes at present, such as infrared light lights Diode (Infrared Light Emitting Diode, IR LED).When optical touch control system is run, above-mentioned luminous two Pole pipe, which can emit beam, irradiates the touch surface of optical touch control system.When an object, such as finger or stylus, contact this and touch When controlling plane, the object can cover part light and form shielding shadow in touch surface.Optical touch control system can utilize image Sensor captures the image across touch surface, and is counted according in institute's pick-up image with the presence or absence of shade and shadowgraph imaging position Object is calculated relative to the position in touch surface, and then the function of touch control operation.
The content of the invention
The embodiment of the present invention provides a kind of optical touch control system and its object analysis method, and this object analysis method can be fast Speed and judge that the object close to the touch surface of optical touch control system is touching touch surface or is suspended in touch surface exactly, With the discrimination of touch point in this effective improving optical touch-control system.
The embodiment of the present invention provides a kind of object analysis method of optical touch control system, and this object analysis method includes following Step.First, an Image sensor apparatus is driven to capture the first image across the touch surface.First image has to should One object image of indicant.Then, go out defined in first image to should object image an imagery zone.And Afterwards, according to the luminance difference between multiple pixels in imagery zone, judge whether the indicant is touched touch surface or be suspended in Touch surface.When judging that the indicant touches touch surface, according to the image space of the object image in the first image, calculate this and refer to Show a touch-control coordinate of the thing relative to the panel.
The embodiment of the present invention separately provides a kind of object analysis method of optical touch control system, and wherein optical touch control system includes Panel, the first Image sensor apparatus and the second Image sensor apparatus.First Image sensor apparatus and the second image sensing Device is to be respectively arranged at diverse location on panel, and has superimposed image sensing range.The object analysis method includes The following steps.First, the touch surface of the first Image sensor apparatus and the corresponding panel of the second Image sensor apparatus acquisition is driven First background video and the second background video.Then, the first Image sensor apparatus and the second Image sensor apparatus point are driven The first image and the second image across the touch surface are not captured.First image have to should indicant the first object Image, and second image have to should indicant the second object image.Then, go out defined in the first image corresponding First imagery zone of the first object image.And go out the second image area of corresponding second object image defined in the second image Domain.Then, judge to indicate according to the luminance difference between multiple pixels in the first imagery zone and the second imagery zone respectively Thing is touching touch surface or is suspended in touch surface.
The embodiment of the present invention also provides a kind of optical touch control system, and the optical touch control system one image of coupling shows and set It is standby.The optical touch control system includes a panel, at least a light emitting source, a reflective mirror, at least a reflecting unit, image sensing dress Put and processing unit.The panel has a touch surface.An at least light-emitting component illuminates the touch-control to produce a light Face.The reflective mirror is producing the mirror image of the panel.An at least reflecting unit should caused by the light-emitting component to reflect Light.Image sensor apparatus is to capture multiple images across the touch surface, and at least one of the plurality of image has There are the object image and a mirror image of corresponding indicant.Processing unit couples Image sensor apparatus and the light-emitting component.When Processing unit drives the Image sensor apparatus to capture one first image across the touch surface, and first image is with corresponding When the object image and the mirror image of the indicant, processing unit defined in first image to should object image one Imagery zone.It is tactile to judge whether the indicant is touched according to the luminance difference between multiple pixels in imagery zone for processing unit Control face or it is suspended in touch surface.Processing unit and according to judged result, decides whether to calculate the indicant relative to the panel A touch-control coordinate.
The embodiment of the present invention also provides a kind of non-momentary computer-readable media (non-transitory Computer readable medium), this non-momentary computer-readable media of formula can perform to record one group of computer Program, when computer-readable medium storing is read out by the processor, processor can perform the object of above-mentioned optical touch control system The plurality of step in analysis method.
In summary, the embodiment of the present invention provides a kind of optical touch control system and its object analysis method, the object Analysis method can quickly and accurately be sentenced according to the Luminance Distribution information of shaded areas in the image across touch surface captured The disconnected object detected is touching touch surface or is suspended in touch surface, effectively recognizes the touch-control state of indicant.The object Analysis method can simultaneously decide whether to calculate the touch-control coordinate of the object detected, and then effective improving optical according to judged result Touch point discrimination and operational efficiency in touch-control system.
Above with the technical characteristic of the sketch out present invention and the technique effect reached, it is further understood that to be enabled The feature and technology contents of the present invention, please refer to the following detailed descriptions related to the present invention and accompanying drawing, but these explanations and institute Accompanying drawings are intended merely to the explanation present invention, rather than make any limitation to the interest field of the present invention.
Brief description of the drawings
Fig. 1 is the system structure diagram of optical touch control system provided in an embodiment of the present invention.
Fig. 2A is the schematic diagram with object image that Image sensor apparatus provided in an embodiment of the present invention is captured.
Fig. 2 B are that Image sensor apparatus provided in an embodiment of the present invention captures another bidimensional image with object image Schematic diagram.
Fig. 3 A are the background video and its corresponding brightness curve that Image sensor apparatus provided in an embodiment of the present invention captures Schematic diagram.
Fig. 3 B are the bidimensional images with object image that Image sensor apparatus provided in an embodiment of the present invention is captured Schematic diagram.
Fig. 3 C are the schematic diagrames of the brightness curve of the bidimensional image shown in corresponding diagram 3B.
Fig. 4 A~Fig. 4 B be respectively another embodiment of the present invention provide Image sensor apparatus captured there is object shadow The schematic diagram of the bidimensional image of picture.
Fig. 5 A are the operation charts of optical touch control system provided in an embodiment of the present invention.
Fig. 5 B are the schematic diagrames for the partial image that corresponding diagram 5A Image sensor apparatus is captured.
Fig. 6 is the schematic flow sheet of the object analysis method of optical touch control system provided in an embodiment of the present invention.
Fig. 7 is the schematic flow sheet of the object analysis method for the optical touch control system that another embodiment of the present invention provides.
Fig. 8 is the schematic flow sheet of the object analysis method for the optical touch control system that another embodiment of the present invention provides.
Fig. 9 is showing for the bidimensional image with object image that Image sensor apparatus provided in an embodiment of the present invention is captured It is intended to.
Figure 10 is the schematic flow sheet of the object analysis method for the optical touch control system that another embodiment of the present invention provides.
Figure 11 is the schematic flow sheet of the object analysis method for the optical touch control system that another embodiment of the present invention provides.
Figure 12 is the schematic flow sheet of highlights domain confining method in background video provided in an embodiment of the present invention.
Figure 13 is the system structure diagram for the optical touch control system that another embodiment of the present invention provides.
Figure 14 A~Figure 14 B be another embodiment of the present invention provide Image sensor apparatus captured there is object image Image schematic diagram.
Figure 15 is the schematic flow sheet of the object analysis method of optical touch control system provided in an embodiment of the present invention.
Wherein, description of reference numerals is as follows:
1、3:Optical touch control system
11:Panel
110:Touch surface
111:First edge
113:Second edge
115:3rd edge
117:4th edge
120:Light-emitting component
130:Reflective mirror
140:First reflecting unit
150:Second reflecting unit
310:3rd reflecting unit
12:Image sensor apparatus
12a:First Image sensor apparatus
12b:Second Image sensor apparatus
13:Processing unit
14:Memory cell
15:Transmission unit
16:Display device
161:Cursor
2:User
21:Finger
DR:Background area
BR:Highlights domain
H:The height in touch-control sensing region
P1~PN:Pixel
F1~F8, FB:Image
I21、I21’、I21a、I21b、I41、ITP、ITP’:Object image
LB、LB1、LB2:Left margin
RB、RB1、RB2:Right margin
H_UB:Strong light coboundary
H_LB:Strong light lower boundary
IA、IA’、IA1、IA2:Imagery zone
IS:Virtual image space
RS:Real image space
TP:Touch point position
TP’:The mirror position of touch point
120’:The mirror image of light-emitting component
140’:The mirror image of first reflecting unit
150’:The mirror image of second reflecting unit
SL:First sensing route
SL’:Second sensing route
D1、D2、D3:Distance
A1、A2:Angle
L:Pixel column quantity
VA:Field range
i:Ith pixel row
C10、C20、C30:Curve
S601~S609:Steps flow chart
S701~S709:Steps flow chart
S801~S807:Steps flow chart
S1001~S1007:Steps flow chart
S1101~S1107:Steps flow chart
S1201~S1209:Steps flow chart
S1501~S1515:Steps flow chart
Embodiment
Hereinafter, the present invention will be described in detail by illustrating the various illustrative embodiments of the present invention.However, this The described concept of invention can have many multi-forms to embody, and should not be construed as limited by exemplary reality set forth herein Apply example.In addition, same reference numbers can be used to represent similar element in the drawings.
The optical touch control system of the present invention actively can capture the thing with a corresponding indicant in Image sensor apparatus An imagery zone defined in the image of part image, and according to the pixel brightness contribution information in imagery zone, judge to lean on exactly Near or indicant close to the touch area of optical touch control system is that the touch-control for touching (touching) optical touch control system is put down Face or suspension (hovering) are in touch surface.With this, touch point discrimination and touch-control in effective improving optical touch-control system The operation of sensitivity.
(embodiment of optical touch control system)
Fig. 1 is refer to, Fig. 1 shows the system structure diagram of optical touch control system provided in an embodiment of the present invention.Optics Touch-control system 1 is sensing the position of touch of an at least indicant (pointer).In the present embodiment, the indicant is to make The finger 21 of user 2, but in other embodiments, the indicant may be, for example, that stylus (stylus) or touching control rod etc. touch Object is controlled, but the present invention is not intended to limit.
It is reflective that optical touch control system 1 includes panel 11, Image sensor apparatus 12, light-emitting component 120, reflective mirror 130, first Unit 140, the second reflecting unit 150, processing unit 13, memory cell 14, transmission unit 15 and display device 16.Hair Optical element 120, Image sensor apparatus 12, memory cell 14, transmission unit 15 and display device 16 are respectively coupled to place Manage unit 13.
In simple terms, when optical touch control system 1 is run, processing unit 13 can correspond to control light-emitting component 120, image sense Survey the operation of device 12, memory cell 14 and transmission unit 15.Processing unit 13 simultaneously can be according to the sense of Image sensor apparatus 12 Survey result, the corresponding action for controlling cursor 161 on display device 16.
Image sensor apparatus 12, light-emitting component 120, reflective mirror 130, the first reflecting unit 140 and the second reflecting unit 150 all configure on panel 11.The panel 11 may be, for example, a blank (whiteboard), transparent panel (such as glass plate or Plastic plate) or a Touch Screen (touch screen).
In the present embodiment, the panel 11 is a reflective mirror or a reflective surface.The panel 11 is essentially one piece of square Shape plate.The panel 11 has a touch surface 110, and the shape of touch surface 110 is also rectangle.Specifically, touch surface 110 has Four linear edges, namely first edge 111, second edge 113, the 3rd edge 115 and phase relative to first edge 111 For the 4th edge 117 of second edge 113.First edge 111 intersects with second edge 113, forms the first corner (first corner);First edge 111 intersects with the 4th edge 117, is formed the second corner (second corner);Second edge 113 Intersect with the 3rd edge 115, form the third angle and fall (third corner);3rd edge 115 intersects with the 4th edge 117, shape Into fourth corner (fourth corner).
Touch surface 110, light-emitting component 120, reflective mirror 130, the first reflecting unit 140 and the institute of the second reflecting unit 150 The region of encirclement is the touch-control sensing area TR of optical touch control system 1.The touch-control sensing area TR has a height H, wherein height H It is to be set according to the practical structures of optical touch control system 1 with operational requirements.
Light-emitting component 120 is provided in the first edge 111 of touch surface 110.Light-emitting component 120 is providing optical touch Required light source when system 1 operates.Light-emitting component 120 is sending black light (invisible light), such as infrared light (infrared light) or ultraviolet light (ultraviolet light), illuminates whole touch surface 110.
In one embodiment, light-emitting component 120 may include multiple illuminators, and the plurality of illuminator is along touch surface 110 first edge 111 arranges.In another embodiment, light-emitting component 120 may also comprise an illuminator and a light-guide device (light guide), such as light guide plate.Illuminator is disseminated to whole light-guide device by light is produced with scattering method, and by leading Optical element projects uniform light to touch surface 110.The illuminator may be, for example, infrared light emitting diodes (IR LED) Or ultraviolet-ray diode (UV LED).It is noted that the light that light-emitting component 120 is sent can also be visible ray (visible light).It is noted that the actual embodiment of light-emitting component 120 can be according to the reality of optical touch control system 1 Operational requirements are set, and the present embodiment is not intended to limit.
Reflective mirror 130 is disposed on the 4th edge 117 of touch surface 110, and reflective mirror 130 is to protrude from touch surface On 110.Specifically, reflective mirror 130 is the top extended height H from touch surface 110 toward touch surface 110 in the present embodiment. Reflective mirror 130 includes a reflective mirror, and reflective mirror is sent invisible towards touch surface 110 with Refl-Luminous element 120 Light is to touch surface 110.
Reflective mirror 130 is simultaneously to form corresponding touch-control sensing area TR mirror image (mirror image), and produce and touching The mirror image (being not depicted in Fig. 1) of the indicant operated on control face 110.Reflective mirror 130 can utilize a plane mirror (plane mirror) is realized, and the minute surface of reflective mirror 130 is towards touch-control sensing area TR.
First reflecting unit 140 is disposed on the 3rd edge 115 of touch surface 110.
Second reflecting unit 150 is disposed on the second edge 113 of touch surface 110.First reflecting unit 140 and second anti- Light unit 150 protrudes from touch surface 110 respectively.First reflecting unit 140 may be, for example, one reflective with the second reflecting unit 150 Cloth, and it is respectively facing touch-control sensing area TR, the light sent with Refl-Luminous element 120.First reflecting unit 140 and second Reflecting unit 150 is the top extended height H from touch surface 110 toward touch surface 110 respectively.
In the present embodiment, reflective mirror 130, the height of first and second reflecting unit 140,150 are all H, it will be appreciated that Reflective mirror 130, first, second reflecting unit 140,150 height also can according to optical touch control system 1 practical operation demand and Differ.
First reflecting unit 140 and the second reflecting unit 150 can use retro-reflection material (retro- Reflective material) reach reflecting effect, but the present embodiment is not limited thereto, as long as the first reflecting unit 140 And second reflecting unit 150 light of light-emitting component 120 can be reflexed to touch surface 110, and will not preferably form touch-control sense Survey area TR mirror image.Can also be substituted by three light-emitting components, as long as making these three light-emitting components all directions and shining Penetrate touch surface 110.
Image sensor apparatus 12 is provided in the first corner in touch surface 110.Image sensor apparatus 12 can also be set Put in the second corner in touch surface 110 or the first edge 111 of touch surface 110, as long as making Image sensor apparatus 12 relative In the set location with reflective mirror 130.
Image sensor apparatus 12 is detecting touch-control of the indicant (i.e. the finger 21 of user 2) in touch-control sensing area TR Operation.Specifically, Image sensor apparatus 12 is to capture across touch surface 110, and comprising touch surface 110, reflective mirror 130, Multiple images for the touch-control sensing area TR that first reflecting unit 140 and the second reflecting unit 150 are surrounded.The plurality of image is extremely The image of object image comprising background video and with indicant less.The background video is not close to panel in indicant When 11, and Image sensor apparatus 12 captures the image that touch-control sensing area TR can be included across touch surface 110.
Image sensor apparatus 12 can be further provided with filtration module, (such as infrared filter (IR-pass Filter)), so that Image sensor apparatus 12 can only receive the light of specific wavelength, such as infrared ray.
The visual field of Image sensor apparatus 12 can be set to tilt towards touch surface 110, and its angle of inclination can be according to actual setting Demand is set with image capture scope, as long as Image sensor apparatus 12 can be made to capture the touch-control sensing arrived across touch surface 110 Area TR image.Longitudinal visual field of Image sensor apparatus 12 can preferably be more than or equal to touch-control sensing area TR height H.
Image sensor apparatus 12 may be, for example, CMOS (CMOS) or charge coupled cell (CCD), and in art tool usually intellectual can be designed according to actual use situation, the present embodiment is herein not It is any limitation as.
Processing unit 13 judges indicant to correspond to the image of touch surface 110 according to produced by Image sensor apparatus 12 (i.e. finger 21) whether touches (or contact) touch surface 110 or indicant is suspended in (hovering) touch surface 110.Processing Unit 13 decides whether that calculate the indicant (i.e. finger 21) sits relative to a touch-control of the touch surface 110 according to judged result Mark.Processing unit 13 simultaneously will be sent to image with the relevant information of the touch-control coordinate of corresponding indicant using transmission unit 15 and show Show equipment 16, to manipulate the action of cursor 161 on the picture of display device 16.
Processing unit 13 can be the image capture frequency according to acquiescence, and driving Image sensor apparatus 12 is captured across touch-control The plurality of image in face 110, and indicant is determined whether close to touch-control sensing area TR according to the plurality of image and judges to refer to Show touch surface 110 is touched or be suspended in thing (i.e. finger 21) whether.The image capture frequency can be according to optical touch system The practical operation of system 1 and working environment (such as ambient brightness arround optical touch control system) are set, and the present embodiment is not Limitation.In addition, in other embodiments, processing unit 13 can also drive Image sensor apparatus 12 continuously to capture horizontal stroke Across the plurality of image of touch surface 110, and (such as every 3 continuously or at regular intervals according to the plurality of image Image is every 2 seconds) in, touch control operation of the detection indicant in touch-control sensing area TR.
Memory cell 14 is storing the plurality of image and judge that indicant is tactile that Image sensor apparatus 12 is captured Touch touch surface 110 or be suspended in the relevant parameter of touch surface 110.Memory cell 14 and can be used to storage calculate indicant phase For the touch-control coordinate of touch surface 110.
In simple terms, one of the plurality of image captured when processing unit 13 according to Image sensor apparatus 12, is detected When one indicant (such as finger 21) is close to touch surface 110, processing unit 13 can active phase defined in the image captured For an imagery zone of the object image of the indicant.Processing unit 13 is according to bright between multiple pixels in the imagery zone Difference is spent, judges whether the indicant (such as finger 21) touches (contact) touch surface 110 or be suspended in the touch surface 110。
When processing unit 13 judges that indicant (such as finger 21) touches touch surface 110, processing unit 13 is according to instruction Thing the image captured image space and to should indicant mirror image (by the image of the mirror of reflective mirror 130) in image Image space, calculate touch-control coordinate of the indicant relative to touch surface 110.Special instruction, technical field Has usual skill, it should be understood that contact (contact) touch surface 110 or touching (touch) touch surface 110 all have phase herein Same meaning, therefore be used interchangeably.
Furthermore, it refer to Fig. 2A~2B and referring concurrently to Fig. 1.Fig. 2A and Fig. 2 B show offer of the present invention respectively The schematic diagram of the bidimensional image with object image that is captured of Image sensor apparatus.Fig. 2A is indicant (such as finger 21) The bidimensional image of touch surface 110 is suspended in, and Fig. 2 B are the bidimensional image that indicant (such as finger 21) touches touch surface 110.
Touched because longitudinal visual field (namely longitudinal sensing range of Image sensor apparatus 12) of Image sensor apparatus 12 is more than Control sensing region TR height H, therefore as shown in Figure 2 A, the image F1 that Image sensor apparatus 12 is captured include background area DR with And highlights domain (bright region) BR.
Longitudinal height of the highlights domain BR is by touch surface 110, reflective mirror 130, the first reflecting unit 140 and Two reflecting units 150 are determined.Specifically, due to light-emitting component 120, reflective mirror 130, the first reflecting unit 140 and second Reflecting unit 150 can launch or reflection light, therefore can form higher bright of brightness in image that Image sensor apparatus 12 is captured Area BR.The background area DR is then touch surface 110, reflective mirror 130, the first reflecting unit 140 and the second reflecting unit 150 Region in addition, wherein the background area is because and without the light that is sent to Refl-Luminous element 120, therefore present dark Color.In addition, highlights domain BR's defines after mode can be specified in, therefore do not repeat herein.
When the finger 21 of user 2 is near or adjacent to touch surface 110, but during not in contact with touch surface 110, Image sensor apparatus The 12 image F1 captured can include object image I21, I21 for corresponding to finger 21 ' and go out corresponding hand by the mirror of reflective mirror 130 Refer to 21 mirror image (being not depicted in Fig. 2A and Fig. 2 B).
Further say, when the finger 21 of user 2 covers light-emitting component 120 and the direct projection of reflective mirror 130 or reflection During some light, can in image F1 formed gloomy object image I21 (also known as object optical information, grazing point or dim spot) with And the mirror image (also known as mirror image optimal information, grazing point or dim spot) of finger 21 is corresponded to as caused by the mirror of reflective mirror 130.This Outside, because in the present embodiment, touch surface 110 is a reflective mirror, therefore the image F1 that Image sensor apparatus 12 is captured also can be simultaneously The mirror image for corresponding to finger 21, i.e. object image I21 ' are produced comprising being reflected by touch surface 110.The image F1 has M × N number of picture Element, wherein M, N are positive integer.
As shown in Figure 2 A, when the finger 21 of user 2 is not in contact with touch surface 110, the image of finger 21 is corresponded in image F1 It can be not attached between object image I21 and the object image I21 ' of the corresponding mirror image of finger 21 and at a distance of a distance.The distance is Possessed height distance between the finger tip and touch surface 110 of finger 21.As shown in Figure 2 B, when the finger 21 of user 2 contacts Or during touching touch surface 110, the image F2 that Image sensor apparatus 12 is captured includes object image I21 and object image I21 ' It can be connected with each other.The image F2 also has M × N number of pixel.Pixel quantity possessed by the image F1 and F2 is by image The resolution ratio of sensing device further 12 determines.
Then, processing unit 13 can be according to object image I21 and I21 ' in image F1 or F2 image space and optics One background video of a touch-control system 1 left margin LB defined in an image F1 or F2 and right margin RB, with image F1 or The corresponding object image I21 and I21 ' of a F2 definition imagery zone IA.It is, the imagery zone IA is by highlights domain BR And left margin LB and right margin RB is defined.The background video is that Image sensor apparatus 12 is not yet close tactile in finger 21 Control the image captured during sensing area TR.More specifically, background video and without the object image of corresponding indicant.Cause This, has object image I21 can be corresponding compared with background video with the brightness of I21 ' position in image F1 highlights domain BR The brightness in highlights domain is low, therefore can define imagery zone IA accordingly.
As shown in Figure 2 A, when finger 21 is not in contact with control face 110, because of object image I21 and the mirror image of finger 21 in image F1 Object image I21 ' is not connected with each other, so as to which the Luminance Distribution of multiple pixels in imagery zone IA can be uneven.It is, Change between the pixel value of the plurality of pixel in each pixel column in imagery zone IA differs greatly.And when indicant connects When touching touch surface 110, as shown in Figure 2 B, because object image I21 and object image I21 ' is connected with each other in image F2, therefore image F1 Imagery zone IA in the brightness of multiple pixels can be uniformly distributed, namely respectively this in the pixel column is more in imagery zone IA Change or difference between the pixel value of individual pixel is smaller.
Therefore, processing unit 13 can be according to multiple pixel columns (i.e. L pixel in image F1 or F2 imagery zone IA Luminance difference in OK) between the plurality of pixel, judge that finger 21 is touching touch surface 110 or is only suspended in touch surface 110, wherein L are positive integer, and L is less than N.
In detail, need not calculate analysis to should finger 21 in the thing of caused mirror image in the touch surface 110 Under part image I21 ' image information situation, processing unit 13 can be according to each pixel column in imagery zone IA in image F1 or F2 Variation value in (i.e. L pixel column) between the pixel value (such as grey decision-making) of multiple pixels, such as the respectively pixel column Pixel ratio or difference value in variation value, each pixel column between maximum brightness value and minimum brightness value or the respectively pixel column it is flat Equal pixel value etc., to judge whether finger 21 touches touch surface 110 or be suspended in touch surface 110.
In one embodiment, due between background area DR and highlights domain BR luminance difference it is obvious, and work as finger 21 Highlights domain BR light can be covered when contacting the touch surface 110, so that the pixel variation value of corresponding shielded pixel column Diminish.Processing unit 13 can be calculated first in each pixel variation value of the plurality of pixel of the pixel column in imagery zone IA, work as shadow As in the IA of region respectively the plurality of pixel of the pixel column pixel variation value it is smaller, that is, represent that shielded region area is bigger;And When respectively the pixel variation value of the plurality of pixel of the pixel column is bigger in imagery zone IA, i.e. expression shaded areas is smaller.Therefore, Processing unit 13 can be more than a presetted pixel variation value in the pixel variation value of the pixel column with maximum pixel variation value When, judge that finger 21 is suspended in touch surface 110.
In another embodiment, processing unit 13 can calculate in imagery zone IA respectively the pixel column max pixel value with Pixel ratio between minimum pixel value.Processing unit 13 and in the pixel of the pixel column with maximum pixel ratio value When ratio value is more than presetted pixel ratio value (such as 1), judge that finger 21 is suspended in touch surface 110.
In yet another embodiment, processing unit 13 can be also calculated in each pixel column max pixel value in imagery zone IA With the pixel value difference between minimum pixel value.Processing unit 13 and in the pixel difference of the pixel column with minimum pixel difference When value is more than presetted pixel difference (such as zero), judge that finger 21 is suspended in touch surface 110.
In yet another embodiment, processing unit 13 can be also calculated in each mean pixel of the pixel column in imagery zone IA Value.Processing unit 13 is simultaneously less than an acquiescence average pixel value in the average pixel value of the pixel column with minimum average B configuration pixel value When, it is touching touch surface 110 to judge finger 21.
In a further embodiment, processing unit 13 can be in defining one first pixel group and one the in imagery zone IA Two pixel groups, wherein the first pixel group and one second pixel group are with so that processing unit 13 judges imagery zone IA Luminance difference, judge that indicant is touching or is suspended in touch surface 110 with this.
Specifically, the first pixel group is high brightness group, and the first pixel group includes an at least high brightness Pixel, wherein high luminance pixel are to be more than the pixel of predetermined threshold value in imagery zone IA with pixel value.Second pixel group is Low-light level group, and the second pixel group includes an at least low brightness pixel, wherein low brightness pixel is imagery zone IA The interior pixel for being less than predetermined threshold value with pixel value.The default threshold can be the average bright of highlights domain BR in foundation image Degree is set, for example, the 75%~90% of highlights domain BR average brightness value, and the present embodiment is not intended to limit.
When processing unit 13 judges that finger 21 is suspended in touch surface 110, it is relative that processing unit 13 does not calculate indicant In the touch-control coordinate of the touch surface 110 of panel 11.Processing unit 13 can not export any touch-control coordinate data or output is previous The touch-control coordinate data of the corresponding touching of finger 21 touch surface 110 is calculated to display device 16, with stabilized image display device The position of cursor 161 on 16.
When it is touching touch surface 110 that processing unit 13, which judges finger 21, processing unit 13 is according to finger 21 in image Image space and the mirror image of finger 21 (i.e. by the mirror image (being not depicted in Fig. 2A and Fig. 2 B) of 130 mirrors of reflective mirror) in image Image space, calculate indicant relative to the touch surface 110 of panel 11 touch-control coordinate.Processing unit 13 simultaneously will calculate touch-control The related data of coordinate is sent to display device 16, to control the action of cursor 161 on display device 16, such as controls Moving operation, writing (writing) operation or point selection operation of cursor 161 processed etc..
Above-mentioned presetted pixel variation value, presetted pixel ratio value, presetted pixel difference, acquiescence average pixel value and acquiescence Threshold value can be stored in memory cell 14 in advance, and unit 13 for processing is read.Above-mentioned presetted pixel variation value, default picture Plain ratio value, presetted pixel difference and acquiescence average pixel value can be the practical operation demands according to optical touch control system 1, Such as sensitivity or noise intensity of touch-control sensitivity or Image sensor apparatus 12 etc., to set.
Acquisition, highlights domain BR below for background video define and defining for imagery zone IA is done further Explanation.It refer to Fig. 3 A~Fig. 3 C and referring concurrently to Fig. 1.Fig. 3 A show that Image sensor apparatus provided in an embodiment of the present invention captures Background video and its corresponding brightness curve schematic diagram.Fig. 3 B show Image sensor apparatus institute provided in an embodiment of the present invention The schematic diagram of the bidimensional image with object image captured.Fig. 3 C show the brightness curve of the bidimensional image shown in corresponding diagram 3B Schematic diagram.Curve C10 represents the background luminance curve of corresponding background video.Curve C20 represents default luminance threshold curve, its Middle curve C20 is according to acquired in curve C10.Curve C30 represents corresponding image F1 brightness curve.
When optical touch control system 1 is run, processing unit 13 can be not yet close in indicant (such as finger 21) or entered Before the TR of touch-control sensing region (such as during the just startup of optical touch control system 1), first Image sensor apparatus 12 is driven to sense and capture Across the background video FB of touch surface 110.Background video FB includes background area DR and highlights domain BR, wherein background video FB has M × N number of pixel.
Processing unit 13 can first calculate background video FB average brightness value.Processing unit 13 is further according to background video FB's Average brightness value and a default weight value (such as 1.2) set a presetted pixel value.Then, processing unit 13 is by comparing background The pixel value of each pixel in image FB in each pixel column (pixel column) and presetted pixel value, determine in each pixel column Justice goes out the last one light coboundary H_UB and the last one light lower boundary H_LB, goes out highlights domain BR defined in background video FB with this. The plurality of pixel value wherein between strong light coboundary H_UB and strong light lower boundary H_LB is more than the presetted pixel value.
In addition, the default weighted value can be according to a standard value difference of background video FB average brightness value (standard deviation) is set.
In another embodiment, processing unit 13 can also calculate in background video FB N number of picture in the respectively pixel column Plain P1~PN average pixel value.Then, processing unit 13 presets weighted value according to the average pixel value of the respectively pixel column and one, The corresponding presetted pixel value for setting the respectively pixel column.Processing unit 13 is each in each pixel column in background video FB by comparing The pixel value of individual pixel and presetted pixel value, and the plurality of pixel that will there is brightness to be more than presetted pixel value in the respectively pixel column The intensive block at place is set to the highlights block of the respectively pixel column.Processing unit 13 is according to the highlights block of the respectively pixel column in the back of the body Each pixel column defines strong light coboundary H_UB and strong light lower boundary H_LB in scape image FB.
In yet another embodiment, processing unit 13 can also calculate in background video FB N number of picture in the respectively pixel column Plain P1~PN average pixel value.Then, processing unit 13 is according to the average picture of the pixel column with maximum average pixel value Element value and above-mentioned default weighted value setting presetted pixel value.Processing unit 13 further according to acquiescence pixel value in background video FB it is each Pixel column defines strong light coboundary H_UB and strong light lower boundary H_LB.
After going out highlights domain BR defined in image FB, background video FB corresponding conversions are that a background is bright by processing unit 13 Write music line (namely curve C10), to produce background video FB background luminance curve data.Along each of X direction in curve C10 The brightness value is each brightness summation of the pixel column in the highlights domain BR for correspond to background video FB.In other words, background luminance The brightness curve being distributed in curve (namely curve C10) record background video FB along each pixel row direction.
In detail, processing unit 13 can be by the plurality of pixel of each pixel column in the BR of highlights domain in background video FB Pixel value is added, using the pixel value as the pixel column.For example, the brightness of ith pixel row is background shadow in curve C10 As the summation of the pixel value of the plurality of pixel in highlights domain the i-th rows of BR in FB.
In addition, in a further embodiment, highlights domain BR can need not be also picked out, by each pictures of background video FB The k pixels with high brightness are filtered out in N number of pixel of plain row (such as ith pixel row) and (are greater than a presetted pixel Threshold value), then the summation of the pixel value of the plurality of k pixel is calculated, the pixel as the pixel column (such as ith pixel row) Value.Also or, k1 are filtered out in N number of pixel of each pixel columns of background video FB (such as ith pixel row) with highlighted The pixel (being greater than the presetted pixel threshold value) and k2 of degree have dark pixel (such as less than presetted pixel threshold value), then The summation of the pixel value of the difference between the k1 pixel and k2 pixel is calculated, as the pixel column (such as ith pixel Pixel value OK).Described k, k1 and k2 are positive integer.
Processing unit 13 can also carry out computing to curve C10, bright using the background that produces curve C20 as background video FB Spend curve data.Curve C20 is the product of curve C10 and a default percentage (such as 80%).That is, curve C20 Brightness can be low compared with curve C10 brightness, and in this, as background luminance data, with the touching state and touch-control of sensing indicant During position, there is provided suitably brightness tolerance (tolerance).
After processing unit 13 receives the image output of Image sensor apparatus 12 (such as image F1 shown in Fig. 2A), processing Unit 13 first can define highlights domain BR according to background video FB corresponding to image F1, as shown in Figure 3 B.Processing unit 13 is simultaneously Can be brightness curve (such as curve C30) by image F1 corresponding conversions, to produce image F1 brightness curve data.
As shown in Figure 3 C, the meeting comparison curves C30 of processing unit 13 (i.e. image F1 brightness curve) and curve C20, to obtain The object image I21 and I21 ' of indicant is taken in image F1 image space.In detail, object image I21 and I21 ' are By indicant (such as finger 21 and mirror image of finger 21 for being reflected by touch surface 110) masking light-emitting component 120, reflective mirror 130 with And some light of the direct projection of the first reflector element 140 or reflection is formed.Therefore, corresponded in curve C30 object image I21 and I21 ' is in respectively the brightness of the pixel column can be less than the background luminance of the corresponding pixel columns of curve C20 in the region where image F1 Value, and respectively the brightness of the pixel column can be higher than the background luminance of corresponding pixel column in curve C20 in image F1 other regions.
So as to which processing unit 13 passes through comparison curves C30 and curve C20, you can obtain object image I21 and I21 ' in shadow As F1 video imaging scope, namely left margin LBs of the object image I21 and I21 ' in image F1 and right margin RB.Then, Processing unit 13 defines imagery zone (such as Fig. 2A according to image F1 highlights domain BR and left margin LB with right margin RB Imagery zone IA).Then, processing unit 13 carries out Luminance Analysis to imagery zone IA again, judges to be located at touch-control sensing area with this Touch surface 110 is touched or contacted to the indicant in TR whether.
It is noted that above-mentioned touch surface 110 or non-reflective mirror.As shown in Figure 4 A and 4 B shown in FIG., image sensing Device 12 captures the image across touch surface 110 then only comprising reflective mirror 130, the first reflecting unit 140, the second reflecting unit 150 areas encompassed but not including touch surface 110 reflect caused by mirror image.Fig. 4 A and Fig. 4 B are another reality of the present invention respectively Apply the schematic diagram for the bidimensional image with object image that the Image sensor apparatus that example provides is captured.
As shown in Figure 4 A, when touch surface 110 of the indicant (such as finger 21) close to panel 11, Image sensor apparatus 12 The image F3 captured the only finger tips comprising finger 21 or finger pulp masking reflective mirror 130, the first reflecting unit 140 or the second are reflective Object image I41, highlights domain BR and the background area DR of unit 150.Highlights domain BR height is by reflective mirror 130, What one reflecting unit 140 and the second reflecting unit 150 were determined.The background area DR is then reflective for reflective mirror 130, first Background area beyond the reflecting unit 150 of unit 140 and second.
As shown in Figure 4 B, it is corresponding in the image F4 that Image sensor apparatus 12 is captured when finger 21 contacts touch surface 110 The finger tip of finger 21 or the object image I41 of finger pulp can cover last row pixel in image F4 imagery zone IA '.Processing For unit 13 when image F3 or F4 is received, highlights domain BR that can be in image F3 or F4 correspondingly defines imagery zone IA ' (i.e. Define left and right border), to carry out the brightness analysis of variance in imagery zone IA '.Processing unit 13 and according to imagery zone IA ' Luminance Analysis result, it is that touching or is suspended in touch surface 110 at touch surface 110 to judge indicant.
The mode that existing optical touch control system 1 calculates the position of indicant can be according to its practical structures (such as image sensing The quantity of device, the set-up mode of reflective mirror 130) with mode of operation and change, it is and known according to usual skill for art Technology, therefore below only for processing unit 13 calculate positioned at touch-control sensing area TR indicant with respect to touch surface 110 touch-control A kind of calculation of coordinated manner does a summary.
Fig. 5 A be refer to Fig. 5 B and referring concurrently to Fig. 1, Fig. 5 A are optical touch control systems provided in an embodiment of the present invention Operation chart.Fig. 5 B are the schematic diagrames for the partial image that corresponding diagram 5A Image sensor apparatus is captured.
First reflecting unit 140 projects the first reflecting unit mirror image 140 ' relative to the specular mirror of reflective mirror 130.Second is anti- Light unit 150 relative to reflective mirror 130 the reflecting unit mirror image 150 ' of minute surface mirror (mirrored) second.Light-emitting component 120 With respect to the minute surface mirror light-emitting component mirror image 120 ' of reflective mirror 130.Light-emitting component 120, reflective mirror 130, the first reflecting unit 140th, the second reflecting unit 150 defines a real image space RS jointly.Light-emitting component mirror image 120 ', reflective mirror 130, the first reflective list First mirror image 140 ', the second reflecting unit mirror image 150 ' define a virtual image space IS jointly.Touch point TP is empty to be shown in real image Between in RS, indicant touches the position of touch of touch surface 110, and touch point TP ' is then corresponding to be shown in the IS of virtual image space Indicant touches the position of touch of touch surface 110.
The horizontal visual field VA of Image sensor apparatus 12 is across touch surface 110, and horizontal visual field VA comprises at least real image space RS and virtual image space IS.Image sensor apparatus 12 captures real image space RS, virtual image space IS and the touch-control positioned at real image space RS Indicant (such as Fig. 1 finger 21) in sensing area TR, and produce image F5.As shown in Figure 5 B, image F5 has finger 21 Finger tip or finger pulp hide worn-out light-emitting component 120, reflective mirror 130, the first reflecting unit 140, the second reflecting unit 150 and/or touch The object image ITP and the object image by the minute surface institute mirror of reflective mirror 130 that the part direct projection of control face 110 or reflection are formed ITP’。
Specifically, when any indicant, such as finger 21, during into touch-control sensing area TR positioned at real image space RS, Image sensor apparatus 12 captures corresponding indicant (such as finger according to first sensing route (first sensing path) SL 21) image, and indicant (such as finger 21) can form object image ITP in image F5.The same time, Image sensor apparatus 12 according to The indicant (such as finger 21) is captured with respect to the mirror of reflective mirror 130 according to second sensing route (second sensing path) SL ' The mirror image of face mirror indicant in the IS of virtual image space, and the mirror image of indicant forms object image ITP ' in image F5.
The memory cell 14 can store in advance (not to be shown with an object image (not shown) positioned at one with reference to image FREF Show) one-dimensional coordinate position and it is corresponding sensing route and the second reflecting unit 150 between angle relativeness.Storage is single Member 14 can also store the distance D1 between the edge 117 of second edge 113 and the 4th of touch surface 110 in advance.
Processing unit 13 can be according to object image ITP, ITP in image F5 ' one-dimensional position (i.e. touch point TP, TP ') The first included angle A 1 and the second included angle A 2 are obtained respectively.Processing unit 13 simultaneously utilizes trigonometric function relation, calculates indicant in touch-control The two-dimensional plane coordinate of sensing area TR touch point, to obtain touch-control coordinate of the corresponding indicant relative to touch surface 110.
In more detail, a rectangular coordinate system can be set referring again to Fig. 1 and with reference to Fig. 5 A, touch surface 110.It is described Origin of the rectangular coordinate system using the position of Image sensor apparatus 12 as rectangular coordinate system, the second of touch surface 110 X-axis of the edge 113 as rectangular coordinate system, and Y-axis of the first edge 111 as rectangular coordinate system.So as to corresponding indicant The coordinate that the touch point TP of touch-control is located at rectangular coordinate system is then represented by (D2, D3), and wherein D2 represents that touch point TP is relative In the distance of first edge 111, and D3 represents distances of the touch point TP relative to second edge 113.
Thus, processing unit 13 can calculate first with trigonometric function and obtain out the first sensing route SL and touch surface 110 The between the first included angle A 1 and the second edge 113 of the second sensing route SL ' and touch surface 110 between second edge 113 Two included angle As 2.Processing unit 13 calculates D2 using following equation (1),
Wherein D2 represents the distance between touch point TP and first edge 111;H is the height of reflective mirror 130;A1 is first Sense the first angle between route SL and the second edge 113 of touch surface 110;A2 is the second sensing route SL ' and touch surface The second angle between 110 second edge 113.
Then, processing unit 13 calculates D2tanA1, to obtain touch point TP Y- axial coordinates.The touch point TP's Two-dimensional coordinate is (D2, D2tanA1).
Special instruction, when reality is implemented, Image sensor apparatus 12 can further include a lens or lens module, And lens enable Image sensor apparatus 12 intactly to capture reality to adjust the horizontal visual field VA of Image sensor apparatus 12 with this Image space RS and the virtual image space IS image.
In addition, if corresponding indicant is in real image space RS object image ITP and the virtual image space IS object image ITP ' When (i.e. object image ITP mirror image) is overlapping, processing unit 13 can first obtain corresponding real image space RS object image ITP and void Image space IS object image ITP ' independent image, then carry out touch-control coordinate computation.The method of the above-mentioned independent image of acquisition has Many kinds, and the plurality of method be applied to the present invention, including but not limited to change optical touch control system light illumination mode with Light structures, Image sensor apparatus 12 is set not produce mirror image (i.e. object image ITP ') in acquisition, or isolation produces mirror image Light, real image space RS object image ITP light is incident upon on Image sensor apparatus 12, or shorten virtual image space IS object image ITP ' height, and using partly not the image with object image ITP image overlap as real image space RS Object image ITP independent image etc..
It is to be understood that it has been observed that the mode that optical touch control system calculated or obtained touch point is prior art, and this hair Bright technical field tool usually intellectual can also be used trigonometric function and coordinate other positions calculation, such as TaiWan, China patent Disclosed calculation obtains in application case the 098120274th (corresponding case U.S. Patent Bulletin US8269158 B2) Touch point TP two-dimensional coordinate is taken, the mode of above-mentioned calculating touch point TP two-dimensional coordinate is only a kind of exemplary, the present invention It is not limited thereto.
In addition, in the present embodiment, processing unit 13 can be with microcontroller (microcontroller) or embedded The process chips such as controller (embedded controller) compile mode to realize using program code, but the present embodiment is simultaneously Do not limit.Memory cell 14 can be volatile using flash chip, ROM chip or RAM chip etc. Property or non-volatile memory chip are realized, but the present embodiment is not limited thereto.And transmission unit 15 can be utilize it is wired Touch-control coordinate information is sent to display device 16, but the present embodiment by transmission or wireless transmission method (such as Bluetooth transmission) It is not limited thereto.
In other embodiments, if the light-emitting component 120 of optical touch control system 1 is substituted with passive light source, such as Reflective mirror, then can separately (such as first edge 111 and the intersection of the second edge 113) setting at least one around touch surface 110 Illuminator, then by light-emitting component 120 and light to the whole touch surface 110 of the reflection light of reflective mirror 130 injection.
In another embodiment, light-emitting component 120 can be affixed on Image sensor apparatus 12.For example, send out Optical element 120 can using bonding (sticking), screw locking (screwing) or fasten (fastening) by the way of and image Sensing device further 12 combines, so as to be fixed on Image sensor apparatus 12.
In yet another embodiment, optical touch control system 1 can not have light-emitting component 120, and Image sensor apparatus 12 can Configure a lighting device (such as infrared illumination device with infrared light-emitting diode).Image sensor apparatus 12 can simultaneously enter One step is provided with infrared ray filtration module, (such as infrared filter (IR-pass filter)), so that Image sensor apparatus 12 can capture the image of touch surface 110 by infrared ray filtration module.
In addition, in the present embodiment, panel 11 and the display device 16 of optical touch control system 1 are separate member Part, but in other embodiments, panel 11 can also be combined with each other with the display screen of display device 16.
For example, when panel 11 is Touch Screen (such as transparent touch screen), the display screen of video display 16 Panel 11 can be used as.And reflective mirror 130, the first reflecting unit 140 and the second reflecting unit 150 can be correspondingly arranged in image and show On the display screen for showing equipment 16.
In Fig. 1, panel 11 is a rectangular shape, and light-emitting component 120, reflective mirror 130, the first reflecting unit 140 and Second reflecting unit 150 is four sides for being arranged at panel 11 mutual vertically, but in other embodiments, panel 11 is alternatively Other geometries, for example, it is square, circular etc., and light-emitting component 120, reflective mirror 130, the first reflecting unit 140 and second Reflecting unit 150 is correspondingly arranged on panel 11.
It is noted that panel 11, Image sensor apparatus 12, light-emitting component 120, reflective mirror 130, the first reflecting unit 140th, the species of the second reflecting unit 150, processing unit 13, memory cell 14, transmission unit 15 and display device 16, Entity structure and/or embodiment are set according to species, entity structure and/or the embodiment of optical touch control system 1, this Invention is not intended to limit.
(embodiment of the object analysis method of optical touch control system)
By the above embodiments, the present invention can separately summarize a kind of object analysis method, and the method, which can be applied, is applied to Fig. 1 The optical touch control system of embodiment.The object analysis method can be according to detected by Image sensor apparatus corresponding indicant The brightness change of image, judges that indicant is the touch surface of touch-control optical touch control system or is suspended in the touch surface.
It refer to Fig. 6 and with reference to Fig. 1, Fig. 2A and Fig. 2 B.Fig. 6 shows optical touch provided in an embodiment of the present invention The schematic flow sheet of the object analysis method of system.Processing unit 13, can be according to an acquiescence when optical touch control system 1 operates Image capture frequency, driving Image sensor apparatus 12 captures multiple images of the touch surface 110 across panel 11, to detect whether There is indicant close.The image capture frequency can be the practical operation and working environment (example according to optical touch control system 1 Such as the ambient brightness arround optical touch control system) set, the present embodiment is not intended to limit.
In step s 601, when processing unit 13 drives Image sensor apparatus 12 to capture the touch surface 110 across panel 11 The first image, wherein the first image has the object image of a corresponding indicant (such as finger), such as Fig. 2A image F1 Or Fig. 2 B image F2.The image data of first image is simultaneously stored in memory cell 14 by processing unit 13.
Secondly, in step S603, processing unit 13 can be according to the background luminance curve number of the background video captured in advance According to going out the left margin LB and right margin RB of the object image of corresponding indicant defined in the first image, with the first image The imagery zone IA of the corresponding object image of definition.
In detail, processing unit 13 can first go out highlights domain BR according to background video defined in the first image.Then, First image Mapping and Converting is again brightness curve data by processing unit 13.It has been observed that processing unit 13 can be calculated positioned at first The pixel value summation of multiple pixels of the respectively pixel column in highlights domain in image, namely calculate highlights domain BR in the first image Inside respectively the brightness summation of the pixel column, also generation correspond to the brightness curve data of the first image.Then, processing unit 13 compares Background luminance curve data and brightness curve data, the left margin LB and right margin RB of object image are defined in the first image.
Then, in step s 605, processing unit 13 is according to the luminance difference between multiple pixel values in imagery zone IA, Judge whether indicant touches touch surface 110 or be suspended in touch surface 110.When processing unit 13 judges that indicant touches touch-control Face 110, perform step S607.Conversely, when processing unit 13 judges that indicant is suspended in touch surface 110, then step is performed S609。
Specifically, when the brightness variation that processing unit 13 calculates imagery zone IA is larger, processing unit 13 Judge that indicant is suspended in touch surface 110 and performs step S609.And when processing unit 13 calculates the bright of imagery zone IA Degree is evenly distributed, namely variation it is smaller when, when processing unit 13 i.e. can determine that indicant be touching touch surface 110 and perform step S607。
In step S607, if processing unit 13 judges indicant touching touch surface 110, processing unit 13 i.e. can basis Indicant goes out corresponding instruction in the image space (i.e. object image I21 image space) of the first image with the mirror of speculum 130 The mirror image of thing calculates touch-control coordinate of the indicant relative to touch surface 110 in the image space of the first image.Then, processing is single The relevant information (resolution ratio for including touch surface 110) of touch-control coordinate is simultaneously sent to image and shown by member 13 using transmission unit 15 Equipment 16, with the action of cursor 161 on the display screen of corresponding control display device 16, such as the movement in display screen Action.
In step S609, if processing unit 13 judges that indicant is suspended in touch surface 110, processing unit 13 is disregarded Calculate touch-control coordinate of the indicant relative to touch surface 110.
Below for processing unit 13 judge indicant be suspended in touch surface 110 or touch touch surface 110 it is specific Embodiment is described further.
In an embodiment, processing unit 13 can by dividing the pixel basis brightness in imagery zone to group, Judge that indicant is suspended in touch surface 110 or touching touch-control by the luminance difference between more different brightness groups again Face 110.
It refer to Fig. 7 and with reference to Fig. 1 and Fig. 2A and Fig. 2 B.Fig. 7 shows the light that another embodiment of the present invention provides Learn the schematic flow sheet of the object analysis method of touch-control system.The step of Fig. 7, can perform Fig. 6 step S605 in processing unit 13 Shi Zhihang.
In step s 701, processing unit 13 in the imagery zone IA of the first image captured in defining the first pixel group Group and the second pixel group.First pixel group and the second pixel group group judge shadow with for the analysis of processing unit 13 respectively As region IA luminance difference.In more detail, the first pixel group includes an at least high luminance pixel, second picture Plain group includes an at least low brightness pixel.The high luminance pixel is in imagery zone IA there is pixel value to be more than predetermined threshold value Pixel.The low brightness pixel is the pixel for being less than predetermined threshold value in imagery zone IA with pixel value.The default threshold Can be set according to the mean flow rate of highlights domain BR in image, for example, highlights domain BR average brightness value 75%~90%, but the present embodiment is not limited thereto.
In step S703, processing unit 13 is according to the first pixel group and in the imagery zone IA of the first image Two pixel groups, the average pixel value of the first pixel group and the average pixel value of the second pixel group are calculated respectively.Processing is single Member 13 can calculate the average using the average picture as the first pixel group of the pixel value of the first the plurality of pixel of pixel group respectively The pixel value of element value and the second the plurality of pixel of pixel group is averaged using the average pixel value as the second pixel group.
In step S705, processing unit 13 judges the average pixel value of the first pixel group and putting down for the second pixel group Whether the ratio between equal pixel value is more than a presetted pixel ratio value (e.g., about 1).Specifically, processing unit 13 can be first Being averaged for first pixel group is calculated according to the average pixel value of the first pixel group and the average pixel value of the second pixel group Ratio between pixel value and the average pixel value of the second pixel group.Then, processing unit 13 compares the first pixel group Ratio and presetted pixel ratio value between average pixel value and the average pixel value of the second pixel group.
If processing unit 13 calculates the average pixel value of the first pixel group and the average pixel value of the second pixel group Between ratio be more than presetted pixel ratio value (such as 1), that is, perform step S707.Conversely, if processing unit 13 calculates Ratio between the average pixel value of one pixel group and the average pixel value of the second pixel group is less than or equal to presetted pixel Ratio value (e.g., about 1), that is, perform step S709.
In step S707, when between the average pixel value of the first pixel group and the average pixel value of the second pixel group Ratio be more than presetted pixel ratio value (such as 1), that is, represent the imagery zone IA of the first image Luminance Distribution and uneven, I.e. luminance difference is larger, therefore processing unit 13 can judge indicant and not in contact with touch surface 110, and is suspended in touch surface 110.
Fig. 2A is refer again to, is illustrated using Fig. 2A as the first image.When indicant is suspended in touch surface Luminance Distribution in 110, imagery zone IA can be uneven, and the pixel with max pixel value with minimum pixel value Luminance difference between pixel is larger.
In step S709, when between the average pixel value of the first pixel group and the average pixel value of the second pixel group Ratio be less than or equal to presetted pixel ratio value (such as 1), that is, represent that the imagery zone IA of the first image Luminance Distribution is equal It is even, and luminance difference is smaller, therefore processing unit 13 can judge indicant contact or touching touch surface 110.Processing unit 13 and root Indicant is calculated in the touch-control coordinate of optical touch control system 1 in the image space of image according to indicant and its mirror image, is controlled with corresponding Cursor 161 on the display screen of display device 16 processed acts.
For example, Fig. 2 B are refer again to, when indicant is contact or touches touch surface 110, the image area of the first image Luminance Distribution in the IA of domain substantially can very uniformly, and the pixel with max pixel value and the pixel with minimum pixel value it Between luminance difference can very little, or even level off to zero.
Accordingly, processing unit 13 can be made a variation according to the brightness between the first pixel group and the second pixel group to judge Indicant is touching touch surface 110 or is suspended in touch surface 110.
In another embodiment, each picture in the imagery zone IA that processing unit 13 can be by analyzing the first image The pixel value variation of the plurality of pixel in plain row, to judge that indicant is suspended in touch surface 110 or touching touch surface 110. Fig. 8 be refer to Fig. 9 and with reference to Fig. 1 and Fig. 2 B.Fig. 8 shows the optical touch control system that another embodiment of the present invention provides The schematic flow sheet of object analysis method.Fig. 9 shows that what Image sensor apparatus provided in an embodiment of the present invention captured has thing The schematic diagram of the bidimensional image of part image.The step of Fig. 8, can be performed when processing unit 13 performs Fig. 6 step S605.
In step S801, processing unit 13, which is calculated according to the image F6 of acquisition in the imagery zone IA of the first image, respectively should Pixel ratio in pixel column between max pixel value and a minimum pixel value.Processing unit 13 can be in the shadow by the first image As region IA left margin LB to right margin RB, sequentially calculate in the respectively pixel column between max pixel value and a minimum pixel value Pixel ratio.
In step S803, processing unit 13 judges that IA in imagery zone has the pixel column of maximum pixel ratio value Whether pixel ratio is more than above-mentioned presetted pixel ratio value (e.g., from about 1).When processing unit 13 judges there is maximum pixel ratio The pixel ratio of the pixel column of example value is more than above-mentioned presetted pixel ratio value (such as 1), performs step S805.Conversely, work as Processing unit 13 judges that the pixel ratio of the pixel column with maximum pixel ratio value is less than or equal to above-mentioned presetted pixel Ratio value (e.g., from about 1), perform step S807.
In step S805, processing unit 13 can be because of the luminance difference mistake of the pixel column with maximum pixel ratio value Greatly, judge that indicant is suspended in touch surface 110.And in step S807, processing unit 13 can be because of maximum pixel ratio value The pixel column luminance difference it is smaller, and judge indicant be touching touch surface 110.
For example, as shown in figure 9, object image I21, I21 due to being instructed to thing in image F6 imagery zone IA ' The area of masking is smaller, therefore image F6 imagery zone IA luminance difference is significantly greater, so as to have maximum pixel ratio value The pixel ratio of the pixel column can be more than above-mentioned presetted pixel ratio value, therefore processing unit 13 can judge that indicant is to suspend In the touch surface 110.
Separately for example, if the first image is the image F2 shown in Fig. 2 B, thing is instructed in image F2 imagery zones IA Object image I21, I21 ' masking area it is larger, therefore imagery zone IA luminance difference is smaller.So as in imagery zone IA The pixel ratio of the pixel column with maximum pixel ratio value can be less than or equal to above-mentioned presetted pixel ratio value, Gu Chu Reason unit 13 can judge that indicant is touching touch surface 110.
Processing unit 13 simultaneously calculates indicant in optical touch system according to indicant and its mirror image in the image space of image The touch-control coordinate of system 1, acted with the cursor 161 on the display screen of corresponding control display device 16.
In still another embodiment, processing unit 13 can be maximum in the respectively pixel column by analyzing in imagery zone IA Value differences between pixel value and minimum pixel value, and judge that indicant is suspended in touch surface 110 or touching with this Touch surface 110.It refer to Figure 10 and referring concurrently to Fig. 9 and Fig. 2 B.Figure 10 shows that the optics that another embodiment of the present invention provides touches The schematic flow sheet of the object analysis method of control system.The step of Figure 10 can be processing unit 13 perform Fig. 6 the step of Performed during S605.
In step S1001, processing unit 13 can calculate the image area of the first image according to the first image captured A pixel value difference in the IA of domain in each pixel column between max pixel value and minimum pixel value.Processing unit 13 can be in by first The left margin LB of the imagery zone of image to right margin RB, sequentially calculate max pixel value and a minimum image in the respectively pixel column Value differences (i.e. brightness value difference) between element value.
In the step s 1003, the pixel column with minimum pixel difference in the imagery zone IA of the first image is judged Whether the pixel value difference is more than a presetted pixel difference (such as zero).When processing unit 13 judges being somebody's turn to do with minimum pixel difference The pixel value difference of pixel column is more than the presetted pixel difference (such as zero), performs step S1005.Conversely, work as processing unit 13 Judge that the pixel value difference of the pixel column with minimum pixel difference is less than or equal to above-mentioned presetted pixel difference (such as zero), hold Row step S1007.
In step S1005, processing unit 13 can be excessive because of the luminance difference of the pixel column with minimum pixel difference, And judge indicant and be suspended in touch surface 110.And in step S1007, what processing unit 13 can be because of minimum pixel difference should The luminance difference of pixel column is smaller, and it is touching touch surface 110 to judge indicant.
For example, using the first image as the image F6 shown in Fig. 9, due to being instructed to thing in image F6 imagery zone IA Object image I21, I21 ' masking area it is smaller, therefore the luminance difference in image F6 imagery zone IA between any pixel row Different meeting is larger, therefore the pixel value difference of the pixel column with minimum pixel difference can be more than above-mentioned presetted pixel difference, therefore handles Unit 13 can judge that indicant is suspended in the touch surface 110.
Separately for example, if the first image is the image F2 shown in Fig. 2 B, thing is instructed in image F2 imagery zones IA Object image I21, I21 ' masking area it is larger, therefore the luminance difference in image F6 imagery zone IA between any pixel row It is different smaller.So as to which the pixel value difference of the pixel column with minimum pixel difference can be less than upper in image F6 imagery zone IA Presetted pixel difference is stated, therefore processing unit 13 can judge that indicant touches touch surface 110.
In yet another embodiment, processing unit 13 can be by analyzing in imagery zone being averaged in the respectively pixel column Pixel value (namely average brightness value), and judge that indicant is suspended in touch surface 110 or touching touch surface 110 with this.Please Reference picture 11 and referring concurrently to Fig. 9 and Fig. 2 B.Figure 11 shows the object for the optical touch control system that another embodiment of the present invention provides The schematic flow sheet of analysis method.The step of Figure 11, can be performed when processing unit 13 performs Fig. 6 step S605.
In step S1101, processing unit 13 can calculate the first shadow according to the first image (Fig. 9 image F6) captured An average pixel value of each pixel column in the imagery zone IA of picture.Processing unit 13 can be in the image area by the first image The left margin LB in domain to right margin RB, sequentially calculate the average pixel value of the respectively pixel column.
In step S1103, judge that there is the pixel column of minimum average B configuration pixel value in the imagery zone IA of the first image The average pixel value whether to be more than an acquiescence average pixel value (be, for example, any pixel row in the domain of highlights in background video Average pixel value).When processing unit 13 judges the average pixel value of the pixel column with minimum average B configuration pixel value more than above-mentioned Give tacit consent to average pixel value, perform step S1105.Conversely, when processing unit 13 judges the pixel column with minimum average B configuration pixel value The average pixel value be less than or equal to above-mentioned acquiescence average pixel value, perform step S1107.
In step S1105, processing unit 13 can because have minimum average B configuration pixel value pixel column average pixel value compared with Greatly, represent that imagery zone IA brightness is excessive, and judge indicant and be suspended in touch surface 110.And in step S1107, place Manage unit 13 can because having the average pixel value of the pixel column of minimum average B configuration pixel value smaller, represent the brightness of imagery zone compared with It is small, and it is touching touch surface 110 to judge indicant.
For example, if the first image is the image F6 shown in Fig. 9, due to being instructed to thing in image F6 imagery zone IA Object image I21, I21 ' masking area it is smaller, therefore the average picture in image F6 imagery zone IA between any pixel row Element value is larger, therefore the average pixel value of the pixel column with minimum average B configuration pixel value can be more than above-mentioned acquiescence average pixel value, Therefore processing unit 13 can judge that indicant is suspended in the touch surface 110.
Separately for example, if the first image is the image F2 shown in Fig. 2 B, thing is instructed in image F2 imagery zones IA Object image I21, I21 ' masking area it is larger, therefore the average picture in image F6 imagery zone IA between any pixel row Element value is smaller.The average pixel value value of the pixel column with minimum average B configuration pixel value can be less than in image F6 imagery zone IA Above-mentioned acquiescence pixel average pixel value, therefore processing unit 13 can judge that indicant is touching touch surface 110.
The presetted pixel ratio value, presetted pixel difference and acquiescence average pixel value can according to practical application request with Firmware mode is pre-designed in processing unit 13.The presetted pixel ratio value, presetted pixel difference and acquiescence mean pixel Value can also be foundation or be stored in memory cell 14 in advance, and reads and apply when operation for processing unit 13.
In addition, above-mentioned presetted pixel ratio value, presetted pixel difference and acquiescence average pixel value can be according to optics The practical operation demand of touch-control system, such as brightness, shadow caused by the sensing indicant touching sensitivity of touch surface, light-emitting component Sensitivity and ambient brightness arround noise intensity and optical touch control system 1 as sensing device further etc. are set, and in affiliated Technical field has usual skill can should respectively correspond to the appropriate presetted pixel ratio value of setting, default picture according to the above description Plain difference and acquiescence average pixel value, accurately to define the imagery zone scope of detection.Therefore, the present invention is not intended to limit The specific setting of above-mentioned presetted pixel ratio value, presetted pixel difference and acquiescence average pixel value etc. and embodiment.
Above-mentioned Fig. 6 object analysis method and Fig. 7, Fig. 8, Figure 10 and Figure 11 can be with the analysis method of imagery zone It is that the process chip of processing unit 13 is written into realize by Design of Firmware mode, so that processing unit 13 is when running The calculation method described in Fig. 6, Fig. 7, Fig. 8, Figure 10 is performed, the present invention is not intended to limit.In addition, in practical application, processing unit 13 also can sequentially perform indicant touch-control state described in Fig. 7, Fig. 8, Figure 10 and Figure 11 when performing Fig. 6 step S605 Judgment mode.It is noted that Fig. 6, Fig. 7, Fig. 8, Figure 10 and Figure 11 are merely to illustrate the object point of the present embodiment offer The embodiment of analysis method, is not limited to the present invention.
The present embodiment separately provides defining and background luminance curve data production method for highlights domain.It refer to Figure 12 and same When reference picture 1, Fig. 3 A and Fig. 3 B.Figure 12 shows the stream of highlights domain confining method in background video provided in an embodiment of the present invention Journey schematic diagram.
In step S1201, before the touch-control sensing area TR in touch surface 110 of the indicant not yet close to panel 11 (such as when optical touch control system 1 just starts), driving Image sensor apparatus 12 captures the back of the body of the touch surface across touch surface 110 Scape image FB.The background video FB includes background area DR and highlights domain BR.
Longitudinal height of the highlights domain BR is by touch surface 110, reflective mirror 130, the first reflecting unit 140 and Two reflecting units 150 are determined.The background area DR is then to cover touch surface 110, reflective mirror 130, the first reflecting unit 140 And the second background area (real image and mirror area that include touch-control sensing region TR) beyond reflecting unit 150.
In step S1203, compare the respectively pixel value and a presetted pixel value in each pixel column in background video FB. The presetted pixel value can be as in the foregoing embodiment according to background video FB average brightness value and default weight value (such as 1.2) set.The default weight value is practical operation demand (such as the Image sensor apparatus according to optical touch control system 1 Ambient brightness arround image sensing ability or optical touch control system etc.) set.The presetted pixel value also can be as foregoing It is according to respectively the average brightness value of the pixel column is set with default weight value (such as 1.2) in background video FB described in embodiment The presetted pixel value of the corresponding respectively pixel column, the present embodiment are not intended to limit.
In step S1205, respectively the pixel column defines the last one light coboundary H_UB and the last one in background video FB Light lower boundary H_LB, wherein at least pixel value between strong light coboundary H_UB and strong light lower boundary H_LB is more than above-mentioned Presetted pixel value.
Specifically, processing unit 13 can be by comparing the pixel of each pixel in background video FB in each pixel column Value and presetted pixel value, and it is intensive according to where in the respectively pixel column there is brightness to be more than the plurality of pixel of presetted pixel value Block defines strong light coboundary H_UB and the last one light lower boundary H_LB.
In step S1207, processing unit 13 is according to strong light coboundary H_UB and the last one light lower boundary H_LB in the back of the body A highlights domain defined in scape image FB.
Then, in step S1209, in the calculating background video of processing unit 13 FB in the BR of highlights domain in each pixel column The brightness summation of multiple pixel values, to produce background luminance curve data.Specifically, processing unit 13 can be by background video FB The pixel value of the plurality of pixel of each pixel column is added in middle highlights domain BR, using the brightness value as the pixel column.Processing is single Background luminance curve data is simultaneously stored in memory cell 14 by member 13.
In another embodiment, processing unit 13 also can as it is foregoing it is described further according to a preset percentage (such as 80%) the background luminance curve with compared with low-light level, is produced, to provide brightness tolerance.Processing unit 13 and according to the back of the body of calculating Scape brightness curve correspondingly produces background luminance curve data.
Above-mentioned Figure 12 highlights domain and background luminance data creating method can also be by Design of Firmware mode The process chip of processing unit 13 is written into realize, and Figure 12 is merely to illustrate highlights domain and the background of the present embodiment offer One embodiment of brightness data production method, is not limited to the present invention.
(another embodiment of optical touch control system)
The optical touch system that another embodiment of the present invention provides is shown referring to Figure 13, Figure 14 A and Figure 14 B, Figure 13 The system structure diagram of system.Figure 14 A~Figure 14 B are the tools that the Image sensor apparatus that another embodiment of the present invention provides is captured There is the schematic diagram of the image of object image.
The difference of Figure 13 optical touch control system 3 and Fig. 1 optical touch control system 1 is Figure 13 optical touch control system 3 Comprising two Image sensor apparatus, such as the first Image sensor apparatus 12a and the second Image sensor apparatus 12b, to avoid single shadow As sensing device further caused by its set location or the set location of light-emitting component blind area (blind spot) and cause to judge by accident. In addition, the reflective mirror 130 of Figure 13 optical touch control system 3 is replaced by one the 3rd reflective list 310.It is and touch surface 110, luminous Element 120, the first reflecting unit 140, the second reflecting unit 150 and the 3rd reflective area encompassed of list 310 are touched for optics The touch-control sensing area TR of control system 3.The touch-control sensing area TR has a height H, and height H is according to optical touch control system 3 Practical structures set with operational requirements.
Furthermore, the first Image sensor apparatus 12a be disposed on the first edge 111 of touch surface 110 with The first intersecting corner of second edge 113.The second Image sensor apparatus 12b is disposed on the first edge of touch surface 110 111 the second corners intersected with the 4th edge 117.First Image sensor apparatus 12a and second Image sensor apparatus 12b is point Diverse location on panel 11 is not arranged at, and there is superimposed image sensing range, and the touch-control of optical touch control system 3 is improved with this Discrimination.
The first Image sensor apparatus 12a and the second Image sensor apparatus 12b capture the shadow across touch surface 110 respectively Picture, and can include or not comprising touch surface 110.The first Image sensor apparatus 12a's and the second Image sensor apparatus 12b is vertical Touch-control sensing area TR height H is preferably greater than to the visual field, intactly to capture the image of indicant.
In the present embodiment, touch surface 110 is non-reflective mirror, will not produce reflected image, therefore the first image sensing fills Putting the image that 12a and the second Image sensor apparatus 12b is captured across touch surface 110, only to wrap the first reflecting unit 140, second anti- Light unit 150, the area encompassed of the 3rd reflecting unit 310, but do not include the image as caused by the mirror of touch surface 110.
In simple terms, processing unit 13 can drive the first Image sensor apparatus according to the image capture frequency of an acquiescence respectively 12a and the second Image sensor apparatus 12b takes multiple images across touch surface 110.The plurality of image list for processing Member 13 senses indicant, such as the finger 21 or stylus of user 2, if into touch-control sensing area TR and indicant in Touch-control sensing area TR associative operation.
In detail, the first Image sensor apparatus 12a can be captured comprising indicant (such as finger tip and finger pulp of finger 21) Shield lights and form object image I21a in the first image F7.Second Image sensor apparatus 12b can be captured comprising indicant (example Such as the finger tip and finger pulp of finger 21) shield lights and form object image I21b in the second image F8.First image F7 and second Image F8 includes highlights domain BR and background area DR respectively, and wherein highlights domain BR's defines mode to be specified in foregoing reality Example is applied, therefore is repeated no more.
Processing unit 13 can according to the first Image sensor apparatus 12a and the second Image sensor apparatus 12b sensing result (and First image F7 and the second image F8) judge that indicant is that touching or is suspended in touch surface 110 at touch surface 110.
When the first Image sensor apparatus 12a and the second Image sensor apparatus 12b sensing result shows that indicant is simultaneously When touching touch surface 110, processing unit 13 judges that indicant is touching touch surface 110.And when the first Image sensor apparatus 12a with When any display indicant of second Image sensor apparatus 12b sensing result is suspended in touch surface 110, processing unit 13 is sentenced Determine indicant and be suspended in touch surface 110.
When it is touching touch surface 110 that processing unit 13, which judges indicant, processing unit 13 is according to indicant (such as finger 21) in the first image F7 and the second image F8 image space, touch-control coordinate of the indicant relative to touch surface 110 is calculated.Place The related data for calculating touch-control coordinate is simultaneously sent to display device 16 by reason unit 13, to control on display device 16 The action of cursor 161.
In order to know more about the mode of operation of optical touch control system 3, the present invention provides one kind and is applied to optical touch control system 3 Object analysis method.It refer to Figure 15 and show that the embodiment of the present invention provides referring concurrently to Figure 13, Figure 14 A and Figure 14 B, Figure 15 Optical touch control system object analysis method schematic flow sheet.
In step S1501, processing unit 13 not yet enters touch-control prior to indicant and surveys area TR (such as optical touch control systems 3 just start or not yet detect indicant) when, drive the first Image sensor apparatus 12a and the second image sensing to fill respectively Put the first background video FB1 (not shown) and the second background video FB2 that 12b captures the touch surface 110 across panel 11 (not shown).
In step S1503, processing unit 13 drives the first Image sensor apparatus 12a and the second Image sensor apparatus 12b captures the first image F7 and the second image F8 across touch surface 110 again respectively.The first image F7 has to should First object image I21a of indicant (such as finger tip and finger pulp of finger 21), and second image has to should indicant The second object image I21b.
In step S1505, processing unit 13 by comparing the first image F7 and corresponding first background video FB1, Go out corresponding first object image I21a the first left margin LB1 and the first right margin RB1 defined in first image F7, with first Corresponding first object image I21a the first imagery zone IA1 defined in image F7.
In step S1507, processing unit 13 by comparing the second image F8 and corresponding second background video FB2, Go out corresponding second object image I21b the second left margin LB2 and the second right margin RB2 defined in second image F8, with second Corresponding second object image I21b the second imagery zone IA2 defined in image F8.
In step S1509, processing unit 13 is respectively according in the first imagery zone IA1 and the second imagery zone IA2 Luminance difference between multiple pixel values, judges whether the indicant touches the touch surface 110 or be suspended in touch surface 110.
In step S1511, if the first imagery zone IA1 and the second imagery zone IA2 luminance difference are shown simultaneously When indicant is touching touch surface 110, then judge that the indicant is the touching touch surface 110 and performs step S1515.
In step S1513, if the first imagery zone IA1 and the second imagery zone IA2 any luminance difference show When showing that indicant is suspended in touch-control surface 110, processing unit 13 judges that the indicant is suspended in touch surface 110, and handles Unit 13 does not calculate touch-control coordinate of the indicant relative to touch surface 110.
In one embodiment, when processing unit 13 can be according to the first pixel group in the first imagery zone IA1 The 3rd pixel group in luminance difference and the second imagery zone IA2 and the 4th pixel group between the second pixel group it Between luminance difference, it is to touch the touch surface 110 or to be suspended in the touch surface 110 to judge the indicant.
Specifically, the first pixel group and the 3rd pixel group group include an at least high luminance pixel respectively, and institute State the second pixel group and the 4th pixel group group includes an at least low brightness pixel respectively.The high luminance pixel is the first image It is more than the pixel of predetermined threshold value in region IA1 or the second imagery zone IA2 with pixel value.The low brightness pixel is the first shadow As the pixel that there is pixel value to be less than predetermined threshold value in region IA1 or the second imagery zone IA2.
In another embodiment, processing unit 13 can be by calculating in the first imagery zone IA1 or the second imagery zone The respectively pixel ratio or pixel value difference between the pixel column max pixel value and minimum pixel value, or calculate first in IA2 The each average pixel value of the pixel column in imagery zone IA1 or the second imagery zone IA2, come judge the first imagery zone IA1 or Whether whether uniform or luminance difference is excessive for Luminance Distribution in second imagery zone IA2.
When processing unit 13 judges the first imagery zone IA1 or the second imagery zone IA2 any possessed brightness not Uniformly or whether luminance difference is excessive (i.e. maximum excessive with minimum pixel value difference), that is, judges that indicant is suspended in touch surface 110。
When processing unit 13 while judge brightness uniformity possessed by the first imagery zone IA1 or the second imagery zone IA2, Or maximum has luminance difference smaller with minimum pixel value, you can it is touching touch surface 110 to judge indicant.
In step S1515, processing unit 13 according to the image space of the first object image I21a in the first image F7 with And second second object image I21b in image F8 image space, calculate indicant and sat relative to the touch-control of the touch surface 110 Mark.The related data for calculating touch-control coordinate is simultaneously sent to display device 16 by processing unit 13, is set with controlling image to show The action of cursor 161 on standby 16, such as the moving operation of control cursor 161, writing operation or point selection operation etc..
Above-mentioned touch-control coordinate calculation can be calculated using trigonometric function, and its detailed calculation is similar to Content described in previous embodiment, and belong to prior art, do not repeated herein.
It is noted that described first and second Image sensor apparatus 12a, 12b may be, for example, complementary metal oxide Thing semiconductor (CMOS) or charge coupled cell (CCD), and can be according to reality in art tool usually intellectual Service condition designs, and the present embodiment is not any limitation as herein.
In addition, in order to increase the reflecting effect of first, second and third reflecting unit 140,150,310, in other implementations In mode, it may be selected to set up and the mutually isostructural another light-emitting component irradiation touch surface 110 of light-emitting component 120.This newly-increased hair Optical element can for example be integrated with the second Image sensor apparatus 12b and be arranged at the second corner.Newly-increased light-emitting component can use glutinous The mode for closing (sticking), screw locking (screwing) or fastening (fastening) is fixed on Image sensor apparatus 12b On.
Thus, optical touch control system 3 can be by setting up two Image sensor apparatus, the touch-control of improving optical touch-control system 3 Discrimination.But in other embodiments, optical touch control system 3 can also include three, four or multiple Image sensor apparatus, and The plurality of individual Image sensor apparatus is respectively arranged at diverse location, and has the overlapping visual field, carrys out improving optical touch-control system 3 Touch-control discrimination.In other words, the setting quantity of Image sensor apparatus and set location are touched according to optics in optical touch control system 3 The practical structures of control system 3 are designed with operational requirements, and the present embodiment is not intended to limit.
Above-mentioned Figure 15 object analysis method can also be written into processing unit 13 by Design of Firmware mode Process chip is realized, and Figure 15 is merely to illustrate the embodiment of the object analysis method of the present embodiment offer, and is not used to Limit the present invention.
In addition, a kind of computer-readable medium storing, storage earlier figures 6 and the thing described in Figure 15 can also be used in the present invention The object described in highlights domain confining method and Fig. 7, Fig. 8, Figure 10 and Figure 11 point described in part analysis method, Figure 12 is to sentence The computer program code of disconnected method etc., to perform foregoing step when computer-readable medium storing is read by a processor Suddenly.This computer-readable media can be floppy disk, hard disk, CD, Portable disk, tape, can by network access database or Person familiar with the technology can think easily and the store media with identical function.
(possibility effect of embodiment)
In summary, optical touch control system and its object analysis method, the object analysis method can be leaned in indicant During closely or positioned at the touch-control sensing area of optical touch control system, automatic pick-up corresponds to the image of indicant.Optical touch control system simultaneously can According to the Luminance Distribution information of the object image that indicant shaded portions light is formed in image, institute is quickly and accurately judged The indicant detected is the touch surface of contact or contact panel or is suspended in the touch surface.The object analysis method can basis Judged result, decide whether to calculate the touch-control coordinate of the object detected, and then touch point effectively in improving optical touch-control system Discrimination and operational efficiency.
Concept disclosed above and specific embodiment can based on, and by suitably being changed or designing other System architecture, processing procedure or system's operating mode and realize and identical purpose of the present invention.Therefore, it the foregoing is only the present invention's Embodiment, it is simultaneously not used to the scope of the claims of the limitation present invention.Technical field have usually intellectual it should be appreciated that on State structure and the spirit and scope of the present invention that appended claims is proposed can not be departed from.

Claims (21)

  1. A kind of 1. object analysis method of optical touch control system, it is characterised in that including:
    An Image sensor apparatus is driven to capture one first image across a touch surface, wherein first image has corresponding one to refer to Show an object image of thing;
    Go out defined in first image to should object image an imagery zone;And
    According to the luminance difference between multiple pixels in the imagery zone, judge whether the indicant is touched the touch surface or hanged Float on the touch surface;
    Wherein, define in the step of the imagery zone of the object image is to go out defined in first image to should object A left margin and a right margin for image, with defined in first image to should object image the imagery zone;
    Wherein, judging whether the indicant is touched the touch surface or be suspended in the step of the touch surface, including:
    According to the luminance difference between the one first pixel group and one second pixel group in the imagery zone, judge that this refers to Show whether thing touches the touch surface or be suspended in the touch surface;
    Wherein the first pixel group includes an at least high luminance pixel, and the second pixel group includes an at least low-light level picture Element, and the pixel value of the high luminance pixel is more than a predetermined threshold value, the pixel value of the low brightness pixel is less than the predetermined threshold value;
    Wherein, if judge that the indicant touches the touch surface, according to the image space of the object image in first image, meter Calculate a touch-control coordinate of the indicant relative to the touch surface.
  2. 2. object analysis method as claimed in claim 1, it is characterised in that judging whether the indicant touches the touch surface Or be suspended in the step of the touch surface, further include:
    If the ratio between the average pixel value of the first pixel group and the average pixel value of the second pixel group is more than one Presetted pixel ratio value, judge that the indicant is suspended in the touch surface;And
    If the ratio between the average pixel value of the first pixel group and the average pixel value of the second pixel group be less than or Equal to the presetted pixel ratio value, it is to touch the touch surface to judge the indicant.
  3. A kind of 3. object analysis method of optical touch control system, it is characterised in that including:
    An Image sensor apparatus is driven to capture one first image across a touch surface, wherein first image has corresponding one to refer to Show an object image of thing;
    Go out defined in first image to should object image an imagery zone;And
    According to the luminance difference between multiple pixels in the imagery zone, judge whether the indicant is touched the touch surface or hanged Float on the touch surface;
    Wherein, the step for defining the imagery zone of the object image is to go out defined in first image to should object shadow A left margin and a right margin for picture, with defined in first image to should object image the imagery zone;
    Wherein, judging whether the indicant is touched the touch surface or be suspended in the step of the touch surface, including:
    Calculate the pixel ratio in each pixel column between a max pixel value and a minimum pixel value in the imagery zone;
    Compare the pixel ratio of the pixel column with maximum pixel ratio value and a presetted pixel ratio in the imagery zone Example value, to judge whether the indicant touches the touch surface or be suspended in the touch surface;
    If the pixel ratio of the pixel column with maximum pixel ratio value is more than the presetted pixel ratio value, judge that this refers to Show that thing is suspended in the touch surface;And
    If the pixel ratio of the pixel column with maximum pixel ratio value is less than or equal to the presetted pixel ratio value, sentence The fixed indicant is to touch the touch surface;
    Wherein, if judge that the indicant touches the touch surface, according to the image space of the object image in first image, meter Calculate a touch-control coordinate of the indicant relative to the touch surface.
  4. A kind of 4. object analysis method of optical touch control system, it is characterised in that including:
    An Image sensor apparatus is driven to capture one first image across a touch surface, wherein first image has corresponding one to refer to Show an object image of thing;
    Go out defined in first image to should object image an imagery zone;And
    According to the luminance difference between multiple pixels in the imagery zone, judge whether the indicant is touched the touch surface or hanged Float on the touch surface;
    Wherein, the step for defining the imagery zone of the object image is to go out defined in first image to should object shadow A left margin and a right margin for picture, with defined in first image to should object image the imagery zone;
    Wherein, judging whether the indicant is touched the touch surface or be suspended in the step of the touch surface, including:
    Calculate the pixel value difference in each pixel column between a max pixel value and a minimum pixel value in the imagery zone;
    Compare pixel value difference and a presetted pixel difference of the pixel column with minimum pixel difference in the imagery zone, with Judge whether the indicant touches the touch surface or be suspended in the touch surface;
    If the pixel value difference of the pixel column with minimum pixel difference is more than the presetted pixel difference, judge that the indicant hangs Float on the touch surface;And
    If the pixel value difference of the pixel column with minimum pixel difference is less than the presetted pixel difference, the indicant is judged Touch the touch surface;
    Wherein, if judge that the indicant touches the touch surface, according to the image space of the object image in first image, meter Calculate a touch-control coordinate of the indicant relative to the touch surface.
  5. A kind of 5. object analysis method of optical touch control system, it is characterised in that including:
    An Image sensor apparatus is driven to capture one first image across a touch surface, wherein first image has corresponding one to refer to Show an object image of thing;
    Go out defined in first image to should object image an imagery zone;And
    According to the luminance difference between multiple pixels in the imagery zone, judge whether the indicant is touched the touch surface or hanged Float on the touch surface;
    Wherein, define in the step of the imagery zone of the object image is to go out defined in first image to should object A left margin and a right margin for image, with defined in first image to should object image the imagery zone;
    Wherein, judging whether the indicant touch-control touch surface or is suspended in the step of the touch surface, including:
    Calculate the average pixel value of each pixel column in the imagery zone;
    Compare the average pixel value of each pixel column described in the imagery zone and an acquiescence average pixel value, to judge the indicant Whether the touch surface is touched;
    If the average pixel value of the pixel column with minimum average B configuration pixel value is more than the acquiescence average pixel value, judge that this refers to Show that thing is suspended in the touch surface;And
    If the average pixel value of the pixel column with minimum average B configuration pixel value is less than the acquiescence average pixel value, the instruction is judged Thing touches the touch surface;
    Wherein, if judge that the indicant touches the touch surface, according to the image space of the object image in first image, meter Calculate a touch-control coordinate of the indicant relative to the touch surface.
  6. 6. object analysis method as claimed in claim 5, it is characterised in that define to should object image the left side In this of boundary and the right margin step, further include:
    The Image sensor apparatus is driven to capture the background video across the touch surface, wherein the background video and without corresponding The object image of the indicant;
    Compare each pixel value and a presetted pixel value in the background video in each pixel column, to be defined in each pixel column Go out the last one light coboundary and the last one light lower boundary, wherein the plurality of between the strong light coboundary and the strong light lower boundary Pixel value is more than the presetted pixel value;
    According to the strong light coboundary of each pixel column and the strong light lower boundary, a highlights defined in the background video Domain;
    The brightness summation of multiple pixel values in the background video in the highlights domain in each pixel column is calculated, to produce a back of the body Scape brightness curve data;And
    According to the background luminance curve data of the background video, defined in first image to should object image the left side Boundary and the right margin.
  7. 7. object analysis method as claimed in claim 6, it is characterised in that define to should object image the left side In this of boundary and the right margin step, in addition to:
    Calculate in first image to should be in the highlights domain of background video each pixel column in multiple pixel values it is bright Spend summation;
    According to result of calculation, the brightness curve data being distributed in first image along each pixel row direction are produced;And
    According to the luminance difference between the brightness curve data and the background luminance curve data, go out defined in first image To should object image the left margin and the right margin.
  8. 8. object analysis method as claimed in claim 6, it is characterised in that also include:
    Calculate an average brightness value of the background video;And
    According to the average brightness value and a default weight value, the presetted pixel value is set.
  9. 9. a kind of object analysis method of optical touch control system, the optical touch control system includes a panel, one first image sensing Device and one second Image sensor apparatus, first Image sensor apparatus are to be respectively arranged at second Image sensor apparatus Diverse location on the panel, and there is superimposed image sensing range, it is characterised in that the object analysis method includes:
    Drive first Image sensor apparatus and second Image sensor apparatus capture to should panel a touch surface one First background video and one second background video, wherein first background video and second background video are one respectively Captured when indicant is not yet close to the touch surface;
    First Image sensor apparatus and second Image sensor apparatus is driven to capture one first across the touch surface respectively Image and one second image, wherein first image have to should indicant one first object image, and second image With to should indicant one second object image;
    Go out defined in first image to should the first object image one first imagery zone;
    Go out defined in second image to should the second object image one second imagery zone;And
    Respectively according to the luminance difference between multiple pixel values in first imagery zone and second imagery zone, judging should Whether indicant touches the touch surface or is suspended in the touch surface;
    When according to any luminance difference of first imagery zone and second imagery zone show the indicant be suspend When the touch surface, judge that the indicant is suspended in the touch surface, and do not calculate the indicant and touched relative to the one of the panel Control coordinate;
    Wherein, judging whether the indicant is touched the touch surface or be suspended in the step of the touch surface, including:
    Respectively according to the luminance difference between the one first pixel group and one second pixel group in first imagery zone with And the luminance difference between one the 3rd pixel group and one the 4th pixel group in second imagery zone, judge the indicant Whether touch the touch surface or be suspended in the touch surface;
    Wherein the first pixel group includes an at least high luminance pixel respectively with the 3rd pixel group group, the second pixel group Include an at least low brightness pixel respectively with the 4th pixel group group;The pixel value of the high luminance pixel is more than a predetermined threshold value, The pixel value of the low brightness pixel is less than the predetermined threshold value;
    Wherein, when simultaneously the luminance difference of first imagery zone and second imagery zone shows that the indicant is that touching should During touch surface, it is to touch the touch surface to judge the indicant;And
    According to second object image in the image space of first object image in first image and second image Image space, calculate the touch-control coordinate of the indicant relative to the panel.
  10. 10. object analysis method as claimed in claim 9, it is characterised in that judging whether the indicant touches the touch-control Face or it is suspended in the step of the touch surface, further includes:
    If ratio between the average pixel value of the first pixel group and the average pixel value of the second pixel group or this Ratio between the average pixel value of three pixel groups and the average pixel value of the 4th pixel group is more than a presetted pixel ratio Example value, judges that the indicant is suspended in the touch surface;And
    If ratio between the average pixel value of the first pixel group and the average pixel value of the second pixel group and should Ratio between the average pixel value of 3rd pixel group and the average pixel value of the 4th pixel group is less than or equal to simultaneously The presetted pixel ratio value, it is to touch the touch surface to judge the indicant.
  11. 11. a kind of object analysis method of optical touch control system, the optical touch control system includes a panel, one first image sensing Device and one second Image sensor apparatus, first Image sensor apparatus are to be respectively arranged at second Image sensor apparatus Diverse location on the panel, and there is superimposed image sensing range, it is characterised in that the object analysis method includes:
    Drive first Image sensor apparatus and second Image sensor apparatus capture to should panel a touch surface one First background video and one second background video, wherein first background video and second background video are one respectively Captured when indicant is not yet close to the touch surface;
    First Image sensor apparatus and second Image sensor apparatus is driven to capture one first across the touch surface respectively Image and one second image, wherein first image have to should indicant one first object image, and second image With to should indicant one second object image;
    Go out defined in first image to should the first object image one first imagery zone;
    Go out defined in second image to should the second object image one second imagery zone;And
    Respectively according to the luminance difference between multiple pixel values in first imagery zone and second imagery zone, judging should Whether indicant touches the touch surface or is suspended in the touch surface;
    When according to any luminance difference of first imagery zone and second imagery zone show the indicant be suspend When the touch surface, judge that the indicant is suspended in the touch surface, and do not calculate the indicant and touched relative to the one of the panel Control coordinate;
    Wherein, judging whether the indicant is touched the touch surface or be suspended in the step of the touch surface, including:
    Calculate one first pixel ratio in each pixel column between max pixel value and minimum pixel value in first imagery zone Example value;
    Calculate one second pixel between the max pixel value and minimum pixel value in second imagery zone in each pixel column Ratio value;
    It is respectively compared first pixel ratio of the pixel column in first imagery zone with maximum first pixel ratio There is second pixel ratio and one of the pixel column of maximum second pixel ratio in value and second imagery zone Presetted pixel ratio value;And
    If have in first imagery zone pixel column of maximum first pixel ratio first pixel ratio and Second pixel ratio in second imagery zone with the pixel column of maximum first pixel ratio be less than simultaneously or Equal to the presetted pixel ratio value, judge that the indicant touches the touch surface;
    Wherein, when simultaneously the luminance difference of first imagery zone and second imagery zone shows that the indicant is that touching should During touch surface, it is to touch the touch surface to judge the indicant;And
    According to second object image in the image space of first object image in first image and second image Image space, calculate the touch-control coordinate of the indicant relative to the panel.
  12. 12. object analysis method as claimed in claim 11, it is characterised in that judging whether the indicant touches the touch-control Face or it is suspended in the step of the touch surface, further includes:
    If have in first imagery zone pixel column of maximum first pixel ratio first pixel ratio and Any of second pixel ratio in second imagery zone with the pixel column of maximum second pixel ratio is more than The presetted pixel ratio value, judge that the indicant is suspended in the touch surface.
  13. 13. a kind of object analysis method of optical touch control system, the optical touch control system includes a panel, one first image sensing Device and one second Image sensor apparatus, first Image sensor apparatus are to be respectively arranged at second Image sensor apparatus Diverse location on the panel, and there is superimposed image sensing range, it is characterised in that the object analysis method includes:
    Drive first Image sensor apparatus and second Image sensor apparatus capture to should panel a touch surface one First background video and one second background video, wherein first background video and second background video are one respectively Captured when indicant is not yet close to the touch surface;
    First Image sensor apparatus and second Image sensor apparatus is driven to capture one first across the touch surface respectively Image and one second image, wherein first image have to should indicant one first object image, and second image With to should indicant one second object image;
    Go out defined in first image to should the first object image one first imagery zone;
    Go out defined in second image to should the second object image one second imagery zone;And
    Respectively according to the luminance difference between multiple pixel values in first imagery zone and second imagery zone, judging should Whether indicant touches the touch surface or is suspended in the touch surface;
    When according to any luminance difference of first imagery zone and second imagery zone show the indicant be suspend When the touch surface, judge that the indicant is suspended in the touch surface, and do not calculate the indicant and touched relative to the one of the panel Control coordinate;
    Wherein, judging whether the indicant is touched the touch surface or be suspended in the step of the touch surface, including:
    Calculate one first pixel between the max pixel value and minimum pixel value in first imagery zone in each pixel column Difference;
    Calculate one second pixel between the max pixel value and minimum pixel value in second imagery zone in each pixel column Difference;
    Be respectively compared first pixel value difference of the pixel column in first imagery zone with minimum first pixel value difference with And there is second pixel value difference and a presetted pixel of the pixel column of minimum second pixel value difference in second imagery zone Difference;And
    If have in first imagery zone pixel column of minimum first pixel value difference first pixel value difference and this Second pixel value difference in two imagery zones with the pixel column of minimum second pixel value difference is less than the presetted pixel simultaneously Difference, judge that the indicant touches the touch surface;
    Wherein, when simultaneously the luminance difference of first imagery zone and second imagery zone shows that the indicant is that touching should During touch surface, it is to touch the touch surface to judge the indicant;And
    According to second object image in the image space of first object image in first image and second image Image space, calculate the touch-control coordinate of the indicant relative to the panel.
  14. 14. object analysis method as claimed in claim 13, it is characterised in that judging whether the indicant touches the touch-control Face or it is suspended in the step of the touch surface, further includes:
    If have in first imagery zone pixel column of minimum first pixel value difference first pixel value difference and this There is any of second pixel value difference of the pixel column of minimum second pixel value difference to be more than the default picture in two imagery zones Plain difference, judge that the indicant is suspended in the touch surface.
  15. 15. a kind of object analysis method of optical touch control system, the optical touch control system includes a panel, one first image sensing Device and one second Image sensor apparatus, first Image sensor apparatus are to be respectively arranged at second Image sensor apparatus Diverse location on the panel, and there is superimposed image sensing range, it is characterised in that the object analysis method includes:
    Drive first Image sensor apparatus and second Image sensor apparatus capture to should panel a touch surface one First background video and one second background video, wherein first background video and second background video are one respectively Captured when indicant is not yet close to the touch surface;
    First Image sensor apparatus and second Image sensor apparatus is driven to capture one first across the touch surface respectively Image and one second image, wherein first image have to should indicant one first object image, and second image With to should indicant one second object image;
    Go out defined in first image to should the first object image one first imagery zone;
    Go out defined in second image to should the second object image one second imagery zone;And
    Respectively according to the luminance difference between multiple pixel values in first imagery zone and second imagery zone, judging should Whether indicant touches the touch surface or is suspended in the touch surface;
    When according to any luminance difference of first imagery zone and second imagery zone show the indicant be suspend When the touch surface, judge that the indicant is suspended in the touch surface, and do not calculate the indicant and touched relative to the one of the panel Control coordinate;
    Wherein, judging whether the indicant is touched the touch surface or be suspended in the step of the touch surface, including:
    Calculate one first average pixel value of the plurality of pixel value in each pixel column in first imagery zone;
    Calculate one second average pixel value of the plurality of pixel value in each pixel column in second imagery zone;
    It is respectively compared multiple first average pixel values and second imagery zone of multiple pixel columns in first imagery zone Described in multiple pixel columns multiple second average pixel values and one acquiescence average pixel value;And
    If have in first imagery zone pixel column of minimum first average pixel value first average pixel value and There is second average pixel value of the pixel column of minimum second average pixel value to be simultaneously less than in second imagery zone to be somebody's turn to do Give tacit consent to average pixel value, then judge that the indicant touches the touch surface;
    Wherein, when simultaneously the luminance difference of first imagery zone and second imagery zone shows that the indicant is that touching should During touch surface, it is to touch the touch surface to judge the indicant;And
    According to second object image in the image space of first object image in first image and second image Image space, calculate the touch-control coordinate of the indicant relative to the panel.
  16. 16. object analysis method as claimed in claim 15, it is characterised in that judging whether the indicant touches the touch-control Face or it is suspended in the step of the touch surface, further includes:
    If have in first imagery zone pixel column of minimum first average pixel value first average pixel value and Any of second average pixel value in second imagery zone with the pixel column of minimum second average pixel value is more than The acquiescence average pixel value, then judge that the indicant is suspended in the touch surface.
  17. A kind of 17. optical touch control system, it is characterised in that including:
    One panel, there is a touch surface;
    An at least light-emitting component, produce a light and illuminate the touch surface;
    One reflective mirror, to produce the mirror image of the panel;
    An at least reflecting unit, to reflect the light of the light-emitting component;
    One Image sensor apparatus, to capture multiple images across the touch surface, and at least one of the plurality of image An object image and a mirror image with a corresponding indicant;And
    One processing unit, couple the light-emitting component, the Image sensor apparatus and a display device;
    Wherein when the processing unit drives the Image sensor apparatus to capture one first image across the touch surface, and first shadow As have to should indicant the object image and the mirror image when, the processing unit is defined in first image to should One imagery zone of object image, the processing unit judge according to the luminance difference between multiple pixel values in the imagery zone The indicant is to touch the touch surface or be suspended in the touch surface, and the processing unit decides whether according to judged result Calculate a touch-control coordinate of the indicant relative to the touch surface;
    The processing unit according to image space of the object image in first image define to should object image it is one left Border and a right margin, with defined in first image to should object image the imagery zone;
    Wherein, when the processing unit judges that the indicant touches the touch surface, the processing unit is according to the object image in this The mirror image of the image space and the indicant in the first image, calculate the touch-control of the indicant relative to the touch surface Coordinate, and the touch-control coordinate is exported to the display device, with the corresponding action for controlling a cursor on the display device;
    The processing unit is according to the luminance difference between the one first pixel group and one second pixel group in the imagery zone Different, it is to touch the touch surface or be suspended in the touch surface to judge the indicant;And
    Wherein the first pixel group includes an at least high luminance pixel, and the second pixel group includes an at least low-light level picture Element;The pixel value of the high luminance pixel is more than a predetermined threshold value, and the pixel value of the low brightness pixel is less than the predetermined threshold value.
  18. 18. optical touch control system as claimed in claim 17, it is characterised in that when the processing unit calculates first pixel Ratio between the average pixel value of group and the average pixel value of the second pixel group is more than a presetted pixel ratio value, should Processing unit judges that the indicant is suspended in the touch surface;When the processing unit calculates the average picture of the first pixel group Ratio between element value and the average pixel value of the second pixel group is less than or equal to the presetted pixel ratio value, the processing list Member judges that the indicant is to touch the touch surface.
  19. 19. optical touch control system as claimed in claim 17, it is characterised in that the processing unit calculates every in the imagery zone A pixel ratio in one pixel column between max pixel value and minimum pixel value, and the processing unit compares the image area The pixel ratio of the pixel column with maximum pixel ratio value and a presetted pixel ratio value in domain, to judge the instruction Whether thing touches the touch surface;When the processing unit judges the pixel ratio of the pixel column with maximum pixel ratio value More than the presetted pixel ratio value, the processing unit judges that the indicant is suspended in the touch surface;When the processing unit judges The pixel ratio of the pixel column with maximum pixel ratio value is less than or equal to the presetted pixel ratio value, the processing list Member judges that the indicant touches the touch surface.
  20. 20. optical touch control system as claimed in claim 17, it is characterised in that the processing unit calculates every in the imagery zone A pixel value difference in one pixel column between max pixel value and minimum pixel value, and the processing unit compares the imagery zone In have maximum pixel ratio value the pixel column the pixel value difference and a presetted pixel difference, whether to judge the indicant Touch the touch surface;When the processing unit judges that the pixel value difference of the pixel column with minimum pixel difference is default more than this Pixel value difference, the processing unit judge that the indicant is suspended in the touch surface;When the processing unit judges have minimum pixel poor The pixel value difference of the pixel column of value is less than the presetted pixel difference, and the processing unit judges that the indicant touches the touch-control Face.
  21. 21. optical touch control system as claimed in claim 17, it is characterised in that the processing unit calculates every in the imagery zone The average pixel value of the plurality of pixel in one pixel column, the processing unit simultaneously compare and have minimum average B configuration pixel in the imagery zone Value and an acquiescence average pixel value, to judge whether the indicant touches the touch surface;When the processing unit judges there is minimum The average pixel value of the pixel column of average pixel value is more than the acquiescence average pixel value, and the processing unit judges that the indicant is It is suspended in the touch surface;It is somebody's turn to do when the processing unit judges that the average pixel value of the pixel column with minimum average B configuration pixel value is less than Give tacit consent to average pixel value, the processing unit judges that the indicant touches the touch surface.
CN201410043884.9A 2014-01-29 2014-01-29 Optical touch control system and its object analysis method Expired - Fee Related CN104808865B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410043884.9A CN104808865B (en) 2014-01-29 2014-01-29 Optical touch control system and its object analysis method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410043884.9A CN104808865B (en) 2014-01-29 2014-01-29 Optical touch control system and its object analysis method

Publications (2)

Publication Number Publication Date
CN104808865A CN104808865A (en) 2015-07-29
CN104808865B true CN104808865B (en) 2018-03-23

Family

ID=53693749

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410043884.9A Expired - Fee Related CN104808865B (en) 2014-01-29 2014-01-29 Optical touch control system and its object analysis method

Country Status (1)

Country Link
CN (1) CN104808865B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI718378B (en) * 2018-05-23 2021-02-11 友達光電股份有限公司 Optical detection device and detection method thereof

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201108058A (en) * 2009-08-28 2011-03-01 Pixart Imaging Inc Touch system and pointer coordinate detecting method therefor
CN103076925A (en) * 2011-10-26 2013-05-01 原相科技股份有限公司 Optical touch system, optical sensing module and method for operating optical touch system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI450154B (en) * 2010-09-29 2014-08-21 Pixart Imaging Inc Optical touch system and object detection method therefor

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201108058A (en) * 2009-08-28 2011-03-01 Pixart Imaging Inc Touch system and pointer coordinate detecting method therefor
CN103076925A (en) * 2011-10-26 2013-05-01 原相科技股份有限公司 Optical touch system, optical sensing module and method for operating optical touch system

Also Published As

Publication number Publication date
CN104808865A (en) 2015-07-29

Similar Documents

Publication Publication Date Title
CN101663637B (en) Touch screen system with hover and click input methods
US8786576B2 (en) Three-dimensional space touch apparatus using multiple infrared cameras
KR101123932B1 (en) Optical touch system and method
TWI534687B (en) Optical touch detection system and object analyzation method thereof
TWI420357B (en) Touch system and pointer coordinate detecting method therefor
TW201535204A (en) Object detection method and calibration apparatus of optical touch system
KR20070038430A (en) Display apparatus and display method
KR20120058594A (en) Interactive input system with improved signal-to-noise ratio (snr) and image capture method
KR20110005738A (en) Interactive input system and illumination assembly therefor
US9501160B2 (en) Coordinate detection system and information processing apparatus
US9471180B2 (en) Optical touch panel system, optical apparatus and positioning method thereof
TWI511008B (en) Touch detection method and optical touch system thereof
US20130127704A1 (en) Spatial touch apparatus using single infrared camera
US20110241987A1 (en) Interactive input system and information input method therefor
US20140098062A1 (en) Optical touch panel system and positioning method thereof
CN104808865B (en) Optical touch control system and its object analysis method
KR101071864B1 (en) Touch and Touch Gesture Recognition System
US9489077B2 (en) Optical touch panel system, optical sensing module, and operation method thereof
CN103076928B (en) The electronic whiteboard spot identification method and device of imageing sensor is combined based on light film
CN104915065A (en) Object detecting method and correcting device used for optical touch system
TWI536228B (en) An inductive motion-detective device
US20130162592A1 (en) Handwriting Systems and Operation Methods Thereof
KR101358842B1 (en) Detecting Module for Detecting Touch Point Selectively By Plural Methods and Screen Input Device Having the Same

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
EXSB Decision made by sipo to initiate substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20180323

Termination date: 20200129

CF01 Termination of patent right due to non-payment of annual fee