WO2011071305A2 - Écran tactile optique - Google Patents

Écran tactile optique Download PDF

Info

Publication number
WO2011071305A2
WO2011071305A2 PCT/KR2010/008728 KR2010008728W WO2011071305A2 WO 2011071305 A2 WO2011071305 A2 WO 2011071305A2 KR 2010008728 W KR2010008728 W KR 2010008728W WO 2011071305 A2 WO2011071305 A2 WO 2011071305A2
Authority
WO
WIPO (PCT)
Prior art keywords
infrared
micro coordinate
light sources
coordinates
coordinate light
Prior art date
Application number
PCT/KR2010/008728
Other languages
English (en)
Korean (ko)
Other versions
WO2011071305A3 (fr
Inventor
김성한
Original Assignee
Kim Sung-Han
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kim Sung-Han filed Critical Kim Sung-Han
Priority to JP2012543024A priority Critical patent/JP5459689B2/ja
Priority to EP10836200.5A priority patent/EP2511801B1/fr
Priority to CN201080054237.8A priority patent/CN102713807B/zh
Publication of WO2011071305A2 publication Critical patent/WO2011071305A2/fr
Publication of WO2011071305A3 publication Critical patent/WO2011071305A3/fr
Priority to US13/491,669 priority patent/US8780087B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/04164Connections between sensors and controllers, e.g. routing lines between electrodes and connection pads
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/1333Constructional arrangements; Manufacturing methods
    • G02F1/1335Structural association of cells with optical devices, e.g. polarisers or reflectors
    • G02F1/1336Illuminating devices
    • G02F1/133615Edge-illuminating devices, i.e. illuminating from the side
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0428Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by sensing at the edges of the touch surface the interruption of optical paths, e.g. an illumination plane, parallel to the touch surface which may be virtual
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04103Manufacturing, i.e. details related to manufacturing processes specially suited for touch sensitive devices

Definitions

  • the present invention relates to an optical touch screen capable of recognizing touch coordinates by touching a screen using a finger or a touch pen.
  • touch screens are one of the most efficient input devices for making the interface between the display device and the user simple and easy.
  • Such a touch screen can be operated by various devices such as a computer, a mobile phone, a financial terminal, a game machine, and the like by using a finger or a touch pen, and the application field is very wide.
  • Electric methods include a resistive film method and a capacitive method.
  • the resistive type and the capacitive type are used for the small touch screen because the price is expensive and the technical limit is large when the screen size of the touch screen is increased.
  • Optical methods include an infrared matrix method and a camera method.
  • the infrared matrix method is partially used for medium and large touch screens.
  • the larger the screen size of the touch screen the higher the power consumption and the higher the price, the greater the problem of malfunction due to external environment such as sunlight, lighting.
  • the camera method basically calculates the touch coordinates from the angles of the touch objects seen by the two cameras.
  • Conventional camera method has a problem of malfunction by the external environment, such as sunlight, lighting, like the infrared matrix method.
  • the accuracy is lowered due to the error of the angle measurement due to the distortion of the camera lens.
  • a virtual point (ghost point) occurs in the calculation process in the process of sensing two or more multi-touch at the same time, there is a problem that it is difficult to distinguish the virtual image coordinates.
  • An object of the present invention is to provide an optical touch screen that is not affected by shadows or external light and can accurately obtain the coordinates of a touch object without measuring errors due to distortion of the camera lens itself.
  • an object of the present invention is to provide an optical touch screen that can determine the exact actual coordinates by distinguishing the virtual image coordinates generated in the process of sensing two or more multi-touch.
  • the main body is installed to surround the border of the touch area of the screen;
  • two horizontal sides and two left and right vertical sides are respectively installed to generate infrared fine gauge light sources at regular intervals toward the touch area, thereby providing coordinate reference of the horizontal axis and the vertical axis.
  • Infrared micro coordinate light source generator Two or more infrared cameras installed in the main body to detect infrared micro coordinate light sources generated by the infrared micro coordinate light source generators;
  • a controller configured to calculate coordinates of the touch object touched in the touch area based on the data sensed by the infrared cameras.
  • the infrared micro coordinate light sources are generated toward the touch area and the position of the infrared micro coordinate light sources is sensed to obtain the coordinates of the touch object, the camera lens itself is not affected by sunlight, shadows, or external light.
  • the coordinates of the touch object can be stably obtained without measuring errors due to the aberration and the distortion of.
  • the infrared micro coordinate light source generation unit since the infrared micro coordinate light source generation unit generates light by distributing the light of one or two infrared light emitting units to the infrared micro coordinate light sources by the number of micro grooves, power consumption can be reduced, Manufacturing of the touch screen can be facilitated.
  • the virtual coordinates generated in the calculation when there are two or more multi-touches, the virtual coordinates generated in the calculation can be distinguished to obtain accurate actual coordinates.
  • FIG. 1 is a block diagram of an optical touch screen according to an embodiment of the present invention.
  • Figure 2 is a front view showing an example of the infrared micro coordinate light source generator.
  • FIG. 3 is a partial perspective view illustrating a portion of the micro coordinate light source generating unit illustrated in FIG. 2.
  • FIG. 4 is a perspective view showing another example of the micro coordinate light source generator.
  • FIG. 5 is a front view showing still another example of the micro coordinate light source generator.
  • FIG. 6 is a partial perspective view illustrating a portion of the micro coordinate light source generator illustrated in FIG. 5.
  • FIG. 7 illustrates an example of a lookup table.
  • FIG. 8 is a view for explaining an example of a process of measuring the angles of the points where each infrared micro coordinate light source is positioned by the infrared camera.
  • FIG. 9 is a view for explaining an example in which infrared micro coordinate light sources are sensed in a straight line in an image sensor
  • FIG. 10 is a diagram for explaining a process of obtaining touch coordinates.
  • the optical touch screen 100 includes a main body 110, infrared micro coordinate light source generators 120A, 120B, 120C, and 120D, infrared cameras 130A, 130B, and 130C, and a controller 140. ).
  • the main body 110 is installed to surround the edge of the screen touch area 10.
  • the screen touch area 10 may correspond to screen touch areas of various display devices such as a liquid crystal display device.
  • the main body 110 mounts and supports the infrared micro coordinate light source generators 120A, 120B, 120C, and 120D and the infrared cameras 130A, 130B, and 130C.
  • the infrared micro coordinate light source generators 120A, 120B, 120C, and 120D are for providing the coordinate reference of the horizontal axis and the vertical axis in the screen touch area 10. Infrared micro coordinate light source generators 120A, 120B, 120C, and 120D are respectively installed on two horizontal sides and two vertical sides on the main body 110.
  • the infrared micro coordinate light source generators 120A, 120B, 120C, and 120D generate infrared micro coordinate light sources at regular intervals from the inner four sides of the main body 110 toward the touch area 10.
  • the light emitting regions of the infrared micro coordinate light sources are located in front of the touch region 10, and are arranged at four intervals on the four sides of the touch region 10 at regular intervals. Therefore, the infrared micro coordinate light sources function as the reference of the horizontal axis and the vertical axis in the touch area 10.
  • the infrared cameras 130A, 130B, and 130C are cameras having sufficient sensitivity to infrared rays, and the main body 110 may detect infrared micro coordinate light sources generated by the infrared micro coordinate light source generators 120A, 120B, 120C, and 120D. It is installed in). Although the infrared camera is shown as being provided with three, it is also possible to provide two or four.
  • the infrared cameras 130A, 130B, and 130C may include a lens and an image sensor, respectively.
  • the lens may be configured to have an angle of view of 90 ° or more.
  • the image sensor receives an optical image of the subject formed by the lens and converts it into an electrical signal.
  • the image sensor may be a charge-coupled device (CCD) image sensor or a complementary metal-oxide semiconductor (CMOS) image sensor.
  • CCD charge-coupled device
  • CMOS complementary metal-oxide semiconductor
  • the infrared cameras 130A, 130B, and 130C detect the positions of the infrared micro coordinate light sources blocked by the touch object among the infrared micro coordinate light sources, and provide the detected data to the controller 140. Then, the controller 140 calculates the coordinates of the touch object touched in the touch area 10 based on the data detected from the infrared cameras 130A, 130B, and 130C.
  • the infrared micro coordinate light sources are generated toward the touch area 10 and the position of the infrared micro coordinate light sources is sensed to obtain the coordinates of the touch object, the coordinates of the touch object are not affected by sunlight, shadows, or external light. It is possible to stably obtain the touch object coordinates without measuring errors due to aberration and distortion of the camera lens itself.
  • the infrared micro coordinate light source generators 120A, 120B, 120C, and 120D each include at least one infrared light emitter 121 and a micro coordinate light source distributor 122, respectively. can do.
  • the infrared light emitting unit 121 may use an infrared light emitting diode (LED).
  • the micro coordinate light source distributor 122 distributes the light emitted from the infrared light emitter 121 to the infrared micro coordinate light sources at a predetermined interval.
  • the micro coordinate light source distributor 122 may include a transparent bar 123 and a diffuser 124.
  • the transparent bar 123 may be formed of a transparent plastic or glass material having a high refractive index.
  • An infrared light emitting unit 121 is disposed at at least one end of the transparent bar 123.
  • the transparent bar 123 may have a shape having a rectangular cross section.
  • the transparent bar 123 has a structure in which fine grooves 123a are formed at regular intervals along one side portion at one side thereof.
  • fine grooves 123a are formed at regular intervals along one side portion at one side thereof.
  • infrared micro coordinate light sources are respectively generated. Therefore, infrared microcoordinate light sources of a predetermined interval may be generated from the transparent bar 123.
  • an infrared light emitting part may be additionally disposed at the opposite end of the transparent bar 123, or a reflective mirror may be disposed.
  • the diffusion unit 124 is intended to enable the infrared micro coordinate light sources to emit light evenly at all angles when the infrared micro coordinate light sources are generated from the fine grooves 123a.
  • a diffusion film may be used as the diffusion unit 124.
  • the diffusion film may be formed of a diffuse reflection surface treatment, and may be attached to a portion where fine grooves 123a are formed in the transparent bar 123.
  • the infrared micro coordinate light source generator 120 having the above-described structure distributes the light of one or two infrared light emitters 121 to the infrared micro coordinate light sources by the number of minute grooves 123a, power consumption may be reduced. It can reduce the cost and facilitate the manufacture of large size touch screen.
  • the transparent bar 223 of the micro coordinate light source distributor 222 has fine grooves 224 formed at regular intervals along a length direction on one side 223a, and the fine grooves ( The infrared micro coordinate light sources of a predetermined interval are respectively generated on the side 223b opposite to the surface of the side 223a in which the 224 is formed.
  • An infrared light emitting unit 121 is disposed at at least one end of the transparent bar 223.
  • the infrared micro coordinate light sources may be generated at regular intervals on the opposite side 223b of the transparent bar 223.
  • the transparent bar 223 is disposed so that the infrared micro coordinate light sources face the screen touch area 10.
  • the side portion 223b in which the infrared micro coordinate light sources are positioned on the transparent bar 223 may be curved to serve as a lens. Accordingly, the condensing effect may be enhanced when some of the light diffused in each of the fine grooves 224 passes through the inside of the transparent bar 223 through the opposite side 223b of the transparent bar 223.
  • the transparent bar 223 may be formed to be curved to the side portion 223a in which the fine grooves 224 are formed. Accordingly, since a part of the light diffusely reflected in each of the fine grooves 224 is focused in the inner direction of the transparent bar 223, the amount of light emitted through the opposite side 223b of the transparent bar 223 may be increased.
  • the reflective member 225 may be further provided on the side portion 223a in which the fine grooves 224 are formed in the transparent bar 223.
  • the reflective member 225 reflects light diffused outwardly from the fine grooves 224 to the transparent bar 223 to increase brightness of the infrared micro coordinate light sources.
  • the micro coordinate light source distributor 322 may include the base film 323, the light passages 324, the coating 325, and the diffusion 326. It may include.
  • the base film 323 is made of a film having a low refractive index.
  • the light passages 324 are formed of a transparent resin having a high refractive index to be spaced apart on the base film 323. In this case, the light passages 324 may be formed on the base film 323 by printing or etching.
  • the film 325 is formed of a resin having a low refractive index to cover the light passages 324 on the base film 323.
  • the coating 325 may be formed over the entire base film 323.
  • the diffuser 326 allows the infrared micro coordinate light sources to emit light evenly at all angles from the light paths 324.
  • a diffuse reflection surface-treated diffusion film may be used, and may be attached to the light emitting portions of the micro coordinate light sources in the micro coordinate light source distributor 322.
  • the light of the infrared light emitting unit 121 When the light of the infrared light emitting unit 121 is incident on at least one side of the base film 323, the incident light reaches each emission point of the light passages 324 while total reflection occurs inside each of the light passages 324. It is then diffused and diffused by the diffusion 326 at each discharge point. Therefore, the light of the infrared light emitting unit 121 may be distributed to the infrared micro coordinate light sources at a predetermined interval by the number of light paths 324 to emit light.
  • the three infrared cameras 130A, 130B, and 130C may be disposed at three corners of the main body 110. Is placed.
  • the three infrared cameras 130A, 130B, and 130C may be disposed at the lower left corner, the lower right corner, and the upper right corner of the main body 110.
  • each center of the infrared cameras 130A, 130B, and 130C is disposed in a 45 ° direction with respect to the horizontal and vertical sides of the main body 110.
  • the infrared cameras 130A, 130B, and 130C may be configured to generate infrared micro coordinate light sources generated by the infrared micro coordinate light source generators 120A, 120B, 120C, and 120D installed at opposite horizontal and vertical sides. It is sensing.
  • the controller 140 may include a camera interface 141, a memory 142, and a calculator 143.
  • the memory 142 stores the lookup table shown in FIG. 7 in advance. Lookup tables can be created as follows: In the four sides of the main body 110 in which the four infrared micro coordinate light source generators 120A, 120B, 120C, and 120D are installed, the inner side length and the inner side length are already determined at the time of manufacturing the main body 110. In addition, the respective positions of the infrared micro coordinate light sources generated by the infrared micro coordinate light source generators 120A, 120B, 120C, and 120D are already determined at the time of manufacturing the main body 110.
  • the angles of the points at which the infrared micro coordinate light sources are positioned may be measured at the points where the three infrared cameras 130A, 130B, and 130C are located. That is, as shown in FIG. 8, the infrared camera 130C at the upper right corner has n infrared rays from d 1 to d n generated by the infrared micro coordinate light source generator 120D of the left vertical side in the opposite diagonal direction.
  • the angles of the positions of the micro coordinate light sources and m infrared micro coordinate light sources from c 1 to c m generated by the infrared micro coordinate light source generator 120C on the lower horizontal side may be measured.
  • the infrared camera 130A at the lower left corner and the infrared camera 130B at the lower right corner may also measure angles at the points where the infrared micro coordinate light sources are located. From this, a lookup table that uses position numbers assigned to all infrared micro coordinate light sources as index values and angles measured at each position of infrared micro coordinate light sources from three infrared cameras 130A, 130B and 130C as table values. This can be made. The lookup table thus created is stored in advance in the memory 142.
  • the memory 142 stores an address map in advance.
  • the address map is constructed as follows.
  • the infrared camera 130C at the upper right corner has n infrared micro coordinate light sources from d 1 to d n generated by the infrared micro coordinate light source generator 120D on the left vertical side in the opposite diagonal direction, and infrared rays on the lower horizontal side.
  • the m coordinates from the m 1 to c m generated by the micro coordinate light source generation unit 120C are sensed together. Accordingly, as illustrated in FIG. 9, n + m infrared micro coordinate light sources from d 1 to c m are detected in a straight line in the image sensor 131 provided in the infrared camera 130C at the upper right corner.
  • the image sensor provided in the infrared camera 130A in the lower left corner includes n + infrared micro coordinate light sources from b n to b 1 and n + including infrared micro coordinate light sources from a m to a 1 .
  • m infrared micro coordinate light sources are detected.
  • the image sensor provided in the infrared camera 130B at the lower right corner includes n + m infrared micro coordinate light sources including infrared micro coordinate light sources from d n to d 1, and infrared micro coordinate light sources from a 1 to a m . Are detected.
  • the controller 140 finds and assigns an identification number to the data addresses where the pixels of the image sensor photosensitive by the infrared micro coordinate light sources are located, based on the respective image data of the infrared cameras 130A, 130B, and 130C.
  • the address maps are matched with the location numbers of the infrared micro coordinate light sources, respectively.
  • the address maps thus generated are previously stored in the memory 142.
  • the angle of the touch position may be calculated as follows by the lookup table and the address maps stored in the memory 142.
  • a touch object such as a finger
  • the infrared micro coordinate light sources blocked by the touch object are infrared cameras 130A and 130B. , 130C). Accordingly, the photosensitive of the pixels corresponding to the blocked infrared micro coordinate light sources is stopped on each image sensor of the infrared cameras 130A, 130B, and 130C.
  • the operation unit 143 periodically checks the photosensitive data of the pixels in the address maps, and if there are pixels whose photoresisting is stopped, the infrared micro coordinate light source of the corresponding infrared micro coordinate light source in the address map with the identification numbers assigned to the addresses of the corresponding pixels. Read their location number. Next, the calculator 143 obtains angle values of the infrared micro coordinate light sources through the look-up table stored in the memory 142.
  • the calculator 143 calculates the coordinates of the touch object based on the obtained angle values.
  • the process of calculating the coordinates of the touch object may be performed as follows. As shown in FIG. 10, assuming that a position where a touch occurs on the screen touch area 10 is P1, the calculator 143 obtains angles ⁇ P1 and ⁇ P1 indicating P1 in the lookup table. ⁇ P1 is an angle obtained from the infrared camera 130A at the lower left corner, and ⁇ P1 is an angle obtained from the infrared camera 130B at the lower right corner.
  • the calculation unit 143 is based on the angle values obtained from the two infrared cameras 130A, 130B of the three infrared cameras 130A, 130B, 130C.
  • the coordinates of the touch object are calculated, and the actual coordinates and the virtual image coordinates are distinguished based on the calculated coordinates of the touch object and angle values obtained from the remaining one infrared camera 130C.
  • the positions of the multitouch are called P1 and P2.
  • the coordinates (X1, Y1) of P1 and the coordinates (X2, Y2) of P2 are obtained as follows. Specifically, a total of four intersection points are generated by a combination of the angles ⁇ P1 and ⁇ P2 obtained from the infrared camera 130A in the lower left corner and the angles ⁇ P1 and ⁇ P2 obtained from the infrared camera 130B in the lower right corner. .
  • Four intersections are available for the ⁇ and ⁇ P1 P1 of intersection P1, P2 ⁇ and ⁇ of intersection P2 P2, P1 ⁇ and ⁇ of intersection of the G1 P2, P2 ⁇ and ⁇ of intersection of G2 P1 of the.
  • P1 and P2 are actual coordinates
  • G1 and G2 are virtual image coordinates among P1, P2, G1 and G2.
  • G1 and G2 are virtual image coordinates that exist only in calculation because they are not on the angles P1 and P2 angles obtained from the infrared camera 130C at the upper right corner.
  • the distinction between real coordinates and virtual coordinates can be made as follows.
  • the calculation unit 143 calculates coordinate values of P1, P2, G1, and G2 by using Equation 1 in the combination of ⁇ P1 , ⁇ P2 , ⁇ P1 , and ⁇ P2 .
  • the calculation unit 143 substitutes coordinate values of P1 and G1 into X and Y in Equation 2 below, and substitutes an angle value of ⁇ P1 into ⁇ .
  • the calculation unit 143 determines that the left side value and the right side value are the actual coordinates, and the left side value and the right side value are determined to be the virtual image coordinates.
  • the calculation unit 143 substitutes the coordinate values of P2 and G2 into X and Y, and the angle value of ⁇ P2 into ⁇ in the following equation (2).
  • the calculation unit 143 determines that the left side value and the right side value are the actual coordinates, and the left side value and the right side value are determined to be the virtual image coordinates.
  • the virtual image coordinates may be removed and the actual coordinates may be obtained using the same principle as described above.
  • the optical touch screen 100 may include only two infrared cameras to reduce the manufacturing cost.
  • two infrared cameras may be installed at one of two corners in a diagonal direction among four corners of the main body 110, and may be installed to detect all infrared micro coordinate light sources generated toward the touch area 10. have.
  • the infrared camera 130B at the lower right corner may be omitted from the three infrared cameras 130A, 130B, and 130C.
  • the two infrared cameras are installed one at two adjacent corners of the four corners of the main body 110 to detect the micro coordinate light sources generated at the horizontal and vertical sides opposite to each corner. It is also possible.
  • the infrared camera 130C at the upper right corner of the three infrared cameras 130A, 130B, and 130C may be omitted.
  • the optical touch screen 100 may include four infrared cameras to more accurately detect the coordinates of the multi-touch.
  • four infrared cameras may be installed at one of four corners of the main body 110, and may be installed to detect all infrared micro coordinate light sources generated toward the touch area 10.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Nonlinear Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Crystallography & Structural Chemistry (AREA)
  • Mathematical Physics (AREA)
  • Optics & Photonics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Position Input By Displaying (AREA)

Abstract

La présente invention porte sur un écran tactile optique pouvant reconnaître des coordonnées tactiles lorsque l'écran est touché par un doigt, un crayon de touche ou un élément analogue. L'écran tactile optique comprend un corps principal, des unités de génération de lumière infrarouge formant des coordonnées précises, au moins deux caméras infrarouges, et une unité de commande. Le corps principal est conçu pour recouvrir le bord d'une zone tactile d'un écran. Les unités de génération de lumière infrarouge formant des coordonnées précises sont disposées au niveau des deux côtés horizontaux et des deux côtés verticaux du corps principal à des intervalles réguliers et orientées vers la zone tactile afin de générer la lumière infrarouge formant des coordonnées précises, et fournissent des références pour les coordonnées dans un axe horizontal et un axe vertical. Les caméras infrarouges sont installées dans le corps principal afin de détecter la lumière infrarouge formant des coordonnées générées par les unités destinées à générer la lumière infrarouge formant des coordonnées précises. L'unité de commande calcule les coordonnées de l'objet touché dans la zone tactile sur la base des données détectées par les caméras infrarouges.
PCT/KR2010/008728 2009-12-11 2010-12-08 Écran tactile optique WO2011071305A2 (fr)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2012543024A JP5459689B2 (ja) 2009-12-11 2010-12-08 光学式タッチスクリーン
EP10836200.5A EP2511801B1 (fr) 2009-12-11 2010-12-08 Écran tactile optique
CN201080054237.8A CN102713807B (zh) 2009-12-11 2010-12-08 光学式触摸屏
US13/491,669 US8780087B2 (en) 2009-12-11 2012-06-08 Optical touch screen

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2009-0123099 2009-12-11
KR20090123099 2009-12-11
KR1020100123939A KR101070864B1 (ko) 2009-12-11 2010-12-07 광학식 터치스크린
KR10-2010-0123939 2010-12-07

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/491,669 Continuation US8780087B2 (en) 2009-12-11 2012-06-08 Optical touch screen

Publications (2)

Publication Number Publication Date
WO2011071305A2 true WO2011071305A2 (fr) 2011-06-16
WO2011071305A3 WO2011071305A3 (fr) 2011-11-03

Family

ID=44399597

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2010/008728 WO2011071305A2 (fr) 2009-12-11 2010-12-08 Écran tactile optique

Country Status (6)

Country Link
US (1) US8780087B2 (fr)
EP (1) EP2511801B1 (fr)
JP (1) JP5459689B2 (fr)
KR (1) KR101070864B1 (fr)
CN (1) CN102713807B (fr)
WO (1) WO2011071305A2 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130278563A1 (en) * 2012-04-19 2013-10-24 Hsun-Hao Chang Optical touch device and touch sensing method
US20140055419A1 (en) * 2012-08-24 2014-02-27 Sung-han Kim Camera module for optical touchscreen

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9925264B2 (en) 2011-05-10 2018-03-27 Itochu Chemical Frontier Corporation Non-aqueous patch
KR101260341B1 (ko) * 2011-07-01 2013-05-06 주식회사 알엔디플러스 멀티 터치 인식 장치
JP6021269B2 (ja) 2011-09-27 2016-11-09 伊藤忠ケミカルフロンティア株式会社 非水性貼付剤
KR101380676B1 (ko) * 2012-01-17 2014-04-04 주식회사 스마트센스테크놀러지 지시물체 위치인식장치
TWM443861U (en) * 2012-06-26 2012-12-21 Wistron Corp Touch display module and positioner thereof
KR101333076B1 (ko) * 2012-08-09 2013-11-26 서울시립대학교 산학협력단 복수의 영상입력기기 배치를 통한 인터랙티브 스크린 구현 시스템 및 구현 방법, 그 기록매체
CN103853387B (zh) * 2012-11-30 2017-04-12 汉王科技股份有限公司 一种显示装置、系统及坐标定位方法
KR102026131B1 (ko) * 2013-03-27 2019-09-30 삼성디스플레이 주식회사 표시 장치 및 이를 포함하는 광학 터치 시스템
US9360888B2 (en) 2013-05-09 2016-06-07 Stephen Howard System and method for motion detection and interpretation
US10891003B2 (en) 2013-05-09 2021-01-12 Omni Consumer Products, Llc System, method, and apparatus for an interactive container
US9465488B2 (en) 2013-05-09 2016-10-11 Stephen Howard System and method for motion detection and interpretation
KR20150080298A (ko) 2013-12-31 2015-07-09 현대자동차주식회사 곡면 디스플레이의 터치 인식 장치
TWI529583B (zh) 2014-12-02 2016-04-11 友達光電股份有限公司 觸控系統與觸控偵測方法
CA3138907C (fr) 2014-12-30 2023-08-01 Omni Consumer Products, Llc Systeme et procede de projection interactive
CN104793374B (zh) * 2015-05-15 2018-05-11 合肥京东方光电科技有限公司 不良定位装置,方法和目视检查装置
JP6774409B2 (ja) * 2015-07-17 2020-10-21 富士電機株式会社 光学式タッチパネル及び自動販売機
US10310674B2 (en) 2015-07-22 2019-06-04 Semiconductor Components Industries, Llc Optical touch screen system using radiation pattern sensing and method therefor
WO2019147612A1 (fr) 2018-01-25 2019-08-01 Neonode Inc. Capteur de coordonnées polaires

Family Cites Families (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5629204A (en) * 1979-08-16 1981-03-24 Oki Electric Ind Co Ltd Optical branching circuit
US4746770A (en) * 1987-02-17 1988-05-24 Sensor Frame Incorporated Method and apparatus for isolating and manipulating graphic objects on computer video monitor
US5969343A (en) * 1995-08-24 1999-10-19 Matsushita Electric Industrial Co., Ltd. Linear illumination device
JP4057200B2 (ja) * 1999-09-10 2008-03-05 株式会社リコー 座標入力装置および座標入力装置の記録媒体
JP3905670B2 (ja) * 1999-09-10 2007-04-18 株式会社リコー 座標入力検出装置、情報記憶媒体及び座標入力検出方法
JP4001705B2 (ja) * 2000-04-05 2007-10-31 株式会社リコー 座標入力/検出装置及び電子黒板システム
JP4059620B2 (ja) * 2000-09-20 2008-03-12 株式会社リコー 座標検出方法、座標入力/検出装置及び記憶媒体
US6954197B2 (en) * 2002-11-15 2005-10-11 Smart Technologies Inc. Size/scale and orientation determination of a pointer in a camera-based touch system
US6972401B2 (en) * 2003-01-30 2005-12-06 Smart Technologies Inc. Illuminated bezel and touch system incorporating the same
JP4590295B2 (ja) * 2005-04-15 2010-12-01 キヤノン株式会社 座標入力装置及びその制御方法、プログラム
US7599520B2 (en) * 2005-11-18 2009-10-06 Accenture Global Services Gmbh Detection of multiple targets on a plane of interest
US7302156B1 (en) * 2006-07-12 2007-11-27 Lumio Inc. Optical system
JP5415954B2 (ja) * 2006-09-28 2014-02-12 ルミオ インコーポレイテッド 光学式タッチパネル
US8615151B2 (en) * 2006-11-14 2013-12-24 Modilis Holdings Llc Lightguide arrangement and related applications
JP4864761B2 (ja) 2007-02-19 2012-02-01 日東電工株式会社 タッチパネル用光導波路
US8243048B2 (en) * 2007-04-25 2012-08-14 Elo Touch Solutions, Inc. Touchscreen for detecting multiple touches
US7809221B2 (en) * 2007-05-02 2010-10-05 Poa Sana Liquidating Trust Shadow detection in optical touch sensor through the linear combination of optical beams and grey-scale determination of detected shadow edges
CA2688214A1 (fr) * 2007-05-11 2008-11-20 Rpo Pty Limited Corps transmissif
KR101374418B1 (ko) * 2007-05-11 2014-03-17 엘지디스플레이 주식회사 멀티 터치 장치
KR20080100111A (ko) * 2007-08-08 2008-11-14 아페리오(주) 고밀도 패키지 기판 제조 방법
KR100942431B1 (ko) 2007-12-11 2010-02-17 주식회사 토비스 촬상소자와 광원을 이용한 터치 좌표 인식 방법 및 이를이용한 터치스크린 시스템
KR101338114B1 (ko) * 2007-12-31 2013-12-06 엘지디스플레이 주식회사 Ir 광원을 적용한 액정표시장치 및 이를 이용한 멀티터치시스템
US20090213093A1 (en) * 2008-01-07 2009-08-27 Next Holdings Limited Optical position sensor using retroreflection
KR100910024B1 (ko) * 2008-10-13 2009-07-30 호감테크놀로지(주) 선형 적외선 발광체를 이용한 카메라 방식의 터치 스크린
US8339378B2 (en) * 2008-11-05 2012-12-25 Smart Technologies Ulc Interactive input system with multi-angle reflector
US7931396B2 (en) * 2008-12-10 2011-04-26 Sharp Kabushiki Kaisha Backlight and display
WO2010134899A1 (fr) * 2009-05-20 2010-11-25 Tom Chang Panneau tactile optique
JP2010277122A (ja) * 2009-05-26 2010-12-09 Xiroku:Kk 光学式位置検出装置
TWI398804B (zh) * 2009-06-30 2013-06-11 Pixart Imaging Inc 光學觸控螢幕之位移偵測系統及其方法
KR101649314B1 (ko) 2009-09-01 2016-08-18 위순임 수광량을 이용한 터치 감지 장치 및 이를 이용한 터치 감지 방법
US8558804B2 (en) * 2009-12-14 2013-10-15 Silicon Motion, Inc. Touch control apparatus and touch point detection method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
None
See also references of EP2511801A4

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130278563A1 (en) * 2012-04-19 2013-10-24 Hsun-Hao Chang Optical touch device and touch sensing method
CN103376954A (zh) * 2012-04-19 2013-10-30 纬创资通股份有限公司 光学触控装置及触控感测方法
US9235293B2 (en) 2012-04-19 2016-01-12 Wistron Corporation Optical touch device and touch sensing method
CN103376954B (zh) * 2012-04-19 2016-06-15 纬创资通股份有限公司 光学触控装置及触控感测方法
US20140055419A1 (en) * 2012-08-24 2014-02-27 Sung-han Kim Camera module for optical touchscreen
CN104317460A (zh) * 2012-08-24 2015-01-28 Mos株式会社 光学式触摸屏用摄像模块
US9367175B2 (en) * 2012-08-24 2016-06-14 Mos Co., Ltd. Camera module for optical touchscreen
CN104317460B (zh) * 2012-08-24 2017-07-11 Mos株式会社 光学式触摸屏用摄像模块

Also Published As

Publication number Publication date
KR101070864B1 (ko) 2011-10-10
EP2511801A4 (fr) 2015-01-14
JP2013513852A (ja) 2013-04-22
US20120299879A1 (en) 2012-11-29
CN102713807A (zh) 2012-10-03
CN102713807B (zh) 2015-04-29
US8780087B2 (en) 2014-07-15
JP5459689B2 (ja) 2014-04-02
EP2511801A2 (fr) 2012-10-17
KR20110066858A (ko) 2011-06-17
EP2511801B1 (fr) 2017-07-19
WO2011071305A3 (fr) 2011-11-03

Similar Documents

Publication Publication Date Title
WO2011071305A2 (fr) Écran tactile optique
US20100315383A1 (en) Touch screen adopting an optical module system using linear infrared emitters
CN102193685B (zh) 触摸位置检测设备
WO2018066761A1 (fr) Dispositif d'affichage
KR101604030B1 (ko) 어레이 방식의 후방 카메라를 이용한 멀티터치 센싱 장치
CN106934379A (zh) 一种指纹识别装置及指纹识别方法、触控显示装置
CN102449584A (zh) 光学位置检测设备
JP2010257089A (ja) 光学式位置検出装置
WO2009035227A2 (fr) Écran tactile utilisant une caméra à infra-rouge difficilement affectée par la lumière externe gênante
KR101657216B1 (ko) 터치 패널 및 터치 패널의 접촉 위치 검출 방법
WO2010137843A2 (fr) Écran tactile adoptant un système de balayage infrarouge
US11822747B2 (en) Electronic devices having moisture-insensitive optical touch sensors
KR20120063423A (ko) 광학식 터치스크린
WO2014038804A1 (fr) Système d'entrée utilisant un stylo électronique
WO2015102137A1 (fr) Film optique et système de crayon numérique utilisant celui-ci
JP2001175415A (ja) 座標入力/検出装置
WO2013035990A2 (fr) Système de stylo numérique et dispositif d'affichage
JP2010282463A (ja) タッチパネル装置
TWM617636U (zh) 立體座標觸控裝置
CN107193428A (zh) 光学式触摸屏及其触摸定位方法、以及光学畸变标定方法
WO2017030397A1 (fr) Dispopsitif d'affichage doté d'une fonction d'écran tactile optique
CN201859424U (zh) 显示器触控系统
WO2012121464A1 (fr) Écran tactile utilisant un corps émetteur de lumière
CN102122221B (zh) 光学式触摸屏、显示装置
JP2006260474A (ja) 遮光型座標入力装置

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201080054237.8

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10836200

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2012543024

Country of ref document: JP

REEP Request for entry into the european phase

Ref document number: 2010836200

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2010836200

Country of ref document: EP