WO2010006883A2 - Determining the location of one or more objects on a touch surface - Google Patents

Determining the location of one or more objects on a touch surface Download PDF

Info

Publication number
WO2010006883A2
WO2010006883A2 PCT/EP2009/057724 EP2009057724W WO2010006883A2 WO 2010006883 A2 WO2010006883 A2 WO 2010006883A2 EP 2009057724 W EP2009057724 W EP 2009057724W WO 2010006883 A2 WO2010006883 A2 WO 2010006883A2
Authority
WO
WIPO (PCT)
Prior art keywords
light
touch
panel
points
touch surface
Prior art date
Application number
PCT/EP2009/057724
Other languages
French (fr)
Other versions
WO2010006883A3 (en
Inventor
Tomas Christiansson
Ola Wassvik
Mattias Bryborn
Original Assignee
Flatfrog Laboratories Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Flatfrog Laboratories Ab filed Critical Flatfrog Laboratories Ab
Priority to US12/737,016 priority Critical patent/US8482547B2/en
Priority to EP09779863.1A priority patent/EP2318905B1/en
Publication of WO2010006883A2 publication Critical patent/WO2010006883A2/en
Publication of WO2010006883A3 publication Critical patent/WO2010006883A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • G06F3/0423Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen using sweeping light beams, e.g. using rotating or vibrating mirror
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04109FTIR in optical digitiser, i.e. touch detection by frustrating the total internal reflection within an optical waveguide due to changes of optical properties or deformation at the touch location

Definitions

  • the present invention relates to touch-sensitive panels and data processing techniques in relation to such panels.
  • GUI graphical user interface
  • a fixed GUI may e.g. be in the form of printed matter placed over, under or inside the panel.
  • a dynamic GUI can be provided by a display screen integrated with, or placed underneath, the panel or by an image being projected onto the panel by a projector.
  • US2004/0252091 discloses an alternative technique which is based on frustrated total internal reflection (FTIR).
  • FTIR frustrated total internal reflection
  • Light is coupled into a panel to propagate inside the panel by total internal reflection.
  • Arrays of light sensors are located around the perimeter of the panel to detect the light.
  • the location of the object is determined by triangulation based on the attenuation of the light from each source at the array of light sensors.
  • US 3,673,327 discloses a similar technique in which arrays of light beam transmitters are placed along two edges of a panel to set up a grid of intersecting light beams that propagate through the panel by internal reflection. Corresponding arrays of beam sensors are placed at the opposite edges of the panel. When an object touches a surface of the panel, the beams that intersect at the point of touch will be attenuated. The attenuated beams on the arrays of detectors directly identify the location of the object.
  • FTIR devices In order for these FTIR devices to gain acceptance in the market, they must be operable in real-life situations. For example, after some use, it is likely that the touch surface becomes contaminated by deposits, such as fingerprints, fluids, dust, etc. Such contaminants may reduce the performance of the FTIR device. There is thus a general desire to design FTIR devices that provide adequate performance in terms of ability to determine the location and optionally also the size of one or more touching objects, with adequate precision even after substantial use in a real-life environment.
  • a first aspect of the invention is a method in a touch-sensing apparatus, which comprises a light transmissive panel that defines a touch surface and an opposite surface, and a light source arrangement for providing sheets of light inside the panel, wherein each sheet comprises light that propagates by internal reflection between the touch surface and the opposite surface from one or more incoupling points to a set of outcoupling points.
  • the apparatus further comprises a light sensor arrangement for generating one or more output signals that represent light reaching the outcoupling points, wherein the touch-sensing apparatus is configured such that an object touching the touch surface locally attenuates at least two sheets of light.
  • the method comprises: identifying, in said output signal(s), a set of signal profiles originating from said object; determining at least part of an attenuated light path across the panel based on each signal profile; and identifying the position of the object on the touch surface based on the thus-determined attenuated light paths.
  • the attenuated light paths are determined by applying a predetermined width function which is representative of a dependence of signal profile width on distance to the incoupling point due to light scattering caused by at least one of the touch surface and the opposite surface.
  • the inventive method makes it possible to deliberately add a light scattering structure to the touch surface or the opposite surface, in order to reduce the influence of fingerprints and other deposits, as well as scratches, that may occur during use of the touch-sensing apparatus.
  • the impact of light scattering caused by deposits may be accounted for by intermittently measuring/estimating the width function during use of the touch- sensing apparatus, and then applying the thus-obtained width function in the determination of attenuation paths.
  • the inventive method enables determination of both the position and the true size/area/shape of a touching object.
  • the method can be used to improve the position determination when multiple (two or more) objects are brought in contact with the touch surface. For multiple objects, the position determination will result in a larger number of candidate touch points than the actual number of objects. Thus, the candidate touch points will contain both true touch points and ghost touch points.
  • the inventive method makes it possible to determine the size/area/shape of the candidate touch points.
  • the thus-determined size/area/shape can be used to validate the candidate touch points, so as to distinguish between ghost touch points and true touch points.
  • the inventive method may improve the precision of the positions determined for multiple touching objects, and the method may also increase the number of touches than can be resolved for a given number of light sheets and/or outcoupling points of the touch-sensing apparatus.
  • the width function represents the factual width of the object given the detected signal profile, as a function of distance to the incoupling point.
  • At least one of the sheets is generated by sweeping an essentially collimated beam across a set of incoupling points on the panel.
  • said determining comprises, for each signal profile: reconstructing a center ray of the attenuated light path by geometrically retracing a center point of the signal profile to one of said incoupling points; determining a signal width of the signal profile; and determining an object width at one or more candidate positions along the center ray by applying said width function, thereby determining part of said attenuated light path.
  • said one or more candidate positions may be determined by triangulation using a set of center rays that are reconstructed from said set of signal profiles.
  • At least one of the sheets is generated in the form of a diverging beam originating at the one or more incoupling points.
  • said determining may comprise, for each signal profile associated with said at least one sheet generated in the form of a diverging beam: identifying outcoupling points corresponding to limits of the signal profile; reconstructing limiting rays of the attenuated light path by geometrically retracing the thus-identified outcoupling points to a respective incoupling point; and modifying the distance between the limiting rays by applying said width function, thereby determining the attenuated light path.
  • said determining results in a set of candidate positions, and wherein said identifying the positions comprises: calculating a shape measure and/or an area measure for at least one candidate position based on the thus-determined attenuated light paths; and validating said at least one candidate position based on the shape measure and/or area measure.
  • a second aspect of the invention is a computer program product comprising computer code which, when executed on a data-processing system, is adapted to carry out the method of the first aspect.
  • a third aspect of the invention is a device for determining a position of an object on a touch surface included in a touch-sensing apparatus, which comprises a light transmissive panel that defines the touch surface and an opposite surface, a light source arrangement for providing sheets of light inside the panel, wherein each sheet comprises light that propagates by internal reflection between the touch surface and the opposite surface from one or more incoupling points to a set of outcoupling points, a light sensor arrangement for generating one or more output signals that represent light reaching the outcoupling points, wherein the touch-sensing apparatus is configured such that an object touching the touch surface locally attenuates at least two sheets of light.
  • the device comprises: an element for identifying, in said output signal(s), a set of signal profiles originating from the object; an element for determining at least part of an attenuated light path across the panel based on each signal profile; and an element for identifying the position of the object on the touch surface based on the thus-determined attenuated light paths.
  • the determining element is configured to apply a predetermined width function which is representative of a dependence of signal profile width on distance to the incoupling point due to light scattering caused by at least one of the touch surface and the opposite surface.
  • a fourth aspect of the invention is a touch-sensing apparatus, comprising: a light transmissive panel that defines a touch surface and an opposite surface; a light source arrangement for providing sheets of light inside the panel, wherein each sheet comprises light that propagates by internal reflection between the touch surface and the opposite surface from one or more incoupling points to a set of outcoupling points; a light sensor arrangement for generating one or more output signals that represent light reaching the outcoupling points, wherein the touch-sensing apparatus is configured such that an object touching the touch surface locally attenuates at least two sheets of light; and a device for determining a position according to the third aspect. Any one of the embodiments of the first aspect can be combined with the second to fourth aspects.
  • Fig. IA is a side view of a touch-sensing apparatus
  • Fig. IB is a top plan view of a fan beam embodiment of such a touch- sensing apparatus
  • Fig. 1C is a top plan view of a scan beam embodiment of such a touch-sensing apparatus.
  • Fig. 2A illustrates a spatial transmission signal generated in a fan beam embodiment
  • Fig. 2B illustrates a spatial transmission signal generated in a scan beam embodiment
  • Fig. 2C illustrates decoding by triangulation.
  • Figs 3A-3B are graphs of dispersion functions caused by scattering in a touch- sensing apparatus.
  • Figs 4A-4D are top plan views of a beam propagating inside a light transmissive panel, serving to illustrate the origin of the dispersion functions in Figs 3A-3B.
  • Figs 5A-5D are top plan views of a scan beam embodiment, to illustrate a reconstruction of attenuation paths.
  • Figs 6A-6B are top plan views of another scan beam embodiment, to illustrate a reconstruction of attenuation paths.
  • Fig. 7 is a flow chart of an exemplary decoding process.
  • Figs 8A-8D are top plan views of a fan beam embodiment, to illustrate a reconstruction of attenuation paths.
  • Fig. 9 is a graph of a dispersion function based on measurement data.
  • Fig. 10 is a block diagram of an embodiment of a device for determining touch locations.
  • Figs 1 IA- 1 IB are top plan views of hybrid embodiments.
  • the present invention relates to techniques for detecting the location of an object on a touch surface of a touch-sensing apparatus.
  • the description starts out by presenting the use of frustrated total internal reflection (FTIR) for touch determination, in relation to a number of exemplifying arrangements for illuminating the interior of a light transmissive panel. Then, the influence of signal dispersion caused by scattering in the light transmissive panel is discussed. Finally, the use of signal dispersion for improving the touch determination process is discussed in relation to two principal arrangements for illuminating the light transmissive panel.
  • FTIR frustrated total internal reflection
  • Fig. IA is a side view of an exemplifying arrangement in a touch-sensing apparatus.
  • the arrangement includes a light transmissive panel 1, one or more light emitters 2 (one shown) and one or more light sensors 3 (one shown).
  • the panel defines two opposite and generally parallel surfaces 4, 5 and may be planar or curved.
  • a radiation propagation channel is provided between two boundary surfaces of the panel, wherein at least one of the boundary surfaces allows the propagating light to interact with a touching object 6.
  • the light from the emitter(s) 2 propagates by total internal reflection (TIR) in the radiation propagation channel, and the sensors 3 are arranged at the periphery of the panel 1 to generate a respective output signal which is indicative of the energy of received light.
  • TIR total internal reflection
  • the light may be coupled into and out of the panel 1 directly via the edge portion that connects the top and bottom surfaces 4, 5 of the panel 1.
  • a separate coupling element e.g. in the shape of a wedge
  • part of the light may be scattered by the object 6, part of the light may be absorbed by the object 6, and part of the light may continue to propagate unaffected.
  • the object 6 touches a boundary surface of the panel e.g. the top surface 4
  • the total internal reflection is frustrated and the energy of the transmitted light is decreased.
  • the location of the touching object 6 may be determined by measuring the energy of the light transmitted through the panel 1 from a plurality of different directions. This may, e.g., be done by operating a number of spaced- apart emitters 2 to generate a corresponding number of sheets of light inside the panel 1, and by operating the sensors 3 to detect the energy of the transmitted energy of each sheet of light. As long as the touching object attenuates at least two sheets of light, the position of the object can be determined by triangulation.
  • a data processing device 7 is configured to process the output signal(s) from the sensor(s) 3 to determine the location of the touching object 6. As indicated in Fig. IA, the light will not be blocked by the touching object 6.
  • each touch point has a transmission in the range 0-1, but normally in the range 0.7-0.99.
  • Figs IB and 1C illustrate exemplary light source arrangements for generating sheets of light inside the light transmissive panel 1, and light sensor arrangements for detecting the transmitted energy of each sheet.
  • each emitter 2 generates a beam Bl, B2 of light that expands in the plane of the panel 1 while propagating away from the emitter 2.
  • a beam is denoted fan beam, and this type of embodiment is generally referred to as a "fan beam embodiment" in the following.
  • Each fan beam Bl, B2 propagates from one or more entry or incoupling points within an incoupling site on the panel 1 to form a sheet of light, which in this example is distributed essentially throughout the entire panel 1.
  • Arrays of light sensors 3 are located around the perimeter of the panel 1 to receive the light from the emitters 2 at a number of spaced- apart outcoupling points within an outcoupling site on the panel 1. The location of the object is determined by triangulation based on the attenuation of the light from each emitter 2 at the array of light sensors 3.
  • This type of touch-sensing apparatus is, e.g., known from aforesaid US2004/0252091, which is incorporated herein by reference.
  • each beam Bl, B2 is generated and swept along a set of entry or incoupling points within an incoupling site on the panel 1 by an input scanner arrangement 8.
  • the entry points are located at the left and top edges of the panel 1.
  • the transmitted energy at a number of outcoupling points within an outcoupling site on the panel 1 is measured by an output scanner arrangement 9 which is synchronized with the input scanner arrangement 8 to receive the beam Bl, B2 as it is swept across the panel 1.
  • the outcoupling points are located at the right and bottom edges of the panel 1 opposite to the entry points.
  • Each output scanner arrangement 9 typically includes a beam scanner and one light sensor (not shown).
  • each input scanner arrangement 8 typically includes a light emitter and a beam scanner (not shown).
  • two or more input scanner arrangements share one and the same light emitter, and/or that two or more output scanner arrangements share one and the same light sensor.
  • more than two beams can be swept across the panel.
  • the beams are translated across the panel, i.e. they have an essentially invariant angle (scan angle) in the plane of the panel.
  • dedicated optical components are associated with the incoupling and outcoupling sites to re-direct the incoming light from input scanner arrangement 8 into a desired direction ("scan angle") and to re-direct the transmitted light towards a common focal area/point on the output scanner arrangement 9, respectively.
  • the beams Bl, B2 are essentially parallel to a respective edge of the panel 1.
  • the scan angle of the beams Bl, B2 change as they are swept across the panel 1.
  • the illustrated embodiment and alternative configurations thereof are further disclosed in Applicant's U.S. provisional applications No. 61/129,372 and No. 61/129,373, both filed on June 23, 2008 and incorporated herein by reference.
  • each output scanner arrangement 9 and the redirecting optical components are replaced by a respective elongate sensor, which extends along the panel edge and is optically connected to the panel.
  • Each such elongate sensor is controlled to measure the received energy as a function of time, while a beam is swept along an incoupling site on the panel.
  • the transmitted energy is measured at a number of outcoupling points within an outcoupling site on the panel, wherein the outcoupling points correspond to different time points in the output signal of the elongate sensor.
  • each output scanner arrangement 9 is replaced by a stationary radiation detector, which is arranged in the aforesaid common focal area/point, spaced from the panel edge.
  • the stationary radiation detector is controlled to measure the received energy as a function of time, while a beam is swept along an incoupling site on the panel.
  • the output scanner arrangements 9 are omitted and replaced by retro-reflectors, such that the beams Bl, B2 are reflected back to the respective input scanner arrangement 8.
  • the input scanner arrangements 8 are configured as transceivers that both sweep and receive a beam, to measure the transmitted energy.
  • the output signals of the sensor(s) 3 may be aggregated into a spatial transmission signal for each sheet of light.
  • the spatial transmission signal represents the received energy at different locations around the perimeter of the panel.
  • the spatial transmission signal could optionally be normalized by a background signal to represent the true transmission of light at the different locations, as will be further exemplified below.
  • Fig. 2A illustrates a spatial transmission signal obtained from the array of sensors 3 at the right-end panel edge of the embodiment in Fig. IB.
  • Fig. 2A also schematically illustrates a set of idealized light rays rl, ..., rN in the sheet of light that is generated by the bottom left-side emitter in Fig. IA.
  • the spatial resolution of the transmission signal Sl depends, La., on the density of sensors 3 in the array of sensors.
  • the transmission signal Sl is illustrated to contain a signal profile Pl that originates from a touching object (not shown). Such a signal profile Pl is also denoted "touch signature" in the following.
  • Fig. 2B illustrates a spatial transmission signal Sl obtained from the right-end output scanner arrangement 9 in Fig. 1C, or any of the alternatives thereof as discussed above.
  • Fig. 2B also schematically illustrates a set of instances rl,..., rN of the center ray of a light beam while it is swept across the panel 1.
  • the spatial transmission signal Sl could, e.g., be given as a function of scanning angle or time, which is equivalent to location along the right-end edge of the panel 1. It should be understood that the spatial resolution of the transmission signal Sl depends, La., on the sampling rate of the output scanner arrangement 9.
  • the transmission signal Sl is also illustrated to contain a touch signature Pl of a touching object (not shown).
  • FIG. 2C illustrates two spatial transmission signals Sl, S2 obtained from the output scanner arrangements 9 in Fig. 1C.
  • all touch signatures Pl, P2 are identified in the transmission signals Sl, S2.
  • an attenuation path is determined, typically by tracing the center of the touch signature Pl, P2 back to the corresponding entry point.
  • the back- traced center rays are illustrated by dashed lines. The location of the touching object 6 is given by the intersection of the center rays.
  • the present Applicant has now realized that the decoding process may be improved by deliberately causing the propagating light to be scattered at one or both of the boundary surfaces 4, 5 of the light transmissive panel 1 (Fig. 1), provided that the decoding process is appropriately designed to take the effects of scattering into account.
  • the scattering may be caused by a diffusing surface structure, also called an antiglare (AG) structure, on the touch surface 4.
  • AG structure may be used to reduce glares from external lighting on the touch surface.
  • the touching object is a naked finger
  • the contact between the finger and the touch surface normally leaves a fingerprint on the touch surface. On a perfectly flat surface, such fingerprints are clearly visible and usually unwanted.
  • the visibility of fingerprints is reduced.
  • the friction between the finger and the touch surface decreases when an AG structure is used, which may thus improve the user experience.
  • AG structures are specified in gloss units (GU), where lower GU values result in less glares.
  • each internal reflection against such a scattering boundary surface will cause some light to be diverted away from the main direction of the beam and may also cause radiation to escape through the boundary surface.
  • the provision of an AG structure generally causes the beam to be broadened in the plane of the panel as the beam propagates from its entry point(s) on the panel. This broadening causes the shape of the touch signature in the spatial transmission signal to depend on the location of the touching object on the panel, specifically the distance between the touching object and the relevant incoupling/entry point.
  • FIG. 3A illustrates an exemplifying dependence between the width of the touch signature caused by a touching object and the distance between the touching object and the entry point.
  • the factual width of the touching object is W n .
  • the detected touch signature will be distinct and have a width similar to the factual width.
  • the detected touch signature will gradually broaden. Close to the outcoupling point, the width of the touch signature may again become slightly smaller. It is to be understood that the actual functional dependence between width and touch location is greatly dependent on the actual optical design of the touch-sensing apparatus, and that Fig. 3A is merely given as an example.
  • Fig. 3A it can be seen that a small touching object located centrally between the entry and outcoupling points will yield the same touch signature width as a larger touching object located closer to the entry point.
  • dispersion function This type of functional dependence is denoted dispersion function in the following.
  • Fig. 3B is a graph of a dispersion function determined for the data in Fig 3A.
  • Fig. 3B illustrates the factual object width at different locations that will generate the same touch signature width in the spatial transmission signal.
  • a dispersion function can be used to improve the precision and/or consistency in determining the location and/or size of one or more touching objects.
  • the origin of the dispersion function will now be further explained in relation to the scan beam embodiment of Fig. 1C.
  • the shape of the diverging set of rays from the entry point depends on many different factors, e.g. panel thickness, internal angle of incidence onto the boundary surfaces, AG structure, etc.
  • the resulting touch signature depends, apart from the diverging set of rays, on a number of other factors, e.g. detector surface area, detector numerical aperture, cross-section of injected light, etc.
  • detector- specific parameters typically have more impact on the touch signature for touch locations close to the outcoupling point.
  • emitter- specific properties mainly affect the touch signature for touch locations close to the entry point.
  • Fig. 4A is a plan view of the panel 1 in which a beam Bl is injected at an entry side and propagates to an outcoupling side. At the outcoupling side, the energy of the beam B 1 is sensed within a confined area (indicated by ⁇ and denoted "receiving area" in the following).
  • the length of the receiving area ⁇ is dependent on the numerical aperture of the light sensor arrangement (e.g. the output scanner arrangement 9 in Fig. 1C), i.e. the range of angles over which the light sensor arrangement can accept light.
  • the beam Bl diverges as it propagates through the panel. Since the receiving area ⁇ has a finite length, it will only receive the central parts of the diverging beam Bl that reaches the outcoupling side.
  • Fig. 4B indicates the outer rays that reach the receiving area ⁇ .
  • Fig. 4C illustrates the situation when an object 6 touches the panel 1 close to the entry side, in this example the left side.
  • a touching object 6 that moves with respect to the beam Bl, but the conclusions will be equally applicable for a stationary touching object and a moving beam (as in the scan beam embodiment).
  • Four different locations of the touching object 6 are shown in the left-hand part of Fig. 4C.
  • the touching object 6 interacts with the beam Bl over a short distance.
  • Fig. 4C also indicates that the touching object 6 interacts with a large part of the beam Bl.
  • the resulting touch signature will be narrow (small width) and strong (low transmission).
  • Fig. 4D illustrates the situation when the object 6 touches the panel 1 further away from the entry side.
  • the touching object 6 interacts with the beam Bl over a longer distance. It is also seen that the touching object 6 interacts with a smaller portion of the beam B 1. Therefore, the resulting touch signature will be wider and weaker.
  • the width of the touch signature will decrease slightly for locations to the right of the touching object 6 in Fig. 4D.
  • Such signature behaviour is also illustrated in the graph of Fig. 3A. It should be noted that such a decrease in signature width is only observed when the length of the receiving area ⁇ is smaller than the width of the dispersed beam at the outcoupling side (e.g. as shown in Fig. 4A). For example, in the above-mentioned variant where a single elongate sensor is arranged at the outcoupling side instead of the output scanner arrangement, a decrease in touch signature width is unlikely to be observed.
  • Figs 5A-5D illustrate a scan beam embodiment, in which three collimated non- parallel beams are swept (translated) across the panel, resulting in three transmission signals.
  • Fig. 5A illustrates the three beams B1-B3 and the resulting spatial transmission signals Sl -S3.
  • a first beam Bl which is parallel to the top and bottom edges of the panel 1, is injected at the left side and detected at the right side of the panel 1, while being swept from the bottom to the top (or vice versa).
  • the resulting transmission signal Sl is shown to the right side of the panel 1.
  • a second beam B2, with a scan angle which is non- parallel to the edges of the panel 1, is injected at the top and is detected at the bottom, while being swept from left to right (or vice versa).
  • the resulting transmission signal S2 is shown at the bottom.
  • the resulting transmission signal S3 is shown at the top.
  • Each transmission signal S1-S3 contains a respective touch signature P1-P3, resulting from the touching object 6.
  • Fig. 5B illustrates the attenuated paths determined based on the touch signatures P1-P3, without considering the signal dispersion caused by scattering.
  • the attenuated paths have been reconstructed by tracing the limits of the touch signatures PIPS back to the corresponding entry points, as illustrated by the straight parallel lines extending from the limits of each peak P1-P3 along the associated beam path.
  • Fig. 5C illustrates the reconstruction of the attenuation path for the first beam Bl in Fig. 5 A, using a dispersion function determined for this scan beam embodiment.
  • the dispersion function may be calculated theoretically or may be derived from measured data.
  • Fig. 5C includes two dispersion lines showing the factual width of a touching object 6 yielding the detected touch signature width as a function of the distance from the entry point. It is seen that if the touching object 6 is located close to the entry point, the factual width is essentially equal to the width of the touch signature. If the touching object 6 is located farther away from the entry point, its factual width has to be smaller in order to generate the detected touch signature Pl.
  • Fig. 5D illustrates the reconstructed attenuation paths for the touch signatures PI ⁇
  • a first transmission signal Sl is generated by sensing the transmitted energy of a beam B 1 which is parallel to the top and bottom edges of the panel 1 and which is injected at the left side and outcoupled at the right side of the panel 1.
  • a second transmission signal S2 is generated by sensing the transmitted energy of a beam B2 which is parallel to the left and right edges of the panel 1 and which is injected at the bottom side and outcoupled at the top side of the panel 1.
  • each transmission signal Sl, S2 contains two touch signatures PIa, PIb, P2a, P2b, each resulting from one of the touching objects 6.
  • Fig. 6A also illustrates the attenuation paths (corrected attenuation paths) that have been reconstructed based on the touch signatures PIa, PIb, P2a, P2b while applying the dispersion function for this embodiment.
  • Fig. 6 A also illustrates the attenuation paths (uncorrected attenuation paths) that are obtained without applying the dispersion function.
  • the attenuation paths form four polygonal intersections, with each intersection being a candidate location cl-c4.
  • Fig. 6B illustrates the spatial transmission signals Sl, S2 that are generated when the two touching objects 6 are located at the ghost locations in Fig. 6A.
  • the intersections cl, c4 at the ghost points are also square, but one intersection has a very small area, and the other intersection has a significantly larger area.
  • Fig. 7 is a flow chart for an exemplifying decoding process that may be used to identify touch locations in any one of the above-described scan beam arrangements.
  • the process obtains the output signals from the light sensors, typically by sampling data values from the output signal at given time intervals.
  • the output signals are processed to form a sample vector for each sheet of light, each sample vector including a set of data values associated with different time points.
  • this processing may involve filtering the output signals for suppression of noise and/or ambient light, combining output signals from different sensors, interpolating the output signals, etc.
  • the sample vector is then used as a spatial transmission signal, optionally after dividing the sample vector with background data.
  • the background data may be a corresponding sample vector that represents the received energy without any object touching the touch surface.
  • the background data may be pre-set or obtained during a separate calibration step.
  • each spatial transmission signal is processed to identify one or more peaks that may originate from touching objects.
  • the identified peaks correspond to the above-discussed touch signatures.
  • step 704 the center point of each peak is identified. This step may or may not involve interpolating the data values in the transmission signal. Using the center point, and knowing the scan angle of the beam at each data value in the spatial transmission signal, the process determines a center ray (cf. Fig. 2C) for each center point. Further, the width of each peak in the spatial transmission signals is determined.
  • step 705 the intersections between the center rays are determined by triangulation. These intersections form candidate touch points.
  • step 706 the factual width at each intersection is calculated for each peak in the transmission signal, using a dispersion function and the peak width.
  • the peak width and location data for an intersection may be input to a function of the type shown in Fig. 3B, to output the factual width at the intersection.
  • step 706 results in width data for each candidate touch point.
  • step 707 the process determines the most probable set of true touch points among the candidate touch points.
  • the true touch points may be identified by calculating an area value for each candidate touch point and matching the area values to an area measure, or by calculating a shape value for each candidate touch point and matching the shape values to a shape measure, or a combination thereof.
  • the true touch points are output by the process.
  • a second validation sub-step may be configured to compute the area of the candidate points, e.g. as w x *w y . If the bottom-left candidate touch point cl is significantly larger than the other candidate touch points, at the same time as the top-right candidate point c4 is smaller than the other candidate touch points, the process concludes that the bottom-left and top-right candidate touch points cl, c4 are ghost points (cf. Fig. 6B).
  • the process could be configured to validate only the top-left candidate point c2 or the bottom-right candidate point c4 according to the first validation sub-step.
  • the skilled person understands that there are numerous alternative implementations of the validation step 707, depending e.g. on the number of touches to be resolved, the dispersion function, the shape and area of the objects, the shape and area variations among the objects, etc.
  • the above example demonstrates that it is generally possible to improve the decoding process by applying a dispersion function in the reconstruction of attenuation paths based on spatial transmission signals generated by sweeping a number of collimated non-parallel beams inside a light transmissive panel.
  • a center ray is first reconstructed for each touch signature by geometrically retracing a center point of the touch signature to a corresponding entry point (step 704). Then, a set of candidate touch points is determined by triangulating the reconstructed center rays (step 705), whereupon the dispersion function is applied to determine factual widths at each candidate touch point (step 706). Thus, the corrected attenuation path is only determined at the candidate touch points.
  • corrected attenuation paths are determined before the triangulation, i.e. the dispersion function is first applied to reconstruct the full attenuation path from the outcoupling side to the entry side. Then, the full attenuation paths are intersected in a triangulation step, which thus results in both the locations and the factual widths of the candidate touch points.
  • the spatial transmission signals may be generated to represent only part of the sample vector.
  • steps 702 and 703 may be combined such that touch signatures are first identified in the sample vectors, whereupon spatial transmission signals are generated only for one of more sample points within the touch signatures in the sample vectors.
  • the corrected attenuation paths may be calculated at a reduced accuracy or resolution, e.g. by approximating the outer limits of the corrected attenuation paths by a straight line.
  • the decoding process may be based on any available image reconstruction algorithm, and especially few-view algorithms that are used in, e.g., the field of tomography. Any such algorithm can be modified to account for dispersion, as long as the dispersion function is known.
  • the use of the dispersion function in the decoding process will be discussed in relation to fan beam embodiments.
  • the fan beam generally originates from a smaller incoupling site and expands in the plane of the panel even in the absence of scattering, i.e. without an AG structure.
  • the touch signatures contain information about the distance between the entry point and the touching object, even if the panel is not provided with an AG structure.
  • the presence of scattering causes the fan beam to broaden in the plane of the panel while propagating from the incoupling site towards the outcoupling site.
  • a dispersion function can be measured or calculated for a specific fan beam embodiment.
  • Figs 8A-8D the panel is internally illuminated by two fan beams.
  • several sensors are arranged at two sides of the panel to measure the transmitted energy of the sheets of light formed by the fan beams.
  • Fig. 8A illustrates a touch-sensing apparatus with no AG structure on the boundary surfaces.
  • the thin lines indicate the boundaries of the light cones from two different emitters (not shown) that interact with two touching objects 6. It should be noted that although the light cones are drawn as if they originate from a singularity (i.e. a single incoupling point), in practice they generally originate from an extended incoupling site that may include more than one incoupling point for the decoding process.
  • Fig. 8A also illustrates the intersections of attenuation paths that are reconstructed from the resulting touch signatures PIa, PIb, P2a, P2b in the transmission signals Sl, S2. The intersections form four candidate locations cl-c4, of which at least three are similar in size. Thus, it may be difficult to identify the true locations based on the touch signatures.
  • Fig. 8B illustrates the spatial transmission signals Sl, S2 obtained in the touch- sensing apparatus of Fig. 8 A if at least one of its boundary surfaces is provided with an AG structure.
  • Fig. 8B shows that the touch signatures PIa, PIb, P2a, P2b have become wider and slightly weaker as a result of the scattering.
  • Fig. 8B indicates the light cones (in thin lines) that interact with the objects 6, as well as the corrected attenuation paths (in thicker lines), which are obtained by applying the appropriate dispersion function to the touch signatures PIa, PIb, P2a, P2b in the transmission signals Sl, S2.
  • Fig. 8B illustrates the spatial transmission signals Sl, S2 obtained in the touch- sensing apparatus of Fig. 8 A if at least one of its boundary surfaces is provided with an AG structure.
  • intersections at the true locations cl, c3 remain essentially unchanged, whereas the areas of the intersections at the ghost locations c2, c4 change considerably (decreasing and increasing in area, respectively). Both of these changes in intersection area make it easier to distinguish true locations from ghost locations among the candidate locations in a multi-touch scenario.
  • Fig. 8C further illustrate the areas of the intersections between uncorrected attenuation paths (i.e. the aforesaid light cones) in the touch-sensing apparatus of Fig. 8B.
  • uncorrected attenuation paths i.e. the aforesaid light cones
  • the size of the touches will be incorrectly estimated if the dispersion function is not taken into account.
  • Fig. 8D illustrates the spatial transmission signals Sl, S2 that are generated when the two touching objects 6 are located at the ghost locations c2, c4 in Fig. 8C.
  • the corrected attenuations paths in thicker lines
  • the intersections that correspond to the touching objects 6 have essentially the same shape and area
  • the intersections at the ghost locations cl, c3 are thin and elongate.
  • true locations can be distinguished from ghost locations among the candidate locations in a multi-touch scenario.
  • the decoding process could be implemented according to any of the decoding embodiments discussed above in relation to the scan beam embodiments, e.g. according the process shown in Fig. 7.
  • a center ray is first reconstructed for each touch signature by geometrically retracing a center point of the touch signature to a corresponding entry point. Then, a set of candidate touch points is determined by triangulating the reconstructed center rays, whereupon the factual width at each candidate touch point is determined by applying a geometric function to reconstruct the uncorrected fan beam width at the candidate touch point (resulting in the width between the outer lines at each intersection cl-c4 in Fig. 8B) and by applying the dispersion function to modify the thus-determined fan beam width (resulting in the width between the inner lines at each intersection cl-c4 in Fig. 8B), or vice versa.
  • the geometric function and the dispersion function could be combined into a single correction function which is applied to determine the factual widths.
  • corrected attenuation paths are determined before the triangulation, i.e. the geometric function and the dispersion function are first applied to reconstruct the full attenuation path from the outcoupling side to the entry point. Then, the full attenuation paths are intersected in a triangulation step, which thus results in both the locations and the factual widths of the candidate touch points.
  • Fig. 9 is a graph of measurement data obtained from a scan beam embodiment of the type shown in Fig. 6, wherein the measurement data has been obtained for a rectangular light transmissive panel with a 37 inch diameter.
  • the graph shows the measured half- width of the touch signature as a function of the distance between the entry point (e.g. located on the left side of the panel in Fig. 6A) and the touching object.
  • this graph corresponds to the graph in Fig. 3A.
  • the touch signature width is clearly dependent on the distance from the entry point (and also on the distance to the outcoupling point).
  • the dispersion function may be given by the actual measurement data, suitably after recalculation into a function as shown in Fig. 3B, or the dispersion function may be derived based on a suitable function that is fitted to the measurement data.
  • a fitted function may be linear, polygonal, spline, etc
  • a corresponding dispersion function can be measured without any difficulty in a fan beam embodiment.
  • the decoding process is typically executed by a data processing device (cf. 7 in Fig. IA) connected to receive the output signal(s) of the sensor(s).
  • a data processing device 7 An example of such a data processing device 7 is shown in Fig. 10.
  • the device includes an element (or means) 10 for identifying, in the output signal(s), a set of touch signatures originating from touching objects.
  • the device further includes an element (or means) 11 for determining at least part of an attenuated light path across the panel based on each touch signature, by applying the dispersion function as described in the foregoing.
  • an element (or means) 12 for identifying the positions of the touching objects on the touch surface based on the attenuated light paths.
  • the device 7 may be implemented by special-purpose software (or firmware) run on one or more general- purpose or special-purpose computing devices.
  • each "element” or “means” of such a computing device refers to a conceptual equivalent of a method step; there is not always a one-to-one correspondence between elements/means and particular pieces of hardware or software routines.
  • One piece of hardware sometimes comprises different means/elements.
  • a processing unit serves as one element/means when executing one instruction, but serves as another element/means when executing another instruction.
  • one element/means may be implemented by one instruction in some cases, but by a plurality of instructions in some other cases.
  • Such a software controlled computing device may include one or more processing units, e.g. a CPU ("Central Processing Unit"), a DSP ("Digital Signal Processor"), an ASIC ("Application-Specific Integrated Circuit”), discrete analog and/or digital components, or some other programmable logical device, such as an FPGA ("Field Programmable Gate Array”).
  • the computing device may further include a system memory and a system bus that couples various system components including the system memory to the processing unit.
  • the system bus may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures.
  • the system memory may include computer storage media in the form of volatile and/or non- volatile memory such as read only memory (ROM), random access memory (RAM) and flash memory.
  • the special-purpose software may be stored in the system memory, or on other removable/non-removable volatile/non-volatile computer storage media which is included in or accessible to the computing device, such as magnetic media, optical media, flash memory cards, digital tape, solid state RAM, solid state ROM, etc.
  • the computing device may include one or more communication interfaces, such as a serial interface, a parallel interface, a USB interface, a wireless interface, a network adapter, etc, as well as one or more data acquisition devices, such as an A/D converter.
  • One or more I/O devices may be connected to the computing device, via a communication interface, including e.g. a keyboard, a mouse, a touch screen, a display, a printer, a disk drive, etc.
  • the special- purpose software may be provided to the computing device on any suitable computer- readable medium, including a record medium, a read-only memory, or an electrical carrier signal.
  • the emitters 2 can operate in any suitable wavelength range, e.g. in the infrared or visible wavelength region. All sheets of light could be generated with identical wavelength. Alternatively, different sheets could be generated with light in different wavelength ranges, permitting differentiation between the sheets based on wavelength. Furthermore, the emitters 2 can output either continuous or pulsed radiation. Still further, the emitters 2 may be activated concurrently or sequentially. Any type of emitter capable of emitting light in a desired wavelength range could be used, for example a diode laser, a VCSEL (vertical-cavity surface-emitting laser), or an LED (light-emitting diode), an incandescent lamp, a halogen lamp, etc.
  • a diode laser a VCSEL (vertical-cavity surface-emitting laser), or an LED (light-emitting diode), an incandescent lamp, a halogen lamp, etc.
  • VCSEL vertical-cavity surface-emitting laser
  • LED light-emitting
  • the transmitted energy may be measured by any type of light sensor 3 capable of converting light into an electrical signal.
  • a "light sensor” implies a O-dimensional light detector.
  • the light sensor 3 may be a single light sensing element such as a photo-detector or a pixel on a CCD or CMOS detector.
  • the light sensor 3 may be formed by a group of light sensing elements that are combined for O-dimensional light detection, by summing/averaging the output of the individual elements in hardware or software.
  • the light transmissive panel 1 may be made of any material that transmits a sufficient amount of light in the relevant wavelength range to permit a sensible measurement of transmitted energy.
  • the panel 1 may be made of a single material or may be formed by layers of different materials.
  • the internal reflections in the touch surface are caused by total internal reflection (TIR), resulting from a difference in refractive index between the material of the panel and the surrounding medium, typically air.
  • the reflections in the opposite boundary surface may be caused either by TIR or by a reflective coating applied to the opposite boundary surface.
  • the AG structure may be provided as a dedicated layer attached to the touch surface.
  • the AG structure may be integrated into the touch surface, e.g. by etching, engraving, stamping or moulding.
  • the AG structure may be provided on the surface opposite to the touch surface.
  • the touch- sensing apparatus With an AG structure on the touch surface, the touch- sensing apparatus will have inherently large signal dispersion, reducing the influence of additional sources of scattering, such as contaminants and scratches.
  • both boundary surfaces of the panel may be used as touch surfaces.
  • the desired scattering may be caused by other means than a dedicated AG structure, e.g. bulk scattering in the panel.
  • a GU value of 10-200, e.g. 100-120, of one of the boundary surfaces of the panel would result in a sufficient amount of scattering for the purpose of the present invention.
  • Fig. 1 IA is a variant of the fan beam embodiments discussed in the foregoing.
  • several light emitters 2 are arranged around the perimeter of the panel 1 to emit a respective fan beam (as indicated by diverging lines).
  • the fan beam propagates inside the panel by TIR, at least in the touch surface 4.
  • a large number of light sensors 3 are arranged around the perimeter to measure the received energy from the different sheets of light generated by the emitters 2. It has been found that the benefit from applying the dispersion function when reconstructing the attenuation paths increases with the number of fan beams, since the inconsistencies in area/shape for the ghost points will become larger and larger whilst the true touch points will have better and better correspondence in area/shape.
  • Fig. 1 IB Another exemplifying arrangement is shown in Fig. 1 IB, which is a variant of the scan beam embodiments discussed in the foregoing.
  • two groups of light emitters 2 are arranged along two edges of the panel 1.
  • Each emitter 2 is configured to emit a collimated beam that is coupled into the panel to propagate by TIR, at least in the touch surface 4.
  • the emitters 2 are activated sequentially, in any order.
  • the emitters 2 may be activated to simulate a beam sweep, as in the scan beam embodiments.
  • a respective elongate light sensor 3 is arranged at the opposite edges to measure the received energy from the different sheets of light generated by the emitters 2. It should be understood that the decoding process described for the above scan beam embodiments are equally applicable to this arrangement.
  • hybrids of the fan beam and scan beam embodiments include hybrids of the fan beam and scan beam embodiments.
  • One such hybrid may be formed based on one of the above-described fan beam embodiments, by replacing one or more stationary emitters generating a fan beam with one or more stationary emitters generating a collimated beam.
  • Another hybrid may be formed based on one of the above-described scan beam embodiments, by sweeping fan beams or converging beams instead of collimated beams. The skilled person will readily realize how to implement the above teachings in the decoding process for the hybrid embodiments.

Abstract

A device operates on output signals (S1, S2) from a light sensor arrangement in a touch-sensing apparatus to determine a position of an object (6) on a touch surface (4). The apparatus includes a light transmissive panel (1) that defines the touch surface (4) and an opposite surface. A light source arrangement provides sheets of light inside the panel (1), wherein each sheet comprises light that propagates by internal reflection between the touch surface (4) and the opposite surface from one or more incoupling points to a set of outcoupling points. The light sensor arrangement generates the output signals (S1, S2), which represent light reaching the outcoupling points. The apparatus is configured such that an object (6) touching the touch surface (4) locally attenuates at least two sheets of light. To determine the position, the device identifies, in the output signals (S1, S2), a set of signal profiles (P1a, P1b, P2a, P2b) originating from said object (6). The device determines at least part of an attenuated light path across the panel (1) based on each signal profile (P1a, P1b, P2a, P2b), and identifies the position of the object (6) on the touch surface (4) based on the thus-determined attenuated light paths. In determining the attenuated light path, the device applies a predetermined width function which is representative of a dependence of signal profile width on distance to the incoupling point due to light scattering caused by at least one of the touch surface (4) and the opposite surface (5).

Description

DETERMINING THE LOCATION OF ONE OR MORE OBJECTS ON A
TOUCH SURFACE
Cross-Reference to Related Applications
The present application claims the benefit of Swedish patent application No. 0801467-2, filed on June 23, 2008, U.S. provisional application No. 61/129,372, filed on June 23, 2008, Swedish patent application No. 0900138-9, filed on February 5, 2009, and U.S. provisional application No. 61/202,208, filed on February 5, 2009, all of which are incorporated herein by reference.
Technical Field
The present invention relates to touch-sensitive panels and data processing techniques in relation to such panels.
Background Art
To an increasing extent, touch-sensitive panels are being used for providing input data to computers, electronic measurement and test equipment, gaming devices, etc. The panel may be provided with a graphical user interface (GUI) for a user to interact with using e.g. a pointer, stylus or one or more fingers. The GUI may be fixed or dynamic. A fixed GUI may e.g. be in the form of printed matter placed over, under or inside the panel. A dynamic GUI can be provided by a display screen integrated with, or placed underneath, the panel or by an image being projected onto the panel by a projector.
There are numerous known techniques for providing touch sensitivity to the panel, e.g. by using cameras to capture light scattered off the point(s) of touch on the panel, or by incorporating resistive wire grids, capacitive sensors, strain gauges, etc into the panel.
US2004/0252091 discloses an alternative technique which is based on frustrated total internal reflection (FTIR). Light is coupled into a panel to propagate inside the panel by total internal reflection. Arrays of light sensors are located around the perimeter of the panel to detect the light. When an object comes into contact with a surface of the panel, the light will be locally attenuated at the point of touch. The location of the object is determined by triangulation based on the attenuation of the light from each source at the array of light sensors.
US 3,673,327 discloses a similar technique in which arrays of light beam transmitters are placed along two edges of a panel to set up a grid of intersecting light beams that propagate through the panel by internal reflection. Corresponding arrays of beam sensors are placed at the opposite edges of the panel. When an object touches a surface of the panel, the beams that intersect at the point of touch will be attenuated. The attenuated beams on the arrays of detectors directly identify the location of the object.
In order for these FTIR devices to gain acceptance in the market, they must be operable in real-life situations. For example, after some use, it is likely that the touch surface becomes contaminated by deposits, such as fingerprints, fluids, dust, etc. Such contaminants may reduce the performance of the FTIR device. There is thus a general desire to design FTIR devices that provide adequate performance in terms of ability to determine the location and optionally also the size of one or more touching objects, with adequate precision even after substantial use in a real-life environment.
Summary of the Invention
It is an object of the invention to at least partly overcome one or more of the above- identified limitations of the prior art.
This and other objects, which may appear from the description below, are at least partly achieved by means of a method, a computer program product, a device for determining positions, and a touch-sensing apparatus according to the independent claims, embodiments thereof being defined by the dependent claims.
A first aspect of the invention is a method in a touch-sensing apparatus, which comprises a light transmissive panel that defines a touch surface and an opposite surface, and a light source arrangement for providing sheets of light inside the panel, wherein each sheet comprises light that propagates by internal reflection between the touch surface and the opposite surface from one or more incoupling points to a set of outcoupling points. The apparatus further comprises a light sensor arrangement for generating one or more output signals that represent light reaching the outcoupling points, wherein the touch-sensing apparatus is configured such that an object touching the touch surface locally attenuates at least two sheets of light. The method comprises: identifying, in said output signal(s), a set of signal profiles originating from said object; determining at least part of an attenuated light path across the panel based on each signal profile; and identifying the position of the object on the touch surface based on the thus-determined attenuated light paths. In this method, the attenuated light paths are determined by applying a predetermined width function which is representative of a dependence of signal profile width on distance to the incoupling point due to light scattering caused by at least one of the touch surface and the opposite surface.
By applying the width function, it is possible to reduce or essentially eliminate any negative impact of light scattering on the ability of the touch-sensing apparatus to determine the position of the object. In fact, by accounting for light scattering at the touch surface and/or the opposite surface, the inventive method makes it possible to deliberately add a light scattering structure to the touch surface or the opposite surface, in order to reduce the influence of fingerprints and other deposits, as well as scratches, that may occur during use of the touch-sensing apparatus. Alternatively or additionally, the impact of light scattering caused by deposits may be accounted for by intermittently measuring/estimating the width function during use of the touch- sensing apparatus, and then applying the thus-obtained width function in the determination of attenuation paths.
Since the inventive method enables determination of attenuation paths that are more or less corrected for the influence of light scattering, it should be understood that the method enables determination of both the position and the true size/area/shape of a touching object. By the same token, the method can be used to improve the position determination when multiple (two or more) objects are brought in contact with the touch surface. For multiple objects, the position determination will result in a larger number of candidate touch points than the actual number of objects. Thus, the candidate touch points will contain both true touch points and ghost touch points. As noted above, the inventive method makes it possible to determine the size/area/shape of the candidate touch points. The thus-determined size/area/shape can be used to validate the candidate touch points, so as to distinguish between ghost touch points and true touch points. Thus, the inventive method may improve the precision of the positions determined for multiple touching objects, and the method may also increase the number of touches than can be resolved for a given number of light sheets and/or outcoupling points of the touch-sensing apparatus.
In one embodiment, the width function represents the factual width of the object given the detected signal profile, as a function of distance to the incoupling point.
In one embodiment, at least one of the sheets is generated by sweeping an essentially collimated beam across a set of incoupling points on the panel. In one embodiment, said determining comprises, for each signal profile: reconstructing a center ray of the attenuated light path by geometrically retracing a center point of the signal profile to one of said incoupling points; determining a signal width of the signal profile; and determining an object width at one or more candidate positions along the center ray by applying said width function, thereby determining part of said attenuated light path. Here, said one or more candidate positions may be determined by triangulation using a set of center rays that are reconstructed from said set of signal profiles.
In one embodiment, at least one of the sheets is generated in the form of a diverging beam originating at the one or more incoupling points. In such an embodiment, said determining may comprise, for each signal profile associated with said at least one sheet generated in the form of a diverging beam: identifying outcoupling points corresponding to limits of the signal profile; reconstructing limiting rays of the attenuated light path by geometrically retracing the thus-identified outcoupling points to a respective incoupling point; and modifying the distance between the limiting rays by applying said width function, thereby determining the attenuated light path.
In one embodiment, said determining results in a set of candidate positions, and wherein said identifying the positions comprises: calculating a shape measure and/or an area measure for at least one candidate position based on the thus-determined attenuated light paths; and validating said at least one candidate position based on the shape measure and/or area measure.
A second aspect of the invention is a computer program product comprising computer code which, when executed on a data-processing system, is adapted to carry out the method of the first aspect.
A third aspect of the invention is a device for determining a position of an object on a touch surface included in a touch-sensing apparatus, which comprises a light transmissive panel that defines the touch surface and an opposite surface, a light source arrangement for providing sheets of light inside the panel, wherein each sheet comprises light that propagates by internal reflection between the touch surface and the opposite surface from one or more incoupling points to a set of outcoupling points, a light sensor arrangement for generating one or more output signals that represent light reaching the outcoupling points, wherein the touch-sensing apparatus is configured such that an object touching the touch surface locally attenuates at least two sheets of light. The device comprises: an element for identifying, in said output signal(s), a set of signal profiles originating from the object; an element for determining at least part of an attenuated light path across the panel based on each signal profile; and an element for identifying the position of the object on the touch surface based on the thus-determined attenuated light paths. The determining element is configured to apply a predetermined width function which is representative of a dependence of signal profile width on distance to the incoupling point due to light scattering caused by at least one of the touch surface and the opposite surface.
A fourth aspect of the invention is a touch-sensing apparatus, comprising: a light transmissive panel that defines a touch surface and an opposite surface; a light source arrangement for providing sheets of light inside the panel, wherein each sheet comprises light that propagates by internal reflection between the touch surface and the opposite surface from one or more incoupling points to a set of outcoupling points; a light sensor arrangement for generating one or more output signals that represent light reaching the outcoupling points, wherein the touch-sensing apparatus is configured such that an object touching the touch surface locally attenuates at least two sheets of light; and a device for determining a position according to the third aspect. Any one of the embodiments of the first aspect can be combined with the second to fourth aspects.
Still other objectives, features, aspects and advantages of the present invention will appear from the following detailed description, from the attached claims as well as from the drawings.
Brief Description of Drawings
Embodiments of the invention will now be described in more detail with reference to the accompanying schematic drawings. Fig. IA is a side view of a touch-sensing apparatus, Fig. IB is a top plan view of a fan beam embodiment of such a touch- sensing apparatus, and Fig. 1C is a top plan view of a scan beam embodiment of such a touch-sensing apparatus.
Fig. 2A illustrates a spatial transmission signal generated in a fan beam embodiment, Fig. 2B illustrates a spatial transmission signal generated in a scan beam embodiment, and Fig. 2C illustrates decoding by triangulation.
Figs 3A-3B are graphs of dispersion functions caused by scattering in a touch- sensing apparatus.
Figs 4A-4D are top plan views of a beam propagating inside a light transmissive panel, serving to illustrate the origin of the dispersion functions in Figs 3A-3B. Figs 5A-5D are top plan views of a scan beam embodiment, to illustrate a reconstruction of attenuation paths.
Figs 6A-6B are top plan views of another scan beam embodiment, to illustrate a reconstruction of attenuation paths.
Fig. 7 is a flow chart of an exemplary decoding process. Figs 8A-8D are top plan views of a fan beam embodiment, to illustrate a reconstruction of attenuation paths.
Fig. 9 is a graph of a dispersion function based on measurement data. Fig. 10 is a block diagram of an embodiment of a device for determining touch locations. Figs 1 IA- 1 IB are top plan views of hybrid embodiments.
Detailed Description of Example Embodiments
The present invention relates to techniques for detecting the location of an object on a touch surface of a touch-sensing apparatus. The description starts out by presenting the use of frustrated total internal reflection (FTIR) for touch determination, in relation to a number of exemplifying arrangements for illuminating the interior of a light transmissive panel. Then, the influence of signal dispersion caused by scattering in the light transmissive panel is discussed. Finally, the use of signal dispersion for improving the touch determination process is discussed in relation to two principal arrangements for illuminating the light transmissive panel.
Throughout the description, the same reference numerals are used to identify corresponding elements.
Fig. IA is a side view of an exemplifying arrangement in a touch-sensing apparatus. The arrangement includes a light transmissive panel 1, one or more light emitters 2 (one shown) and one or more light sensors 3 (one shown). The panel defines two opposite and generally parallel surfaces 4, 5 and may be planar or curved. A radiation propagation channel is provided between two boundary surfaces of the panel, wherein at least one of the boundary surfaces allows the propagating light to interact with a touching object 6. Typically, the light from the emitter(s) 2 propagates by total internal reflection (TIR) in the radiation propagation channel, and the sensors 3 are arranged at the periphery of the panel 1 to generate a respective output signal which is indicative of the energy of received light.
As shown in Fig. IA, the light may be coupled into and out of the panel 1 directly via the edge portion that connects the top and bottom surfaces 4, 5 of the panel 1. Alternatively, a separate coupling element (e.g. in the shape of a wedge) may be attached to the edge portion or to the top or bottom surface 4, 5 of the panel 1 to couple the light into and/or out of the panel 1. When the object 6 is brought sufficiently close to the boundary surface, part of the light may be scattered by the object 6, part of the light may be absorbed by the object 6, and part of the light may continue to propagate unaffected. Thus, when the object 6 touches a boundary surface of the panel (e.g. the top surface 4), the total internal reflection is frustrated and the energy of the transmitted light is decreased.
The location of the touching object 6 may be determined by measuring the energy of the light transmitted through the panel 1 from a plurality of different directions. This may, e.g., be done by operating a number of spaced- apart emitters 2 to generate a corresponding number of sheets of light inside the panel 1, and by operating the sensors 3 to detect the energy of the transmitted energy of each sheet of light. As long as the touching object attenuates at least two sheets of light, the position of the object can be determined by triangulation. In the embodiment of Fig. IA, a data processing device 7 is configured to process the output signal(s) from the sensor(s) 3 to determine the location of the touching object 6. As indicated in Fig. IA, the light will not be blocked by the touching object 6.
Thus, if two objects happen to be placed after each other along a light path from an emitter 2 to a sensor 3, part of the light will interact with both objects. Provided that the light energy is sufficient, a remainder of the light will reach the sensor 3 and generate an output signal that allows both interactions (touch points) to be identified. Normally, each such touch point has a transmission in the range 0-1, but normally in the range 0.7-0.99. The total transmission T along a light path is the product of the individual transmissions tn of the touch points on that light path: T = Y[ tn. Thus, it may be possible for the data processing device 7 to determine the locations of multiple touching objects, even if they are located in line with a light path.
Figs IB and 1C illustrate exemplary light source arrangements for generating sheets of light inside the light transmissive panel 1, and light sensor arrangements for detecting the transmitted energy of each sheet.
In the embodiment of Fig. IB, light from two spaced-apart emitters 2 is coupled into the panel 1 to propagate inside the panel 1 by total internal reflection. Each emitter 2 generates a beam Bl, B2 of light that expands in the plane of the panel 1 while propagating away from the emitter 2. Such a beam is denoted fan beam, and this type of embodiment is generally referred to as a "fan beam embodiment" in the following. Each fan beam Bl, B2 propagates from one or more entry or incoupling points within an incoupling site on the panel 1 to form a sheet of light, which in this example is distributed essentially throughout the entire panel 1. Arrays of light sensors 3 are located around the perimeter of the panel 1 to receive the light from the emitters 2 at a number of spaced- apart outcoupling points within an outcoupling site on the panel 1. The location of the object is determined by triangulation based on the attenuation of the light from each emitter 2 at the array of light sensors 3. This type of touch-sensing apparatus is, e.g., known from aforesaid US2004/0252091, which is incorporated herein by reference.
In the embodiment of Fig. 1C, two collimated beams Bl, B2 are swept across the panel in two different directions, and the energy of each transmitted beam is measured during the sweep. This type of embodiment is generally referred to as a "scan beam embodiment" in the following. The sweeping of a collimated beam Bl, B2 forms a sheet of light. Specifically, each beam Bl, B2 is generated and swept along a set of entry or incoupling points within an incoupling site on the panel 1 by an input scanner arrangement 8. In the illustrated example, the entry points are located at the left and top edges of the panel 1. The transmitted energy at a number of outcoupling points within an outcoupling site on the panel 1 is measured by an output scanner arrangement 9 which is synchronized with the input scanner arrangement 8 to receive the beam Bl, B2 as it is swept across the panel 1. In the illustrated example, the outcoupling points are located at the right and bottom edges of the panel 1 opposite to the entry points.
Each output scanner arrangement 9 typically includes a beam scanner and one light sensor (not shown). Similarly, each input scanner arrangement 8 typically includes a light emitter and a beam scanner (not shown). However, it is conceivable that two or more input scanner arrangements share one and the same light emitter, and/or that two or more output scanner arrangements share one and the same light sensor. It is also to be understood that more than two beams can be swept across the panel. Typically, the beams are translated across the panel, i.e. they have an essentially invariant angle (scan angle) in the plane of the panel. Although not shown in Fig. 1C, dedicated optical components are associated with the incoupling and outcoupling sites to re-direct the incoming light from input scanner arrangement 8 into a desired direction ("scan angle") and to re-direct the transmitted light towards a common focal area/point on the output scanner arrangement 9, respectively. In the example of Fig. 1C, the beams Bl, B2 are essentially parallel to a respective edge of the panel 1. However, it is conceivable that the scan angle of the beams Bl, B2 change as they are swept across the panel 1. The illustrated embodiment and alternative configurations thereof are further disclosed in Applicant's U.S. provisional applications No. 61/129,372 and No. 61/129,373, both filed on June 23, 2008 and incorporated herein by reference.
In an alternative (not shown) to the embodiment in Fig. 1C, each output scanner arrangement 9 and the redirecting optical components are replaced by a respective elongate sensor, which extends along the panel edge and is optically connected to the panel. Each such elongate sensor is controlled to measure the received energy as a function of time, while a beam is swept along an incoupling site on the panel. Thus, similarly to the embodiment in Fig. 1C, the transmitted energy is measured at a number of outcoupling points within an outcoupling site on the panel, wherein the outcoupling points correspond to different time points in the output signal of the elongate sensor. In a variant, the redirecting optical components are retained but each output scanner arrangement 9 is replaced by a stationary radiation detector, which is arranged in the aforesaid common focal area/point, spaced from the panel edge. In this variant, the stationary radiation detector is controlled to measure the received energy as a function of time, while a beam is swept along an incoupling site on the panel. Such alternatives and variants are further disclosed in Applicant's U.S. provisional application No. 61/202,874, filed on April 15, 2009 and incorporated herein by reference.
In a further alternative (not shown) to the embodiment in Fig. 1C, the output scanner arrangements 9 are omitted and replaced by retro-reflectors, such that the beams Bl, B2 are reflected back to the respective input scanner arrangement 8. Thus, the input scanner arrangements 8 are configured as transceivers that both sweep and receive a beam, to measure the transmitted energy. Such alternatives are further disclosed in Applicant's PCT application WO 2009/048365, which is incorporated herein by reference. In all of the above embodiments, the output signals of the sensor(s) 3 may be aggregated into a spatial transmission signal for each sheet of light. The spatial transmission signal represents the received energy at different locations around the perimeter of the panel. The spatial transmission signal could optionally be normalized by a background signal to represent the true transmission of light at the different locations, as will be further exemplified below. Fig. 2A illustrates a spatial transmission signal obtained from the array of sensors 3 at the right-end panel edge of the embodiment in Fig. IB. Fig. 2A also schematically illustrates a set of idealized light rays rl, ..., rN in the sheet of light that is generated by the bottom left-side emitter in Fig. IA. It should be understood that the spatial resolution of the transmission signal Sl depends, La., on the density of sensors 3 in the array of sensors. The transmission signal Sl is illustrated to contain a signal profile Pl that originates from a touching object (not shown). Such a signal profile Pl is also denoted "touch signature" in the following.
Fig. 2B illustrates a spatial transmission signal Sl obtained from the right-end output scanner arrangement 9 in Fig. 1C, or any of the alternatives thereof as discussed above. Fig. 2B also schematically illustrates a set of instances rl,..., rN of the center ray of a light beam while it is swept across the panel 1. The spatial transmission signal Sl could, e.g., be given as a function of scanning angle or time, which is equivalent to location along the right-end edge of the panel 1. It should be understood that the spatial resolution of the transmission signal Sl depends, La., on the sampling rate of the output scanner arrangement 9. The transmission signal Sl is also illustrated to contain a touch signature Pl of a touching object (not shown).
To further illustrate the process for determining the location of a touching object, Fig. 2C illustrates two spatial transmission signals Sl, S2 obtained from the output scanner arrangements 9 in Fig. 1C. In this so-called decoding process, all touch signatures Pl, P2 are identified in the transmission signals Sl, S2. For each touch signature Pl, P2, an attenuation path is determined, typically by tracing the center of the touch signature Pl, P2 back to the corresponding entry point. In Fig. 2C, the back- traced center rays are illustrated by dashed lines. The location of the touching object 6 is given by the intersection of the center rays.
The present Applicant has now realized that the decoding process may be improved by deliberately causing the propagating light to be scattered at one or both of the boundary surfaces 4, 5 of the light transmissive panel 1 (Fig. 1), provided that the decoding process is appropriately designed to take the effects of scattering into account. The scattering may be caused by a diffusing surface structure, also called an antiglare (AG) structure, on the touch surface 4. An AG structure may be used to reduce glares from external lighting on the touch surface. Further, when the touching object is a naked finger, the contact between the finger and the touch surface normally leaves a fingerprint on the touch surface. On a perfectly flat surface, such fingerprints are clearly visible and usually unwanted. By adding an AG structure to the touch surface, the visibility of fingerprints is reduced. Furthermore, the friction between the finger and the touch surface decreases when an AG structure is used, which may thus improve the user experience. AG structures are specified in gloss units (GU), where lower GU values result in less glares.
When a beam of light propagates by internal reflection in a light transmissive panel that has an AG structure on one or both of its boundary surfaces, each internal reflection against such a scattering boundary surface will cause some light to be diverted away from the main direction of the beam and may also cause radiation to escape through the boundary surface. Thus, the provision of an AG structure generally causes the beam to be broadened in the plane of the panel as the beam propagates from its entry point(s) on the panel. This broadening causes the shape of the touch signature in the spatial transmission signal to depend on the location of the touching object on the panel, specifically the distance between the touching object and the relevant incoupling/entry point. Fig. 3A illustrates an exemplifying dependence between the width of the touch signature caused by a touching object and the distance between the touching object and the entry point. The factual width of the touching object is Wn. When the touching object is located close to the entry point, the detected touch signature will be distinct and have a width similar to the factual width. As the touching object is moved away from the entry point, the detected touch signature will gradually broaden. Close to the outcoupling point, the width of the touch signature may again become slightly smaller. It is to be understood that the actual functional dependence between width and touch location is greatly dependent on the actual optical design of the touch-sensing apparatus, and that Fig. 3A is merely given as an example.
In Fig. 3A, it can be seen that a small touching object located centrally between the entry and outcoupling points will yield the same touch signature width as a larger touching object located closer to the entry point. Based on the data in Fig. 3A, it is possible to determine the factual width of a touching object that yields a certain touch signature width, as a function of the distance between the touching object and the entry point. This type of functional dependence is denoted dispersion function in the following. Fig. 3B is a graph of a dispersion function determined for the data in Fig 3A. Thus, Fig. 3B illustrates the factual object width at different locations that will generate the same touch signature width in the spatial transmission signal. As will be further explained in the following, such a dispersion function can be used to improve the precision and/or consistency in determining the location and/or size of one or more touching objects.
The origin of the dispersion function will now be further explained in relation to the scan beam embodiment of Fig. 1C. To understand the behaviour of a specific touch- sensing apparatus, it is necessary to analyse its optical design. The shape of the diverging set of rays from the entry point depends on many different factors, e.g. panel thickness, internal angle of incidence onto the boundary surfaces, AG structure, etc. The resulting touch signature depends, apart from the diverging set of rays, on a number of other factors, e.g. detector surface area, detector numerical aperture, cross-section of injected light, etc. When beams are swept parallel to the edges of the panel, detector- specific parameters typically have more impact on the touch signature for touch locations close to the outcoupling point. Conversely, emitter- specific properties mainly affect the touch signature for touch locations close to the entry point.
As explained above, a beam of light that is transmitted through the panel will be broadened each time it interacts with the AG structure. Fig. 4A is a plan view of the panel 1 in which a beam Bl is injected at an entry side and propagates to an outcoupling side. At the outcoupling side, the energy of the beam B 1 is sensed within a confined area (indicated by Υ and denoted "receiving area" in the following). The length of the receiving area Υ is dependent on the numerical aperture of the light sensor arrangement (e.g. the output scanner arrangement 9 in Fig. 1C), i.e. the range of angles over which the light sensor arrangement can accept light.
As shown in Fig. 4A, the beam Bl diverges as it propagates through the panel. Since the receiving area Υ has a finite length, it will only receive the central parts of the diverging beam Bl that reaches the outcoupling side. Fig. 4B indicates the outer rays that reach the receiving area Υ.
Fig. 4C illustrates the situation when an object 6 touches the panel 1 close to the entry side, in this example the left side. For simplicity, we consider a touching object 6 that moves with respect to the beam Bl, but the conclusions will be equally applicable for a stationary touching object and a moving beam (as in the scan beam embodiment). Four different locations of the touching object 6 are shown in the left-hand part of Fig. 4C. Clearly, the touching object 6 interacts with the beam Bl over a short distance. Fig. 4C also indicates that the touching object 6 interacts with a large part of the beam Bl. Thus, the resulting touch signature will be narrow (small width) and strong (low transmission). Fig. 4D illustrates the situation when the object 6 touches the panel 1 further away from the entry side. Clearly, the touching object 6 interacts with the beam Bl over a longer distance. It is also seen that the touching object 6 interacts with a smaller portion of the beam B 1. Therefore, the resulting touch signature will be wider and weaker. In the example of Fig. 4, the width of the touch signature will decrease slightly for locations to the right of the touching object 6 in Fig. 4D. Such signature behaviour is also illustrated in the graph of Fig. 3A. It should be noted that such a decrease in signature width is only observed when the length of the receiving area Υ is smaller than the width of the dispersed beam at the outcoupling side (e.g. as shown in Fig. 4A). For example, in the above-mentioned variant where a single elongate sensor is arranged at the outcoupling side instead of the output scanner arrangement, a decrease in touch signature width is unlikely to be observed.
Above, it was shown that the width and height of a touch signature changes with the location of the touching object due to the effects of scattering. Below, it will now be explained how the resulting dispersion function can be used to improve the decoding process. For reasons of explanation, the dispersive effects are slightly exaggerated in the figures accompanying the following disclosure.
DECODING IN SCAN BEAM EMBODIMENTS
Figs 5A-5D illustrate a scan beam embodiment, in which three collimated non- parallel beams are swept (translated) across the panel, resulting in three transmission signals.
Fig. 5A illustrates the three beams B1-B3 and the resulting spatial transmission signals Sl -S3. A first beam Bl, which is parallel to the top and bottom edges of the panel 1, is injected at the left side and detected at the right side of the panel 1, while being swept from the bottom to the top (or vice versa). The resulting transmission signal Sl is shown to the right side of the panel 1. A second beam B2, with a scan angle which is non- parallel to the edges of the panel 1, is injected at the top and is detected at the bottom, while being swept from left to right (or vice versa). The resulting transmission signal S2 is shown at the bottom. A third beam B3, which is parallel to the left and right edges of the panel 1, is injected at the bottom and detected at the top, while being swept from left to right (or vice versa). The resulting transmission signal S3 is shown at the top. Each transmission signal S1-S3 contains a respective touch signature P1-P3, resulting from the touching object 6.
Fig. 5B illustrates the attenuated paths determined based on the touch signatures P1-P3, without considering the signal dispersion caused by scattering. Here, the attenuated paths have been reconstructed by tracing the limits of the touch signatures PIPS back to the corresponding entry points, as illustrated by the straight parallel lines extending from the limits of each peak P1-P3 along the associated beam path. Clearly, there is an inconsistency in the estimated size of the touching object 6 at the intersection of the attenuated paths. Fig. 5C illustrates the reconstruction of the attenuation path for the first beam Bl in Fig. 5 A, using a dispersion function determined for this scan beam embodiment. The dispersion function may be calculated theoretically or may be derived from measured data. Fig. 5C includes two dispersion lines showing the factual width of a touching object 6 yielding the detected touch signature width as a function of the distance from the entry point. It is seen that if the touching object 6 is located close to the entry point, the factual width is essentially equal to the width of the touch signature. If the touching object 6 is located farther away from the entry point, its factual width has to be smaller in order to generate the detected touch signature Pl. Fig. 5D illustrates the reconstructed attenuation paths for the touch signatures PI¬
PS in the transmission signals S1-S3 generated by the beams B1-B3, by applying the dispersion function to the width of each touch signature P1-P3. Clearly, the resulting factual widths at the intersection of the attenuated paths are consistent. Thus, by applying the dispersion function, it is possible to verify the determined position by checking the consistency of the factual widths at the intersection.
As will be shown in the following, further advantages may be obtained when spatial transmission signals are processed to determine the locations of two or more touching objects on the panel. These advantages will be explained in relation to a scan beam embodiment shown in Figs 6A-6B. In this embodiment, two collimated beams Bl, B2 are swept (translated) across the panel, resulting in two spatial transmission signals Sl, S2. A first transmission signal Sl is generated by sensing the transmitted energy of a beam B 1 which is parallel to the top and bottom edges of the panel 1 and which is injected at the left side and outcoupled at the right side of the panel 1. A second transmission signal S2 is generated by sensing the transmitted energy of a beam B2 which is parallel to the left and right edges of the panel 1 and which is injected at the bottom side and outcoupled at the top side of the panel 1.
In Fig. 6A, each transmission signal Sl, S2 contains two touch signatures PIa, PIb, P2a, P2b, each resulting from one of the touching objects 6. Fig. 6A also illustrates the attenuation paths (corrected attenuation paths) that have been reconstructed based on the touch signatures PIa, PIb, P2a, P2b while applying the dispersion function for this embodiment. Fig. 6 A also illustrates the attenuation paths (uncorrected attenuation paths) that are obtained without applying the dispersion function. The attenuation paths form four polygonal intersections, with each intersection being a candidate location cl-c4. Looking at the corrected attenuation paths, it can be seen that two of the intersections are almost square whereas the other two intersections are thin and elongate. If the touching objects 6 are known to be approximately regular in shape, it can be concluded that the touching objects are located at the square intersections cl, c4. Thus, based on the shape/area of the intersections, true locations can be distinguished from ghost locations among the candidate locations in a multi-touch scenario.
Fig. 6B illustrates the spatial transmission signals Sl, S2 that are generated when the two touching objects 6 are located at the ghost locations in Fig. 6A. Looking at the corrected attenuation paths, it can again be seen that the intersections that correspond to the touching objects 6 are almost square and have similar areas. The intersections cl, c4 at the ghost points are also square, but one intersection has a very small area, and the other intersection has a significantly larger area. Thus, by assessing the areas of the intersections cl-c4, it is possible to determine the two most probable touch locations. It should be realized that it would be much more difficult, if not impossible, to distinguish between true locations and ghost locations based on the uncorrected attenuation paths, since all intersections would have approximately the same shape and area.
Fig. 7 is a flow chart for an exemplifying decoding process that may be used to identify touch locations in any one of the above-described scan beam arrangements. In step 701, the process obtains the output signals from the light sensors, typically by sampling data values from the output signal at given time intervals.
Then, in step 702, the output signals are processed to form a sample vector for each sheet of light, each sample vector including a set of data values associated with different time points. Depending on implementation, this processing may involve filtering the output signals for suppression of noise and/or ambient light, combining output signals from different sensors, interpolating the output signals, etc. The sample vector is then used as a spatial transmission signal, optionally after dividing the sample vector with background data. The background data may be a corresponding sample vector that represents the received energy without any object touching the touch surface. The background data may be pre-set or obtained during a separate calibration step.
In step 703, each spatial transmission signal is processed to identify one or more peaks that may originate from touching objects. The identified peaks correspond to the above-discussed touch signatures.
In step 704, the center point of each peak is identified. This step may or may not involve interpolating the data values in the transmission signal. Using the center point, and knowing the scan angle of the beam at each data value in the spatial transmission signal, the process determines a center ray (cf. Fig. 2C) for each center point. Further, the width of each peak in the spatial transmission signals is determined.
In step 705, the intersections between the center rays are determined by triangulation. These intersections form candidate touch points.
In step 706, the factual width at each intersection is calculated for each peak in the transmission signal, using a dispersion function and the peak width. For example, the peak width and location data for an intersection may be input to a function of the type shown in Fig. 3B, to output the factual width at the intersection. Thus, step 706 results in width data for each candidate touch point.
In step 707, the process determines the most probable set of true touch points among the candidate touch points. As indicated in the foregoing, the true touch points may be identified by calculating an area value for each candidate touch point and matching the area values to an area measure, or by calculating a shape value for each candidate touch point and matching the shape values to a shape measure, or a combination thereof. In step 708, the true touch points are output by the process.
To further exemplify the validation step 707, we consider the situation in Fig. 6A. After applying the above steps 701-706, the process has determined four candidate touch points: cl = (xl, yl), c2 = (xl, y2), c3= (x2, yl), and c4 = (x2, y2), and corresponding width data for each candidate point (wx, wy). A first validation sub-step may be configured to test for ghost points that have an elongated shape. For each candidate point, the ratio r = min(wx, wy)/max(wx, wy) is calculated using the width data. If the ratio r is significantly smaller for top-left and bottom-right candidate touch points c2, c3, the process concludes that the top-right and bottom-left candidate touch points c4, cl are the true touch points. A second validation sub-step may be configured to compute the area of the candidate points, e.g. as wx*wy. If the bottom-left candidate touch point cl is significantly larger than the other candidate touch points, at the same time as the top-right candidate point c4 is smaller than the other candidate touch points, the process concludes that the bottom-left and top-right candidate touch points cl, c4 are ghost points (cf. Fig. 6B). In a simplified validation example, the process could be configured to validate only the top-left candidate point c2 or the bottom-right candidate point c4 according to the first validation sub-step. The skilled person understands that there are numerous alternative implementations of the validation step 707, depending e.g. on the number of touches to be resolved, the dispersion function, the shape and area of the objects, the shape and area variations among the objects, etc. The above example demonstrates that it is generally possible to improve the decoding process by applying a dispersion function in the reconstruction of attenuation paths based on spatial transmission signals generated by sweeping a number of collimated non-parallel beams inside a light transmissive panel.
In the example of Fig. 7, a center ray is first reconstructed for each touch signature by geometrically retracing a center point of the touch signature to a corresponding entry point (step 704). Then, a set of candidate touch points is determined by triangulating the reconstructed center rays (step 705), whereupon the dispersion function is applied to determine factual widths at each candidate touch point (step 706). Thus, the corrected attenuation path is only determined at the candidate touch points.
In a variant (not shown), corrected attenuation paths are determined before the triangulation, i.e. the dispersion function is first applied to reconstruct the full attenuation path from the outcoupling side to the entry side. Then, the full attenuation paths are intersected in a triangulation step, which thus results in both the locations and the factual widths of the candidate touch points.
The skilled person realizes that there are many variants and alternatives to the above-described decoding process. For example, the spatial transmission signals may be generated to represent only part of the sample vector. For example, steps 702 and 703 may be combined such that touch signatures are first identified in the sample vectors, whereupon spatial transmission signals are generated only for one of more sample points within the touch signatures in the sample vectors.
Still further, the corrected attenuation paths may be calculated at a reduced accuracy or resolution, e.g. by approximating the outer limits of the corrected attenuation paths by a straight line.
Also, the decoding process may be based on any available image reconstruction algorithm, and especially few-view algorithms that are used in, e.g., the field of tomography. Any such algorithm can be modified to account for dispersion, as long as the dispersion function is known.
DECODING IN FAN BEAM EMBODIMENTS
In the following, the use of the dispersion function in the decoding process will be discussed in relation to fan beam embodiments. One difference compared to the scan beam embodiments is that the fan beam generally originates from a smaller incoupling site and expands in the plane of the panel even in the absence of scattering, i.e. without an AG structure. Thus, in a fan beam embodiment, the touch signatures contain information about the distance between the entry point and the touching object, even if the panel is not provided with an AG structure. However, the presence of scattering causes the fan beam to broaden in the plane of the panel while propagating from the incoupling site towards the outcoupling site. Thus, a dispersion function can be measured or calculated for a specific fan beam embodiment. By providing scattering and by taking the resulting dispersion function into account, it is possible to eliminate inconsistencies that may occur when evaluating the intersections of reconstructed attenuation paths in the decoding process. The basic principle for applying the dispersion function is the same as in the scan beam embodiments. This will be further explained with reference to Figs 8A-8D. In the example of Figs 8A-8D, the panel is internally illuminated by two fan beams. Although not shown on the drawings, several sensors are arranged at two sides of the panel to measure the transmitted energy of the sheets of light formed by the fan beams.
Fig. 8A illustrates a touch-sensing apparatus with no AG structure on the boundary surfaces. The thin lines indicate the boundaries of the light cones from two different emitters (not shown) that interact with two touching objects 6. It should be noted that although the light cones are drawn as if they originate from a singularity (i.e. a single incoupling point), in practice they generally originate from an extended incoupling site that may include more than one incoupling point for the decoding process. Fig. 8A also illustrates the intersections of attenuation paths that are reconstructed from the resulting touch signatures PIa, PIb, P2a, P2b in the transmission signals Sl, S2. The intersections form four candidate locations cl-c4, of which at least three are similar in size. Thus, it may be difficult to identify the true locations based on the touch signatures.
Fig. 8B illustrates the spatial transmission signals Sl, S2 obtained in the touch- sensing apparatus of Fig. 8 A if at least one of its boundary surfaces is provided with an AG structure. Compared to Fig. 8A, Fig. 8B shows that the touch signatures PIa, PIb, P2a, P2b have become wider and slightly weaker as a result of the scattering. Fig. 8B indicates the light cones (in thin lines) that interact with the objects 6, as well as the corrected attenuation paths (in thicker lines), which are obtained by applying the appropriate dispersion function to the touch signatures PIa, PIb, P2a, P2b in the transmission signals Sl, S2. Compared to Fig. 8 A, the intersections at the true locations cl, c3 (i.e. the locations of the touching objects 6) remain essentially unchanged, whereas the areas of the intersections at the ghost locations c2, c4 change considerably (decreasing and increasing in area, respectively). Both of these changes in intersection area make it easier to distinguish true locations from ghost locations among the candidate locations in a multi-touch scenario.
Fig. 8C further illustrate the areas of the intersections between uncorrected attenuation paths (i.e. the aforesaid light cones) in the touch-sensing apparatus of Fig. 8B. Clearly, the size of the touches will be incorrectly estimated if the dispersion function is not taken into account.
Fig. 8D illustrates the spatial transmission signals Sl, S2 that are generated when the two touching objects 6 are located at the ghost locations c2, c4 in Fig. 8C. Looking at the corrected attenuations paths (in thicker lines), it can be seen that the intersections that correspond to the touching objects 6 have essentially the same shape and area, whereas the intersections at the ghost locations cl, c3 are thin and elongate. Clearly, based on the shape/area of the intersections, true locations can be distinguished from ghost locations among the candidate locations in a multi-touch scenario. The decoding process could be implemented according to any of the decoding embodiments discussed above in relation to the scan beam embodiments, e.g. according the process shown in Fig. 7. In such a process, a center ray is first reconstructed for each touch signature by geometrically retracing a center point of the touch signature to a corresponding entry point. Then, a set of candidate touch points is determined by triangulating the reconstructed center rays, whereupon the factual width at each candidate touch point is determined by applying a geometric function to reconstruct the uncorrected fan beam width at the candidate touch point (resulting in the width between the outer lines at each intersection cl-c4 in Fig. 8B) and by applying the dispersion function to modify the thus-determined fan beam width (resulting in the width between the inner lines at each intersection cl-c4 in Fig. 8B), or vice versa. Alternatively, the geometric function and the dispersion function could be combined into a single correction function which is applied to determine the factual widths.
In a variant (not shown), corrected attenuation paths are determined before the triangulation, i.e. the geometric function and the dispersion function are first applied to reconstruct the full attenuation path from the outcoupling side to the entry point. Then, the full attenuation paths are intersected in a triangulation step, which thus results in both the locations and the factual widths of the candidate touch points.
OBTAINING THE DISPERSION FUNCTION
As mentioned above, the dispersion function can be obtained by either theoretical calculations for a specific touch-sensing apparatus or by measurements. Fig. 9 is a graph of measurement data obtained from a scan beam embodiment of the type shown in Fig. 6, wherein the measurement data has been obtained for a rectangular light transmissive panel with a 37 inch diameter. The graph shows the measured half- width of the touch signature as a function of the distance between the entry point (e.g. located on the left side of the panel in Fig. 6A) and the touching object. Thus, this graph corresponds to the graph in Fig. 3A. The touch signature width is clearly dependent on the distance from the entry point (and also on the distance to the outcoupling point). In this particular example, there is no decrease in touch signature width when the touching object is located close to the outcoupling point. The dispersion function may be given by the actual measurement data, suitably after recalculation into a function as shown in Fig. 3B, or the dispersion function may be derived based on a suitable function that is fitted to the measurement data. Such a fitted function may be linear, polygonal, spline, etc It should also be noted that a corresponding dispersion function can be measured without any difficulty in a fan beam embodiment. GENERAL
The decoding process is typically executed by a data processing device (cf. 7 in Fig. IA) connected to receive the output signal(s) of the sensor(s). An example of such a data processing device 7 is shown in Fig. 10. In the illustrated example, the device includes an element (or means) 10 for identifying, in the output signal(s), a set of touch signatures originating from touching objects. The device further includes an element (or means) 11 for determining at least part of an attenuated light path across the panel based on each touch signature, by applying the dispersion function as described in the foregoing. There is also provided an element (or means) 12 for identifying the positions of the touching objects on the touch surface based on the attenuated light paths. The device 7 may be implemented by special-purpose software (or firmware) run on one or more general- purpose or special-purpose computing devices. In this context, it is to be understood that each "element" or "means" of such a computing device refers to a conceptual equivalent of a method step; there is not always a one-to-one correspondence between elements/means and particular pieces of hardware or software routines. One piece of hardware sometimes comprises different means/elements. For example, a processing unit serves as one element/means when executing one instruction, but serves as another element/means when executing another instruction. In addition, one element/means may be implemented by one instruction in some cases, but by a plurality of instructions in some other cases. Such a software controlled computing device may include one or more processing units, e.g. a CPU ("Central Processing Unit"), a DSP ("Digital Signal Processor"), an ASIC ("Application-Specific Integrated Circuit"), discrete analog and/or digital components, or some other programmable logical device, such as an FPGA ("Field Programmable Gate Array"). The computing device may further include a system memory and a system bus that couples various system components including the system memory to the processing unit. The system bus may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. The system memory may include computer storage media in the form of volatile and/or non- volatile memory such as read only memory (ROM), random access memory (RAM) and flash memory. The special-purpose software may be stored in the system memory, or on other removable/non-removable volatile/non-volatile computer storage media which is included in or accessible to the computing device, such as magnetic media, optical media, flash memory cards, digital tape, solid state RAM, solid state ROM, etc. The computing device may include one or more communication interfaces, such as a serial interface, a parallel interface, a USB interface, a wireless interface, a network adapter, etc, as well as one or more data acquisition devices, such as an A/D converter. One or more I/O devices may be connected to the computing device, via a communication interface, including e.g. a keyboard, a mouse, a touch screen, a display, a printer, a disk drive, etc. The special- purpose software may be provided to the computing device on any suitable computer- readable medium, including a record medium, a read-only memory, or an electrical carrier signal.
In all of the above embodiments, the emitters 2 can operate in any suitable wavelength range, e.g. in the infrared or visible wavelength region. All sheets of light could be generated with identical wavelength. Alternatively, different sheets could be generated with light in different wavelength ranges, permitting differentiation between the sheets based on wavelength. Furthermore, the emitters 2 can output either continuous or pulsed radiation. Still further, the emitters 2 may be activated concurrently or sequentially. Any type of emitter capable of emitting light in a desired wavelength range could be used, for example a diode laser, a VCSEL (vertical-cavity surface-emitting laser), or an LED (light-emitting diode), an incandescent lamp, a halogen lamp, etc. The transmitted energy may be measured by any type of light sensor 3 capable of converting light into an electrical signal. It should be noted that in the context of this specification, a "light sensor" implies a O-dimensional light detector. Thus, the light sensor 3 may be a single light sensing element such as a photo-detector or a pixel on a CCD or CMOS detector. Alternatively, the light sensor 3 may be formed by a group of light sensing elements that are combined for O-dimensional light detection, by summing/averaging the output of the individual elements in hardware or software. The light transmissive panel 1 may be made of any material that transmits a sufficient amount of light in the relevant wavelength range to permit a sensible measurement of transmitted energy. Such material includes glass, poly(methyl methacrylate) (PMMA) and polycarbonates (PC). The panel 1 may be made of a single material or may be formed by layers of different materials. The internal reflections in the touch surface are caused by total internal reflection (TIR), resulting from a difference in refractive index between the material of the panel and the surrounding medium, typically air. The reflections in the opposite boundary surface may be caused either by TIR or by a reflective coating applied to the opposite boundary surface.
The AG structure may be provided as a dedicated layer attached to the touch surface. Alternatively, the AG structure may be integrated into the touch surface, e.g. by etching, engraving, stamping or moulding. Alternatively or additionally, the AG structure may be provided on the surface opposite to the touch surface. In certain embodiments, it may be advantageous for at least the touch surface to be provided with an AG structure, to avoid that deposition of contaminants (e.g. fingerprints) and formation of scratches causes the scattering, and thus the dispersion function, to change over time. With an AG structure on the touch surface, the touch- sensing apparatus will have inherently large signal dispersion, reducing the influence of additional sources of scattering, such as contaminants and scratches. It is also to be noted that in certain embodiments, both boundary surfaces of the panel may be used as touch surfaces. Furthermore, the desired scattering may be caused by other means than a dedicated AG structure, e.g. bulk scattering in the panel. In one example, a GU value of 10-200, e.g. 100-120, of one of the boundary surfaces of the panel would result in a sufficient amount of scattering for the purpose of the present invention.
The invention has mainly been described above with reference to a few embodiments. However, as is readily appreciated by a person skilled in the art, other embodiments than the ones disclosed above are equally possible within the scope and spirit of the invention, which is defined and limited only by the appended patent claims. For example, there are numerous other arrangements in which the process of decoding the output signals of light sensors can be improved by applying a dispersion function. One such alternative arrangement is shown in Fig. 1 IA, which is a variant of the fan beam embodiments discussed in the foregoing. Here, several light emitters 2 are arranged around the perimeter of the panel 1 to emit a respective fan beam (as indicated by diverging lines). The fan beam propagates inside the panel by TIR, at least in the touch surface 4. A large number of light sensors 3 are arranged around the perimeter to measure the received energy from the different sheets of light generated by the emitters 2. It has been found that the benefit from applying the dispersion function when reconstructing the attenuation paths increases with the number of fan beams, since the inconsistencies in area/shape for the ghost points will become larger and larger whilst the true touch points will have better and better correspondence in area/shape.
Another exemplifying arrangement is shown in Fig. 1 IB, which is a variant of the scan beam embodiments discussed in the foregoing. Here, two groups of light emitters 2 are arranged along two edges of the panel 1. Each emitter 2 is configured to emit a collimated beam that is coupled into the panel to propagate by TIR, at least in the touch surface 4. Within each group, the emitters 2 are activated sequentially, in any order. Thus, the emitters 2 may be activated to simulate a beam sweep, as in the scan beam embodiments. A respective elongate light sensor 3 is arranged at the opposite edges to measure the received energy from the different sheets of light generated by the emitters 2. It should be understood that the decoding process described for the above scan beam embodiments are equally applicable to this arrangement.
Further variants include hybrids of the fan beam and scan beam embodiments. One such hybrid (not shown) may be formed based on one of the above-described fan beam embodiments, by replacing one or more stationary emitters generating a fan beam with one or more stationary emitters generating a collimated beam. Another hybrid (not shown) may be formed based on one of the above-described scan beam embodiments, by sweeping fan beams or converging beams instead of collimated beams. The skilled person will readily realize how to implement the above teachings in the decoding process for the hybrid embodiments.

Claims

1. A method in a touch-sensing apparatus, said apparatus comprising a light transmissive panel (1) that defines a touch surface (4) and an opposite surface (5), a light source arrangement (2; 8) for providing sheets of light inside the panel (1), wherein each sheet comprises light that propagates by internal reflection between the touch surface (4) and the opposite surface (5) from one or more incoupling points to a set of outcoupling points, said apparatus further comprising a light sensor arrangement (3; 9) for generating one or more output signals (Sl, S2, S3) that represent light reaching the outcoupling points, wherein the touch- sensing apparatus is configured such that an object (6) touching the touch surface (4) locally attenuates at least two sheets of light, said method comprising: identifying, in said output signal(s) (Sl, S2, S3), a set of signal profiles (Pl, P2, P3; PIa, PIb, P2a, P2b) originating from said object (6), determining at least part of an attenuated light path across the panel (1) based on each signal profile (Pl, P2, P3; PIa, PIb, P2a, P2b), and identifying the position of the object (6) on the touch surface (4) based on the thus- determined attenuated light paths, wherein said determining comprises applying a predetermined width function which is representative of a dependence of signal profile width on distance to the incoupling point due to light scattering caused by at least one of the touch surface (4) and the opposite surface (5).
2. The method of claim 1, wherein the width function represents the factual width of the object given the detected signal profile, as a function of distance to the incoupling point.
3. The method of claim 1 or 2, wherein at least one of the sheets is generated by sweeping an essentially collimated beam (Bl, B2, B3) across a set of incoupling points.
4. The method of any preceding claim, wherein said determining comprises, for each signal profile (Pl, P2, P3; PIa, PIb, P2a, P2b): reconstructing a center ray of the attenuated light path by geometrically retracing a center point of the signal profile (Pl, P2, P3; PIa, PIb, P2a, P2b) to one of said incoupling points, determining a signal width of the signal profile (Pl, P2, P3; PIa, PIb, P2a, P2b), and determining an object width at one or more candidate positions (cl-c4) along the center ray by applying said width function, thereby determining part of said attenuated light path.
5. The method of claim 4, wherein said one or more candidate positions (cl-c4) are determined by triangulation using a set of center rays that are reconstructed from said set of signal profiles (Pl, P2, P3; PIa, PIb, P2a, P2b).
6. The method of any preceding claim, wherein at least one of the sheets is generated in the form of a diverging beam (Bl, B2) originating at the one or more incoupling points.
7. The method of claim 6, wherein said determining comprises, for each signal profile (Pl, P2, P3; PIa, PIb, P2a, P2b) associated with said at least one sheet generated in the form of a diverging beam (Bl, B2): identifying outcoupling points corresponding to limits of the signal profile (Pl, P2,
P3; PIa, PIb, P2a, P2b), reconstructing limiting rays of the attenuated light path by geometrically retracing the thus-identified outcoupling points to a respective incoupling point, and modifying the distance between the limiting rays by applying said width function, thereby determining the attenuated light path.
8. The method of claim 1 or 2, wherein said determining results in a set of candidate positions (cl-c4), and wherein said identifying the positions comprises: calculating a shape measure and/or an area measure for at least one candidate position (cl-c4) based on the thus-determined attenuated light paths, and validating said at least one candidate position (cl-c4) based on the shape measure and/or area measure.
9. A computer program product comprising computer code which, when executed on a data-processing system, is adapted to carry out the method of claim 8.
10. A device for determining a position of an object (6) on a touch surface (4) included in a touch-sensing apparatus, said touch-sensing apparatus comprising a light transmissive panel (1) that defines the touch surface (4) and an opposite surface (5), a light source arrangement (2; 8) for providing sheets of light inside the panel (1), wherein each sheet comprises light that propagates by internal reflection between the touch surface (4) and the opposite surface (5) from one or more incoupling points to a set of outcoupling points, a light sensor arrangement (3; 9) for generating one or more output signals (Sl, S2, S3) that represent light reaching the outcoupling points, wherein the touch-sensing apparatus is configured such that an object (6) touching the touch surface (4) locally attenuates at least two sheets of light, said device comprising: an element (10) for identifying, in said output signal(s) (Sl, S2, S3), a set of signal profiles (Pl, P2, P3; PIa, PIb, P2a, P2b) originating from the object (6), an element (11) for determining at least part of an attenuated light path across the panel (1) based on each signal profile (Pl, P2, P3; PIa, PIb, P2a, P2b), and an element (12) for identifying the position of the object (6) on the touch surface (4) based on the thus-determined attenuated light paths, wherein the determining element (11) is configured to apply a predetermined width function which is representative of a dependence of signal profile width on distance to the incoupling point due to light scattering caused by at least one of the touch surface (4) and the opposite surface (5).
11. A touch-sensing apparatus, comprising: a light transmissive panel (1) that defines a touch surface (4) and an opposite surface (5), a light source arrangement (2; 8) for providing sheets of light inside the panel (1), wherein each sheet comprises light that propagates by internal reflection between the touch surface (4) and the opposite surface (5) from one or more incoupling points to a set of outcoupling points, a light sensor arrangement (3; 9) for generating one or more output signals (Sl, S2, S3) that represent light reaching the outcoupling points, wherein the touch-sensing apparatus is configured such that an object (6) touching the touch surface (4) locally attenuates at least two sheets of light, and a device (7) for determining a position according to claim 10.
PCT/EP2009/057724 2008-06-23 2009-06-22 Determining the location of one or more objects on a touch surface WO2010006883A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US12/737,016 US8482547B2 (en) 2008-06-23 2009-06-22 Determining the location of one or more objects on a touch surface
EP09779863.1A EP2318905B1 (en) 2008-06-23 2009-06-22 Determining the location of one or more objects on a touch surface

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US12937208P 2008-06-23 2008-06-23
SE0801467-2 2008-06-23
US61/129,372 2008-06-23
SE0801467 2008-06-23
US20220809P 2009-02-05 2009-02-05
US61/202,208 2009-02-05
SE0900138 2009-02-05
SE0900138-9 2009-02-05

Publications (2)

Publication Number Publication Date
WO2010006883A2 true WO2010006883A2 (en) 2010-01-21
WO2010006883A3 WO2010006883A3 (en) 2010-12-02

Family

ID=41258953

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2009/057724 WO2010006883A2 (en) 2008-06-23 2009-06-22 Determining the location of one or more objects on a touch surface

Country Status (4)

Country Link
US (1) US8482547B2 (en)
EP (1) EP2318905B1 (en)
TW (1) TW201013492A (en)
WO (1) WO2010006883A2 (en)

Cited By (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010006886A2 (en) * 2008-06-23 2010-01-21 Flatfrog Laboratories Ab Determining the location of one or more objects on a touch surface
WO2010064983A3 (en) * 2008-12-05 2010-08-05 Flatfrog Laboratories Ab A touch sensing apparatus and method of operating the same
WO2011049513A1 (en) * 2009-10-19 2011-04-28 Flatfrog Laboratories Ab Determining touch data for one or more objects on a touch surface
WO2011139213A1 (en) 2010-05-03 2011-11-10 Flatfrog Laboratories Ab Touch determination by tomographic reconstruction
EP2466429A1 (en) 2010-12-16 2012-06-20 FlatFrog Laboratories AB Scanning ftir systems for touch detection
EP2466428A2 (en) 2010-12-16 2012-06-20 FlatFrog Laboratories AB Touch apparatus with separated compartments
WO2012105893A1 (en) 2011-02-02 2012-08-09 Flatfrog Laboratories Ab Optical incoupling for touch-sensitive systems
WO2013055282A2 (en) 2011-10-11 2013-04-18 Flatfrog Laboratories Ab Improved multi-touch detection in a touch system
WO2013089622A2 (en) 2011-12-16 2013-06-20 Flatfrog Laboratories Ab Tracking objects on a touch surface
WO2013133756A1 (en) 2012-03-09 2013-09-12 Flatfrog Laboratories Ab Efficient tomographic processing for touch determination
WO2013133757A2 (en) 2012-03-09 2013-09-12 Flatfrog Laboratories Ab Efficient tomographic processing for touch determination
WO2013089623A3 (en) * 2011-12-16 2013-09-12 Flatfrog Laboratories Ab Tracking objects on a touch surface
US8542217B2 (en) 2008-06-23 2013-09-24 Flatfrog Laboratories Ab Optical touch detection using input and output beam scanners
WO2013165305A2 (en) 2012-05-02 2013-11-07 Flatfrog Laboratories Ab Object detection in touch systems
WO2013165306A2 (en) 2012-05-02 2013-11-07 Flatfrog Laboratories Ab Object detection in touch systems
WO2013126005A3 (en) * 2012-02-21 2013-12-19 Flatfrog Laboratories Ab Touch determination with improved detection of weak interactions
WO2013176615A3 (en) * 2012-05-23 2014-03-20 Flatfrog Laboratories Ab Touch-sensitive apparatus with improved spatial resolution
US8692807B2 (en) 2009-09-02 2014-04-08 Flatfrog Laboratories Ab Touch surface with a compensated signal profile
WO2013176614A3 (en) * 2012-05-23 2014-04-10 Flatfrog Laboratories Ab Touch-sensitive apparatus with improved spatial resolution
CN104094206A (en) * 2012-02-08 2014-10-08 微软公司 Optical touch navigation
US8890843B2 (en) 2008-06-23 2014-11-18 Flatfrog Laboratories Ab Detecting the location of an object on a touch surface
US8890849B2 (en) 2011-09-27 2014-11-18 Flatfrog Laboratories Ab Image reconstruction for touch determination
US9024916B2 (en) 2009-10-19 2015-05-05 Flatfrog Laboratories Ab Extracting touch data that represents one or more objects on a touch surface
WO2015108479A1 (en) * 2014-01-16 2015-07-23 Flatfrog Laboratories Ab Light coupling in tir-based optical touch systems
WO2015108480A1 (en) * 2014-01-16 2015-07-23 Flatfrog Laboratories Ab Improvements in tir-based optical touch systems of projection-type
US9134854B2 (en) 2008-06-23 2015-09-15 Flatfrog Laboratories Ab Detecting the locations of a plurality of objects on a touch surface
US9158401B2 (en) 2010-07-01 2015-10-13 Flatfrog Laboratories Ab Data processing in relation to a multi-touch sensing apparatus
EP2852879A4 (en) * 2012-05-23 2016-02-17 Flatfrog Lab Ab Touch-sensitive apparatus with improved spatial resolution
US9274645B2 (en) 2010-12-15 2016-03-01 Flatfrog Laboratories Ab Touch determination with signal enhancement
CN105808023A (en) * 2016-03-14 2016-07-27 青岛海信电器股份有限公司 Touch point identification method and infrared touch control device
US9411444B2 (en) 2010-10-11 2016-08-09 Flatfrog Laboratories Ab Touch determination by tomographic reconstruction
US9588619B2 (en) 2012-01-31 2017-03-07 Flatfrog Laboratories Ab Performance monitoring and correction in a touch-sensitive apparatus
US9639210B2 (en) 2011-12-22 2017-05-02 Flatfrog Laboratories Ab Touch determination with interaction compensation
US9874978B2 (en) 2013-07-12 2018-01-23 Flatfrog Laboratories Ab Partial detect mode
US9927920B2 (en) 2011-12-16 2018-03-27 Flatfrog Laboratories Ab Tracking objects on a touch surface
US10019113B2 (en) 2013-04-11 2018-07-10 Flatfrog Laboratories Ab Tomographic processing for touch detection
EP3250993A4 (en) * 2015-01-28 2018-10-10 FlatFrog Laboratories AB Dynamic touch quarantine frames
US10161886B2 (en) 2014-06-27 2018-12-25 Flatfrog Laboratories Ab Detection of surface contamination
US10168835B2 (en) 2012-05-23 2019-01-01 Flatfrog Laboratories Ab Spatial resolution in touch displays
US10282035B2 (en) 2016-12-07 2019-05-07 Flatfrog Laboratories Ab Touch device
US10318074B2 (en) 2015-01-30 2019-06-11 Flatfrog Laboratories Ab Touch-sensing OLED display with tilted emitters
US10401546B2 (en) 2015-03-02 2019-09-03 Flatfrog Laboratories Ab Optical component for light coupling
US10437389B2 (en) 2017-03-28 2019-10-08 Flatfrog Laboratories Ab Touch sensing apparatus and method for assembly
US10481737B2 (en) 2017-03-22 2019-11-19 Flatfrog Laboratories Ab Pen differentiation for touch display
US10496227B2 (en) 2015-02-09 2019-12-03 Flatfrog Laboratories Ab Optical touch system comprising means for projecting and detecting light beams above and inside a transmissive panel
US10761657B2 (en) 2016-11-24 2020-09-01 Flatfrog Laboratories Ab Automatic optimisation of touch signal
US11256371B2 (en) 2017-09-01 2022-02-22 Flatfrog Laboratories Ab Optical component
US11301089B2 (en) 2015-12-09 2022-04-12 Flatfrog Laboratories Ab Stylus identification
US11474644B2 (en) 2017-02-06 2022-10-18 Flatfrog Laboratories Ab Optical coupling in touch-sensing systems
US11567610B2 (en) 2018-03-05 2023-01-31 Flatfrog Laboratories Ab Detection line broadening
US11893189B2 (en) 2020-02-10 2024-02-06 Flatfrog Laboratories Ab Touch-sensing apparatus
US11943563B2 (en) 2019-01-25 2024-03-26 FlatFrog Laboratories, AB Videoconferencing terminal and method of operating the same

Families Citing this family (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9471170B2 (en) 2002-11-04 2016-10-18 Neonode Inc. Light-based touch screen with shift-aligned emitter and receiver lenses
US8902196B2 (en) * 2002-12-10 2014-12-02 Neonode Inc. Methods for determining a touch location on a touch screen
US8553014B2 (en) * 2008-06-19 2013-10-08 Neonode Inc. Optical touch screen systems using total internal reflection
JP5588982B2 (en) 2008-08-07 2014-09-10 ラプト アイピー リミテッド Optical control system with modulated emitter
US9092092B2 (en) * 2008-08-07 2015-07-28 Rapt Ip Limited Detecting multitouch events in an optical touch-sensitive device using touch event templates
US9063615B2 (en) 2008-08-07 2015-06-23 Rapt Ip Limited Detecting multitouch events in an optical touch-sensitive device using line images
JP5378519B2 (en) 2008-08-07 2013-12-25 ドラム,オウエン Method and apparatus for detecting multi-touch events in optical touch sensitive devices
US8619056B2 (en) * 2009-01-07 2013-12-31 Elan Microelectronics Corp. Ghost resolution for a capacitive touch panel
US9158416B2 (en) 2009-02-15 2015-10-13 Neonode Inc. Resilient light-based touch surface
US9063614B2 (en) 2009-02-15 2015-06-23 Neonode Inc. Optical touch screens
CN101751186B (en) * 2009-12-30 2011-12-21 鸿富锦精密工业(深圳)有限公司 Touch electronic device and control method thereof
US8648815B2 (en) * 2010-02-15 2014-02-11 Elo Touch Solutions, Inc. Touch panel that has an image layer and detects bending waves
US9557837B2 (en) 2010-06-15 2017-01-31 Pixart Imaging Inc. Touch input apparatus and operation method thereof
US20130271429A1 (en) * 2010-10-06 2013-10-17 Pixart Imaging Inc. Touch-control system
TWI438669B (en) * 2011-04-14 2014-05-21 Wistron Corp Optical touch module and method thereof
CN104094203B (en) 2011-07-22 2017-02-15 拉普特知识产权公司 Optical coupler for use in optical touch sensitive device
US9524060B2 (en) 2012-07-13 2016-12-20 Rapt Ip Limited Low power operation of an optical touch-sensitive device for detecting multitouch events
US9405382B2 (en) 2012-07-24 2016-08-02 Rapt Ip Limited Augmented optical waveguide for use in an optical touch sensitive device
US10282034B2 (en) 2012-10-14 2019-05-07 Neonode Inc. Touch sensitive curved and flexible displays
US9921661B2 (en) 2012-10-14 2018-03-20 Neonode Inc. Optical proximity sensor and associated user interface
US9164625B2 (en) 2012-10-14 2015-10-20 Neonode Inc. Proximity sensor for determining two-dimensional coordinates of a proximal object
US9207800B1 (en) 2014-09-23 2015-12-08 Neonode Inc. Integrated light guide and touch screen frame and multi-touch determination method
JP6195980B2 (en) * 2013-06-05 2017-09-13 エーファウ・グループ・エー・タルナー・ゲーエムベーハー Measuring apparatus and method for measuring pressure map
CN103729096A (en) * 2013-12-25 2014-04-16 京东方科技集团股份有限公司 Interaction recognition system and display unit provided with same
US20170068392A1 (en) * 2015-09-03 2017-03-09 Ceebus Technologies, Llc Touchscreen system usable in a variety of media
CN106502475B (en) * 2016-10-26 2019-07-16 青岛海信电器股份有限公司 A kind of infrared touch point identification method, infrared touch device and display device
CN108256416B (en) * 2017-11-30 2021-04-02 北京集创北方科技股份有限公司 Biological characteristic detection method and system
US11068116B2 (en) 2019-07-24 2021-07-20 Samsung Electronics Company, Ltd. Touch detection and position reconstruction
CN115039060A (en) 2019-12-31 2022-09-09 内奥诺德公司 Non-contact touch input system
WO2022072331A1 (en) 2020-09-30 2022-04-07 Neonode Inc. Optical touch sensor

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001084251A2 (en) * 2000-05-01 2001-11-08 Tulbert David J Human-machine interface
US20030160155A1 (en) * 2001-10-09 2003-08-28 Liess Martin Dieter Device having touch sensitivity functionality
US20040252091A1 (en) * 2003-06-14 2004-12-16 Massachusetts Institute Of Technology Input device based on frustrated total internal reflection
US20070052684A1 (en) * 2005-09-08 2007-03-08 Gruhlke Russell W Position detection system using laser speckle
WO2007112742A1 (en) * 2006-03-30 2007-10-11 Flatfrog Laboratories Ab A system and a method of determining a position of a scattering/reflecting element on the surface of a radiation transmissive element

Family Cites Families (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR1452041A (en) * 1965-04-26 1966-02-25 Electronique & Automatisme Sa Communication device with an electronic calculator
US3673327A (en) * 1970-11-02 1972-06-27 Atomic Energy Commission Touch actuable data input panel assembly
US4129384A (en) * 1977-06-08 1978-12-12 Batelle Memorial Institute Optical extensometer
US4213707A (en) * 1979-04-25 1980-07-22 Eastman Kodak Company Device for improving the accuracy of optical measuring apparatus and the like
US4294543A (en) * 1979-11-13 1981-10-13 Command Control & Communications Corporation Optical system for developing point coordinate information
US4420261A (en) * 1980-09-02 1983-12-13 Lowbar, Inc. Optical position location apparatus
JPS58111705A (en) * 1981-12-25 1983-07-02 Mitsutoyo Mfg Co Ltd Optical measuring device
GB2131544B (en) 1982-12-07 1986-03-05 Lowbar Inc Optical postition location apparatus
US4507557A (en) * 1983-04-01 1985-03-26 Siemens Corporate Research & Support, Inc. Non-contact X,Y digitizer using two dynamic ram imagers
US4746770A (en) * 1987-02-17 1988-05-24 Sensor Frame Incorporated Method and apparatus for isolating and manipulating graphic objects on computer video monitor
CH683370A5 (en) * 1992-04-10 1994-02-28 Zumbach Electronic Ag Method and apparatus for measuring the dimension of an object.
DE69318677T2 (en) * 1992-11-25 1999-02-18 Sumitomo Electric Industries Method of detecting contaminants in molten plastic
US5764223A (en) * 1995-06-07 1998-06-09 International Business Machines Corporation Touch-screen input device using the monitor as a light source operating at an intermediate frequency
EP0897161B1 (en) 1997-08-07 2007-10-10 Fujitsu Limited Optical scanning-type touch panel
JP3827450B2 (en) * 1998-08-18 2006-09-27 富士通株式会社 Optical scanning touch panel
US6972753B1 (en) * 1998-10-02 2005-12-06 Semiconductor Energy Laboratory Co., Ltd. Touch panel, display device provided with touch panel and electronic equipment provided with display device
CA2393164C (en) * 1999-12-02 2008-04-01 Elo Touchsystems, Inc. Apparatus and method to improve resolution of infrared touch systems
US6724489B2 (en) * 2000-09-22 2004-04-20 Daniel Freifeld Three dimensional scanning camera
JP4768143B2 (en) * 2001-03-26 2011-09-07 株式会社リコー Information input / output device, information input / output control method, and program
US7254775B2 (en) * 2001-10-03 2007-08-07 3M Innovative Properties Company Touch panel system and method for distinguishing multiple touch inputs
US7265748B2 (en) * 2003-12-11 2007-09-04 Nokia Corporation Method and device for detecting touch pad input
JP4522113B2 (en) * 2004-03-11 2010-08-11 キヤノン株式会社 Coordinate input device
JP2006039686A (en) * 2004-07-22 2006-02-09 Pioneer Electronic Corp Touch panel device, touch region detecting method, and touch region detecting program
US8599140B2 (en) * 2004-11-17 2013-12-03 International Business Machines Corporation Providing a frustrated total internal reflection touch interface
WO2006095320A2 (en) 2005-03-10 2006-09-14 Koninklijke Philips Electronics, N.V. System and method for detecting the location, size and shape of multiple objects that interact with a touch screen display
US7705835B2 (en) * 2005-03-28 2010-04-27 Adam Eikman Photonic touch screen apparatus and method of use
US7629968B2 (en) * 2005-07-29 2009-12-08 Avago Technologies Fiber Ip (Singapore) Pte. Ltd. Methods and systems for detecting selections on a touch screen display
KR100782431B1 (en) 2006-09-29 2007-12-05 주식회사 넥시오 Multi position detecting method and area detecting method in infrared rays type touch screen
WO2008068607A2 (en) 2006-12-08 2008-06-12 Flatfrog Laboratories Ab Position determination in optical interface systems
CN101075168B (en) 2007-06-22 2014-04-02 北京汇冠新技术股份有限公司 Method for discriminating multiple points on infrared touch screen
EP2212763A4 (en) * 2007-10-10 2012-06-20 Flatfrog Lab Ab A touch pad and a method of operating the touch pad
KR20100121512A (en) * 2008-02-11 2010-11-17 넥스트 홀딩즈 리미티드 Systems and methods for resolving multitouch scenarios for optical touchscreens
TW201001258A (en) 2008-06-23 2010-01-01 Flatfrog Lab Ab Determining the location of one or more objects on a touch surface
TW201007530A (en) 2008-06-23 2010-02-16 Flatfrog Lab Ab Detecting the location of an object on a touch surface
TW201005606A (en) 2008-06-23 2010-02-01 Flatfrog Lab Ab Detecting the locations of a plurality of objects on a touch surface
US8542217B2 (en) 2008-06-23 2013-09-24 Flatfrog Laboratories Ab Optical touch detection using input and output beam scanners
US9317159B2 (en) * 2008-09-26 2016-04-19 Hewlett-Packard Development Company, L.P. Identifying actual touch points using spatial dimension information obtained from light transceivers

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001084251A2 (en) * 2000-05-01 2001-11-08 Tulbert David J Human-machine interface
US20030160155A1 (en) * 2001-10-09 2003-08-28 Liess Martin Dieter Device having touch sensitivity functionality
US20040252091A1 (en) * 2003-06-14 2004-12-16 Massachusetts Institute Of Technology Input device based on frustrated total internal reflection
US20070052684A1 (en) * 2005-09-08 2007-03-08 Gruhlke Russell W Position detection system using laser speckle
WO2007112742A1 (en) * 2006-03-30 2007-10-11 Flatfrog Laboratories Ab A system and a method of determining a position of a scattering/reflecting element on the surface of a radiation transmissive element

Cited By (104)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9134854B2 (en) 2008-06-23 2015-09-15 Flatfrog Laboratories Ab Detecting the locations of a plurality of objects on a touch surface
US8890843B2 (en) 2008-06-23 2014-11-18 Flatfrog Laboratories Ab Detecting the location of an object on a touch surface
WO2010006886A2 (en) * 2008-06-23 2010-01-21 Flatfrog Laboratories Ab Determining the location of one or more objects on a touch surface
WO2010006886A3 (en) * 2008-06-23 2011-05-26 Flatfrog Laboratories Ab Determining the location of one or more objects on a touch surface
US8542217B2 (en) 2008-06-23 2013-09-24 Flatfrog Laboratories Ab Optical touch detection using input and output beam scanners
US10474249B2 (en) 2008-12-05 2019-11-12 Flatfrog Laboratories Ab Touch sensing apparatus and method of operating the same
WO2010064983A3 (en) * 2008-12-05 2010-08-05 Flatfrog Laboratories Ab A touch sensing apparatus and method of operating the same
US10048773B2 (en) 2008-12-05 2018-08-14 Flatfrog Laboratories Ab Touch sensing apparatus and method of operating the same
US9442574B2 (en) 2008-12-05 2016-09-13 Flatfrog Laboratories Ab Touch sensing apparatus and method of operating the same
US8581884B2 (en) 2008-12-05 2013-11-12 Flatfrog Laboratories Ab Touch sensing apparatus and method of operating the same
US8692807B2 (en) 2009-09-02 2014-04-08 Flatfrog Laboratories Ab Touch surface with a compensated signal profile
WO2011049513A1 (en) * 2009-10-19 2011-04-28 Flatfrog Laboratories Ab Determining touch data for one or more objects on a touch surface
US9024916B2 (en) 2009-10-19 2015-05-05 Flatfrog Laboratories Ab Extracting touch data that represents one or more objects on a touch surface
US9430079B2 (en) 2009-10-19 2016-08-30 Flatfrog Laboratories Ab Determining touch data for one or more objects on a touch surface
WO2011139213A1 (en) 2010-05-03 2011-11-10 Flatfrog Laboratories Ab Touch determination by tomographic reconstruction
EP3012721A1 (en) 2010-05-03 2016-04-27 FlatFrog Laboratories AB Touch determination by tomographic reconstruction
US9996196B2 (en) 2010-05-03 2018-06-12 Flatfrog Laboratories Ab Touch determination by tomographic reconstruction
US9547393B2 (en) 2010-05-03 2017-01-17 Flatfrog Laboratories Ab Touch determination by tomographic reconstruction
US8780066B2 (en) 2010-05-03 2014-07-15 Flatfrog Laboratories Ab Touch determination by tomographic reconstruction
US9158401B2 (en) 2010-07-01 2015-10-13 Flatfrog Laboratories Ab Data processing in relation to a multi-touch sensing apparatus
US9710101B2 (en) 2010-07-01 2017-07-18 Flatfrog Laboratories Ab Data processing in relation to a multi-touch sensing apparatus
US10013107B2 (en) 2010-07-01 2018-07-03 Flatfrog Laboratories Ab Data processing in relation to a multi-touch sensing apparatus
US9411444B2 (en) 2010-10-11 2016-08-09 Flatfrog Laboratories Ab Touch determination by tomographic reconstruction
EP2652582A4 (en) * 2010-12-15 2017-06-21 FlatFrog Laboratories AB Touch determination with signal enhancement
US9594467B2 (en) 2010-12-15 2017-03-14 Flatfrog Laboratories Ab Touch determination with signal enhancement
US9274645B2 (en) 2010-12-15 2016-03-01 Flatfrog Laboratories Ab Touch determination with signal enhancement
US8872098B2 (en) 2010-12-16 2014-10-28 Flatfrog Laboratories Ab Scanning FTIR systems for touch detection
US8872801B2 (en) 2010-12-16 2014-10-28 Flatfrog Laboratories Ab Touch apparatus with separated compartments
EP2466428A2 (en) 2010-12-16 2012-06-20 FlatFrog Laboratories AB Touch apparatus with separated compartments
EP2466429A1 (en) 2010-12-16 2012-06-20 FlatFrog Laboratories AB Scanning ftir systems for touch detection
EP3173914A1 (en) 2011-02-02 2017-05-31 FlatFrog Laboratories AB Optical incoupling for touch-sensitive systems
WO2012105893A1 (en) 2011-02-02 2012-08-09 Flatfrog Laboratories Ab Optical incoupling for touch-sensitive systems
US9552103B2 (en) 2011-02-02 2017-01-24 Flatfrog Laboratories Ab Optical incoupling for touch-sensitive systems
US10151866B2 (en) 2011-02-02 2018-12-11 Flatfrog Laboratories Ab Optical incoupling for touch-sensitive systems
US8890849B2 (en) 2011-09-27 2014-11-18 Flatfrog Laboratories Ab Image reconstruction for touch determination
WO2013055282A2 (en) 2011-10-11 2013-04-18 Flatfrog Laboratories Ab Improved multi-touch detection in a touch system
US9377884B2 (en) 2011-10-11 2016-06-28 Flatfrog Laboratories Ab Multi-touch detection in a touch system
US9317168B2 (en) 2011-12-16 2016-04-19 Flatfrog Laboratories Ab Tracking objects on a touch surface
CN104081323B (en) * 2011-12-16 2016-06-22 平蛙实验室股份公司 Follow the tracks of the object on touch-surface
EP3506069A1 (en) 2011-12-16 2019-07-03 FlatFrog Laboratories AB Tracking objects on a touch surface
CN104081323A (en) * 2011-12-16 2014-10-01 平蛙实验室股份公司 Tracking objects on a touch surface
WO2013089622A2 (en) 2011-12-16 2013-06-20 Flatfrog Laboratories Ab Tracking objects on a touch surface
US9927920B2 (en) 2011-12-16 2018-03-27 Flatfrog Laboratories Ab Tracking objects on a touch surface
WO2013089623A3 (en) * 2011-12-16 2013-09-12 Flatfrog Laboratories Ab Tracking objects on a touch surface
US8982084B2 (en) 2011-12-16 2015-03-17 Flatfrog Laboratories Ab Tracking objects on a touch surface
US9639210B2 (en) 2011-12-22 2017-05-02 Flatfrog Laboratories Ab Touch determination with interaction compensation
US10372265B2 (en) 2012-01-31 2019-08-06 Flatfrog Laboratories Ab Performance monitoring and correction in a touch-sensitive apparatus
US9588619B2 (en) 2012-01-31 2017-03-07 Flatfrog Laboratories Ab Performance monitoring and correction in a touch-sensitive apparatus
CN104094206A (en) * 2012-02-08 2014-10-08 微软公司 Optical touch navigation
WO2013126005A3 (en) * 2012-02-21 2013-12-19 Flatfrog Laboratories Ab Touch determination with improved detection of weak interactions
US10031623B2 (en) 2012-02-21 2018-07-24 Flatfrog Laboratories Ab Touch determination with improved detection of weak interactions
US9811209B2 (en) 2012-02-21 2017-11-07 Flatfrog Laboratories Ab Touch determination with improved detection of weak interactions
WO2013133756A1 (en) 2012-03-09 2013-09-12 Flatfrog Laboratories Ab Efficient tomographic processing for touch determination
US9684414B2 (en) 2012-03-09 2017-06-20 Flatfrog Laboratories Ab Efficient tomographic processing for touch determination
WO2013133757A2 (en) 2012-03-09 2013-09-12 Flatfrog Laboratories Ab Efficient tomographic processing for touch determination
US9760233B2 (en) 2012-03-09 2017-09-12 Flatfrog Laboratories Ab Efficient tomographic processing for touch determination
US10318041B2 (en) 2012-05-02 2019-06-11 Flatfrog Laboratories Ab Object detection in touch systems
WO2013165305A2 (en) 2012-05-02 2013-11-07 Flatfrog Laboratories Ab Object detection in touch systems
WO2013165306A2 (en) 2012-05-02 2013-11-07 Flatfrog Laboratories Ab Object detection in touch systems
US9626018B2 (en) 2012-05-02 2017-04-18 Flatfrog Laboratories Ab Object detection in touch systems
US10168835B2 (en) 2012-05-23 2019-01-01 Flatfrog Laboratories Ab Spatial resolution in touch displays
WO2013176615A3 (en) * 2012-05-23 2014-03-20 Flatfrog Laboratories Ab Touch-sensitive apparatus with improved spatial resolution
US10001881B2 (en) 2012-05-23 2018-06-19 Flatfrog Laboratories Ab Touch-sensitive apparatus with improved spatial resolution
WO2013176614A3 (en) * 2012-05-23 2014-04-10 Flatfrog Laboratories Ab Touch-sensitive apparatus with improved spatial resolution
US9626040B2 (en) 2012-05-23 2017-04-18 Flatfrog Laboratories Ab Touch-sensitive apparatus with improved spatial resolution
EP2852879A4 (en) * 2012-05-23 2016-02-17 Flatfrog Lab Ab Touch-sensitive apparatus with improved spatial resolution
US10019113B2 (en) 2013-04-11 2018-07-10 Flatfrog Laboratories Ab Tomographic processing for touch detection
US9874978B2 (en) 2013-07-12 2018-01-23 Flatfrog Laboratories Ab Partial detect mode
US10126882B2 (en) 2014-01-16 2018-11-13 Flatfrog Laboratories Ab TIR-based optical touch systems of projection-type
WO2015108480A1 (en) * 2014-01-16 2015-07-23 Flatfrog Laboratories Ab Improvements in tir-based optical touch systems of projection-type
US10146376B2 (en) 2014-01-16 2018-12-04 Flatfrog Laboratories Ab Light coupling in TIR-based optical touch systems
WO2015108479A1 (en) * 2014-01-16 2015-07-23 Flatfrog Laboratories Ab Light coupling in tir-based optical touch systems
US10161886B2 (en) 2014-06-27 2018-12-25 Flatfrog Laboratories Ab Detection of surface contamination
US11182023B2 (en) 2015-01-28 2021-11-23 Flatfrog Laboratories Ab Dynamic touch quarantine frames
EP3250993A4 (en) * 2015-01-28 2018-10-10 FlatFrog Laboratories AB Dynamic touch quarantine frames
US10318074B2 (en) 2015-01-30 2019-06-11 Flatfrog Laboratories Ab Touch-sensing OLED display with tilted emitters
US11029783B2 (en) 2015-02-09 2021-06-08 Flatfrog Laboratories Ab Optical touch system comprising means for projecting and detecting light beams above and inside a transmissive panel
US10496227B2 (en) 2015-02-09 2019-12-03 Flatfrog Laboratories Ab Optical touch system comprising means for projecting and detecting light beams above and inside a transmissive panel
US10401546B2 (en) 2015-03-02 2019-09-03 Flatfrog Laboratories Ab Optical component for light coupling
US11301089B2 (en) 2015-12-09 2022-04-12 Flatfrog Laboratories Ab Stylus identification
CN105808023A (en) * 2016-03-14 2016-07-27 青岛海信电器股份有限公司 Touch point identification method and infrared touch control device
CN105808023B (en) * 2016-03-14 2019-01-29 青岛海信电器股份有限公司 A kind of recognition methods of touch point and infrared touch device
US10761657B2 (en) 2016-11-24 2020-09-01 Flatfrog Laboratories Ab Automatic optimisation of touch signal
US11281335B2 (en) 2016-12-07 2022-03-22 Flatfrog Laboratories Ab Touch device
US10775935B2 (en) 2016-12-07 2020-09-15 Flatfrog Laboratories Ab Touch device
US11579731B2 (en) 2016-12-07 2023-02-14 Flatfrog Laboratories Ab Touch device
US10282035B2 (en) 2016-12-07 2019-05-07 Flatfrog Laboratories Ab Touch device
US11740741B2 (en) 2017-02-06 2023-08-29 Flatfrog Laboratories Ab Optical coupling in touch-sensing systems
US11474644B2 (en) 2017-02-06 2022-10-18 Flatfrog Laboratories Ab Optical coupling in touch-sensing systems
US10606414B2 (en) 2017-03-22 2020-03-31 Flatfrog Laboratories Ab Eraser for touch displays
US11016605B2 (en) 2017-03-22 2021-05-25 Flatfrog Laboratories Ab Pen differentiation for touch displays
US10481737B2 (en) 2017-03-22 2019-11-19 Flatfrog Laboratories Ab Pen differentiation for touch display
US11099688B2 (en) 2017-03-22 2021-08-24 Flatfrog Laboratories Ab Eraser for touch displays
US11269460B2 (en) 2017-03-28 2022-03-08 Flatfrog Laboratories Ab Touch sensing apparatus and method for assembly
US10606416B2 (en) 2017-03-28 2020-03-31 Flatfrog Laboratories Ab Touch sensing apparatus and method for assembly
US11281338B2 (en) 2017-03-28 2022-03-22 Flatfrog Laboratories Ab Touch sensing apparatus and method for assembly
US10437389B2 (en) 2017-03-28 2019-10-08 Flatfrog Laboratories Ab Touch sensing apparatus and method for assembly
US10845923B2 (en) 2017-03-28 2020-11-24 Flatfrog Laboratories Ab Touch sensing apparatus and method for assembly
US10739916B2 (en) 2017-03-28 2020-08-11 Flatfrog Laboratories Ab Touch sensing apparatus and method for assembly
US11256371B2 (en) 2017-09-01 2022-02-22 Flatfrog Laboratories Ab Optical component
US11650699B2 (en) 2017-09-01 2023-05-16 Flatfrog Laboratories Ab Optical component
US11567610B2 (en) 2018-03-05 2023-01-31 Flatfrog Laboratories Ab Detection line broadening
US11943563B2 (en) 2019-01-25 2024-03-26 FlatFrog Laboratories, AB Videoconferencing terminal and method of operating the same
US11893189B2 (en) 2020-02-10 2024-02-06 Flatfrog Laboratories Ab Touch-sensing apparatus

Also Published As

Publication number Publication date
US20110090176A1 (en) 2011-04-21
US8482547B2 (en) 2013-07-09
WO2010006883A3 (en) 2010-12-02
TW201013492A (en) 2010-04-01
EP2318905A2 (en) 2011-05-11
EP2318905B1 (en) 2017-08-16

Similar Documents

Publication Publication Date Title
EP2318905B1 (en) Determining the location of one or more objects on a touch surface
JP5782446B2 (en) Determination of contact data for one or more objects on the contact surface
EP2318904B1 (en) Determining the location of one or more objects on a touch surface
US10474249B2 (en) Touch sensing apparatus and method of operating the same
US9134854B2 (en) Detecting the locations of a plurality of objects on a touch surface
JP5582622B2 (en) Contact surface with compensation signal profile
US20170185230A1 (en) Touch determination with interaction compensation
US20120200538A1 (en) Touch surface with two-dimensional compensation
WO2010134865A1 (en) Determining the location of an object on a touch surface
KR20120025336A (en) Infrared touch screen devices

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09779863

Country of ref document: EP

Kind code of ref document: A2

WWE Wipo information: entry into national phase

Ref document number: 12737016

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

REEP Request for entry into the european phase

Ref document number: 2009779863

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2009779863

Country of ref document: EP