WO2013179168A1 - User -input interface with a light guide - Google Patents
User -input interface with a light guide Download PDFInfo
- Publication number
- WO2013179168A1 WO2013179168A1 PCT/IB2013/053905 IB2013053905W WO2013179168A1 WO 2013179168 A1 WO2013179168 A1 WO 2013179168A1 IB 2013053905 W IB2013053905 W IB 2013053905W WO 2013179168 A1 WO2013179168 A1 WO 2013179168A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- light
- light guide
- photodetector
- input interface
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0428—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by sensing at the edges of the touch surface the interruption of optical paths, e.g. an illumination plane, parallel to the touch surface which may be virtual
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04109—FTIR in optical digitiser, i.e. touch detection by frustrating the total internal reflection within an optical waveguide due to changes of optical properties or deformation at the touch location
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/0418—Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
- G06F3/04182—Filtering of noise external to the device and not generated by digitiser components
Definitions
- the invention relates to a user-input interface and in particular to a user-input interface comprising a light guide.
- Transparent emissive windows are windows that are transparent and appear as ordinary windows when turned off. However, when turned on, they can emit light. These transparent emissive windows can be used for both general lighting and as atmosphere providing luminaires.
- An emissive window typically consists of an edge-lit light guide on which out- coupling structures are provided.
- emissive windows are equipped with multiple white, or colored, light emitting diodes (LEDs) on one or more edges, a multitude of different lighting patterns and/or colors can be created.
- LEDs light emitting diodes
- Figure 1 shows a cross-sectional schematic of an emissive window 1.
- the emissive window 1 comprises a transparent light guide 10, an LED 12 provided at an edge 10-1 of the light guide 10, and a plurality of out-coupling structures 14 provided on a surface 10-2 of the light guide.
- the out-coupling structures 14 may comprise for example dots of white paint.
- the LED 12 is operable to emit light into the light guide 10 via the edge 10-1.
- Each of the plurality of out-coupling structures 14 is configured to scatter the LED light 16 travelling in the light guide 10, thereby to disrupt the total internal reflection of the LED light 16 such that some of it is emitted from the light guide 10 via a surface 10-3 of the light guide 10.
- WO2011/067719 relates to emissive windows, and more detail regarding emissive windows can be found therein.
- an additional remote control could be provided.
- emissive windows constitute unobtrusive luminaires which are embedded in the architecture of a structure, such as a building, it may be preferable for the user interface to be
- a user-input interface comprising a light guide having first and second surfaces and at least one edge extending between the first and second surfaces, a photo detector provided at the at least one edge of the light guide and operable to output a signal indicative of an amount of light arriving at the photo detector from within the light guide, and electronic circuitry configured to detect an object in physical contact with one of the first and second surfaces of the light guide in response to receipt of a signal from the photo detector that is indicative of a reduction in the amount of light arriving at the photo detector.
- the user- input interface does not require a multilayered structure in order to detect objects in contact therewith. As such, transparency of the user-input interface may be increased and the cost of production may be reduced
- the electronic circuitry may be configured to detect a user input in response to receipt from the photo detector of a signal indicative of a reduction in the amount of light arriving at the photo detector; and to respond to the detection of the user input by triggering an action.
- User inputs may be provided by a user positioning an object adjacent to the surface.
- the electronic circuitry may be configured to detect a user input in response to receipt of a signal from the photo detector that is indicative of a specific series of fluctuations in the amount of light arriving at the photo detector, and to respond to the detection of the user input by triggering an action. This reduces the chance that incidental interactions with the user-input interface are interpreted as user inputs.
- the user- input interface may comprise one or more light emitting diodes provided at an edge of the light guide and operable to emit light into the light guide.
- the photo detector may be constituted by or combined with, one of the one or more light emitting diodes. This maintains a low cost of production, as a smaller number of different components are required.
- the light guide may comprise a plurality of out-coupling structures.
- the user- input interface may be part of an emissive window.
- the one or more light emitting diodes may be modulated and the electronic circuitry may be configured to detect user inputs using only those elements of the signal from the photo detector that result from the light emitted by the light emitting diodes.
- the electronic circuitry may be configured to detect user inputs using only those elements of the signal from the photo detector that result from external light sources.
- the ability to detect user inputs using only the light due to external sources may facilitate the detection of user inputs which are proximate to the surface of the light guide, but which are not actually in contact with it.
- the detection of user inputs using only the light from the LEDs reduces errors that may otherwise be introduced by fluctuations in the external light sources.
- the electronic circuitry may be configured to filter elements of the signal corresponding to a frequency of the modulation of the light emitting diodes, or may be configured to amplify the elements of the signal from the photo detector that correspond to a frequency of the modulation of the light emitting diodes.
- the electronic circuitry may be configured to control the operation of the one or more light emitting diodes in response to detection of a user input.
- the electronic circuitry may be configured to respond to a first specific series of fluctuations in the received signal by controlling the one or more light emitting diodes in a first manner, and to respond to a second, different specific series of fluctuations in the received signal by controlling the one or more light emitting diodes in a second manner.
- the user-input interface may comprise a plurality of photo detectors provided at one or more edges of the light guide. Plural photo detectors may be provided along a single edge of the light guide. This facilitates the determination of the direction of movement of a moving user input.
- a first of the plurality of the photo detectors may be provided at a first edge of the light guide and a second of the plurality of photo detectors may be provided at a second edge of the light guide. This may facilitate detection of "two-dimensional user inputs" and/or may facilitate detection of user inputs irrespective of their location on the light guide.
- a first of the plurality of photo detectors may be provided at a first edge portion of the light guide and a second of the plurality of photo detectors may be provided at a second edge portion of the light guide, wherein the first and second edge portions are nonparallel. This facilitates detection of "two-dimensional user inputs".
- Figure 1 is a cross-sectional schematic of a prior art transparent emissive window
- Figure 2 is a schematic perspective view of an example user-input interface in accordance with the invention.
- Figures 3A and 3B are schematic cross-sectional views through the user-input interface of Figure 2, which illustrate the operation of user-input interfaces in accordance with the invention
- Figure 4 is a schematic perspective view of an alternative example of a user- input interface in accordance with the invention.
- Figures 5A and 5B are qualitative graphs which illustrate the operation of user-input interfaces in accordance with the invention.
- Figure 6 is a schematic perspective view of another alternative example of a user- input interface in accordance with the invention.
- Figure 7 is a flow chart depicting an example of a method performed by features of the invention.
- FIGS 8 A and 8B, 9 A and 9B and 10A and 10B are explanatory illustrations of various steps of the method of Figure 7.
- FIG. 2 is a schematic perspective view of a user-input interface 2 in accordance with the invention.
- the user- input interface 2 comprises a light guide 20 and a photo detector 22.
- the light guide has first and second surfaces 20-2, 20-3 and at least one edge 20-1, 20-4, 20- 5, 20-6 extending between the first and second surfaces 20-2, 20-3.
- the photo detector 22 is provided at one of the edges 20-1, 20-4, 20-5, 20-6 of the light guide 20.
- the photo detector 22 is operable to detect light arriving at the edge 20-1 of the light guide 20 from the interior of the light guide 20.
- the photo detector 22 is operable to output a signal indicative of the amount of light arriving at the photo detector 22 from inside the light guide 20.
- user-input interfaces according to the invention are configured such that the amount of light arriving at the photo detector 22 is reduced when an object 24, which in Figure 2 is a user's finger, is in physical contact with a surface 20-2 of the light guide 20.
- the light guide 20 is transparent.
- the light guide 20 is planar.
- the two surfaces 20-2, 20-3 of the light guide are separated by the thickness of the light guide 20-2, 20-3.
- the thickness of the light guide 20 is significantly less than the dimensions of the surfaces 20-2, 20-3.
- the first and second surfaces 20-2, 20-3 may be referred to as "main surfaces”.
- the edges 20-1, 20-4, 20-5, 20-6 may be referred to as "edge surfaces” or “minor surfaces” that connect the two main surfaces.
- the light guide 20 is square and so comprises four edges 20-1 , 20-4, 20-5, 20-6.
- the light guide may have a different shape and so may have a different number of edges.
- the light guide 20 is of a material that is optically matched with its surroundings in such a way that, when light is travelling within the light guide, it may be confined by means of total internal reflection.
- the light guide 20 may be comprised of, for example, glass or a transparent polymer or any other transparent medium, preferably having a low absorption of light.
- Other examples include Polymethyl methacrylate (PMMA), polycarbonate and quartz.
- the photo detector 22 may comprise any component that is suitable for detecting an amount of incident light.
- the photo detector 22 may comprise a photodiode, a photovoltaic cell, a photo resistor or a phototransistor.
- the photo detector 22 may comprise an LED acting as a photodiode.
- the signal output by the photo detector 22 increases with the amount of light arriving at the photo detector 22.
- the user- input interface 2 further comprises electronic circuitry 26.
- the electronic circuitry in this example, comprises at least one processor 26-1 and at least one memory 26-2.
- the at least one processor 26-1 is operable, under the control of computer readable code 26-2A stored on the at least one memory 26-2, to receive the signal from the photo detector 22 and to perform actions based on the signal.
- the electronic circuitry 26 is configured to detect an object in physical contact 24 with one of the first and second surfaces 20-2, 20-3 of the light guide 20 in response to receipt of a signal from the photo detector 26 that is indicative of a reduction in the amount of light arriving at the photo detector 22.
- the electronic circuitry 26 is configured to detect a user input in response to receipt from the photo detector 22 of a signal indicative of a reduction in the amount of light arriving at the photodetector; and to respond to the detection of the user input by triggering an action.
- a user input is constituted by a user positioning an object adjacent to a surface.
- a user input may be provided by a user positioning an object in contact with a surface 20-2, 20-3 of the light guide 20 or simply positioning the object proximate, but not actually in contact with, the surface 20-2, 20-3 of the light guide 20.
- the user-input interface 2 additionally comprises a plurality of out-coupling structures 28 provided on a first surface 20-2 of the light guide 20.
- the out-coupling 28 structures may alternatively be provided on the second, opposite surface 20-3 of the light guide, or on both surfaces 20-2, 20-3.
- the out-coupling structures may be provided within the bulk of the light guide 20 and may comprise, for example, small scattering particles.
- Such out-coupling structures 28 may be required when the user-input interface is part of an emissive window.
- user- input interfaces in accordance with the invention are used for other applications, they may not comprise out-coupling structures 28.
- Other applications include touch screens, or indeed any other device or apparatus requiring a user input interface.
- light (denoted by arrows LI and L2) arrives at the first and second main surfaces 20-2, 20-3 of the light guide 20.
- the light LI, L2 is from external sources such as the sun or other luminaires near to the light guide 20.
- the light LI, L2 is scattered by the out-coupling structures 28. Some of the light scatters over a sufficiently large angle that it is captured in the light guide 20. Some of the captured light then travels within the light guide 20, via total internal reflection (TIR), until it reaches the edge 20-1 of the light guide 20 at which the photodetector 22 is provided.
- TIR total internal reflection
- the paths of the captured light are denoted by arrows SI and S2.
- the photodetector 22 detects the arriving light and provides to the electronic circuitry 26 (not shown in Figures 3A and 3B) a signal indicative of the amount of light arriving at the photodetector 22.
- the user's finger 24 is incident on the first surface 20-1 of the light guide.
- the presence of the finger 24 on the surface 20-1 blocks some of the light that might otherwise enter light guide 20 (see arrow L3) via the location on the surface 20-2 at which the finger 24 is incident.
- optical contact is created between the finger 24 and the light guide 20 and so some of the captured light that is incident on the location of the surface 20-2 at which the finger 24 is also incident is scattered (see arrow SI).
- This scattering causes some of the captured light to be emitted from the second surface 20-3 of the light guide 20.
- some of the trapped light is absorbed by the finger 24. All of these effects combine to result in a reduced amount of light arriving at the photodetector 22. Consequently, the electronic circuitry 26 is able to identify the presence of the user's finger 24 in contact with the surface 20-2 of light guide 20, when the signal received from the photodetector 22 is indicative of a reduction in the amount of light.
- the user- input interface according to the invention may also be utilized when there is no external light.
- the user-input interface must also comprise one or more LEDs 50.
- An example of such a user interface is shown in Figure 4.
- the user- input interface 4 of Figure 4 is substantially the same as the interface 2 described with reference to Figure 2, but also includes one or more LEDs 50.
- the user- input interface 4 comprises a plurality of LEDs 50.
- the one or more LEDs 50 are provided at an edge 20-4 of the light guide 20.
- the LEDs 50 are operable to emit light into the light guide 20 via the edge 20-4 at which they are provided.
- the presence of the user's finger in contact with the surface 20-2 of the light guide 20 causes some of the LED light to be scattered out of the light guide 20.
- the user's finger also absorbs some of the LED light. Consequently, as with the example of Figures 3 A and 3B, when the user's finger 24 is in contact with a surface of the light guide 20, a reduced amount of light reaches the photodetector 22, and the photodetector signal is indicative of such.
- the user-input interface comprises out-coupling structures 28 and one or more LEDs 50
- the out-coupling structures 28 cause some of the LED light to be scattered from the light guide 20. However, a sufficient amount of light remains trapped within the light guide 20 that the presence of the user's finger 24 is still detectable.
- the one or more LEDs 50 are provided on a second edge 20-4 of the light guide 20.
- the second edge 20-4 is opposite to the edge 20-1 on which photodetector 22 is provided (hereafter referred to as the first edge 20-1).
- the LEDs 50 due to the spreading of the light within the light guide 20, it is also possible for the LEDs 50 to be provided on edges 20-5, 20-6 that are perpendicular to the first edge 20-1.
- the user- input interface includes out-coupling structures 28, which cause scattering of light within the light guide 20, it is even possible for the one or more LEDs 50 to be provided on the first edge 20-1 along with the photodetector 22.
- the one or more LEDs 50 may be driven using pulse width modulation (PWM). This allows the brightness of the light emitted by the LEDs 50 to be varied.
- PWM pulse width modulation
- the electronic circuitry 26 may be configured so as to separate out, by filtering, the high frequency elements of the photodetector signal (i.e. the elements resulting from the PWM-driven LEDs 50) from the low frequency elements of the signal that are due to external light.
- the frequency of the PWM may be used to amplify, for example using a lock-in amplifier, those elements of the photodetector signal that result from the LEDs 50.
- the electronic circuitry 26 may cause actions to be performed based only on the elements of the signal that are due to the LEDs 50 or based on only the elements of the signal that result from external light.
- the electronic circuitry 26 may be operable to switch between detecting user inputs based on only external light sources and detecting user input based on only the light emitted by the LEDs 50.
- the LEDs 50 have been described as being driven by PWM, it will be appreciated that they may alternatively be driven using other modulation methods, such as amplitude modulation. As long as the modulation is at a frequency that is distinguishable from external light, then the LED light can be distinguished from external light using filtering or amplification.
- a more controlled environment can be created.
- changes in external light levels not caused by the user such as other people passing near the surface of the light guide or fluctuating external light sources (e.g. televisions)
- the electronic circuitry 26 is instead configured to detect the external light arriving at the photodetector
- the light injected by the LEDs 50 may make changes in the light from external sources difficult to detect.
- filtering out the light from the LEDs 50 a higher signal-to-noise ratio can be achieved. It may be preferable to detect light from external sources as opposed to light emitted by the LEDs because it is desirable for the user to be able to provide user inputs without actually touching the surface of the light guide.
- the electronic circuitry 26 may be configured to control the operation of the one or more LEDs 50 in dependence on signals received from the user- input interface.
- Figures 5A and 5B are qualitative graphs showing the signal output by the photodetector 22 (and thus also the light level arriving at the photodetector) when the user's finger is at different locations on the surface of the light guide 20.
- Figure 5 A illustrates the light level pattern when the user's finger is aligned with the photodetector 22 and is moved perpendicularly away from the first edge 20-1 (in other words, directly away from the photodetector).
- Figure 5B illustrates the light level pattern when the user's finger 24 is moved in a direction parallel to the first edge 20-1.
- the reduction in the amount of light reaching the photodetector 22 is most pronounced when the user's finger 22 is aligned with the photodetector 22.
- the reduction in the amount of light quickly drops off to almost zero as the user moves their finger to either side of the aligned position.
- the electronic circuitry 26 may be configured to recognize specific
- the electronic circuitry 26 may be configured to cause a particular action to be performed, depending on the type of user input detected. For example, the electronic circuitry 26 may be configured to recognize "swipe inputs" (i.e. linear movements of the user's finger on or proximate to the surface) and to cause different actions to be performed based on the type of the swipe input (e.g. in a parallel direction or a perpendicular direction).
- swipe inputs i.e. linear movements of the user's finger on or proximate to the surface
- the electronic circuitry 26 may recognize a high photodetector signal, followed by a sudden significant decrease, followed by a sudden return to the original high level as a parallel swipe, and may, in emissive window examples, cause the LEDs 50 to be switched on or off.
- the electronic circuitry 26 may be configured to respond to the detection of a perpendicular swipe input (i.e. a high signal followed by a gradual reduction or a low signal followed by a gradual increase) by causing the brightness of the LEDs 50 to be changed.
- the user- input interface is operable also to detect the momentary presence of a non-moving finger on or proximate to the surface 20-2 of the light guide 20. This type of gesture may be referred to as a "tap input”.
- the electronic circuitry 26 may also or alternatively associate different actions with different types of tap inputs (e.g. a short tap or long tap).
- complex gestures may be required in order for the electronic circuitry 26 recognize, or detect, a user input and to respond by causing a certain action to occur.
- Complex gestures comprise more than one successive (or linked) gesture and result in a recognizable, specific series of photodetector 22 signal changes, which is less likely to occur as a result of an unintentional event.
- the electronic circuitry 26 may be configured to detect a user input only in response to a specific series of changes, or fluctuations, in the signal received from the photodetector 20. Examples of complex gestures include a "back-and- forth" swipe (i.e. across the surface in one direction and then back again), a “double tap” (i.e.
- the specific actions that are caused to be performed (or are triggered) by the electronic circuitry 26 may vary depending on the purpose for which the user-input interface is being used. As an example, when the user-input interface is part of an emissive window, the different actions may include, but are not limited to, switching the LEDs on or off, dimming the LEDs up or down, changing the color of the emitted light and causing dynamic light effects to occur.
- Figure 6 shows another example of a user-input interface 6 in accordance with the invention.
- the user- input interface 6 of Figure 6 differs from that of Figure 2 only in that it comprises plural photo detectors 22 each operable to detect an amount of light arriving thereat from within light guide 20.
- the electronic circuitry 26 receives a signal from each photodetector 22.
- the user-input interface 6 comprises plural photo detectors 22 provided on the first edge 20-1 and plural photo detectors 22 provided on a different edge 20-5, which is in this example is perpendicular to the first edge 20-1.
- plural photo detectors 22 may be provided on just a single edge.
- one or more photo detectors 22 may be provided on each of plural edges (e.g. 20-1 and 20-5).
- the provision of plural photo detectors 22 on a single edge 20-1 enables more information to be determined regarding the horizontal movement of the user's finger on the surface 20-2 of the light guide 20. For example, it can be determined whether a parallel swipe is left-to-right or right-to-left. Also, the provision of plural photo detectors 22 on a single edge enables more than one simultaneous input (i.e. "multi-touch") to be detected.
- the provision of one or more photo detectors 22 on each of plural edges allows the sensitivity of the user-input interface to be increased and/or allows more complex gestures to be identified.
- the provision of one or more photo detectors 22 on opposite edges e.g. 20-1, 20-4 provides greater sensitivity. This is because a movement of the user's finger 24 away from a first photodetector 22 on the first edge 20-1, which reduces the detectability of the finger by the first photodetector 22, is necessarily towards a second photodetector 22 on the opposite edge 20-4, and so increases the detectability of the finger by the second photodetector 22. As such, the user's finger 24 is more likely to be detected regardless of its location on the surface 20-2.
- one or more photo detectors 22 on perpendicular edges e.g. 20-1, 20-5 allows two-dimensional gestures, such as circular swipes, to be more easily distinguished from linear, one-dimensional gestures.
- the provision of plural photo detectors 22 at each of plural edges of the light guide 20 may allow the location of the user input to be determined based on the locations of the two or more photo detectors 22 which are detecting a reduced amount of light.
- user-input interfaces in accordance with the invention may comprise plural photo-detectors at various locations around the perimeter of the light guide in addition to one more LEDs.
- the light guide 20 is planar. In other words, the light guide is formed of a flat sheet, or layer, of material. However, in some examples, the light guide 20 may be formed of a curved sheet of material. Also, in the examples of Figures 2, 4 and 6, the light guide 20 is rectangular, specifically square, and as such has two pairs of opposing parallel edges. However, in other examples, the light guide 20 may be a different shape, such as circular or triangular and so may have only one continuous edge or three edges respectively. In such examples, instead of two photo detectors 22 being provided on different perpendicular edges (so as to increase detectability of two dimensional gestures), they may instead be provided on two different edge portions that are nonparallel (i.e. are not parallel with one another).
- the at least one processor 26-1 may comprise any number of processors and/or microprocessors.
- the at least one memory 26-2 may comprise any number of discrete memory media of any suitable type (e.g. ROM, RAM, EEPROM, FLASH etc.).
- the electronic circuitry 26 may comprise a microcontroller.
- the electronic circuitry 26 may comprise any combination of suitable analogue electrical components, as well as or instead of the at least one processor 26-1 and the at least one memory 26-2.
- a finger is described as providing the user inputs, it will be appreciated that the object could be something other than the user's finger.
- a user's hand could be detected on or near the surface of the light guide.
- a stylus could be used for to provide the user inputs.
- the stylus must be configured such that the contact area between the stylus and the surface of the light guide is sufficiently large for the user input to be detected.
- Figure 7 is a flow chart illustrating an example of the way in which the electronic circuitry 26 may operate to cause actions to be performed based on user inputs received at the user input interface.
- step S 1 a signal indicative of the amount of light arriving at the photodetector 22 is continuously received from the photodetector 22.
- the user input-interface includes plural photodetectors 22, plural signals, one from each photodetector, are received by the electronic circuitry 26.
- step S2 the raw photodetector signal (or signals) is pre-processed, for example, by filtering. Pre-processing can be used, for example, to remove noise from the raw signal before processing for input detection is carried out. In some examples, the preprocessing of step S2 is omitted.
- step S3 the pre-processed (or raw) signal is stored in a buffer in the at least one memory 26-2.
- step S4 the stored signal is processed so as to provide a processed signal from which recognisable parameters can be extracted.
- the signal processing of step S4 may be carried out using many different types of filtering algorithm. For example, a high pass filter may be used to isolate those components of the photodetector signal that are due to deliberate user inputs. Another example of a suitable filtering algorithm is to take the derivative over time of the received signal.
- step S5 the signal parameters are extracted or calculated from the processed signal.
- step S6 the extracted parameters are compared with stored reference signal signatures.
- step S7 it is determined if there is a match between the extracted signal parameters and one of the reference signal signatures. If so, the method proceeds to step S8, in which the electronic circuitry 26 causes an action that is associated with the matched reference signal signature to be performed. If no match is detected, the method returns to step S I .
- Figure 8A shows a "double back-and- forth swipe" gesture. This is illustrated by the four arrows Mi to M 4 , which show the distinct movements that constitute the gesture. Although Mi to M 4 are shown at different distances from the edge 20-1 at which the photodetector 22 is provided, this is for illustration purposes only.
- Figure 8B shows, in the top graph, the resulting raw signal output by the photodetector 22 and, in the bottom graph, the processed signal (following step S4).
- the raw signal of Figure 8B has a main low frequency envelope E and four sets of sudden signal variations Vi to V 4 .
- the low frequency envelope E is due to changes in background illumination caused by, for example, clouds passing in front of the sun or people passing by the light guide.
- the four sets of sudden signal variations Vi to V 4 result from sudden reductions in light caused by the movements Mi to M 4 respectively.
- the signal is processed by differentiating the raw photodetector signal. The result of this can be seen in the lower graph. In this instance, differentiating the raw signal retains the characteristic fluctuations Vi to V 4 that are due to the user input and virtually eliminates variation in signal intensity that is due to the changes in background illumination. From this differentiated signal, various parameters can be identified.
- the proximity of the user's hand to the photodetector may be identifiable from the magnitude of the fluctuations in the processed signal.
- the frequency at which the user's hand is moved back and forth across the surface of the light guide can be identified from the separation between the fluctuations.
- Figures 9A and 9B show a user input gesture and the resulting received photodetector signals from three photodetectors 22-1 , 22-2, 22-3 provided at a single edge 20-1 of the light guide.
- the user input gesture of Figure 9A is the same as that shown on Figure 8A.
- Figure 9B shows, in the topmost graph, the raw signal received from the first photodetector 22-1, in the next graph down, the signal received from the second
- the photodetector 22-2 and, in the third graph down, the signal received from the third photodetector 22-3.
- the lower three graphs show the processed versions of these signals. Specifically, the fourth graph shows the processed first photodetector signal, the fifth graph shows the processed second photodetector signal and the sixth graph shows the processed third photodetector signal.
- the provision of plural photodetectors 22-1, 22-2, 22-3 allows additional parameters to be identified. Specifically, the direction of movement and the speed of movement can also be determined. These can be determined by comparing the times at which sudden signal variations occur at each of the photodetectors 22-1, 22-2, 22-3.
- Dashed boxes PI to P4 on Figure 9B each show three fluctuations, one in each photodetector signal, that result from a single movement of the gesture.
- PI shows the fluctuations that result from the first movement Mi
- P2 shows the fluctuations that result from the second movement M 2
- P3 shows the fluctuations that result from the third movement M 3
- P4 shows the fluctuations that result from the fourth movement M 4 .
- the order in which the three fluctuations occur is determined.
- a fluctuation occurs first in the signal from the first photodetector 22-1, then in the signal from the second photodetector 22-2 and finally in signal from the third photodetector 22-3.
- the first photodetector 22-1 is the leftmost of the three and the third photodetector 22-3 is the rightmost of the three (see Figure 9A)
- the fluctuations occur in the opposite order and so it can be determined that the direction of movement M 2 is from right to left.
- the speed of a movement can be determined from the time difference between the first-occurring fluctuation and the occurrence of a fluctuation in a signal from a different photodetector 22-2, 22-3.
- the parameters that may be extracted from the processed signal of Figure 9B include: the number of distinct movements, the direction of each movement, the speed of each movement, the time at which each movement started or finished, and the proximity of each movement to the photodetector(s) 22.
- the extracted signal parameters may be as follows:
- the gesture is a "double back-and-forth swipe"
- the speeds of the four movements are all approximately equal (i.e. vl3 ⁇ 4v23 ⁇ 4v33 ⁇ 4v4)
- the time differences between the start/end of one movement and the start/end of the next are approximately equal (i.e. tl-t23 ⁇ 4t2-t33 ⁇ 4t3-t4)
- the proximities are all approximately equal (i.e. p l3 ⁇ 4p23 ⁇ 4p33 ⁇ 4p4).
- the signal signature may comprise four successive movements, each being in an opposite direction to the last and having approximately equal speeds and equal proximities to the photodetectors. If the parameters extracted in step S5 of Figure 7 match this signature, then the electronic circuitry 26 recognizes the gesture as a double back-and-forth swipe and causes the associated action to be performed..
- Figure 10A shows a circular clockwise gesture in respect of a user- input interface having three photodetectors 22-1 , 22-2, 22-3.
- Figure 10B shows the resulting raw signals from each photodetector. In this example, as there is no change in background illumination, the raw signal can be used to identify the signal parameters.
- the gesture of Figure 10A results in a first set of three signal fluctuations Fl that occur when the user's hand is approximately at the first location on Figure 9 A.
- the first fluctuation appears in the signal from the third photodetector 22-3 and the final fluctuation appears in the signal from the first photodetector 22-4.
- the direction can be identified as right to left.
- the second set of fluctuations F2 the first fluctuation appears in the signal from the first photo detector 22-1 and the final fluctuation appears in the signal from the third photodetector 22-3.
- the direction can be identified as left to right.
- the magnitude of the fluctuations is greater than they are in set F2. This is because the user's hand is closer to the photodetectors 22 and so prevents a greater amount of light from reaching the photodetectors 22.
- the speeds and times of fluctuations may be calculated as described above.
- the signature that identifies the clockwise circular gesture may comprise two sets of fluctuations having approximately equal speeds (i.e. vl3 ⁇ 4v2) and opposite directions of movement, with the right-to-left movement being nearer to the photodetectors than is the left- to-right movement (i.e. pl>p2). It will be appreciated that tl may be greater or less than t2, depending on the starting position of the user's hand. If the parameters extracted from the processed signal match this signature, the electronic circuitry 26 may cause an action that is associated with a clockwise circular gesture to be performed.
- the photodetector signal fluctuations are caused by extracting light from the light guide.
- direct physical contact between the user and the light guide is required in order to provide a user input.
- the requirement of direct physical contact between the user and the light guide greatly reduces the probability of user inputs being erroneously detected as a result of changes in background illumination. This is particularly true when the light from external sources is distinguished from light from the LEDs using the methods described above (for instance, using PWM to generate the LED light and subsequently filtering, for example Fourier filtering, the received photodetector signal).
- Initial filtering to distinguish between the different sources of light may be carried out in the pre-processing step S2 of Figure 7.
- the photodetector signal changes are caused by preventing the in-coupling of external light into the light guide.
- no direct contact between user and the light guide is required to provide a user input.
- avoiding the erroneous detection of a user input becomes more important.
- fast, repetitive gestures which are easier to distinguish from the changes in background illumination, are preferable for providing user inputs than are slower or singular movements.
- the electronic circuitry 26 may set the user input interface into a "standby mode". In this mode, only a relatively small number (e.g. one or two) of photodetectors may be active. As such, only limited processing power may be used while still allowing simple gestures, such as swiping repetitively back and forth across the light guide (as in Figures 8A and 9A), to be detectable. Once a simple gesture is detected, the system may set itself to an "active mode" in which all photodetectors become active and more complex gestures and user inputs can be identified.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A user-input interface comprises a light guide having first and second surfaces and at least one edge extending between the first and second surfaces, a photodetector provided at the at least one edge and operable to output a signal indicative of an amount of light arriving at the photodetector from within the light guide, and electronic circuitry configured to detect an object in physical contact with one of the first and second surfaces of the light guide in response to receipt of a signal from the photodetector that is indicative of a reduction in the amount of light arriving at the photodetector. The user-input interface may be of single-layered construction and may be particularly suitable for controlling an emissive window.
Description
USER -INPUT INTERFACE WITH A LIGHT GUIDE
FIELD OF THE INVENTION
The invention relates to a user-input interface and in particular to a user-input interface comprising a light guide. BACKGROUND OF THE INVENTION
'Transparent emissive windows' are windows that are transparent and appear as ordinary windows when turned off. However, when turned on, they can emit light. These transparent emissive windows can be used for both general lighting and as atmosphere providing luminaires.
An emissive window typically consists of an edge-lit light guide on which out- coupling structures are provided. When emissive windows are equipped with multiple white, or colored, light emitting diodes (LEDs) on one or more edges, a multitude of different lighting patterns and/or colors can be created.
Figure 1 shows a cross-sectional schematic of an emissive window 1. The emissive window 1 comprises a transparent light guide 10, an LED 12 provided at an edge 10-1 of the light guide 10, and a plurality of out-coupling structures 14 provided on a surface 10-2 of the light guide. The out-coupling structures 14 may comprise for example dots of white paint. The LED 12 is operable to emit light into the light guide 10 via the edge 10-1. Each of the plurality of out-coupling structures 14 is configured to scatter the LED light 16 travelling in the light guide 10, thereby to disrupt the total internal reflection of the LED light 16 such that some of it is emitted from the light guide 10 via a surface 10-3 of the light guide 10.
WO2011/067719 relates to emissive windows, and more detail regarding emissive windows can be found therein.
For users to interact with and control the light settings of an emissive window, an additional remote control (either wired or wireless) could be provided. However, as emissive windows constitute unobtrusive luminaires which are embedded in the architecture of a structure, such as a building, it may be preferable for the user interface to be
unobtrusively embedded in the emissive window.
A comparable issue was encountered in the development of portable electronic devices, such as mobile phones and tablet computers, in which there was a desire to embed the user interface into the screen. This feature (i.e. touch screens) has been commercially available in consumer devices for some time and can be provided using many different technologies. Examples of common touch screen technologies include resistive, capacitive and pressure sensitive touch screens. All of these technologies require one or more layers, either continuous or patterned, to be provided on the surface of the screen. However, such layers are not completely transparent and so, if they are applied to a transparent emissive window, they might change the appearance of the window, for example by reducing the transmission of light through the window or by shifting the color of the light. This would constitute a major compromise of the window's main function (i.e. being transparent and unobtrusive). The application of these layers to large areas is also more difficult and more expensive than it is for smaller areas, such as touch screens.
SUMMARY OF THE INVENTION
In a first example of the present invention, a user-input interface is provided. The user interface comprises a light guide having first and second surfaces and at least one edge extending between the first and second surfaces, a photo detector provided at the at least one edge of the light guide and operable to output a signal indicative of an amount of light arriving at the photo detector from within the light guide, and electronic circuitry configured to detect an object in physical contact with one of the first and second surfaces of the light guide in response to receipt of a signal from the photo detector that is indicative of a reduction in the amount of light arriving at the photo detector. The user- input interface does not require a multilayered structure in order to detect objects in contact therewith. As such, transparency of the user-input interface may be increased and the cost of production may be reduced
The electronic circuitry may be configured to detect a user input in response to receipt from the photo detector of a signal indicative of a reduction in the amount of light arriving at the photo detector; and to respond to the detection of the user input by triggering an action. User inputs may be provided by a user positioning an object adjacent to the surface.
The electronic circuitry may be configured to detect a user input in response to receipt of a signal from the photo detector that is indicative of a specific series of fluctuations in the amount of light arriving at the photo detector, and to respond to the detection of the
user input by triggering an action. This reduces the chance that incidental interactions with the user-input interface are interpreted as user inputs.
The user- input interface may comprise one or more light emitting diodes provided at an edge of the light guide and operable to emit light into the light guide.
Consequently, no external light may be necessary in order to utilize the user-input interface. The photo detector may be constituted by or combined with, one of the one or more light emitting diodes. This maintains a low cost of production, as a smaller number of different components are required. The light guide may comprise a plurality of out-coupling structures. The user- input interface may be part of an emissive window.
The one or more light emitting diodes may be modulated and the electronic circuitry may be configured to detect user inputs using only those elements of the signal from the photo detector that result from the light emitted by the light emitting diodes. The electronic circuitry may be configured to detect user inputs using only those elements of the signal from the photo detector that result from external light sources. The ability to detect user inputs using only the light due to external sources may facilitate the detection of user inputs which are proximate to the surface of the light guide, but which are not actually in contact with it. The detection of user inputs using only the light from the LEDs reduces errors that may otherwise be introduced by fluctuations in the external light sources. The electronic circuitry may be configured to filter elements of the signal corresponding to a frequency of the modulation of the light emitting diodes, or may be configured to amplify the elements of the signal from the photo detector that correspond to a frequency of the modulation of the light emitting diodes.
The electronic circuitry may be configured to control the operation of the one or more light emitting diodes in response to detection of a user input. The electronic circuitry may be configured to respond to a first specific series of fluctuations in the received signal by controlling the one or more light emitting diodes in a first manner, and to respond to a second, different specific series of fluctuations in the received signal by controlling the one or more light emitting diodes in a second manner.
The user-input interface may comprise a plurality of photo detectors provided at one or more edges of the light guide. Plural photo detectors may be provided along a single edge of the light guide. This facilitates the determination of the direction of movement of a moving user input.
A first of the plurality of the photo detectors may be provided at a first edge of the light guide and a second of the plurality of photo detectors may be provided at a second
edge of the light guide. This may facilitate detection of "two-dimensional user inputs" and/or may facilitate detection of user inputs irrespective of their location on the light guide. A first of the plurality of photo detectors may be provided at a first edge portion of the light guide and a second of the plurality of photo detectors may be provided at a second edge portion of the light guide, wherein the first and second edge portions are nonparallel. This facilitates detection of "two-dimensional user inputs".
BRIEF DESCRIPTION OF THE DRAWINGS
For a more complete understanding of example embodiments of the present invention, reference is now made to the following description taken in connection with the accompanying drawings in which:
Figure 1 is a cross-sectional schematic of a prior art transparent emissive window;
Figure 2 is a schematic perspective view of an example user-input interface in accordance with the invention;
Figures 3A and 3B are schematic cross-sectional views through the user-input interface of Figure 2, which illustrate the operation of user-input interfaces in accordance with the invention;
Figure 4 is a schematic perspective view of an alternative example of a user- input interface in accordance with the invention;
Figures 5A and 5B are qualitative graphs which illustrate the operation of user-input interfaces in accordance with the invention;
Figure 6 is a schematic perspective view of another alternative example of a user- input interface in accordance with the invention;
Figure 7 is a flow chart depicting an example of a method performed by features of the invention; and
Figures 8 A and 8B, 9 A and 9B and 10A and 10B are explanatory illustrations of various steps of the method of Figure 7.
DETAILED DESCRIPTION of the embodiments
In the drawings and the following description, like reference numerals refer to like elements.
Figure 2 is a schematic perspective view of a user-input interface 2 in accordance with the invention.
The user- input interface 2 comprises a light guide 20 and a photo detector 22. The light guide has first and second surfaces 20-2, 20-3 and at least one edge 20-1, 20-4, 20- 5, 20-6 extending between the first and second surfaces 20-2, 20-3. The photo detector 22 is provided at one of the edges 20-1, 20-4, 20-5, 20-6 of the light guide 20. The photo detector 22 is operable to detect light arriving at the edge 20-1 of the light guide 20 from the interior of the light guide 20. The photo detector 22 is operable to output a signal indicative of the amount of light arriving at the photo detector 22 from inside the light guide 20. As will be understood from the following description, user-input interfaces according to the invention (such as that of Figure 2) are configured such that the amount of light arriving at the photo detector 22 is reduced when an object 24, which in Figure 2 is a user's finger, is in physical contact with a surface 20-2 of the light guide 20.
The light guide 20 is transparent. The light guide 20 is planar. The two surfaces 20-2, 20-3 of the light guide are separated by the thickness of the light guide 20-2, 20-3. The thickness of the light guide 20 is significantly less than the dimensions of the surfaces 20-2, 20-3. As such, the first and second surfaces 20-2, 20-3 may be referred to as "main surfaces". The edges 20-1, 20-4, 20-5, 20-6 may be referred to as "edge surfaces" or "minor surfaces" that connect the two main surfaces. In the examples of Figures 2, 4 and 6, the light guide 20 is square and so comprises four edges 20-1 , 20-4, 20-5, 20-6. However, it will be appreciated that the light guide may have a different shape and so may have a different number of edges. The light guide 20 is of a material that is optically matched with its surroundings in such a way that, when light is travelling within the light guide, it may be confined by means of total internal reflection. The light guide 20 may be comprised of, for example, glass or a transparent polymer or any other transparent medium, preferably having a low absorption of light. Other examples include Polymethyl methacrylate (PMMA), polycarbonate and quartz.
The photo detector 22 may comprise any component that is suitable for detecting an amount of incident light. For example, the photo detector 22 may comprise a photodiode, a photovoltaic cell, a photo resistor or a phototransistor. In some embodiments, the photo detector 22 may comprise an LED acting as a photodiode. Typically, the signal output by the photo detector 22 increases with the amount of light arriving at the photo detector 22.
The user- input interface 2 further comprises electronic circuitry 26. The electronic circuitry, in this example, comprises at least one processor 26-1 and at least one memory 26-2. The at least one processor 26-1 is operable, under the control of computer
readable code 26-2A stored on the at least one memory 26-2, to receive the signal from the photo detector 22 and to perform actions based on the signal. The electronic circuitry 26 is configured to detect an object in physical contact 24 with one of the first and second surfaces 20-2, 20-3 of the light guide 20 in response to receipt of a signal from the photo detector 26 that is indicative of a reduction in the amount of light arriving at the photo detector 22. The electronic circuitry 26 is configured to detect a user input in response to receipt from the photo detector 22 of a signal indicative of a reduction in the amount of light arriving at the photodetector; and to respond to the detection of the user input by triggering an action. As will be understood from the below explanation, a user input is constituted by a user positioning an object adjacent to a surface. In other words, a user input may be provided by a user positioning an object in contact with a surface 20-2, 20-3 of the light guide 20 or simply positioning the object proximate, but not actually in contact with, the surface 20-2, 20-3 of the light guide 20.
The operation of the user- input interface will now be described with reference to Figures 3 A and 3B, which show cross-sectional views through user-input interface of Figure 2.
As can be seen in Figures 3A and 3B, the user-input interface 2 additionally comprises a plurality of out-coupling structures 28 provided on a first surface 20-2 of the light guide 20. The out-coupling 28 structures may alternatively be provided on the second, opposite surface 20-3 of the light guide, or on both surfaces 20-2, 20-3. Alternatively or additionally, the out-coupling structures may be provided within the bulk of the light guide 20 and may comprise, for example, small scattering particles. Such out-coupling structures 28 may be required when the user-input interface is part of an emissive window. However, when user- input interfaces in accordance with the invention are used for other applications, they may not comprise out-coupling structures 28. Other applications include touch screens, or indeed any other device or apparatus requiring a user input interface.
In Figure 3 A, light (denoted by arrows LI and L2) arrives at the first and second main surfaces 20-2, 20-3 of the light guide 20. The light LI, L2 is from external sources such as the sun or other luminaires near to the light guide 20. The light LI, L2 is scattered by the out-coupling structures 28. Some of the light scatters over a sufficiently large angle that it is captured in the light guide 20. Some of the captured light then travels within the light guide 20, via total internal reflection (TIR), until it reaches the edge 20-1 of the light guide 20 at which the photodetector 22 is provided. The paths of the captured light are denoted by arrows SI and S2. The photodetector 22 detects the arriving light and
provides to the electronic circuitry 26 (not shown in Figures 3A and 3B) a signal indicative of the amount of light arriving at the photodetector 22.
In Figure 3B, the user's finger 24 is incident on the first surface 20-1 of the light guide. The presence of the finger 24 on the surface 20-1 blocks some of the light that might otherwise enter light guide 20 (see arrow L3) via the location on the surface 20-2 at which the finger 24 is incident. In addition, optical contact is created between the finger 24 and the light guide 20 and so some of the captured light that is incident on the location of the surface 20-2 at which the finger 24 is also incident is scattered (see arrow SI). This scattering causes some of the captured light to be emitted from the second surface 20-3 of the light guide 20. Also, as well as being scattered, some of the trapped light is absorbed by the finger 24. All of these effects combine to result in a reduced amount of light arriving at the photodetector 22. Consequently, the electronic circuitry 26 is able to identify the presence of the user's finger 24 in contact with the surface 20-2 of light guide 20, when the signal received from the photodetector 22 is indicative of a reduction in the amount of light.
In fact, due to the blocking of light by the user's finger 24, a reduction in the amount of light arriving at the photodetector 22 is also apparent when the user's finger 24 is not actually in contact with the surface 20-2 of a light guide but is simply proximate to the surface 20-2 of the light guide. As long as the user's finger 24, or any other object, casts a shadow on the surface 20-2 of the light guide, the presence of the object is detectable by the electronic circuitry 26.
The user- input interface according to the invention may also be utilized when there is no external light. For use in this situation, the user-input interface must also comprise one or more LEDs 50. An example of such a user interface is shown in Figure 4. The user- input interface 4 of Figure 4 is substantially the same as the interface 2 described with reference to Figure 2, but also includes one or more LEDs 50. In this example, the user- input interface 4 comprises a plurality of LEDs 50. The one or more LEDs 50 are provided at an edge 20-4 of the light guide 20. The LEDs 50 are operable to emit light into the light guide 20 via the edge 20-4 at which they are provided.
The presence of the user's finger in contact with the surface 20-2 of the light guide 20 causes some of the LED light to be scattered out of the light guide 20. The user's finger also absorbs some of the LED light. Consequently, as with the example of Figures 3 A and 3B, when the user's finger 24 is in contact with a surface of the light guide 20, a reduced amount of light reaches the photodetector 22, and the photodetector signal is indicative of such.
When the user-input interface comprises out-coupling structures 28 and one or more LEDs 50, the out-coupling structures 28 cause some of the LED light to be scattered from the light guide 20. However, a sufficient amount of light remains trapped within the light guide 20 that the presence of the user's finger 24 is still detectable.
In the example of Figure 4, the one or more LEDs 50 are provided on a second edge 20-4 of the light guide 20. The second edge 20-4 is opposite to the edge 20-1 on which photodetector 22 is provided (hereafter referred to as the first edge 20-1). However, due to the spreading of the light within the light guide 20, it is also possible for the LEDs 50 to be provided on edges 20-5, 20-6 that are perpendicular to the first edge 20-1. Also, when the user- input interface includes out-coupling structures 28, which cause scattering of light within the light guide 20, it is even possible for the one or more LEDs 50 to be provided on the first edge 20-1 along with the photodetector 22.
The one or more LEDs 50 may be driven using pulse width modulation (PWM). This allows the brightness of the light emitted by the LEDs 50 to be varied. In examples in which PWM is used, the electronic circuitry 26 may be configured so as to separate out, by filtering, the high frequency elements of the photodetector signal (i.e. the elements resulting from the PWM-driven LEDs 50) from the low frequency elements of the signal that are due to external light. Alternatively, the frequency of the PWM may be used to amplify, for example using a lock-in amplifier, those elements of the photodetector signal that result from the LEDs 50. As such, using one of these alternative methods, it is possible for the electronic circuitry 26 to cause actions to be performed based only on the elements of the signal that are due to the LEDs 50 or based on only the elements of the signal that result from external light. The electronic circuitry 26 may be operable to switch between detecting user inputs based on only external light sources and detecting user input based on only the light emitted by the LEDs 50.
Although the LEDs 50 have been described as being driven by PWM, it will be appreciated that they may alternatively be driven using other modulation methods, such as amplitude modulation. As long as the modulation is at a frequency that is distinguishable from external light, then the LED light can be distinguished from external light using filtering or amplification.
By being able to make a distinction between the origins of the light arriving at the photodetector 22, a more controlled environment can be created. For example, when using the LEDs 50 as the light source, changes in external light levels not caused by the user, such as other people passing near the surface of the light guide or fluctuating external light
sources (e.g. televisions), can be separated and ignored. This makes the user-input interface more robust and less prone to errors. Conversely, when the electronic circuitry 26 is instead configured to detect the external light arriving at the photodetector, the light injected by the LEDs 50 may make changes in the light from external sources difficult to detect. As such, by filtering out the light from the LEDs 50, a higher signal-to-noise ratio can be achieved. It may be preferable to detect light from external sources as opposed to light emitted by the LEDs because it is desirable for the user to be able to provide user inputs without actually touching the surface of the light guide.
As will be discussed in more detail below, when the user- input interface comprises one or more LEDs, the electronic circuitry 26 may be configured to control the operation of the one or more LEDs 50 in dependence on signals received from the
photodetector 22.
Figures 5A and 5B are qualitative graphs showing the signal output by the photodetector 22 (and thus also the light level arriving at the photodetector) when the user's finger is at different locations on the surface of the light guide 20. Specifically, Figure 5 A illustrates the light level pattern when the user's finger is aligned with the photodetector 22 and is moved perpendicularly away from the first edge 20-1 (in other words, directly away from the photodetector). Figure 5B illustrates the light level pattern when the user's finger 24 is moved in a direction parallel to the first edge 20-1.
Due to the spreading of light within the light guide 20, the reduction in the amount of light that reaches the photodetector 22 is more pronounced when the user's finger is close to the photodetector 22. This can be seen in Figure 5A, which shows the lowest light level when the user's finger 24 is closest to the photodetector 22. As the vertical distance from the photodetector increases, so too does the amount of light reaching the photodetector 22. It will thus be appreciated that the user's finger 24 is easier to detect (i.e. is more detectable) when it is closer to the photodetector 22.
As can be seen from Figure 5B, the reduction in the amount of light reaching the photodetector 22 is most pronounced when the user's finger 22 is aligned with the photodetector 22. The reduction in the amount of light quickly drops off to almost zero as the user moves their finger to either side of the aligned position.
The electronic circuitry 26 may be configured to recognize specific
photodetector 22 signal patterns that result from specific gestures as user inputs. In response to detecting a particular user input, the electronic circuitry 26 may be configured to cause a particular action to be performed, depending on the type of user input detected. For example,
the electronic circuitry 26 may be configured to recognize "swipe inputs" (i.e. linear movements of the user's finger on or proximate to the surface) and to cause different actions to be performed based on the type of the swipe input (e.g. in a parallel direction or a perpendicular direction). So, the electronic circuitry 26 may recognize a high photodetector signal, followed by a sudden significant decrease, followed by a sudden return to the original high level as a parallel swipe, and may, in emissive window examples, cause the LEDs 50 to be switched on or off. Similarly, the electronic circuitry 26 may be configured to respond to the detection of a perpendicular swipe input (i.e. a high signal followed by a gradual reduction or a low signal followed by a gradual increase) by causing the brightness of the LEDs 50 to be changed.
The user- input interface is operable also to detect the momentary presence of a non-moving finger on or proximate to the surface 20-2 of the light guide 20. This type of gesture may be referred to as a "tap input". The electronic circuitry 26 may also or alternatively associate different actions with different types of tap inputs (e.g. a short tap or long tap).
In other examples, complex gestures may be required in order for the electronic circuitry 26 recognize, or detect, a user input and to respond by causing a certain action to occur. Complex gestures comprise more than one successive (or linked) gesture and result in a recognizable, specific series of photodetector 22 signal changes, which is less likely to occur as a result of an unintentional event. As such, the electronic circuitry 26 may be configured to detect a user input only in response to a specific series of changes, or fluctuations, in the signal received from the photodetector 20. Examples of complex gestures include a "back-and- forth" swipe (i.e. across the surface in one direction and then back again), a "double tap" (i.e. two successive short taps), and a tap followed by a swipe. The recognition of only complex gestures as user inputs may be particularly useful when external sources, and not the LEDs, are providing the light detected by the photodetector. This is because in those circumstances unintentional fluctuations, resulting from, for example, moving shadows or varying external light sources, may occur more commonly. The requirement that a user input is only recognized in response to receipt of a specific series of photodetector signal fluctuations (due to a complex gesture), reduces the chance that unwanted actions will be triggered by the electronic circuitry 26.
Obviously, the specific actions that are caused to be performed (or are triggered) by the electronic circuitry 26 may vary depending on the purpose for which the user-input interface is being used. As an example, when the user-input interface is part of an
emissive window, the different actions may include, but are not limited to, switching the LEDs on or off, dimming the LEDs up or down, changing the color of the emitted light and causing dynamic light effects to occur.
Figure 6 shows another example of a user-input interface 6 in accordance with the invention. The user- input interface 6 of Figure 6 differs from that of Figure 2 only in that it comprises plural photo detectors 22 each operable to detect an amount of light arriving thereat from within light guide 20. The electronic circuitry 26 receives a signal from each photodetector 22.
In the example of Figure 6, the user-input interface 6 comprises plural photo detectors 22 provided on the first edge 20-1 and plural photo detectors 22 provided on a different edge 20-5, which is in this example is perpendicular to the first edge 20-1.
However, in other examples, plural photo detectors 22 may be provided on just a single edge. Alternatively, one or more photo detectors 22 may be provided on each of plural edges (e.g. 20-1 and 20-5).
The provision of plural photo detectors 22 on a single edge 20-1 enables more information to be determined regarding the horizontal movement of the user's finger on the surface 20-2 of the light guide 20. For example, it can be determined whether a parallel swipe is left-to-right or right-to-left. Also, the provision of plural photo detectors 22 on a single edge enables more than one simultaneous input (i.e. "multi-touch") to be detected.
The provision of one or more photo detectors 22 on each of plural edges allows the sensitivity of the user-input interface to be increased and/or allows more complex gestures to be identified. Specifically, the provision of one or more photo detectors 22 on opposite edges e.g. 20-1, 20-4 provides greater sensitivity. This is because a movement of the user's finger 24 away from a first photodetector 22 on the first edge 20-1, which reduces the detectability of the finger by the first photodetector 22, is necessarily towards a second photodetector 22 on the opposite edge 20-4, and so increases the detectability of the finger by the second photodetector 22. As such, the user's finger 24 is more likely to be detected regardless of its location on the surface 20-2. The provision of one or more photo detectors 22 on perpendicular edges e.g. 20-1, 20-5 allows two-dimensional gestures, such as circular swipes, to be more easily distinguished from linear, one-dimensional gestures. The provision of plural photo detectors 22 at each of plural edges of the light guide 20 may allow the location of the user input to be determined based on the locations of the two or more photo detectors 22 which are detecting a reduced amount of light.
Although not shown in any of the figures, it will be appreciated that user-input interfaces in accordance with the invention may comprise plural photo-detectors at various locations around the perimeter of the light guide in addition to one more LEDs.
In the examples of Figures 2, 4 and 6, the light guide 20 is planar. In other words, the light guide is formed of a flat sheet, or layer, of material. However, in some examples, the light guide 20 may be formed of a curved sheet of material. Also, in the examples of Figures 2, 4 and 6, the light guide 20 is rectangular, specifically square, and as such has two pairs of opposing parallel edges. However, in other examples, the light guide 20 may be a different shape, such as circular or triangular and so may have only one continuous edge or three edges respectively. In such examples, instead of two photo detectors 22 being provided on different perpendicular edges (so as to increase detectability of two dimensional gestures), they may instead be provided on two different edge portions that are nonparallel (i.e. are not parallel with one another).
The at least one processor 26-1 may comprise any number of processors and/or microprocessors. The at least one memory 26-2 may comprise any number of discrete memory media of any suitable type (e.g. ROM, RAM, EEPROM, FLASH etc.). The electronic circuitry 26 may comprise a microcontroller. The electronic circuitry 26 may comprise any combination of suitable analogue electrical components, as well as or instead of the at least one processor 26-1 and the at least one memory 26-2.
Although, in the above examples, a finger is described as providing the user inputs, it will be appreciated that the object could be something other than the user's finger. For example, a user's hand could be detected on or near the surface of the light guide.
Alternatively, a stylus could be used for to provide the user inputs. Where LEDs are used as the light source, the stylus must be configured such that the contact area between the stylus and the surface of the light guide is sufficiently large for the user input to be detected.
Figure 7 is a flow chart illustrating an example of the way in which the electronic circuitry 26 may operate to cause actions to be performed based on user inputs received at the user input interface.
In step S 1 , a signal indicative of the amount of light arriving at the photodetector 22 is continuously received from the photodetector 22. In embodiments in which the user input-interface includes plural photodetectors 22, plural signals, one from each photodetector, are received by the electronic circuitry 26.
In step S2, the raw photodetector signal (or signals) is pre-processed, for example, by filtering. Pre-processing can be used, for example, to remove noise from the
raw signal before processing for input detection is carried out. In some examples, the preprocessing of step S2 is omitted.
Next, in step S3, the pre-processed (or raw) signal is stored in a buffer in the at least one memory 26-2.
Subsequently, in step S4, the stored signal is processed so as to provide a processed signal from which recognisable parameters can be extracted. The signal processing of step S4 may be carried out using many different types of filtering algorithm. For example, a high pass filter may be used to isolate those components of the photodetector signal that are due to deliberate user inputs. Another example of a suitable filtering algorithm is to take the derivative over time of the received signal.
In step S5, the signal parameters are extracted or calculated from the processed signal.
In step S6, the extracted parameters are compared with stored reference signal signatures.
In step S7, it is determined if there is a match between the extracted signal parameters and one of the reference signal signatures. If so, the method proceeds to step S8, in which the electronic circuitry 26 causes an action that is associated with the matched reference signal signature to be performed. If no match is detected, the method returns to step S I .
The signal parameters that can be extracted from the processed signal and the signal signatures that are associated with certain gestures are discussed below with reference to Figures 8A and 8B, 9A and 9B and 10A and 10B.
Figure 8A shows a "double back-and- forth swipe" gesture. This is illustrated by the four arrows Mi to M4, which show the distinct movements that constitute the gesture. Although Mi to M4 are shown at different distances from the edge 20-1 at which the photodetector 22 is provided, this is for illustration purposes only. Figure 8B shows, in the top graph, the resulting raw signal output by the photodetector 22 and, in the bottom graph, the processed signal (following step S4).
The raw signal of Figure 8B has a main low frequency envelope E and four sets of sudden signal variations Vi to V4. The low frequency envelope E is due to changes in background illumination caused by, for example, clouds passing in front of the sun or people passing by the light guide. The four sets of sudden signal variations Vi to V4 result from sudden reductions in light caused by the movements Mi to M4 respectively.
In the example of Figure 8B, the signal is processed by differentiating the raw photodetector signal. The result of this can be seen in the lower graph. In this instance, differentiating the raw signal retains the characteristic fluctuations Vi to V4 that are due to the user input and virtually eliminates variation in signal intensity that is due to the changes in background illumination. From this differentiated signal, various parameters can be identified. For example, the proximity of the user's hand to the photodetector may be identifiable from the magnitude of the fluctuations in the processed signal. Also, the frequency at which the user's hand is moved back and forth across the surface of the light guide can be identified from the separation between the fluctuations.
Figures 9A and 9B show a user input gesture and the resulting received photodetector signals from three photodetectors 22-1 , 22-2, 22-3 provided at a single edge 20-1 of the light guide. The user input gesture of Figure 9A is the same as that shown on Figure 8A.
Figure 9B shows, in the topmost graph, the raw signal received from the first photodetector 22-1, in the next graph down, the signal received from the second
photodetector 22-2 and, in the third graph down, the signal received from the third photodetector 22-3. The lower three graphs show the processed versions of these signals. Specifically, the fourth graph shows the processed first photodetector signal, the fifth graph shows the processed second photodetector signal and the sixth graph shows the processed third photodetector signal. Once again, the raw signals have been processed by
differentiating the raw signal with respect to time. For ease of illustration, the raw signals of Figure 9 A are shown without the effects of background illumination.
The provision of plural photodetectors 22-1, 22-2, 22-3 allows additional parameters to be identified. Specifically, the direction of movement and the speed of movement can also be determined. These can be determined by comparing the times at which sudden signal variations occur at each of the photodetectors 22-1, 22-2, 22-3.
Dashed boxes PI to P4 on Figure 9B each show three fluctuations, one in each photodetector signal, that result from a single movement of the gesture. PI shows the fluctuations that result from the first movement Mi, P2 shows the fluctuations that result from the second movement M2, P3 shows the fluctuations that result from the third movement M3, and P4 shows the fluctuations that result from the fourth movement M4.
In order to determine the direction of movement, the order in which the three fluctuations occur is determined. In PI, a fluctuation occurs first in the signal from the first photodetector 22-1, then in the signal from the second photodetector 22-2 and finally in
signal from the third photodetector 22-3. As the first photodetector 22-1 is the leftmost of the three and the third photodetector 22-3 is the rightmost of the three (see Figure 9A), it can be determined that the movement Mi is from left to right. In P2, the fluctuations occur in the opposite order and so it can be determined that the direction of movement M2 is from right to left.
The speed of a movement can be determined from the time difference between the first-occurring fluctuation and the occurrence of a fluctuation in a signal from a different photodetector 22-2, 22-3.
In the example of Figures 9A and 9B, the parameters that may be extracted from the processed signal of Figure 9B include: the number of distinct movements, the direction of each movement, the speed of each movement, the time at which each movement started or finished, and the proximity of each movement to the photodetector(s) 22.
For the example of Figures 9A and 9B, the extracted signal parameters may be as follows:
- Ml => direction = left to right; speed = vl ; time = tl ; proximity = p i ;
M2 => direction = right to left; speed = v2; time = t2; proximity = p2;
M3 => direction = left to right; speed = v3; time = t3; proximity = p3;
M4 => direction = right to left; speed = v4; time = t4; proximity = p4;.
In this instance, as the gesture is a "double back-and-forth swipe", there are four distinct movements. The speeds of the four movements are all approximately equal (i.e. vl¾v2¾v3¾v4), the time differences between the start/end of one movement and the start/end of the next are approximately equal (i.e. tl-t2¾t2-t3¾t3-t4) and the proximities are all approximately equal (i.e. p l¾p2¾p3¾p4).
Some or all of these parameters and/or their values relative to one another constitute the signal signature of the gesture. For example, for a double back-and-forth swipe gesture, the signal signature may comprise four successive movements, each being in an opposite direction to the last and having approximately equal speeds and equal proximities to the photodetectors. If the parameters extracted in step S5 of Figure 7 match this signature, then the electronic circuitry 26 recognizes the gesture as a double back-and-forth swipe and causes the associated action to be performed..
Figure 10A shows a circular clockwise gesture in respect of a user- input interface having three photodetectors 22-1 , 22-2, 22-3. Figure 10B shows the resulting raw
signals from each photodetector. In this example, as there is no change in background illumination, the raw signal can be used to identify the signal parameters.
The gesture of Figure 10A results in a first set of three signal fluctuations Fl that occur when the user's hand is approximately at the first location on Figure 9 A.
Subsequently, when the user's hand is at the second location and so is not aligned with any of the photodetectors 22, no fluctuations occur. Next, when the user's hand is approximately at the third location, a second set of fluctuations F2 are detected. Finally, when the user's hand is at the fourth location, there is another absence of fluctuations.
In the first set of fluctuations Fl, the first fluctuation appears in the signal from the third photodetector 22-3 and the final fluctuation appears in the signal from the first photodetector 22-4. As such, the direction can be identified as right to left. In the second set of fluctuations F2, the first fluctuation appears in the signal from the first photo detector 22-1 and the final fluctuation appears in the signal from the third photodetector 22-3. As such, the direction can be identified as left to right.
In the first set F 1 , the magnitude of the fluctuations is greater than they are in set F2. This is because the user's hand is closer to the photodetectors 22 and so prevents a greater amount of light from reaching the photodetectors 22. The speeds and times of fluctuations may be calculated as described above.
The extracted parameters of a clockwise circular gesture may be as follows: - For set F 1 => direction = right to left, speed = vl , time = tl , proximity = p 1.
For set F2 => direction = right to left, speed = v2, time = t2, proximity = p2.
The signature that identifies the clockwise circular gesture may comprise two sets of fluctuations having approximately equal speeds (i.e. vl¾v2) and opposite directions of movement, with the right-to-left movement being nearer to the photodetectors than is the left- to-right movement (i.e. pl>p2). It will be appreciated that tl may be greater or less than t2, depending on the starting position of the user's hand. If the parameters extracted from the processed signal match this signature, the electronic circuitry 26 may cause an action that is associated with a clockwise circular gesture to be performed.
As is discussed above, if the user input interface uses an 'internal' light source (i.e. LEDs emitting light into an edge of the light guide), the photodetector signal fluctuations are caused by extracting light from the light guide. As such, direct physical contact between the user and the light guide is required in order to provide a user input. The requirement of direct physical contact between the user and the light guide greatly reduces the probability of user inputs being erroneously detected as a result of changes in background illumination.
This is particularly true when the light from external sources is distinguished from light from the LEDs using the methods described above (for instance, using PWM to generate the LED light and subsequently filtering, for example Fourier filtering, the received photodetector signal). Initial filtering to distinguish between the different sources of light may be carried out in the pre-processing step S2 of Figure 7.
If the user input interface instead uses an external light source, the photodetector signal changes are caused by preventing the in-coupling of external light into the light guide. As such, no direct contact between user and the light guide is required to provide a user input. In this case, avoiding the erroneous detection of a user input becomes more important. As such, fast, repetitive gestures, which are easier to distinguish from the changes in background illumination, are preferable for providing user inputs than are slower or singular movements.
In order to save energy and processing power, when no user input is occurring (or has recently occurred), the electronic circuitry 26 may set the user input interface into a "standby mode". In this mode, only a relatively small number (e.g. one or two) of photodetectors may be active. As such, only limited processing power may be used while still allowing simple gestures, such as swiping repetitively back and forth across the light guide (as in Figures 8A and 9A), to be detectable. Once a simple gesture is detected, the system may set itself to an "active mode" in which all photodetectors become active and more complex gestures and user inputs can be identified.
It will be appreciated that the term "comprising" does not exclude other elements or steps and that the indefinite article "a" or "an" does not exclude a plurality. A single processor may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to an advantage. Any reference signs in the claims should not be construed as limiting the scope of the claims.
Although claims have been formulated in this application to particular combinations of features, it should be understood that the scope of the disclosure of the present invention also includes any novel features or any novel combinations of features disclosed herein either explicitly or implicitly or any generalization thereof, whether or not it relates to the same invention as presently claimed in any claim and whether or not it mitigates any or all of the same technical problems as does the parent invention. The applicants hereby give notice that new claims may be formulated to such features and/or combinations of
features during the prosecution of the present application or of any further application derived there from.
Other modifications and variations falling within the scope of the claims hereinafter will be evident to those skilled in the art.
Claims
1. A user- input interface comprising:
a light guide having first and second surfaces and at least one edge extending between the first and second surfaces;
a photodetector provided at the at least one edge and operable to output a signal indicative of an amount of light arriving at the photodetector from within the light guide; and
electronic circuitry configured to detect an object in physical contact with one of the first and second surfaces of the light guide in response to receipt of a signal from the photodetector that is indicative of a reduction in the amount of light arriving at the photodetector.
2. The user- input interface of claim 1, the electronic circuitry being configured:
to detect a user input in response to receipt from the photodetector of a signal indicative of a reduction in the amount of light arriving at the photodetector; and
to respond to the detection of the user input by triggering an action.
3. The user- input interface of any preceding claim, the electronic circuitry being configured:
to detect a user input in response to receipt of a signal from the photodetector that is indicative of a specific series of fluctuations in the amount of light arriving at the photodetector; and
to respond to the detection of the user input by triggering an action.
4. The user- input interface of any preceding claim, comprising one or more light emitting diodes provided at the at least one edge of the light guide and operable to emit light into the light guide.
5. The user- input interface of claim 4, wherein the photodetector is constituted by one of the one or more light emitting diodes.
6. The user- input interface of claim 4 or claim 5, wherein the light guide comprises a plurality of out-coupling structures.
7. The user-input interface of claim 6, wherein the user-input interface is an emissive window.
8. The user- input interface of any of claims 4 to 7, wherein the one or more light emitting diodes are modulated and the electronic circuitry is configured:
to detect user inputs using only elements of the signal from the photodetector that result from the light emitted by the light emitting diodes; or
to detect user inputs using only elements of the signal from the photodetector that result from external light sources.
9. The user input interface of claim 8, wherein the electronic circuitry is configured to filter elements of the signal corresponding to a frequency of the modulation of the light emitting diodes, or wherein the electronic circuitry is configured to amplify elements of the signal from the photodetector that correspond to a frequency of the modulation of the light emitting diodes.
10. The user- input interface of any of claims 4 to 9 wherein the electronic circuitry is configured to control the operation of the one or more light emitting diodes in response to detection of a user input.
11. The user- input interface of claim 10, wherein the electronic circuitry is configured to respond to a first specific series of fluctuations in the received signal by controlling the one more light emitting diodes in a first manner, and to respond to a second, different specific series of fluctuations in the received signal by controlling the light emitting diodes in a second manner.
12. The user- input interface of any preceding claim, comprising a plurality of photo detectors provided at one or more edges of the light guide.
13. The user- input interface of claim 12, wherein plural photo detectors are provided along a single edge of the light guide.
14. The user- input interface of claim 12 or claim 13, wherein a first of the plurality of the photo detectors is provided at a first edge of the light guide and a second of the plurality of photo detectors is provided at a second edge of the light guide.
15. The user- input interface of claim 12 or claim 13, wherein a first of the plurality of photo detectors is provided at a first edge portion of the light guide and wherein a second of the plurality of photo detectors is provided at a second edge portion of the light guide, wherein the first and second edge portions are nonparallel.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201261652486P | 2012-05-29 | 2012-05-29 | |
US61/652,486 | 2012-05-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013179168A1 true WO2013179168A1 (en) | 2013-12-05 |
Family
ID=48703621
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2013/053905 WO2013179168A1 (en) | 2012-05-29 | 2013-05-14 | User -input interface with a light guide |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2013179168A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015155508A1 (en) * | 2014-04-11 | 2015-10-15 | David Dearn | Optical touch screen with a lossy dispersive ftir layer |
EP2975768A1 (en) * | 2014-07-18 | 2016-01-20 | SMR Patents S.à.r.l. | Operating device for motor vehicles |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100187422A1 (en) * | 2009-01-23 | 2010-07-29 | Qualcomm Mems Technologies, Inc. | Integrated light emitting and light detecting device |
US20100295821A1 (en) * | 2009-05-20 | 2010-11-25 | Tom Chang | Optical touch panel |
WO2011067719A1 (en) | 2009-12-03 | 2011-06-09 | Koninklijke Philips Electronics N.V. | Transparent emissive window element |
-
2013
- 2013-05-14 WO PCT/IB2013/053905 patent/WO2013179168A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100187422A1 (en) * | 2009-01-23 | 2010-07-29 | Qualcomm Mems Technologies, Inc. | Integrated light emitting and light detecting device |
US20100295821A1 (en) * | 2009-05-20 | 2010-11-25 | Tom Chang | Optical touch panel |
WO2011067719A1 (en) | 2009-12-03 | 2011-06-09 | Koninklijke Philips Electronics N.V. | Transparent emissive window element |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015155508A1 (en) * | 2014-04-11 | 2015-10-15 | David Dearn | Optical touch screen with a lossy dispersive ftir layer |
GB2539597A (en) * | 2014-04-11 | 2016-12-21 | T-Phy Ltd | Optical touch screen with a lossy dispersive FTIR layer |
CN106471449A (en) * | 2014-04-11 | 2017-03-01 | T-物理有限公司 | There is the optical touch screen of loss dispersion FTIR layer |
JP2017510928A (en) * | 2014-04-11 | 2017-04-13 | ティー‐ファイ リミテッド | Optical touch screen using loss dispersion FTIR layer |
US10175822B2 (en) | 2014-04-11 | 2019-01-08 | T-PHY Ltd. | Optical touch screen with a lossy dispersive FTIR layer |
US10684727B2 (en) | 2014-04-11 | 2020-06-16 | Uniphy Limited | Optical touch screen with a lossy dispersive FTIR layer |
GB2539597B (en) * | 2014-04-11 | 2022-03-02 | Uniphy Ltd | Optical touch screen with a lossy dispersive FTIR layer |
EP2975768A1 (en) * | 2014-07-18 | 2016-01-20 | SMR Patents S.à.r.l. | Operating device for motor vehicles |
WO2016009407A1 (en) | 2014-07-18 | 2016-01-21 | Smr Patents S.A.R.L. | Operating device for motor vehicles |
GB2545575A (en) * | 2014-07-18 | 2017-06-21 | Smr Patents Sarl | Operating device for motor vehicles |
US9891755B2 (en) | 2014-07-18 | 2018-02-13 | SMR Patents S.à.r.l. | Operating device for motor vehicles |
GB2545575B (en) * | 2014-07-18 | 2021-05-05 | Smr Patents Sarl | Operating device for motor vehicles |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106471449B (en) | Optical touch screen with loss dispersive FTIR layer | |
US20180225006A1 (en) | Partial detect mode | |
TWI439907B (en) | Optical touch device and detection method thereof | |
KR102292993B1 (en) | Orthogonal signaling touch user, hand and object discrimination systems and methods | |
US20090267919A1 (en) | Multi-touch position tracking apparatus and interactive system and image processing method using the same | |
US20110050650A1 (en) | Interactive input system with improved signal-to-noise ratio (snr) and image capture method | |
US20150035799A1 (en) | Optical touchscreen | |
US12045420B2 (en) | Optical touch screen | |
EP3019937B1 (en) | Gesture-sensitive display | |
US20140111478A1 (en) | Optical Touch Control Apparatus | |
WO2013179168A1 (en) | User -input interface with a light guide | |
TWI502447B (en) | Touch panel and operating method using the same | |
CN101593067B (en) | Screen controlled by adopting optical signals | |
CN103425275A (en) | Sensing module and sensing method both applied to optical mouse and provided with electricity-saving function | |
CN101825797A (en) | Photo induction touch-control liquid crystal display device | |
CN101813994B (en) | Touch position identifying method | |
TW201337649A (en) | Optical input device and input detection method thereof | |
CN202257514U (en) | Non-contact mechanical keyboard device with motion sensing function | |
TWI493415B (en) | Operating system and operatiing method thereof | |
CN101916151B (en) | Optical plate structure, touch display panel and touch liquid crystal display comprising same | |
KR101131768B1 (en) | Multi touch screen for high speed sampling | |
TWI536228B (en) | An inductive motion-detective device | |
CN103425227A (en) | Sensing module with power saving function and sensing method thereof | |
CN102654810B (en) | Touch module and touch display using same | |
KR102101565B1 (en) | Media display device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13732628 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13732628 Country of ref document: EP Kind code of ref document: A1 |